Installing postgres on Ubuntu

I have installed Ubuntu 18.10 and needed to figure out how to install a postgres database server. Below are the steps. First we install the server components, then we check that the server is running using psql, and finally we change the default password for postgres to ‘securePassword’.

# Update package repo and install
sudo apt update
sudo apt install postgresql postgresql-contrib

# Check that database is running and that you can connect:
sudo -i -u postgres

# Exit and secure password of postgres user:
sudo -u postgres psql -c "ALTER USER postgres PASSWORD 'securePpassword';"

Installing Anaconda on Windows Subsystem for Linux

Windows Subsystem for Linux (WSL) is a great way of running a Unix environment on a Windows machine. I tend to work on cases involving large scale data science, but am, like most corporate users, tied to a Windows machine. Having access to a fully-fledged Unix environment is key to productivity and work pleasure. In this guide I will show you how to install Anaconda on WSL from scratch.

Anaconda is the environment and package manager for Python. It enables you to install and manage the typical Python’esque data science tools such as TensorFlow and numpy. It is available as a Windows installer, but running anaconda from the Windows command line is clunky and doesn’t feel right (at least not after ~18 years of Unix muscle memory). For me, it helped installing conda inside of WSL in order to continue working with my favorite tools.

Step 1: enable WSL feature in Windows 10

First step is to install WSL itself if you haven’t already done so. Installation has two parts – first you enable the WSL in Windows 10, then you install your Linux distribution of choice, which plugs in to the WSL shell. WSL is responsible for translating the Linux (POSIX) syscalls into something the NT kernel can understand and vice versa.

Open powershell.exe and enable the WSL feature:

Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Windows-Subsystem-Linux

This should take a while, so grab a cup of coffee.

Step 2: install Ubuntu

Once done, you can install Ubuntu in two ways: via the Microsoft Store or by running bash.exe. For the later, [ress Windows-key + R, enter ‘ bash.exe’ followed by enter. This will install the Ubuntu on top of WSL.

Step 3: download and install Anaconda

Once installed, open browser and go to

Pick 64-bit for Linux (not Windows). I prefer Python 3.7 as 2.7 is old, but you may need it for specific / good reasons.

Instead of downloading in the browser, right-click the button and select ‘copy link’. Go back to the terminal window and download the installer from the command line. We want to do this as it is easier than copying the file into your Linux home directory from your Windows downloads directory.


Resolving (,, 2606:4700::6810:120a, ...
Connecting to (||:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 684237703 (653M) [application/x-sh]
Saving to: '’
100%[===================================================================================================================================================================================================>] 684,237,703 19.5MB/s   in 39s
2019-01-17 14:09:03 (16.6 MB/s) - '’ saved [684237703/684237703]

Make the file executable and run it:

  chmod +x

Some text will fly by. Grab another cup of coffee after you have answered a few questions. If you use bash, remember to key ‘yes’ to add conda to your path, so you can resolve the binary from within your path (usually inside ~/anaconda).

Step 4: create a new environment and install packages

Create a new environment and install your desired packages into it:

conda create -n newenv
conda activate newenv
conda install tensorflow

And you are done. Happy coding!

Phuket Power Lines

Nerve ends in power lines as Nocturnal Projections, a post punk outfit from Dunedin, New Zealand, sing. I have always found distribution power lines (that is, low voltage assets that transmit power from HV / zone substations to people’s houses) and their different shapes, sizes, and standards fascinating. Their structure and condition reveal a lot about a country’s policy, engineering expertise, and approach to risk management. For instance, a social democratic/ labour policy tends to favour renewal of overhead distribution poles as it keeps people in jobs to inspect, augment / replace assets, and manage vegetation. Sophistication of electrical safety rules and asset performance strategies shines through in the way pole top structures are managed.

From underground cabling in Scandinavia through Sydney’s grids and radials to Thailand’s controlled chaos. Here, power lines hang 1.5-2 metres from the ground. You could reach and grab them in most cases. Some have isolation, but it would be very easy to accidentally snap it and disrupt a whole neighbourhood. Vegetation management, the practice of trimming trees and bush to keep a safe distance to electrical circuits, is also not that common it appears in the photo above. I will keep posting power lines in this category.

The Guardian on migrating from Mongo to Postgres

The Guardian’s Digital blog have posted an excellent write up on their recent migration from MongoDB to PostgreSQL. For the uninformed, the former is a so-called ‘NoSQL’ database that allows developed to treat persisted data and queries as pure JavaScript objects (JSON). The latter is a traditional SQL database, developed by a global community of open source hackers. The article is great as it gives concrete examples of why PostgreSQL would have been a better choice (scale, stability, maturity), the challenge of live migration in a very large environment, and how to deal with it.

Postgres has existed for more than 22 years. Their community claims it is the world’s most advanced open source database. Having worked extensively on Postgres for almost 10 years I can testify that is a true statement.

Mongo, on the other hand, emerged in the late 2000s as the pinnacle of the ‘NoSQL’ movement — developers who were thirsty for easier, more lightweight approaches to databases without the need to learn SQL. It worked, and today MongoDB is backed by a NASDAQ-listed company with $4.5b market cap and 1000+ employees. Mongo was touted as the future of databases for start-ups and full stack engineers that wish to move fast without the cruft and constraints of an orthodox SQL database like MSSQL or Oracle. Therefore when The Guardian decided to build their next generation CMS on top of this technology it was a big deal for the technology and its commercial potential.

I’m not saying Mongo is a poor choice – it was / is successful because it allows developers to move very fast from concept to production using a familiar technology stack (JavaScript). No need to learn SQL or complex relational algebra. It applied the necessary competitive pressure to prompt other vendors to add similar ease of use and speed to their own stacks (e.g. jsonb support in Postgres), creating much needed innovation in a market that had been dominated for too long by a duopoly (MS and Oracle) resting on its laurels.