This is my homepage

This is where Rey Dhuny posts internet things.

About + archives

Cheap and cheerful uptime monitoring

When I was setting up I wanted to check that everything was working as expected, part of which was ensuring that requests were responding with 200 HTTP status codes, and if not then notify accordingly.

Running the following script using GitLab’s Pipeline Schedules does the job nicely:

A schedule was created to run on an hourly basis.

The “build” passes if a 200 status code is returned, otherwise it fails. What’s useful is that GitLab will send an email if a build fails that contains the log output including the HTTP status code :)

How to migrate from Jekyll to WordPress

Last month the decision was made to consolidate years of newsletters, weblog posts and other minutiae from various Jekyll instances to one WordPress-powered deal.

A fair amount of time was spent researching the best, quickest (and easiest!) way to do this but, unsurprisingly, most searches turned up the opposite: migrating from WordPress to Jekyll – anyway – here’s now it’s done:

Jekyll instance

  1. Install the Jekyll Feed plugin.
  2. Do a jekyll build and ensure the generated feed.xml has all the posts that are intended to be migrated.

WordPress instance

  1. Download and install the WP All Import plugin: wp plugin install wp-all-import --activate.
  2. In WordPress Admin, click on the All Import link and upload the feed.xml file generated earlier.
  3. Follow the four-step process to map and import the feed.xml data file into the WordPress instance.

I’d have saved a bunch of time had I been pointed at the WP All Import plugin from the beginning, so hopefully this will help somebody in the future.

Are Google Authenticator accounts saved as part of iOS Backups?

TL;DR Creating an encrypted iOS backup in iTunes will back up your accounts held in Google Authenticator.

Having suffered performance issues with the iOS 11.1.1 update, I decided to restore my iPhone to factory settings, then restore from a backup but worried that I would lose the accounts held in Google Authenticator.

Are Google Authenticator accounts saved as part of iOS Backups?

I’m happy to report that if you create an encrypted iPhone backup in iTunes then all the accounts in Google Authenticator will be backed up, and will be present and correct when you restore your iPhone from your backup.

Heroku-style name generator

I put together a bash function to output a Heroku-style name.

Palin — an internet connection indicator for your Mac

Palin screen shot

Palin is an app for your menu bar that shows a green dot when you have an internet connection, or a red dot if you don’t have an internet connection.

Download the latest version here.

I built Palin for two reasons:

  1. I tether using my iPhone on the commute to work – the internet is often flaky and I regularly find myself opening a terminal and running ping
  2. The wifi card on my work machine is rather troublesome, so it’s nice to have the internet status available at a glance.

Palin is named for Michael Palin, whose television travel series Around the World in 80 Days happened to be television the night before I decided to build it.

How to create (and destroy) your own Twitter archive

EDIT (10/01/15): This has been since replaced by a new version of twttr_autodestructor.


I wrote a handy bash script that creates an archive of all the tweets posted then deletes them from Twitter’s servers. I run it with cron on a weekly basis.


  1. Install t, an awkwardly named command-line tool for Twitter.
  2. Add something like the following to your crontab if you planning on running it on a schedule:

     # Run ``at 11:45 every Sunday
     45 23 * * 0 source /home/kanye/


Set the BOX_USER, TWITTER_USER and BACKUP_FOLDER variables (and don’t forget to create the path to the BACKUP_FOLDER eg. mkdir /home/kanye/archive_kanyewest).


# For all your automatic Tweet backup and destruction needs

# Variables

# The user you're running the script as eg. `kanye`
# The Twitter account you want to backup eg. `hello_ebooks`
FILE=${TWITTER_USER}_$(date +%d%m%y).csv

# Make workspace directory
mkdir /tmp/twttr_autodestructor && cd /tmp/twttr_autodestructor

# Create archive file
/usr/local/bin/t timeline @${TWITTER_USER} --csv --number 1000 --decode-uris > ${FILE}

# If the file has contents (twttr updates to backup)
if [ -s ${FILE} ] ; then

  # Copy archive
  cp ${FILE} /home/${BOX_USER}/archive_${TWITTER_USER}/.

  # Remove columns headers		
  sed -i '1d' ${FILE}

  # Get IDs only
  awk -F"," '{print $1}' ${FILE} > delete_me_column

  # Put the IDs on one line for t
  sed ':a;N;$!ba;s/\n/ /g' delete_me_column > delete_me_row

  # Delete!
  /usr/local/bin/t delete status -f `cat delete_me_row`
  # Send an email saying there were no twttr updates to backup
  echo "${FILE} is empty" | mail -s "${FILE} is empty" ${BOX_USER}@localhost
fi ;

# Delete workspace directory
cd ~ && rm -rf /tmp/twttr_autodestructor


I intend to throw my updates into GitHub, who do some nice formatting with .csv files.

How to get up and running with Hubot on Ubuntu

I’ve been experimenting with Hubot, an extendable chat bot and wanted to get it running on my own infrastructure after testing it out on Heroku.

  1. Create your Hubot
    1. Generate a Hubot instance
    2. Daemonize Hubot
      1. Create
      2. Create .hubotrc
    3. Push to repo on GitHub/Bitbucket
  2. Spin up a Ubuntu instance (using Ubuntu 15.04) and clone your Hubot repo
  3. Create Hubot integration on Slack
    2. Add Slack API token to .hubotrc
  4. Start Hubot
    1. sh debug to test
    2. sh start to run proper

Fun with temperature sensors

Late November 2013 I bought a cheap USB temperature sensor for £8.69 and put it in a drawer and forgot about it.

gnuplot graph of temperature against

Having rediscovered it yesterday I decided to see if I could get it logging some temperatures running on a Raspberry Pi B+.

From what I’ve read this afternoon, there’s a bunch of different variations of this temperature sensor, named TEMPer1. The output from running lsusb was:

Bus 001 Device 008: ID 0c45:7401 Microdia

The following is cribbed from the commands I wrote whilst getting this running and is pretty hacky but seems to work okay. ONWARDS!

Get the drivers for the temperature sensor

# Get the `libusb-dev` library
sudo apt-get install libusb-dev

# Clone the driver
cd ~ && git clone

# Jump to the repo folder
cd usb-thermometer

# Complile

# Plug in the temperature sensor + test it works
sudo ./pcsensor

# Allow `pcsensor` to be run without root
sudo cp ~/usb-thermometer/99-tempsensor.rules /etc/udev/rules.d/.

# Copy to `/usr/local/bin`
sudo cp pcsensor /usr/local/bin/

# Test as a grub user

Create a data file

We want to create a data file that can be used to build a graph. I imagine gnuplot could probably work from the unsanitised pcsensor output but since I’ve never used gnuplot before, coupled with the fact that I’m impatient:

# Get `pcsensor` output and throw it in a temporary file
pcsensor -c >> ~/temp.tmp

# Delete date as we only want the temperature for the last 24 hours
sed -i 's/^[^ ]* //' ~/temp.tmp 

# Delete word `Temperature` from `pcsensor` output
sed -i 's/\<Temperature\>//g' ~/temp.tmp

# Delete letter `C` from `pcsensor` output
sed -i 's/.$//' ~/temp.tmp

# Replace spaces with commas `,`
sed -i 's/ \{1,\}/,/g' ~/temp.tmp

# Copy the sanitised data to `temperature_log.txt`
cat ~/temp.tmp >> ~/temperature_log.txt

# Remove the `temp.tmp` file
rm ~/temp.tmp

Plot a graph

Install gnuplot:

sudo apt-get install gnuplot

Create a gnuplot.conf file for gnuplot:

set terminal png
set datafile separator ","
set output "temperature_graph.png"
set timestamp
set ylabel "Temperature (°C)"¬
set xlabel "Time"
set xdata time
set xtics rotate
set timefmt "%H:%M:%S"
set format x "%H:%M"
set grid
set key off
plot "temperature_log.txt" using 1:2 with lines lt rgb "#ff66cc"

Plot the graph

gnuplot ~/gnuplot.conf

Put it all together


# Get the current temperature and create a handy graph

# Get `pcsensor` output and throw it in a temporary file
pcsensor -c >> ~/temp.tmp

# Delete date as we only want the temperature for the last 24 hours
sed -i 's/^[^ ]* //' ~/temp.tmp 

# Delete word `Temperature` from `pcsensor` output
sed -i 's/\<Temperature\>//g' ~/temp.tmp

# Delete letter `C` from `pcsensor` output
sed -i 's/.$//' ~/temp.tmp

# Replace spaces with commas `,`
sed -i 's/ \{1,\}/,/g' ~/temp.tmp

# Copy the sanitised data to `temperature_log.txt`
cat ~/temp.tmp >> ~/temperature_log.txt

# Remove the `temp.tmp` file
rm ~/temp.tmp

# Create a new graph
gnuplot ~/gnuplot.conf

Create a cron job

You’ll probably want to create a cron job that will run every 15 minutes, which seems like a nice number:

# take the temperature every 15 minutes
*/15 * * * * source /home/rey/

As I’m only interested in the temperature for the last 24 hours I’ll also delete the temperature_log.txt file at 00:00:

# remove `temperature_log` file every 24 hours at midnight
0 0 * * * rm /home/rey/temperature_log.txt

Spatchcock Tangy Chicken

I made this up as I went along and the result was a deliciously flavoured chicken that I will definitely be making again.

It tastes even better the next day in a chicken and piccalilli sandwiches.

Spatchcock Tangy

Check out the recipe on GitHub: Spatchcock Tangy Chicken

Updates to

EDIT (02/09/14): This is no longer the case – I should write why, soon!

I had some time yesterday to make some changes to

New RSS feed

Although zzmag is primarily a newsletter I appreciate that some folk prefer to read it on the website, where others love their feedreader. You can now subscribe to zzmag using the RSS feed.

New issue URLs

Over the past few weeks I’ve been testing different URL structures. As a proponent of beautiful URLs I wanted a user to be able to navigate through issues using the URL alone.

Long story short, I’ve removed the year and month from the issue URLs:

Old issue URL

New issue URL

New 404 page

To support the new issue URLs, I’ve added a more useful 404 page.

Paragraph IDs

It was a bit of a pain trying to link somebody to a specific paragraph in an issue. A user would have to visit an issue page then either search in page for a keyword or scroll through the entire issue.

To allow linking to a specific paragraph, each paragraph now has an ID.

For example, to link to the paragraph about NASA’s Astronomy Picture of the Day in Issue 27, inspect the paragraph1 in question and note the ID:

Then append it to the end of the issue URL like so:

I’ve also added a cheeky highlight to help your eyeballs.

  1. This is quite a technical “solution” but it works for me. To inspect the paragraph, right click on the paragraph in the question, click Inspect Element then you’ll see the ID in question, for example: <p id="zz:10"></p>

How I compose my newsletter

Four weeks ago I decided to start putting out my own weekly newsletter. It’s called zzmag, full of internet and delivered every Monday. Some folk have asked how I go about publishing it so decided to jot down some notes.


  • A text file for your work-in-progress newsletter. I use Simplenote1.
  • A Jekyll-powered website to host your newsletter archive and handily convert Markdown to HTML.
  • An account with TinyLetter to send your newsletter.


  1. Create a new text file for your newsletter.
  2. Write all content in the Markdown format.


  1. Once your newsletter is ready to go out, create a new Markdown file in your Jekyll-powered website and paste in the content from your text file2.
  2. jekyll serve --watch and visit localhost:4000 to view your newsletter.
  3. View the source of the HTML page and copy the HTML markup of the newsletter.


  1. Log in to TinyLetter and click Compose.
  2. Click the View Source Code button on the WYSIWYG editor and paste the HTML markup you copied earlier, replacing any existing markup.
  3. Click Send Preview and admire your handiwork!

Good things!

  • Jekyll provides a handy way of generating only valid, essential HTML markup for your TinyLetter newsletter. In my experience WYSIWYG editors have a bad habit of creating surplus and spurious HTML tags.
  • Own your platform: if anything happens to TinyLetter you have your own archive of newsletters, hosted on your own domain.
  1. I keep my draft newsletter in Simplenote as I make heavy use of the Simplenote web, iPad and iPhone apps. It’s readily available wherever I’m consuming internet. 

  2. Here is Issue 4’s Markdown file as an example 

Multipurpose salsa

This is an amazing salsa that I make in large batches then bag up to freeze. It can be eaten with tortilla chips but is usually put in chilli, curry, casserole, etc.

Multipurpose salsa

Check out the recipe on GitHub: Multipurpose salsa

Corned beef camping stew

I wanted to create a no-fuss stew that could be made without the need for fresh meat. I’m using corned beef and this recipe will feed 2/3 hungry folk.

I’m really happy with how this turned out, you should eat this outdoors on a cool evening with crusty bread.

Corned beef camping stew

Check out the recipe on GitHub: Camping corned beef stew

The 2013 roundup

Following Karl’s 2013 roundup I thought I’d have a go at my own. This look longer than I expected.

OS X Applications

The only new application here is Moom. I spend much of my time with a fair few windows open and Moon gives me an unobtrusive way to organise windows superquick. I would recommend it in a heartbeat.

I’ve been a user of the rest of the applications since their respective births and they do their jobs admirably.

  • Chrome Canary - for development and surfing the information super highway
  • iTunes - for listening to my music
  • Moom - window manager
  • Pixelmator - image editing
  • Quicksilver - application launcher
  • [Terminal]( - where I spend most of my time
  • Tweetbot - to share what I’m having for lunch
  • VLC - to watch all the videos

Terminal Applications

I’m living in the terminal more than ever and have recently moved my instant messaging and email to the command line. I find it more efficient to jump between tmux windows and sessions than between different attention-grabbing applications. A side effect is that I’m using the mouse at lot less which is better for my wrists.

  • Irssi - IRC client
  • Glances - operating system monitoring tool
  • MCabber - XMPP client
  • Mutt - email client
  • tmux - terminal multiplexer
  • Vim - the text editor of truth

iOS Applications of note

Although I only have two pages of applications on my iPhone I didn’t feel the need to list them all. Here are a small(ish) selection of ones that I use regularly and don’t think I could do without.


  • Dell box - an ancient Dell box that runs Debian
  • Filco Majestouch 2 - mechanical keyboard
  • iPad mini
  • iPhone 5
  • Mac Mini - a media centre of sorts
  • MacBook Pro (with Alkr cover) - my primary machine
  • Raspberry Pi - I have two of these and love them for tinkering

Online services

I’m hoping 2014 will be the year I ditch third party email providers and move to a self-hosted deal. I’ve considered different ways to do this but none have struck me as particularly future-proof or time efficient.


I switched to Vim because I was spending much of my time in a terminal. I’ve been using Screen for many years, mainly for kinda persistant tasks (like running a webcam) but found its performance a bit quirky.

These days I need something more scalable and robust so thought I’d bite the bullet and kick the tyres of tmux. It’s pretty sweet, here is my starter config.

I’m not going to write my own instructions on how to tmux but here are a few links that I found helpful:

I’m also currently experiencing a curious bug where in on Mavericks:

tmux 1.8 has a bug that causes it to unexpectedly underline all characters when the status bar requests a bold color.

Colours on Vim and tmux

When using Vim (7.3) through tmux (1.8) I found that the Vim colour scheme looked strange. After much googling this was the solution that worked for me.

Set the following in your .vimrc1:

set term=screen-256color

Set the following in your .tmux.conf2:

set -g default-terminal "screen-256color"

This was very much trial and error3 but seems to work across both OS X and Debian boxes.

EDIT (5/12/13): Since I was struggling to understand colours in the terminal, I emailed the knowledgable @geraintrjones who sent me an excellent explanation (with some brilliant Serengeti animal analogies):

Terminal emulators tell the world about their capabilities with something called a $TERM variable.

The $TERM variable is a label that says “I am this kind of thing” - for instance, OS X’s has its default $TERM set to xterm256-color. You can see this by typing echo $TERM in the terminal. (you might see something else, and this is where the problem starts…)

This doesn’t tell you much on its own, but there is a big database called terminfo that lists all the $TERM variables, and what features something with, say, xterm256-color, supports.

Its like a label saying ‘Elephant’. When you look up ‘Elephant’ in the animal list, you see it has 4 legs, a trunk, and can support palanquins but requires a mahout.

You can see the terminfo entry for your terminal by typing infocmp $TERM (which will look up that terminal’s $TERM in your system’s terminfo database ).

You will see the number of colours listed like this: colors#[number]. xterm256-color for instance, will show colors#256, xterm-color will show colors#8.

This means that OS X’s is telling the world it can handle that number of colours. Applications like Vim see a label saying “I have all the xterm-color features!” and think, “Okay, I’d better give this old timer 8 colours”. Or they see xterm256-color and think, “Lets give this dude 256 colors”.

What should happen, is that tmux reads the $TERM value of your terminal, sees xterm256-color (or similar) , thinks “Okay, that’s a 256 colour terminal” and sets its $TERM to screen256-color. Vim running in tmux then reads tmux’s $TERM and goes, “Cool, I looked screen256-color up and it has 256 colours - let me crack out the 256 colour palettes, like a colourful boss”.

Terminal tells tmux its an Elephant, tmux tells Vim its an Elephant, Vim gets the mahouts out and preps the palanquin.

Whats probably happening is that your .bash_profile is setting a different $TERM - for instance, xterm-color.

Tmux then goes, “Okay, this is an 8 colour terminal, I’m gonna set my $TERM to screen”, which only has 8 colours. Vim then reads tmux’s $TERM and serves up a paltry 8 colours only.

So, terminal is telling tmux its an Antelope, so tmux tells vim its an Antelope, Vim leaves the mahout at home and you don’t get a palanquin, even though you could handle that sh_t.

So by setting set -g default-terminal "screen-256color" in your .tmux.conf, you are telling tmux to always set its $TERM to screen-256color, and then apps like Vim will look that up in terminfo and see, yup, supports 256 colours.

However, your .bash_profile gets run every new terminal right - so it’s run again , overriding the $TERM.

So, telling vim to use a different $TERM, i.e. set term=screen-256color in your .vimrc makes it use the features terminfo lists for screen-256color ,which includes 256 colours. You can use infocmp screen-256color if you’d like to see this.

So Tmux says, its an Elephant, then your .bash_profile changes the label to Antelope - then you have to have Vim tell itself “ignore that, its really an Elephant”.

Only setting the Vim $TERM means tmux isn’t set up for 256 colour, so Vim tries giving 256 and fails. Only setting the tmux one means tmux is loaded for 256, but due to the .bash_profile changing the $TERM, everything thinks tmux is only prepped for 8.

Really, you shouldn’t be trying to override $TERM in .vimrc - because now, whenever you open Vim it thinks it’s in a terminal that has all the screen-256color features. This is only the case when its being opened in tmux!

Really really, you shouldn’t be overriding tmux either - if you make sure your terminal (in this example, OS X’s has its $TERM set correctly, tmux will read it, see it can handle 256 colors and automatically set its $TERM to screen-256color.

The best solution would be to gently reassure the terminal that it is in fact an Elephant after all. Which probably means nixing a rouge export TERM=xterm-color from your .bash_profile.

TL;DR echo $TERM in a new terminal. If it doesn’t say xterm256-color, go look at your .bash_profile and wipe out any evil export TERM=.

How to webcam on Linux

I wanted to keep an eye on a family pet when I was out the house so installed Debian on an old box and set about investigating the weird and wonderful world of webcams on Linux1.

I stuck both single and multiple webcam configs on GitHub.

My requirements

  1. A webcam that takes a photo on detecting motion.
  2. Keep archive of past photos2.
  3. The most recent photo is uploaded to my server.

What I did

After a bit of a Googling, I decided on a piece of a software called webcam. Here’s how I got up and running:

Install webcam

apt-get install webcam

Create a .webcamrc

touch ~/.webcamrc

Example .webcamrc

Run webcam


Assuming you don’t get any errors your webcam should now be watching for changes. You can view the latest photo at and a local archive can be found at ~/webcam.

On investigating webcam software that supported multiple webcams I discovered Motion which seems to be the go-to webcam software for Linux that does all the things. After a day of messing about it with I got it working but found it did way more than I needed it to. I’ll stick my config on GitHub when I get the chance.

EDIT (1/10/13) My Motion config is now on GitHub.

Things I need to add to this post: ssh-agent + ssh-add; how webcam resolution is dependant on whether you’re using one USB bus for multiple webcams; managing your local archive with cron.

  1. When I was younger in the days of Mandrake Linux I did try and get various webcams working, to little success. 

  2. You can do loads of cool things with an archive! Shall experiment with ffmpeg

Fish marinade thing

At home I eat a fair amount of meat which isn’t particular good for you. I’m trying to wean myself off by eating more delicious fish. Here is a recipe I invented for the barbecue but imagine would work just as well in the oven.

Fish marinade thing

Fish marinade thing recipe

Redirect common URLs

I’ve previously pointed to thoughts on URL design. I am of the school of thought that a user should be able to navigate and understand your website’s hierarchy from the URL alone.

Continuing on this theme, your website should redirect common URLs.

For example, if a user visits, expecting your website’s Contact page where your Contact page actually lives at the user is going to have a bad time and probably a 404 for their trouble.

The solution is to redirect common URLs to their actual locations1.

Here are a bunch of URLs that I would expect to redirect to their respective locations:

  • (or /terms)
  • (or /signup, /register)
  • (or /login)
  • (or /logout)

I appreciate the above sounds like common sense but you’d be amazed at the amount of websites that don’t do this2.

  1. You could argue that you should be using these common URLs to start with but, you know. 

  2. It’s at this point I try to play click-the-link when the webpage has infinite scrolling — the rage. 

Thoughts on what I do

This morning I was on my way to work, sat on the train tapping away at some code when an elderly gentleman sat next to me.

At that moment I was completely immersed in what I was doing, in this case trying to figure out why a .json file wasn’t being parsed correctly.

I was getting a direct train but I occasionally change halfway. Distracted by Vim I made the mistake of forgetting which train I was on and got up to change. I quickly realised my mistake and, feeling like a fool, sat back down next to this gentleman who had now taken my previously held window seat.

When I reopened my laptop, the gentleman (who I’ll call Joe) struck up a conversation. He told me that a few months ago he bought a MacBook after many, many years of using Windows.

I asked him why he decided to switch to a Mac after years of using Windows and he told me that his son and daughter-in-law both use Macs and after having a go found them to be more user friendly. That said, he went on to explain that he was still getting used to the OS X operating system.

What was remarkable about talking to Joe was his appreciation of good user interface. He was genuinely excited by OS X and there were times where he dropped to a guilty whisper as he told me something he found particularly brilliant, as if you shouldn’t feel such delight in a user experience.

My motivation for doing what I do is the opportunity to craft experiences that delight. A delightful user interface should not be an exception but the norm.