Thomas James Just a geek.

Technology, Gadgetry, Photography and Software Development


Latest Posts

PC Engines APU - Home router replacement


I've not been much of a fan of off-the-shelf home routers for a long while now. Always opting to build a linux box of some sort to do the job, going back this used to be the old white box with a couple of network cards, and until recently it was a DreamPlug plug-pc. The DreamPlug was starting to have some power issues, so it was best to pro-actively replace it.

A colleague suggested I take a look at the PC Engines ALIX, and the (then) in development APU. The ALIX sounded interesting, but given that the APU was right around the corner I decided it was best to wait and get the new model. I ordered the Voyage Wireless Kit once the APU was available for sale, as this included everything I needed (enclosure, SSD, wifi, etc exlcuding PSU), also being shipped from Hong Kong I hoped it would arrive quickly, which it did thanks to SpeedPost.

As an aside, I choose to run a linux based router generally for two reasons. I like being able to script up, and control the configuration of the router without having to use the generally, very crappy web based UIs that most have; and secondly that I can then manage the router as I see fit and are not limited to the provided features of the device (such as multiple IPoverIP tunnels), which usually wont change once released. With linux I can upgrade as new releases occur.

At the heart of the APU is a low power AMD x86/x64 cpu, unlike the DreamPlug which was ARM based. This means that all the regular linux distrobutions will be suitable without needing anythign to specifically target the device. I chose to use Debain.

Internal view of the APU board

Shown left to right, GSM Mini-PCIe, Wifi & 16GB SSD (mSata). There is also a SIM card slot on the underside of the SD card reader, wired up to one of mini-pcie slots.

The voyage kit includes everything needed, except a suitable PSU, with some assembly required. For the PSU i found that an old Apple Airport Extreme PSU works perfectly.

I purchased a USB serial adapter & a null-modem cable to make installing linux a bit easier, as I wasnt familiar with the device. This is nessesary as no form of display-out is provided on the device. I purchased the Serial app for the mac to make the software side of the terminal emulation easier.

Linux Installation

The installation tripped me up and had me scratching my head for a bit trying to get the debian live CD to work with the APU.

The live CD needs to be configured to output to the serial console, using the correct speed for the APU. However, to update the live CD configuration I found that using unetbootin on windows (not Mac if you want to stay sane) was the easist way to get a writeable USB to update the config files. Alternative suggestions are very welcome.

I found that updaing the following successfully allowed the console to be used for display output and a normal linux install to proceed. Once installed and networking setup, everything else was performed over SSH.

The PC Engines forum was quite useful for getting this right, the post 'APU + Ubuntu 14.04 LTS - install via serial console' in particular.

syslinux.cfg include the following at the start of the file:

# D-I config version 2.0
SERIAL 0 115200 0

Update the kernel entry to include console=ttyS0,115200n8 like so:

label unetbootindefault
menu label Default
kernel /isolinux/rescue64
append initrd=/isolinux/initram.igz rescue64 scandelay=1 console=ttyS0,115200n8 -- rescue32 scandelay=1

isolinux/isolinux.cfg include the following at the start of the file:

# D-I config version 2.0
SERIAL 0 115200 0

Linux Setup

I needed to do a couple of things to complete the install on debian. Taking some cues from Debian installer USB Stick for PC Engines APU board with mSATA drive


From pcengines-apu-debian-iso / profiles / apu.postinst:

cat >/etc/sysctl.d/apu.conf <<EOF

Ensure the serial console is used by grub

Edit /etc/default/grub, update:

GRUB_CMDLINE_LINUX_DEFAULT="quiet console=ttyS0,115200n8"

Then run (per the instructions in the file):


Enable non-free package source for apt

Edit /etc/apt/sources.list ensuring that non-free is used. Then run apt-get update.

Install device firmware

Run the following for the ethernet & wifi firmware:

apt-get install firmware-realtek
apt-get install firmware-atheros

Install sensors for temperature monitoring

apt-get install lm-sensors

Final Thoughts

The APU is a very capable little unit, which has been running stable for over a month in a low airflow area. I originally had some concerns about heat disapation but the unit remains about 52.0°C most of the time.

I'm still yet to setup the GSM device as a backup network link, hopefully a future post.

This Gist includes output from a number of common linux utils that provide hardware information, if that's your thing.

DocPad & a new layout


I've been wanting to redesign the layout & theme of this blog for quite some time. It was dated. With the task of developing a custom wordpress theme ahead of me, I looked at other options. Static site generators that use existing templating libraries & markdown sounded like a good way to go, and allowed me to avoid having to learn to create a wordpress theme at the same time. The frontrunners I considered were Jekyll, Octopress & DocPad.

I decided to go with the following:

Although I chose not to use Jekyll, I found Hadi Hariri's article Migrating from to Jekyll quite helpful in planning out the move to docpad. It gave me a number of things to consider, much of it applying to docpad, including migraing existing content to markdown, which can be consumed almost as-is by docpad. Some manual cleaning up of the html to markdown converted text was required.

Overall the move to docpad was a pretty smooth one, plugins are required to achieve much of the like-for-like behaviour with the wordpress blog that I desired. This included clean URLs with the publish date in the path, tag/category listing pages & generating RSS feeds. Most plugins work as expected with varying options for customising to your liking.

Tweeting links to posts has become a manual task for the moment though... all in good time

Game of Life as an i386 kernel


I wrote this; Game of life implemented as an i386 kernel.

After reading through the excellent Kernel 101 – Let’s write a Kernel post by Arjun Sreedharan, I was motivated. I thought it would be interesting to see how difficult it would be to implement Conway's Game of Life simulation as an extension to what I'd already done as part of the tutorial.

I set myself some goals; that it needed to be interactive & visual.

The kernel implements screen writing, cursor position updating and polling IO for the keyboard.

The screen is updated using the mapped memory method outlined in the tutorial, while the cursor position and the keyboard use C inline asm to read/write to the necessary ports.

I didn't go to the extent of implementing an IRQ based keyboard driver as I just wanted do the minimal to get a working interactive simulation. The simulation also uses a fixed size array thats allocated as part of the program rather than attempting to allocate and manage memory. This certainly limited the data structures available to use and forced me to really think about the most basic algorithm to get the job done.

It's not pretty, efficient or bug free, but it was a fun (re)learning exercise for a weekend. Taking me back to my uni days learning assembler, although much of what I learnt back then I had clearly forgotten.

Some thoughts on where it could go next:

  • interrupt driven keyboard driver
  • scancode to key events & translation into ASCII
  • memory management
  • user driven initial game state (eg using the arrow keys to mark squares as initially live then triggering the simulation run)

Doing the kernel development on a mac also posed some additional challenges in being able to successfully compile and link the kernel, as it was proving imposible to get the GNU linker on the mac. I found that setting up a Vagarantfile and running ubuntu in a virtual machine to perform the build was the easiest way to get around the limitations I ran into.