Time-lapse Photography: Pi Studio

[Article in Beta v0.5]

At the weekend I returned to Bristol Digimakers after missing events in 2017 due to badly timed holidays. I've written about Digimakers numerous times before and am a huge supporter of this Bristol University student-run tech event. Every time I go I love discovering new, cool, interesting, practical, and not-so practical things one can do with computers. Normally I take along a Zumo robot along with various bits and pieces (typically a selection of Raspberry Pi HATs). It's time for something different, and hence I presented Raspberry Pi Lightbox Studio. This is an attempt to create a low-cost photo studio in the home / bedroom / kitchen for an all in price of under £100 (including Pi and camera).


I've been keen on trying time-lapse photography at the small or macro scale. An example: I have a tub of Science Putty which is fun for all of five minutes as one pulls it into shapes and then starts to watch it collapse back to its original tub-shaped state. I say "five minutes" because that is about the limit of my attention span as it is a rather slow process. Enter time-lapse photography to the rescue.

Time-lapse photography is a way of compressing time: a long duration is viewed as a video in a shorter period which has the result of accelerating the rate of change in a scene. Most of us have seen examples like this in films and on YouTube. One way to think of this is the exact opposite of Bullet Time featured in The Matrix, which is a pretty advanced technique that gives the appearance of slowing time (while introducing 3-dimensional camera movement). If Bullet Time (with the help of CGI) makes things appear to move s-l-o-w-l-y then time-lapse makes things move more quickly. Ahah! I do not have to watch my Science Putty collapse in real-time. I can create a video that accelerates the process.

To achieve the effect we take many individual photographs and then stitch them together into the video. Normal videos usually run at around 24 frames per second. For comparison to watch ice melt I take one photo every 15 seconds (your setting may vary). It will therefore take 15 x 24 = 360 seconds (or 6 minutes) to take the 24 frames required for one second of footage. We are therefore accelerating time 360 times! Ice will melt in front of our eyes.

To create a time-lapse effect you need a way to reliably take multiple photographs. Assuming you won't do this by hand (oh the boredom!) you need to be able to program your camera to do this process automatically either with an intervalometer or via the settings on some modern cameras. This is where the Raspberry Pi comes in. Coupled to a Pi Camera (I'm using v2.1 but this will work with v1 as well) we have a very fine degree of control over the creative process of creating a time-lapse movie as the raspistill software features a time-lapse mode.
Assuming you have followed the software set-up in the above guide the following steps will take you through the process:

  1. Select target and compose scene
  2. Take sample photo
  3. Re-adjust
  4. Re-take sample photo and check all is well
  5. Run raspistill to capture individual photos using the time-lapse option
  6. Stitch the images together into a movie
  7. Watch and enjoy.

You may want to use a stand-in for your target when setting up. For example, for a time-lapse of an ice cube melting use a d6 dice as a temporary target (or a whisky stone) and keep the ice in the freezer until needed. Else, you will miss some of the action.

To create a good time-lapse technique several challenges present themselves and each must be overcome. I* broadly split this into:

  • Subject
  • Lighting
  • Photographing
  • Stitching

* - other issues do exist, but get these four sorted and you'll be a long way towards creating excellent time-lapse


Essentially, you have two choices: go big or go small. If big (e.g.: clouds) then you will have less control over lighting (see below) but can create amazing views of our planet. If small then you need to be aware of potential close-up focus issues as the Pi camera struggles to focus below 20cm by default. This can be corrected but do read several sources of information on this first to avoid damaging the camera. YouTube has a good video showing the process. For those not liking the combination of pliers + delicate camera Adafruit have the answer. Or, for your first attempts just don't worry about it (I didn't). Does an ice cube melting need to be in perfect focus? I'll leave that to you to decide.

You should position the Pi Camera with a good view of your subject, and take sample photographs to ensure all is well before starting the time-lapse process. It is essential that you do not nudge either camera or subject in any way during the image capture process or the effect will be ruined. Motion time-lapse, where the camera rig is intentionally and automatically moved along a rail between frames is a whole subject in its own right, and with perfection can create some staggering results.


One of the banes of time-lapse photography is flicker. This is created when the lighting between each frame differs slightly and/or the camera changes its exposure. With the controls of an SLR camera (such as the excellent Nikon D7200 that I use when not Pi'ing) the latter can be resolved by using manual exposure mode and white balance settings. There are several ways to adapt to changes in lighting. You could, for instance, embrace lighting changes in your project. Time-laps of clouds during the day -> evening -> night often do this. Alternatively for controlled indoor scenes (Science Putty) you can use a lightbox that provides a controlled, consistent lighting array (ideally one that minimises shadows).

I am using a £12.99 bit of kit from Puluz. This lightbox has two LED strips top-front and top-back that provide even lighting, albeit do be aware of bright sunshine from the front as that can still change the light level. From Puluz's promotional images such as that below you can see that an added benefit of this lightbox is that a circle can be removed from the top to take photos from above the subject. As a piece of kit it is actually a lot better than its meagre price tag would suggest. Better light boxes are definitely available but at this price point it is worth giving it a go. The two LED arrays with supplied cables (that each require plugging in - use two USB ports on your Pi 3) are bright and provide even lighting throughout the box.
Puluz Lightbox


When your scene is set use the raspistill command to start taking photographs. Now, go make a cup of tea (or several) as the process will take a long time. Remember: we are accelerating time in video, not in reality (not even the Raspberry Pi can do that...yet.) ;)

The guide on time-lapse photography from the Raspberry Pi Foundation is very handy here. Assuming you have updated your Pi to the latest firmware and enabled the camera then the key commands to enter at the terminal prompt are a variant of:

mkdir tlcd tlraspistill -t 30000 -tl 2000 -o image%04d.jpg

First we need to make a directory (mkdir) in which to store our time-lapse photographs as the camera will create a lot, and hence create a bit of a mess in any folder you are currently in. We change directory (cd) to the tl folder (cd tl) and then run raspistill with parameters to output images:

-t switch is the amount of time to run the time-lapse for. In this example 30000 miliseconds, which is 30 seconds.
-tl is the frequency to take photographs, 2000 milliseconds being 2 seconds in this example.
-o is the image name to save the file to disk with %04d being a numerical increment. The 4 means pad with up to four preceding zeros. This will create images numbered 0000, 0001, 0002, etc, i.e.: image0001.jpg, image0002.jpg, image0003.jpg etc.

You should vary the -t and -tl settings as needed. Some experimentation is needed here. To melt a large ice cube of BB8 from Star Wars I used -t 14400000 which is a lot of milliseconds! This equates to 4 hours (1000ms x 60s x 60m x 4h). I decided to take a photograph every 15 seconds, setting -tl to 15000 to do this. This generated 960 images (14400000/15000) and required about 4GB of space on my SD card.

Allow the time-lapse to run to completion. Note that if a frame or two ends up corrupted / showing waving fingers in front of the lens then this can be corrected by deleting the offending image, copying its next nearest neighbour and renaming to fill the gap. This does result in two duplicate frames, but at a rate of 24ish per second no-one will likely notice.


When all the images have been created ... TBD

The end result

Here is a movie... TBD

Kit needed

  • Raspberry Pi 3. Other Pi's can be used however the encoding process to stitch the individual frames together is processor intensive. Even the Pi 3 only manages just over 1 frame per second in contrast to my 2014 MacBook Pro that regularly hits over 9 frames per second.
  • Official Raspberry Pi Power Supply. With the Pi 3 having to do some serious processing during the stitching process, coupled with the juice required for the two LED arrays on the Puluz plus the camera having a decent PSU makes sense. Note that kits are available that combine the Pi, PSU, SD card (with NOOBS pre-installed) for a good combined price are available. If you want to keep the lid on your Pi when the camera is attached do look for a kit with a suitable notch, such as the Pimoroni Pi 3 Essentials Kit.
  • Raspberry Pi Camera v2.1 (or v1 if you have one handy).
  • High capacity SD card. I use Integral cards. 16GB should be considered a minimum if you intend to use the card to save photographed frames and the stitched movie. Alternatively consider a 32GB SD card or an external device such as the Kingston 32GB (or higher) USB 3 external drive.
  • A mini tripod or similar to stabilise the camera.
  • ...for which you'll need an adaptor. I found I did not need a bolt to hold it onto the tripod nut despite what reviews state. Note that blu-tac should probably be avoided as a substitute as it tends to give way slowly which could be noticeable as the angle of your camera gradually changes over a number of frames.
  • Something to raise the Pi up a bit. I found that the limited camera connector length meant that I needed to get my Pi closer. I had one of these Clingo things lurking around and it does the job well. Alternatively use the boxes the Pi and Camera come in. Alternatively a longer ribbon cable does the trick. Pimoroni sell these.
  • A keyboard and mouse for the Pi. I suggest looking at Rii's offerings. I use the Rii i8 for when I need to enter the occasional command into the Pi which is available in RF (Radio Frequency) or Bluetooth variants. Note that the i8 is not suitable for prolonged typing.
  • A monitor to preview and view your movies. Or work headless by connecting your Windows / Mac laptop to the Pi and using it purely to acquire images.

If you use a headless setup this used to mean you could not view a live preview of the Pi Camera's output, even with VNC. Fortunately recent versions of RealVNC, bundled with the Pi, solve this problem. RealVNC provides a good write-up on their website. However if live previews are not important then a purely command line experience works well: you can always download the files to your computer (using SFTP in Filezilla, Transmit or similar) and view as needed.

I'm going to call out the Manfrotto PIXI EVO 2 that I bought as perfect for the job. With a claimed weight of 2.5kg (750g for the camera body) it doubles as a table-top tripod for holidays. At about £35 it is something of a bargain given the features, which includes easy head angle adjustment and ability to operate at a variety of heights. Being able to alter the height and angle of the head is a must given you will want to target your subject precisely.
Manfrotto PIXI EVO 2 mini tripod

Pro tip: the Pi Camera comes with a little sticky protector tab over the lens. Do remember to remove it before starting to capture images! Also, keep it to use as a mini lens protector if transporting / storing your kit.


Subject ideas for time-lapse photography


Zumo George gets upgrades (part 1)

Everything eventually needs an upgrade. You may think that pencil 1.0 was great, but if Apple has taught us anything: we all need pencil 2.0. I jest, although that said it is time for Zumo George, one of my Raspberry Pi robots to receive the 2.0 make-over. This is brought on by two things:

Previously I had thought of upgrading from the Raspberry Pi A+ to a Zero purely to save some space, enabling me to get a bit o'real-estate back as George measure but 10cm x 10cm. However I would still have the WiFi dongle a-dongling, only it would be dangling from a micro to full-size USB. Dongles dangling from dongles (there's a song in there somewhere) made me sad: "if only a variant of the Zero came with WiFi", I thought. Fantastic news Pi fans: the Foundation delivered.

The Raspberry Pi Zero W is essentially a Zero (same CPU, same RAM, same form factor) with the added bonus of a combined WiFi and Bluetooth chip. Also for our inner geek the Foundation has included the coolest antenna I've seen yet which features a triangular resonant cavity. The MagPi magazine covered the antenna in detail just the other day in Issue 55. Proant, a Swedish company, have licensed the tech to the Foundation.
The MagPi, Issue 55. Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported (CC BY-NC-SA 3.0)

Given the move to the slimmest of Raspberry Pi's it is also time to move from the Pimoroni Explorer Pro to the Explorer pHAT. This half-pint size board has many of the features of it's larger sibling and is a perfect match for the Zero W.

Putting it all together here are collection of parts:

Zumo George Pi Zero W upgrade

Any observant bod will quickly notice something missing. Yes I hang my head in shame and join the "forgot to order a 40-pin header for the Zero" club. D'oh! eBay quickly to the rescue. Given this tiny omission the build is on temporary hold for a few days. Still, let's get the blade in place because sumo blades == awesome. While we're at it let's have a preview of where the Zero is going to go. With all ports along one long edge I can now have these poking backwards from George. You can also see the extra space I am gaining from the move to the Zero W from the A+.


I am expecting great things from the sumo blade and am already thinking about how to modify my BDD behaviours and code to take advantage: Zumo George shall no longer retreat in fear from Cartmanzilla.

Stay tuned for Part 2, entitled: "Ahah the header has arrived!"

PS: yes those wires are going to get significantly shortened ;)

It's MeArm Pi on Kickstarter

Kickstarter can be a wonderful place to support great new ideas. One project that has sprung up and captured the hearts, minds (and pledges) of folk is MeArm Pi from Mime Industries. Following on from the very successful original MeArm robot arm Mime are presenting something great to the Raspberry Pi community. The project has already smashed it's £10k goal with almost £47k pledged at the time of writing. Doing the maths on the pledges that represents 844 arms at present. That's a lot of robotic hands to shake! Best of all: you still have until 6pm on March 8 to support the project and acquire your very own robot arm.

Mime describes MeArm Pi as "easy to assemble and not requiring extensive knowledge of electronics, the MeArm Pi STEM kit helps kids and adults learn robotics and teaches them how to code." That's cool. Very cool: robot arms are fun, programming is fun, and programming robot arms is twice the fun.

MeArm Pi

I briefly interviewed Ben Pirt joint founder of Mime. His passion for the new MeArm is clear: a desire to create a functioning robotic arm platform that simplifies the construction process enormously.

CD: What was the motivation to change the design of MeArm for MeArm Pi?

Ben: "The first MeArm has been built thousands of times (including a fair few times ourself!) and we wanted to broaden its appeal and try get even more children involved in making and programming it. So we decided to look at which parts of the build were particularly difficult. The number of screws came out as a big issue that was catching people out so we tried to re-work the design wherever possible not to need screws. Now the only screws left are on the joints where two pieces hinge together. The grip had a major re-work (from 9 screws down to 1) which made it much simpler to build."

It's worth pausing and considering this: the number of screws and fiddly components in a build really can influence the complexity and hence accessibility of the product. When I received the Maplin robot arm for Christmas a few years back I spent several hours putting together gear boxes, ensuring all was aligned and assembling the thing. While highly enjoyable in its own way (who doesn't like to build things) it was also frustrating: that's a lot of components to assemble *just* to get a fairly simple robot arm up and running! Mime's keen attempt to solve this build complexity problem is admirable.

Once I had built the Maplin arm I wanted to program it using a record and playback mechanism in Python. It was at this point I hit a few snags as precision playback just isn't easily possible with normal motors, and again it looks like MeArm Pi has overcome this issue.

CD: How accurate are the servos with MeArm Pi, i.e.: can you reliably pre-program repeatable movements?

Ben: "The servos are pretty accurate - they use metal gears for extra reliability. The big difference from the Maplin arm is that servos can be relied upon to be nicely repeatable so you can program them to do things again and again. Servos won’t drift out of calibration like motors."

Having non-drifting motors sounds like a dream come true! Don't get me wrong: I love the Maplin arm and easily recommend it to everyone as a low-cost way to get into robotics on the Raspberry Pi. Now though, Mime are offering a viable alternative that combines the hardware with ease of programming. Talking of programming, I asked Ben what else makes the MeArm so great:

Ben: "I think there are a lot of things that make the MeArm Pi better than the Maplin arm:
  • children build it themselves so they get a better understanding of how the mechanics works
  • the motor control is easier from the Raspberry Pi and can be programmed in any number of programming languages
  • the software is better and more suited to beginners"

It's worth noting that supplying purpose built control software to get up and running quickly is a great idea: it's what makes projects like Pi-Top so readily accessible for instance. Software for the Maplin arm does exist: we covered this in earlier issues of The MagPi a couple of times however it involves getting ones head around the internals of the USB protocol and while learning about USB Vendor IDs is "fun" in one way it certainly isn't conducive to encouraging people new to robotics into the hobby.

Ben also tells me that the age range of the arm is "officially...11+ but with some parental supervision it can be built by as young as 8 or 9 without too many problems." Producing a product that is interesting and accessible to age groups from primary to adult is a great achievement: "We believe in helping children to have fun whilst learning about technology and the MeArm Pi is completely designed around that goal". Superb.

It seems that MeArm Pi is not the only product that Mime are looking at for the future too:

Ben: "This is the first new product from Mime Industries since we formed the company. We’re going to be taking another look at updates to Mirobot as well as rolling the improvements to the MeArm mechanical design over to the other versions. We’ve got lots of ideas for new products but you’ll have to stay tuned for those!"

And stay tuned I most definitely shall.

MeArm Pi, available for another 6 days on Kickstarter.

MeArm Pi

It's all gone quantum at Digimakers

Last Saturday I had a great time at Bristol's Digimakers. I regularly attend this superb event, running a stand and get the opportunity to talk computers and science with children, parents and teachers. This time around I focused on Behaviour-Driven Development (which I've covered before) with a side order or LED and ePaper displays for the Raspberry Pi and Pi Zero from Pi-Supply.
Digimakers June 2016

Several organisations and lots of students ran demonstrations, workshops and drop-in help sessions throughout the day. This is something especially neat about Digimakers: it's not focussed on a single technology as the supposed solution to all scenarios, but instead showcases lots of complementary technologies. We had Raspberry Pi, Arduino, custom things that I don't quite understand and more besides all used as the basis for a number of very interesting projects.

The computer science and engineering students from the University of Bristol continue to impress. Anthony really hit the nail with his sound wave generator which produced a fantastic musical accompaniment for the day when hooked up to Apple's Logic Pro X. If you're reading this and looking to hire an audio engineer then he definitely deserves the job!
Andrew's marvellous musical vibrations

Directly opposite was Matthew Hockley with a swarm of cute robots that were running a simple algorithm related to locality of their neighbours triggering different light patterns. We talked about how us fallible humans like to anthropomorphise whenever given a chance to do so and I postulated that the random movement of his swarm would be seen as "good" or "evil" if he put green smiley faces or red angry faces on top of each robot. Matthew agreed that we do tend to read more into such critters than is deserved as they're not really responsible agents (an update to the Three Laws that were just a plot device for Asimov and not something to base a real robot on) as Alan Winfield notes in his excellent, accessible book, Robotics: A Very Short Introduction.

They appear to be benign, but if you look closely you can see them plotting world domination.

Students and a teacher from Cotham School were back with their arcade cabinet, and this time also had two "Mini Me" versions (as I like to think of them) present. Sadly I forgot to get a photo, but these proved extremely popular. I think the brief goes along the lines of: "yes, you can play computer games at school providing you program those games." It's a great idea, very well executed.

Talking of schools: I had a great chat with Stewart Edmondson, CEO of the UK Electronics Skills Foundation. They believe absolutely that teaching software is not enough and that kids should be getting hands on experience of electronics. I wholeheartedly agree! As I started secondary school in the 1980s I caught the last of the software-related computer lessons before "IT" became "ICT" with the "C" somehow (apparently) meaning "Word & Excel". However I never learnt electronics in school and feel very much I'm enormously behind the learning curve here. Although I've built my own circuits, read lots of tutorials in books and The MagPi magazine and bought and experimented with stacks of components it all does feel very unstructured, as though I am missing the fundamental underpinnings that school ought to have taught me. There is a huge benefit to learning things when your brain is still wired to absorb knowledge like a sponge. At Digimakers they brought along an electronics project kit called MicroBox to get those brain cells firing and this proved very popular.

Ok, so what has all this to do with the title of this post? One of the workshops focussed on Quantum Computing for kids (yes, you did read that right!) While I unfortunately was unable to get away from my stand for long enough to listen in I had a wonderful conversation with a 14 year old girl who popped over afterwards. It started in just the way you don't expect a conversation with a teenager to start: "I'm off to Google to study quantum computing as a way to break ciphers." We then conversed about such things, including a detour to discuss the shape of the universe and the relative sizes of different infinities, the difference between passive and active hacking (which, fortunately she is very aware of - this difference needs to be taught in schools!), that she'd spent the morning learning about ciphers in Python in one of the sessions and that she's already up to speed on inspecting web elements and the like... Awesome. This was the highlight of the day for me.

The next Digimakers is on October 29th at At-Bristol. If you are planning on attending you should register in advance as this event is very popular.

Zumo George avoids Cartmanzilla at CukeUp!

On Thursday 14th and Friday 15th April I went to CukeUp! London 2016. The Behaviour-Driven Development community met for two days to share ideas and skills relating to my favourite delivery methodology. The event was fantastic and on a par with last year. Inspired by day one, and with an open slot to deliver a lightning talk lasting just five minutes I set about writing a presentation describing Cartmanzilla versus Zumo George... at 2am... after a few beers and a rather tasty gin. The following morning I re-wrote much of my talk to eliminate 95% of the whiz-bang transitions that had somehow crept into several slides (not sure how that happened). For some reason I had also thought a clearly marked slot for a "5 minute" talk was 10 minutes in duration (proof positive that gin slows the passage of time), and quickly edited it again after confirming with Matt Wynne that 5=5 and not 5=10. Still, overall I managed to get 80% of my message across in just minutes.

Skills Matter, who hosted CukeUp! have kindly put a video of my talk online. You will need to register (painless and quick) with Skills Matter to view it.

A quick recap: Cartmanzilla the monster has invaded robot city and the plucky little robots have to keep away from him. Only Zumo George is programmable, and his general behaviours (keep away from monsters) are determined by feature files that contain behavioural specifications written in the Gherkin syntax of BDD:

Given [a precondition]
When [an event]
Then [an outcome]

Each line of the Gherkin causes a related block of test code to be executed, and when every line of test code passes your software is green, i.e.: the behaviours are working as expected.

I've covered Zumo George and the use of BDD with this robot over a few prior posts, including a specific write-up of Cartmanzilla vs Zumo George at Bristol Digimakers. What is interesting as I read back over previous posts, and I noted this in my talk, is that my first attempts to write scenarios were essentially attempts to describe the functional aspects of George, where-as my later attempts are closer to the behaviours that I originally envisaged: sneaking towards the monster when he's not paying attention and fleeing when the monster gives chase.


George visits Digimakers to learn about whiskers

I spent yesterday at Bristol Digimakers having a fantastic time meeting lots of young people who had come along to the event to learn more about coding, robotics, Minecraft, robotics and robotics. There was definitely a theme going on. Digimakers has grown to be the place go to get hands-on experience of hacking and making. Backed by University of Bristol (kudos to the ever-energetic Caroline who does a great deal of the organising) and supported by a host of students and other individuals running their latest coding and hardware inventions a great vibe could be felt all day long.

As usual at Digimakers I set up a table with various demonstrations using Raspberry Pis, mainly focussed around Zumo George, my Behaviour-Driven Development robot. This time around I also included some Crafty Robots, a Hexbug Ant and Cartmanzilla.


Cartmanzilla towers over the city. The little robots wonder how they will escape.

The aim of my table was to present two concepts: firstly programming robots based on defining behavioural outcomes (a right to left approach, for example: Event Storming) rather than a list of functional requirements (a left to right approach that may not lead to the desired outcome) allows non-technical people to be more involved in the creation of the robots that they will share their environment with. I've written about BDD with Zumo George before. Distilling the essence of BDD (conversations that discover outcomes of value that enable us to write tests that drive code) down to something that is easily digestible by youngsters proved challenging, but in general most seemed to understand. I think this was helped by having a working demonstration: Zumo George was given the behaviour of "don't get caught by Cartmanzilla" which in practical terms meant using his inbuilt IR sensor to retreat from Cartmanzilla when he approached, and to advance when Cartmanzilla retreated (all over the top of a lovely cityscape given to me by the great Tim Cox).

Secondly, I wanted to explore the idea of how prey avoids predators (and how predators catch prey) by looking at three different robots:
  • Crazy Robot just moves randomly and cannot react objectively to external stimulus (it can however sometimes bounce off things it bumps into)
  • Hexbug Ant has bump sensors front and rear and therefore can run away from anything it touches.
  • Zumo George can sort-of see (via his infrared sensor) what is in front and respond accordingly.
After playing with Cartmanzilla and the robots I asked two questions of the youngsters who came to my table:
  • If you were a mouse escaping from a cat which method (random, touch, sight) would you use to keep away from the cat?
  • If you were a cat trying to catch a mouse which method would you use?
For the first question everyone said sight, which is the obvious answer, as assuming that there is enough light for the mouse to see then this keeps a decent distance between it and the claws. For the second I was genuinely surprised that about a third of the students realised the cat would likely use a combination of sight and touch. Cats do just this: as they approach prey they primarily use sight, but when they make the final strike their whiskers swing forward to make contact with the prey which helps guide their jaws and claws in. To help reinforce this point I played a snippet from a BBC documentary that covers exactly this:

Watch the whole video or skip forward to 2m15s where they explain why and show a cat doing this. As the cat gets very close to the mouse it can no longer focus so it uses its whiskers to guide the prey to its mouth. If you have a pet cat you can likely see this in action: if your cat chases string or small toys then drag a toy in front of the cat to get it to almost-but-not-quite pounce (you may need to do this several times!) When the cat thinks about pouncing, but then gives up you can often (it's quick) see its whiskers twitch: that's the reflex starting to move them forwards (but stopping as the cat gives in). It is harder to see if the cat does pounce as this happens in the blink of an eye.

The interesting thing here is that my robot, Zumo George would benefit from exactly this kind of whisker technology. The Sharp GP2Y0A41SK0F infrared sensor is effective from about 4cm to 30cm. Hence, when an object is closer than ~4cm the sensor's vision is "blurred" and ineffective. This can be seen on the data sheet for the sensor in the graph on page four, which I have reproduced below. This graph shows the voltage returned on the analog pin for a given distance. Below about 3-4 cm the output voltage becomes wildly inaccurate. This is the point at which George's vision blurs resulting in him sometimes advancing and sometimes retreating, seemingly at random: he becomes little better at this distance at avoiding Cartmanzilla than the Crafty Robots.


Fortunately this is generally not a problem as we define the behaviour of George such that he should not get within 4cm of Cartmanzilla in a .features file that our Behaviour-Driven Development tool of choice (Lettuce in my case) can parse:

Feature: Move around the city avoiding monsters
In order for Zumo George to keep a safe distance from the monsters
As Zumo George
I want to retreat when the monsters get near

- Retreat if a monster is less than 15 cm away
- Advance if a monster is greater than 15 cm away

From the above feature we have articulated and agreed a general outcome: don't get trodden on by Cartmanzilla as it will ruin your day. We then continue the conversation to discover scenarios of importance. It turns out that there are three, wrapped up in a single line of background in which we agree how close to Cartmanzilla we think is safe, and we add these to the .feature file:

Given the minimum distance to the monster is 15cm

Scenario: Advance, there are no monsters
When the distance to the monster is 16cm
Then I should "advance" in relation to the monster

Scenario: Stand still, hopeing the monster won't notice me
When the distance to the monster is 15cm
Then I should "halt" in relation to the monster

Scenario: Retreat, there are monsters
When the distance to the monster is 14cm
Then I should "flee" in relation to the monsters

As you can see, we have defined George's behaviour to be that he should attempt, whenever possible, to stay at least 15cm from Cartmanzilla (the monster).

Behaviour-Driven Development works when people have conversations to discover new and different outcomes. It was great to work with the youngsters at my table to vary the minimum distance for George. We could immediately see the little robot scurry backwards when our outcome was that it was unsafe to be so close to Cartmanzilla or to scuttle forwards to a new minimum distance when we felt the outcome was that it was safe to be closed. Being able to talk about, model and try out the effects of varying outcomes in a safe way without causing George to immediately run amok and leap to certain doom from the table edge was great. The kids definitely seemed to enjoy this modelling exercise, and I did too.

Across the rest of the event a large number of other robots could be seen. Here's Steve. He talks back when you talk to him (and sometimes makes embarrassing mistakes):

This is Steve. Steve was apparently "getting a bit murderous".

Tim Cox ran an excellent workshop and had set up a cityscape full of vehicles, interconnected traffic lights each using PiStop (available from 4tronix) and an enormous (by comparison) Me Arm controllable by a Raspberry Pi and, I'm guessing, python-curses judging by the look of the output on the screen. I was impressed with the Me Arm. I have previously done something similar using the Maplin Robot Arm and the Pi, but I don't like the imprecise geared motors in the Maplin arm. By contrast the Me Arm was much more precise even though it too is not using stepper motors. The screen you can see is from Banggood.com.
"Watch out for the Evil Claw" cried the residents of PiCity.

Someone (sorry Someone, I didn't catch your name) had created a Sentry Gun complete with the required "beep......beep.....beep...beep..beep.beep.beep" heard in the film Aliens from the related motion tracker technology.
If you hear this noise run away and hide.

A couple of students presented a fruit-music maker connected to a Raspberry Pi. Their approach was different to what I have seen before as they were not relying on one completing a circuit (touch a wire, and with your other hand touch the fruit to make a sound play), but were instead relying on (we think) a capacitive drop when you touched the fruit ("touch the kiwi fruit and absorb its power").... or perhaps it was due to electromagnetic fields. They are currently going through a process of elimination as they learn how exactly this works. However it worked, it worked well.

Play that funky banana!

Various other workshops and exhibits ran throughout the day including working with a BBC Buggy and separately, Hedgehog Bots controlled by Arduino Nano and invented by Scott and Joe, graduates from University of Bristol. There was also a horizontal scrolling game controlled by a device one wears that picks up on electrical activity in the brain; you moved up by thinking more and down by thinking less... it was important to not actively think about thinking less. Sadly I forgot to get a photo of these great projects.

Saving the best to last there was Josh who presented an AWESOME Persistence of Vision project. Several rows of LEDs spinning at about 1000RPM (I think that was the speed...) He had animations running, could draw and persist a cube or the time and all sorts of other patterns. It looked great, was a tidy build and captivated us all like moths to a light bulb.

Must...not...look...at...the...lights. Oooooh shiny.

Digimakers has again lived up to expectations with Caroline and the team keeping everything running smoothly throughout the day.

The next event is currently scheduled for June, hope to see you there.

The MagPi: back in print

Issue 36 of The MagPi magazine was recently released as a downloadable PDF. What ticks the awesome box though is that from this issue onwards the magazine is again available in print.

I worked on ~25 of the first 30 issues of The MagPi writing articles, proof reading and undertaking layout, and recall the fantastic feeling of seeing the magazine printed (thanks especially to Ian McAlpine). With options to purchase from several online Raspberry Pi sites as well as three Kickstarter bundles (including binder) the obvious missing link was high street distribution. The MagPi has now been under the wing of the Raspberry Pi Foundation for six issues and Issue 36 is the first to be available in the high street.

I'll say that again, with emphasis: in the high street.

It takes an incredible effort to launch a new magazine and arrange for distribution to WH Smith and similar. A HUGE well done to Russell Barnes, magazine editor and the rest of the team.

With the magazine back in print what is it like?

Firstly, the print quality is exceptionally high. The front cover has a joint matt-gloss effect with the title, most of the text and the Minecraft Splat elements in gloss on a light blue background. The cover paper used is also a fairly heavy stock and will survive some bashing (as I discovered when the magazine became an inadvertent fly swat the other day). Internally each page is full-colour and exceptionally clear and easy to read. This feels like a professional magazine in one's hand because, well, it is a professional magazine. Russell and co really know their stuff.

With an increase in size to 100 pages the spine is thick enough that the magazine can sit on a bookshelf and the identity of each issue be determined from the spine. This does show the one drawback to a magazine of this thickness in that the pages will not lie flat. It's not a big problem, but it does mean that when following code tutorials with the magazine on your desk the pages tend to curve. Firmly (but not forcefully) pressing on the magazine once or twice will open up the pages further without damaging the spine.

Yes, you did read the above paragraph correctly: 100 pages. This is the largest normal (i.e.: excluding Special Edition 1) issue of the magazine yet. Russell and his team have produced an absolutely fantastic publication with numerous hardware and software tutorials, reviews and features. A quick flip through finds 11 pages of adverts (including three asking people to consider subscribing) which I feel is reasonable for a magazine of this size (and the adverts are all Pi-relevant). Personal favourites in this issue include Extra Lives talking about retro gaming and the book review pages as these cover not only Pi-specific books, but also books of related interest. This issue a column of the book reviews pages is devoted to security and penetration testing which is an incredibly interesting subject.

The tutorials cater for all ability levels with a straightforward LED exercise in Python on page 23 at one end of the spectrum and applying physical forces to Python games to model gravity on pages 58 to 63. This is a very clever bit of code that models the movement of spheres, or celestial bodies (think: planets and asteroids). My favourite quote in the whole issue is found here:

For each planet we have, we want to calculate its effect on every other planet

That's a tough ask! Fortunately the article goes into exquisite detail on both the maths and programming needed to accomplish this.

One downside of print though is that if errors creep in then they are irreversible (unless a new print run is undertaken). Before printing The MagPi Volume 1-3 bundles we went back through every single page to update the content for the B+ (which had not been released when we wrote the earliest issues) and to correct any errors we had subsequently found for just this reason. With The MagPi issue 36, as with every magazine, a few gremlin have made it through the editing process and hence have, in print at least, become irreversible. Take the LED article on page 23 for example. The instructions and diagram show to connect to GPIO4 and GND, but the photo shows GPIO3 and +3V3 being used. Likewise, the code listing stated to use GPIO.BOARD but the pinout diagram for the Pi is numbered for GPIO.BCM. As an introductory article, "Get started with Raspberry Pi", errors like this may confuse the reader.

Despite the occasional gremlin the overall quality of the content is first rate. A lot of effort has clearly been put into the magazine. Whether you find reading easier in print or in an electronic format is a very personal thing, and with The MagPi available in both you can take advantage of both, for example: a print magazine that you can search for text within.

The new look MagPi magazine looks great, feels great and has the superb content we all expect from the publication. Best of all, the print edition is now available for a reduced price to subscribers.

Highly recommended.

PaPiRus ePaper HAT: what about the software?

In my previous post I mentioned that I was still getting the software for PaPiRus installed. Well, success and all is well. Here are a few screenshots of the 2.7” screen. Note as before that this is a pre-production model and I’ve been told that the screen I am using is an earlier version than what will be shipped to backers.

PaPiRus is available for three more days on Kickstarter.




PaPiRus ePaPer HAT hardware preview

Aaron of Pi Supply has very kindly posted me a PaPiRus ePaper HAT preview unit. This is a work in progress and Aaron was open that the software side of things still needs work. Fair enough, I’m happy with that: after all in a pre-production unit one does not expect perfection. I have previously covered the Kickstarter and the advantages of using an ePaper display. Today we will be looking at the hardware.


The unit was supplied pretty much ready to go with the three sizes of display (1.44”, 2.0” and 2.7”) along with a coin battery and v1.4 of the HAT itself. The first thing to note, and I owe Aaron and team a beer for this design decision, is that the ribbon connector between display and board uses a click up, insert, click down affair, as opposed to pull, slot in, push that is used on the Raspberry Pi Camera module. This makes inserting the ribbon cable easier and the connection notably more secure when clicked back into place.

The rest of the HAT is similarly well thought out (there is even a slot in place to allow the camera ribbon cable to pass through). As with all HATs this will be equally in place on a B+, B v2 or A+ Pi. The combination of Raspberry Pi A+, PaPiRus and 2.7” screen all in a tasty case mounted on a wall displaying data of one kind or another is very appealing as this will be a very compact, very energy efficient solution for a multitude of projects. I can even see this combination being used as an in-car display screen connected to a car’s OBD port for the same reasons.

The upper side of PaPiRUS is mostly empty, giving you a flat platform to rest your chosen screen upon. Along the top are four tactile buttons which will make interacting with the Raspberry Pi and PaPiRus straightforward.


On the reverse we find lots of little components, all pre-soldered with precision. This isn’t some DIY rough-and-ready solution, but a professional product. It re-confirms what I’ve known about Pi Supply for a while: this is a professional outfit producing products that us consumers can rely upon.


For those who like technical details I got my magnifying glass out and noted the following (obviously this is all subject to change, this being a pre-production board):

IC1 (directly above the edge-mounted ribbon connector) is flash memory from Winbond, part number W25Q32FV. This gives me a clue that the firmware could be updated in the future as needed.
IC2 (up-left from the battery) is a NXP 8523T which the datasheet tells me is a real-time clock and calendar
IC3 (above IC2) is a NXP LM75BD which is a digital temperature sensor and thermal watchdog
To the left of IC4 (below the battery) we have a B6MY which Google tells me may be a folding pocket knife. I think my Google Ninja skills failed me at this point.

The hole for the reset pin is present on v1.4 of the board (below-left from the battery) and with less than £3k to go at the time of writing we’re sure to unlock this stretch goal.

UPDATED: Most intriguing of all on the v1.4 design is the inclusion of a second ribbon cable connector at CN4 (above battery). Aaron pointed out what I had completely failed to notice: that this is a GPIO breakout (see the last photo on the Kickstarter page).

The board is rock-solid, with no loose chippery to be found anywhere. PaPiRus is on Kickstarter for 5 more days and looks set to be an excellent way to add an ePaper display to your Raspberry Pi.


A Kit a Month from pocketmoneytronics

I arrived home today to find a parcel on my doormat. I’m always excited by parcels; what can it contain? It turns out that this is something I have been looking forward to for several weeks, Andrew Gale’s A Kit A Month Soldering Subscription from pocketmoneytronics.co.uk. I previously bought a little Christmas Tree circuit from Andrew late last year via Kickstarter so was very enthused by the idea of receiving small kits of parts month by month to learn more about electronics. Hence, I signed up to his new Kickstarter without delay.

The first kit is Nellie the nine volt robot.

Nellie comprises a small PCB, two green LEDs, three resistors, a diode and battery clip with a couple of wires attached. This is a very good kit because it does not try to overdo anything and is presented with easy to understand instructions. Plus, for myself, it is the first time I’ve encountered a discrete diode as a component so there was immediately something to learn.

In the photo below the kit of parts for Nellie is to the left of the battery holder. Andrew has also included a selection of parts to experiment with LEDs and a 7-segment display. The green square in the photograph is sandpaper, thoughtfully included to smooth off the PCB edges. It’s little touches like this that are great to see.


Also included is a great double-sided newsletter that carefully explains the components in a friendly (and not text-book dry) manner, and includes links and suggestions for further things to try out with the two kits.

Nellie was straightforward to solder

It is an absolutely great start to A Kit a Month Soldering Subscription, and I’m looking forward to each future kit.

Here’s Nellie disguised as a webcam.

And now she’s using her battery as a body:

A mechanical arm that can shift gears in less than a twentieth of a second

Look what Father Christmas brought me for Christmas:


Build time was about half of Christmas Day. It is a pretty good design and seems fairly sturdy. The weight of the 4 D battery cells keeps it nicely planted on the table and the inclusion of an on/off switch on the base keeps battery drain to a minimum. My only gripe would be that it does not use stepper motors and instead relies on a worrying “click, click, I’m failing” noise when any gear reaches the end of its travel.

So far it has been connected up to my Linux Mint netbook and runs a treat after following the instructions by maxinbjohn. Where John writes “Select the 'USB ROBOTIC ARM' and build the kernel” you will want to:

sudo make -f Makefile
sudo make install

from within the directory you downloaded the files to (or use git to clone the files). That seemed to work fine for me.

Now to connect it to one of my Raspberry Pi’s for a bit of Python programming.

The MagPi: help wanted

The MagPi
Well we’re almost there with Issue 11 of The MagPi. It is truly incredible to see how much effort goes into each issue. From authors to layout to graphics designers, to proofreaders and the testers it is the community that has sprung up around the magazine (and the Pi of course) that makes the magazine such a success. I’ve discovered that the opportunities to be involved are virtually endless. I’ve written several articles for the magazine but at the moment am concentrating on layout, proofreading and testing if only because it gives me more time to read all of the great stuff others are writing about.

Then on top there’s the events? Did I mention them...? No...? Well back in December Meltwater (of The MagPi ofc) and I attended an event at @Bristol in, um, Bristol that gave us the opportunity to meet a few hundred young people and assorted parents and teachers. We had Pi’s on display running various software and hardware demos, and some additional GPIO gadgety-things for all to ooo and aaahh at. All under The MagPi banner. This month we are back at @Bristol for another 200+ young person event. Superb and great fun! Others are attending / running their own Raspberry Pi events and the team at The MagPi is always available to support each other in these escapades.

So this is a shameless plug: if you’re reading this and you read The MagPi and have thought: “hey I could write an article on Gadgety-Widgety-Thing” or “I like those graphics, but I think there should be more pink” (or neon green...) then The MagPi will more than welcome you. Email the editor for more information. And if I meet you at an event then I’ll buy you a beverage of your choice to say thank you.

A little diversion: Mezzo d9 folding bicycle

A little off topic this one, but hey why not. I just purchased a Mezzo D9 folding bicycle. List price £725, bought via eBay bid for just less than £400. It’s 5 years old (the spec hasn’t changed since then as far as I can tell) and is in excellent condition.

Although nothing to do with the Raspberry Pi (although I’m toying with the idea of somehow wiring in a counter to count the number of times I fold & unfold it) it is very similar to the Pi in one respect: a very well thought out and cleverly engineered machine. The folding mechanism uses self-locking clasps and quick release mechanisms throughout, meaning one doesn’t have to twist, twist and twist again some bolt or other during the folding process. Also the main cross bar is solid with no folding joins. This gives the bicycle a certain degree of rigidity I’ve not seen on other bikes. It rides well, albeit with a lowish top speed as it only has 9 gears with the ratio of each being quite close in each case.

And to top it off it has 16 inch wheels so Mr Railway is A-OK with the bicycle (First Great Western define a folding bicycle as having 18 inch wheels or less. Amusingly their T&C of carriage do not require that a folding bicycle actually... ahem... folds. They’re a bit fixated on how embiggenned your wheels are).

A couple of pics:
Mezzo D9Mezzo D9 folded

Eurohike solid shell case accidentally for Raspberry Pi

Well would you believe it. Eurohike appear to be making a case for the Raspberry Pi, albeit accidentally. I was in Blacks today and spotted a yellow hard shell mobile phone size Eurohike Safe Case. It is sold as waterproof (I’ve not tested this) with a decent seal and a very solid snap lock clasp. It is made out of tough ABS plastic and also claims to be shockproof. Inside it has a nice foam padded compartment large enough for the Pi with SD card inserted. And for those that want to know: it is rated IP68. To find out what that means head on over to: http://en.wikipedia.org/wiki/IP_Code.

EurohikeCase1 EurohikeCase2
EurohikeCase3 EurohikeCase4

Priced at a respectable £15, discounted from £20 equals a bargain in my eyes. It is perfect to transport one of my Pi’s to events as it permanently has a Nokia 3310 LCD + buttons shield attached and hence I need easy access meaning one of the usual Pi cases isn’t quite what I’m after.

I’m likely going to also use it as a waterproof enclosure to mount alongside and control my Nikon AW100 waterproof camera. That is, when I have plucked up the courage to drill a hole to pass a USB cable through the Eurohike case.

Highly recommended, especially with the discount at the moment. Available over at the Blacks website.

Mudding on Raspberry Pi (Part 1)

A Raspberry Pi games console for online text-based Multi-User Dungeons (MUDs)... and you can play while sitting on your sofa. Read More...

Raspberry Pi - the new Amiga community?

It was 1993 and I got my first Amiga... fast forward and the Raspberry Pi has in my my eyes taken up the banner the Amiga Community once flew. Read on for a bit of nostalgia and hope for the future of cool computing. Read More...

Raspberry Pi: do NOT read the FAQs!

Here is an odd one for you. The one web page that seems to give my shiny new RPi more headaches than any other I have tried is the official RPi FAQs page: www.raspberrypi.org/faqs . This is a very long page. No, that is an understatement: it is a veeeeerrrry long page. Even my iMac took a few seconds to start rendering it. The RPi just chugs on this page.

Moving the gazillion comments out to per-FAQ sub pages may help, but there you have it, you have been informed ;)

LXDE on Pi, not the fastest cookie (but that's ok)

In my last post I mentioned that I had used the command startx to run the LXDE GUI, aka “Gnome without the hassle” (IMHO). This time around I explore a little of what LXDE has to offer on the RPi.

Raspberry Pi - first hands on moment

Opening the miniscule box that the Raspberry Pi comes in (if it were a plug computer it would challenge them all in the size stakes) myself and those around me at work started making the obligatory “ooo” and “aahh” noises.