Time-lapse Photography: Pi Studio

[Article in Beta v0.5]

At the weekend I returned to Bristol Digimakers after missing events in 2017 due to badly timed holidays. I've written about Digimakers numerous times before and am a huge supporter of this Bristol University student-run tech event. Every time I go I love discovering new, cool, interesting, practical, and not-so practical things one can do with computers. Normally I take along a Zumo robot along with various bits and pieces (typically a selection of Raspberry Pi HATs). It's time for something different, and hence I presented Raspberry Pi Lightbox Studio. This is an attempt to create a low-cost photo studio in the home / bedroom / kitchen for an all in price of under £100 (including Pi and camera).

time-lapse_digimakers_17022018

I've been keen on trying time-lapse photography at the small or macro scale. An example: I have a tub of Science Putty which is fun for all of five minutes as one pulls it into shapes and then starts to watch it collapse back to its original tub-shaped state. I say "five minutes" because that is about the limit of my attention span as it is a rather slow process. Enter time-lapse photography to the rescue.

Time-lapse photography is a way of compressing time: a long duration is viewed as a video in a shorter period which has the result of accelerating the rate of change in a scene. Most of us have seen examples like this in films and on YouTube. One way to think of this is the exact opposite of Bullet Time featured in The Matrix, which is a pretty advanced technique that gives the appearance of slowing time (while introducing 3-dimensional camera movement). If Bullet Time (with the help of CGI) makes things appear to move s-l-o-w-l-y then time-lapse makes things move more quickly. Ahah! I do not have to watch my Science Putty collapse in real-time. I can create a video that accelerates the process.

To achieve the effect we take many individual photographs and then stitch them together into the video. Normal videos usually run at around 24 frames per second. For comparison to watch ice melt I take one photo every 15 seconds (your setting may vary). It will therefore take 15 x 24 = 360 seconds (or 6 minutes) to take the 24 frames required for one second of footage. We are therefore accelerating time 360 times! Ice will melt in front of our eyes.

To create a time-lapse effect you need a way to reliably take multiple photographs. Assuming you won't do this by hand (oh the boredom!) you need to be able to program your camera to do this process automatically either with an intervalometer or via the settings on some modern cameras. This is where the Raspberry Pi comes in. Coupled to a Pi Camera (I'm using v2.1 but this will work with v1 as well) we have a very fine degree of control over the creative process of creating a time-lapse movie as the raspistill software features a time-lapse mode.
time-lapse2_digimakers_17022018
Assuming you have followed the software set-up in the above guide the following steps will take you through the process:

  1. Select target and compose scene
  2. Take sample photo
  3. Re-adjust
  4. Re-take sample photo and check all is well
  5. Run raspistill to capture individual photos using the time-lapse option
  6. Stitch the images together into a movie
  7. Watch and enjoy.

You may want to use a stand-in for your target when setting up. For example, for a time-lapse of an ice cube melting use a d6 dice as a temporary target (or a whisky stone) and keep the ice in the freezer until needed. Else, you will miss some of the action.

To create a good time-lapse technique several challenges present themselves and each must be overcome. I* broadly split this into:

  • Subject
  • Lighting
  • Photographing
  • Stitching

* - other issues do exist, but get these four sorted and you'll be a long way towards creating excellent time-lapse

Subject


Essentially, you have two choices: go big or go small. If big (e.g.: clouds) then you will have less control over lighting (see below) but can create amazing views of our planet. If small then you need to be aware of potential close-up focus issues as the Pi camera struggles to focus below 20cm by default. This can be corrected but do read several sources of information on this first to avoid damaging the camera. YouTube has a good video showing the process. For those not liking the combination of pliers + delicate camera Adafruit have the answer. Or, for your first attempts just don't worry about it (I didn't). Does an ice cube melting need to be in perfect focus? I'll leave that to you to decide.

You should position the Pi Camera with a good view of your subject, and take sample photographs to ensure all is well before starting the time-lapse process. It is essential that you do not nudge either camera or subject in any way during the image capture process or the effect will be ruined. Motion time-lapse, where the camera rig is intentionally and automatically moved along a rail between frames is a whole subject in its own right, and with perfection can create some staggering results.

Lightning


One of the banes of time-lapse photography is flicker. This is created when the lighting between each frame differs slightly and/or the camera changes its exposure. With the controls of an SLR camera (such as the excellent Nikon D7200 that I use when not Pi'ing) the latter can be resolved by using manual exposure mode and white balance settings. There are several ways to adapt to changes in lighting. You could, for instance, embrace lighting changes in your project. Time-laps of clouds during the day -> evening -> night often do this. Alternatively for controlled indoor scenes (Science Putty) you can use a lightbox that provides a controlled, consistent lighting array (ideally one that minimises shadows).

I am using a £12.99 bit of kit from Puluz. This lightbox has two LED strips top-front and top-back that provide even lighting, albeit do be aware of bright sunshine from the front as that can still change the light level. From Puluz's promotional images such as that below you can see that an added benefit of this lightbox is that a circle can be removed from the top to take photos from above the subject. As a piece of kit it is actually a lot better than its meagre price tag would suggest. Better light boxes are definitely available but at this price point it is worth giving it a go. The two LED arrays with supplied cables (that each require plugging in - use two USB ports on your Pi 3) are bright and provide even lighting throughout the box.
Puluz Lightbox

Photographing


When your scene is set use the raspistill command to start taking photographs. Now, go make a cup of tea (or several) as the process will take a long time. Remember: we are accelerating time in video, not in reality (not even the Raspberry Pi can do that...yet.) ;)

The guide on time-lapse photography from the Raspberry Pi Foundation is very handy here. Assuming you have updated your Pi to the latest firmware and enabled the camera then the key commands to enter at the terminal prompt are a variant of:

mkdir tlcd tlraspistill -t 30000 -tl 2000 -o image%04d.jpg

First we need to make a directory (mkdir) in which to store our time-lapse photographs as the camera will create a lot, and hence create a bit of a mess in any folder you are currently in. We change directory (cd) to the tl folder (cd tl) and then run raspistill with parameters to output images:

-t switch is the amount of time to run the time-lapse for. In this example 30000 miliseconds, which is 30 seconds.
-tl is the frequency to take photographs, 2000 milliseconds being 2 seconds in this example.
-o is the image name to save the file to disk with %04d being a numerical increment. The 4 means pad with up to four preceding zeros. This will create images numbered 0000, 0001, 0002, etc, i.e.: image0001.jpg, image0002.jpg, image0003.jpg etc.

You should vary the -t and -tl settings as needed. Some experimentation is needed here. To melt a large ice cube of BB8 from Star Wars I used -t 14400000 which is a lot of milliseconds! This equates to 4 hours (1000ms x 60s x 60m x 4h). I decided to take a photograph every 15 seconds, setting -tl to 15000 to do this. This generated 960 images (14400000/15000) and required about 4GB of space on my SD card.

Allow the time-lapse to run to completion. Note that if a frame or two ends up corrupted / showing waving fingers in front of the lens then this can be corrected by deleting the offending image, copying its next nearest neighbour and renaming to fill the gap. This does result in two duplicate frames, but at a rate of 24ish per second no-one will likely notice.

Stitching


When all the images have been created ... TBD

The end result


Here is a movie... TBD

Kit needed


  • Raspberry Pi 3. Other Pi's can be used however the encoding process to stitch the individual frames together is processor intensive. Even the Pi 3 only manages just over 1 frame per second in contrast to my 2014 MacBook Pro that regularly hits over 9 frames per second.
  • Official Raspberry Pi Power Supply. With the Pi 3 having to do some serious processing during the stitching process, coupled with the juice required for the two LED arrays on the Puluz plus the camera having a decent PSU makes sense. Note that kits are available that combine the Pi, PSU, SD card (with NOOBS pre-installed) for a good combined price are available. If you want to keep the lid on your Pi when the camera is attached do look for a kit with a suitable notch, such as the Pimoroni Pi 3 Essentials Kit.
  • Raspberry Pi Camera v2.1 (or v1 if you have one handy).
  • High capacity SD card. I use Integral cards. 16GB should be considered a minimum if you intend to use the card to save photographed frames and the stitched movie. Alternatively consider a 32GB SD card or an external device such as the Kingston 32GB (or higher) USB 3 external drive.
  • A mini tripod or similar to stabilise the camera.
  • ...for which you'll need an adaptor. I found I did not need a bolt to hold it onto the tripod nut despite what reviews state. Note that blu-tac should probably be avoided as a substitute as it tends to give way slowly which could be noticeable as the angle of your camera gradually changes over a number of frames.
  • Something to raise the Pi up a bit. I found that the limited camera connector length meant that I needed to get my Pi closer. I had one of these Clingo things lurking around and it does the job well. Alternatively use the boxes the Pi and Camera come in. Alternatively a longer ribbon cable does the trick. Pimoroni sell these.
  • A keyboard and mouse for the Pi. I suggest looking at Rii's offerings. I use the Rii i8 for when I need to enter the occasional command into the Pi which is available in RF (Radio Frequency) or Bluetooth variants. Note that the i8 is not suitable for prolonged typing.
  • A monitor to preview and view your movies. Or work headless by connecting your Windows / Mac laptop to the Pi and using it purely to acquire images.

If you use a headless setup this used to mean you could not view a live preview of the Pi Camera's output, even with VNC. Fortunately recent versions of RealVNC, bundled with the Pi, solve this problem. RealVNC provides a good write-up on their website. However if live previews are not important then a purely command line experience works well: you can always download the files to your computer (using SFTP in Filezilla, Transmit or similar) and view as needed.

I'm going to call out the Manfrotto PIXI EVO 2 that I bought as perfect for the job. With a claimed weight of 2.5kg (750g for the camera body) it doubles as a table-top tripod for holidays. At about £35 it is something of a bargain given the features, which includes easy head angle adjustment and ability to operate at a variety of heights. Being able to alter the height and angle of the head is a must given you will want to target your subject precisely.
Manfrotto PIXI EVO 2 mini tripod

Pro tip: the Pi Camera comes with a little sticky protector tab over the lens. Do remember to remove it before starting to capture images! Also, keep it to use as a mini lens protector if transporting / storing your kit.


http://www.clingo.com/home-and-office/universal-podium

Subject ideas for time-lapse photography


TBD
Comments

Of course the Pi runs NetBSD

NetBSD
NetBSD, a rather good Unix-like operating system has released support for the Raspberry Pi Zero amongst other Raspberry Pi boards. Version 7.1 was made available on 11 March and can be downloaded from the NetBSD site. Instructions are also provided.

I first came across BSD years ago when a version was supplied on a magazine's cover disk for the Amiga. It started my absolute love of Unix, which culminated in my purchasing two Sun workstations (an Ultra 5 and Ultra 10) to run Solaris some years back. The FreeBSD project has a page that briefly covers the history of the various BSD operating systems and is worth a read.

What is very, very notable about Unix and BSD is stability: the release cycles are such that upgrades happen at a steady pace with very stable component packages. Oh, and to add: NetBSD supports a ridiculous list of computers. Scanning through that list I can see the Acorn Archimedes, Amiga (huzzah!), Cobalt Microservers (I owned a Sun Cobalt Raq 4 for quite a while), Psion PDAs (yes, really), Sega Dreamcast (yes, really really) and many more.

As NetBSD's tagline says: "Of course it runs NetBSD".

I am definitely going to be running NetBSD on one of my Raspberry Pi boards soon. If you are looking for an interesting alternative to Raspbian then do give NetBSD a try.
Comments

Zumo George gets upgrades (part 1)

Everything eventually needs an upgrade. You may think that pencil 1.0 was great, but if Apple has taught us anything: we all need pencil 2.0. I jest, although that said it is time for Zumo George, one of my Raspberry Pi robots to receive the 2.0 make-over. This is brought on by two things:


Previously I had thought of upgrading from the Raspberry Pi A+ to a Zero purely to save some space, enabling me to get a bit o'real-estate back as George measure but 10cm x 10cm. However I would still have the WiFi dongle a-dongling, only it would be dangling from a micro to full-size USB. Dongles dangling from dongles (there's a song in there somewhere) made me sad: "if only a variant of the Zero came with WiFi", I thought. Fantastic news Pi fans: the Foundation delivered.

The Raspberry Pi Zero W is essentially a Zero (same CPU, same RAM, same form factor) with the added bonus of a combined WiFi and Bluetooth chip. Also for our inner geek the Foundation has included the coolest antenna I've seen yet which features a triangular resonant cavity. The MagPi magazine covered the antenna in detail just the other day in Issue 55. Proant, a Swedish company, have licensed the tech to the Foundation.
TheMagPi_55_ZeroWAntenna
The MagPi, Issue 55. Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported (CC BY-NC-SA 3.0)

Given the move to the slimmest of Raspberry Pi's it is also time to move from the Pimoroni Explorer Pro to the Explorer pHAT. This half-pint size board has many of the features of it's larger sibling and is a perfect match for the Zero W.

Putting it all together here are collection of parts:

Zumo George Pi Zero W upgrade

Any observant bod will quickly notice something missing. Yes I hang my head in shame and join the "forgot to order a 40-pin header for the Zero" club. D'oh! eBay quickly to the rescue. Given this tiny omission the build is on temporary hold for a few days. Still, let's get the blade in place because sumo blades == awesome. While we're at it let's have a preview of where the Zero is going to go. With all ports along one long edge I can now have these poking backwards from George. You can also see the extra space I am gaining from the move to the Zero W from the A+.

ZumoGeorge_sumo_blade

I am expecting great things from the sumo blade and am already thinking about how to modify my BDD behaviours and code to take advantage: Zumo George shall no longer retreat in fear from Cartmanzilla.

Stay tuned for Part 2, entitled: "Ahah the header has arrived!"

PS: yes those wires are going to get significantly shortened ;)
Comments

It's MeArm Pi on Kickstarter

Kickstarter can be a wonderful place to support great new ideas. One project that has sprung up and captured the hearts, minds (and pledges) of folk is MeArm Pi from Mime Industries. Following on from the very successful original MeArm robot arm Mime are presenting something great to the Raspberry Pi community. The project has already smashed it's £10k goal with almost £47k pledged at the time of writing. Doing the maths on the pledges that represents 844 arms at present. That's a lot of robotic hands to shake! Best of all: you still have until 6pm on March 8 to support the project and acquire your very own robot arm.

Mime describes MeArm Pi as "easy to assemble and not requiring extensive knowledge of electronics, the MeArm Pi STEM kit helps kids and adults learn robotics and teaches them how to code." That's cool. Very cool: robot arms are fun, programming is fun, and programming robot arms is twice the fun.

MeArm Pi

I briefly interviewed Ben Pirt joint founder of Mime. His passion for the new MeArm is clear: a desire to create a functioning robotic arm platform that simplifies the construction process enormously.

CD: What was the motivation to change the design of MeArm for MeArm Pi?

Ben: "The first MeArm has been built thousands of times (including a fair few times ourself!) and we wanted to broaden its appeal and try get even more children involved in making and programming it. So we decided to look at which parts of the build were particularly difficult. The number of screws came out as a big issue that was catching people out so we tried to re-work the design wherever possible not to need screws. Now the only screws left are on the joints where two pieces hinge together. The grip had a major re-work (from 9 screws down to 1) which made it much simpler to build."

It's worth pausing and considering this: the number of screws and fiddly components in a build really can influence the complexity and hence accessibility of the product. When I received the Maplin robot arm for Christmas a few years back I spent several hours putting together gear boxes, ensuring all was aligned and assembling the thing. While highly enjoyable in its own way (who doesn't like to build things) it was also frustrating: that's a lot of components to assemble *just* to get a fairly simple robot arm up and running! Mime's keen attempt to solve this build complexity problem is admirable.

Once I had built the Maplin arm I wanted to program it using a record and playback mechanism in Python. It was at this point I hit a few snags as precision playback just isn't easily possible with normal motors, and again it looks like MeArm Pi has overcome this issue.

CD: How accurate are the servos with MeArm Pi, i.e.: can you reliably pre-program repeatable movements?

Ben: "The servos are pretty accurate - they use metal gears for extra reliability. The big difference from the Maplin arm is that servos can be relied upon to be nicely repeatable so you can program them to do things again and again. Servos won’t drift out of calibration like motors."

Having non-drifting motors sounds like a dream come true! Don't get me wrong: I love the Maplin arm and easily recommend it to everyone as a low-cost way to get into robotics on the Raspberry Pi. Now though, Mime are offering a viable alternative that combines the hardware with ease of programming. Talking of programming, I asked Ben what else makes the MeArm so great:

Ben: "I think there are a lot of things that make the MeArm Pi better than the Maplin arm:
  • children build it themselves so they get a better understanding of how the mechanics works
  • the motor control is easier from the Raspberry Pi and can be programmed in any number of programming languages
  • the software is better and more suited to beginners"

It's worth noting that supplying purpose built control software to get up and running quickly is a great idea: it's what makes projects like Pi-Top so readily accessible for instance. Software for the Maplin arm does exist: we covered this in earlier issues of The MagPi a couple of times however it involves getting ones head around the internals of the USB protocol and while learning about USB Vendor IDs is "fun" in one way it certainly isn't conducive to encouraging people new to robotics into the hobby.

Ben also tells me that the age range of the arm is "officially...11+ but with some parental supervision it can be built by as young as 8 or 9 without too many problems." Producing a product that is interesting and accessible to age groups from primary to adult is a great achievement: "We believe in helping children to have fun whilst learning about technology and the MeArm Pi is completely designed around that goal". Superb.

It seems that MeArm Pi is not the only product that Mime are looking at for the future too:

Ben: "This is the first new product from Mime Industries since we formed the company. We’re going to be taking another look at updates to Mirobot as well as rolling the improvements to the MeArm mechanical design over to the other versions. We’ve got lots of ideas for new products but you’ll have to stay tuned for those!"

And stay tuned I most definitely shall.

MeArm Pi, available for another 6 days on Kickstarter.

MeArm Pi
Comments

It's all gone quantum at Digimakers

Last Saturday I had a great time at Bristol's Digimakers. I regularly attend this superb event, running a stand and get the opportunity to talk computers and science with children, parents and teachers. This time around I focused on Behaviour-Driven Development (which I've covered before) with a side order or LED and ePaper displays for the Raspberry Pi and Pi Zero from Pi-Supply.
Digimakers June 2016

Several organisations and lots of students ran demonstrations, workshops and drop-in help sessions throughout the day. This is something especially neat about Digimakers: it's not focussed on a single technology as the supposed solution to all scenarios, but instead showcases lots of complementary technologies. We had Raspberry Pi, Arduino, custom things that I don't quite understand and more besides all used as the basis for a number of very interesting projects.

The computer science and engineering students from the University of Bristol continue to impress. Anthony really hit the nail with his sound wave generator which produced a fantastic musical accompaniment for the day when hooked up to Apple's Logic Pro X. If you're reading this and looking to hire an audio engineer then he definitely deserves the job!
Andrew's marvellous musical vibrations

Directly opposite was Matthew Hockley with a swarm of cute robots that were running a simple algorithm related to locality of their neighbours triggering different light patterns. We talked about how us fallible humans like to anthropomorphise whenever given a chance to do so and I postulated that the random movement of his swarm would be seen as "good" or "evil" if he put green smiley faces or red angry faces on top of each robot. Matthew agreed that we do tend to read more into such critters than is deserved as they're not really responsible agents (an update to the Three Laws that were just a plot device for Asimov and not something to base a real robot on) as Alan Winfield notes in his excellent, accessible book, Robotics: A Very Short Introduction.


They appear to be benign, but if you look closely you can see them plotting world domination.

Students and a teacher from Cotham School were back with their arcade cabinet, and this time also had two "Mini Me" versions (as I like to think of them) present. Sadly I forgot to get a photo, but these proved extremely popular. I think the brief goes along the lines of: "yes, you can play computer games at school providing you program those games." It's a great idea, very well executed.

Talking of schools: I had a great chat with Stewart Edmondson, CEO of the UK Electronics Skills Foundation. They believe absolutely that teaching software is not enough and that kids should be getting hands on experience of electronics. I wholeheartedly agree! As I started secondary school in the 1980s I caught the last of the software-related computer lessons before "IT" became "ICT" with the "C" somehow (apparently) meaning "Word & Excel". However I never learnt electronics in school and feel very much I'm enormously behind the learning curve here. Although I've built my own circuits, read lots of tutorials in books and The MagPi magazine and bought and experimented with stacks of components it all does feel very unstructured, as though I am missing the fundamental underpinnings that school ought to have taught me. There is a huge benefit to learning things when your brain is still wired to absorb knowledge like a sponge. At Digimakers they brought along an electronics project kit called MicroBox to get those brain cells firing and this proved very popular.

Ok, so what has all this to do with the title of this post? One of the workshops focussed on Quantum Computing for kids (yes, you did read that right!) While I unfortunately was unable to get away from my stand for long enough to listen in I had a wonderful conversation with a 14 year old girl who popped over afterwards. It started in just the way you don't expect a conversation with a teenager to start: "I'm off to Google to study quantum computing as a way to break ciphers." We then conversed about such things, including a detour to discuss the shape of the universe and the relative sizes of different infinities, the difference between passive and active hacking (which, fortunately she is very aware of - this difference needs to be taught in schools!), that she'd spent the morning learning about ciphers in Python in one of the sessions and that she's already up to speed on inspecting web elements and the like... Awesome. This was the highlight of the day for me.

The next Digimakers is on October 29th at At-Bristol. If you are planning on attending you should register in advance as this event is very popular.
Comments