George visits Digimakers to learn about whiskers

I spent yesterday at Bristol Digimakers having a fantastic time meeting lots of young people who had come along to the event to learn more about coding, robotics, Minecraft, robotics and robotics. There was definitely a theme going on. Digimakers has grown to be the place go to get hands-on experience of hacking and making. Backed by University of Bristol (kudos to the ever-energetic Caroline who does a great deal of the organising) and supported by a host of students and other individuals running their latest coding and hardware inventions a great vibe could be felt all day long.

As usual at Digimakers I set up a table with various demonstrations using Raspberry Pis, mainly focussed around Zumo George, my Behaviour-Driven Development robot. This time around I also included some Crafty Robots, a Hexbug Ant and Cartmanzilla.


Cartmanzilla towers over the city. The little robots wonder how they will escape.

The aim of my table was to present two concepts: firstly programming robots based on defining behavioural outcomes (a right to left approach, for example: Event Storming) rather than a list of functional requirements (a left to right approach that may not lead to the desired outcome) allows non-technical people to be more involved in the creation of the robots that they will share their environment with. I've written about BDD with Zumo George before. Distilling the essence of BDD (conversations that discover outcomes of value that enable us to write tests that drive code) down to something that is easily digestible by youngsters proved challenging, but in general most seemed to understand. I think this was helped by having a working demonstration: Zumo George was given the behaviour of "don't get caught by Cartmanzilla" which in practical terms meant using his inbuilt IR sensor to retreat from Cartmanzilla when he approached, and to advance when Cartmanzilla retreated (all over the top of a lovely cityscape given to me by the great Tim Cox).

Secondly, I wanted to explore the idea of how prey avoids predators (and how predators catch prey) by looking at three different robots:
  • Crazy Robot just moves randomly and cannot react objectively to external stimulus (it can however sometimes bounce off things it bumps into)
  • Hexbug Ant has bump sensors front and rear and therefore can run away from anything it touches.
  • Zumo George can sort-of see (via his infrared sensor) what is in front and respond accordingly.
After playing with Cartmanzilla and the robots I asked two questions of the youngsters who came to my table:
  • If you were a mouse escaping from a cat which method (random, touch, sight) would you use to keep away from the cat?
  • If you were a cat trying to catch a mouse which method would you use?
For the first question everyone said sight, which is the obvious answer, as assuming that there is enough light for the mouse to see then this keeps a decent distance between it and the claws. For the second I was genuinely surprised that about a third of the students realised the cat would likely use a combination of sight and touch. Cats do just this: as they approach prey they primarily use sight, but when they make the final strike their whiskers swing forward to make contact with the prey which helps guide their jaws and claws in. To help reinforce this point I played a snippet from a BBC documentary that covers exactly this:

Watch the whole video or skip forward to 2m15s where they explain why and show a cat doing this. As the cat gets very close to the mouse it can no longer focus so it uses its whiskers to guide the prey to its mouth. If you have a pet cat you can likely see this in action: if your cat chases string or small toys then drag a toy in front of the cat to get it to almost-but-not-quite pounce (you may need to do this several times!) When the cat thinks about pouncing, but then gives up you can often (it's quick) see its whiskers twitch: that's the reflex starting to move them forwards (but stopping as the cat gives in). It is harder to see if the cat does pounce as this happens in the blink of an eye.

The interesting thing here is that my robot, Zumo George would benefit from exactly this kind of whisker technology. The Sharp GP2Y0A41SK0F infrared sensor is effective from about 4cm to 30cm. Hence, when an object is closer than ~4cm the sensor's vision is "blurred" and ineffective. This can be seen on the data sheet for the sensor in the graph on page four, which I have reproduced below. This graph shows the voltage returned on the analog pin for a given distance. Below about 3-4 cm the output voltage becomes wildly inaccurate. This is the point at which George's vision blurs resulting in him sometimes advancing and sometimes retreating, seemingly at random: he becomes little better at this distance at avoiding Cartmanzilla than the Crafty Robots.


Fortunately this is generally not a problem as we define the behaviour of George such that he should not get within 4cm of Cartmanzilla in a .features file that our Behaviour-Driven Development tool of choice (Lettuce in my case) can parse:

Feature: Move around the city avoiding monsters
In order for Zumo George to keep a safe distance from the monsters
As Zumo George
I want to retreat when the monsters get near

- Retreat if a monster is less than 15 cm away
- Advance if a monster is greater than 15 cm away

From the above feature we have articulated and agreed a general outcome: don't get trodden on by Cartmanzilla as it will ruin your day. We then continue the conversation to discover scenarios of importance. It turns out that there are three, wrapped up in a single line of background in which we agree how close to Cartmanzilla we think is safe, and we add these to the .feature file:

Given the minimum distance to the monster is 15cm

Scenario: Advance, there are no monsters
When the distance to the monster is 16cm
Then I should "advance" in relation to the monster

Scenario: Stand still, hopeing the monster won't notice me
When the distance to the monster is 15cm
Then I should "halt" in relation to the monster

Scenario: Retreat, there are monsters
When the distance to the monster is 14cm
Then I should "flee" in relation to the monsters

As you can see, we have defined George's behaviour to be that he should attempt, whenever possible, to stay at least 15cm from Cartmanzilla (the monster).

Behaviour-Driven Development works when people have conversations to discover new and different outcomes. It was great to work with the youngsters at my table to vary the minimum distance for George. We could immediately see the little robot scurry backwards when our outcome was that it was unsafe to be so close to Cartmanzilla or to scuttle forwards to a new minimum distance when we felt the outcome was that it was safe to be closed. Being able to talk about, model and try out the effects of varying outcomes in a safe way without causing George to immediately run amok and leap to certain doom from the table edge was great. The kids definitely seemed to enjoy this modelling exercise, and I did too.

Across the rest of the event a large number of other robots could be seen. Here's Steve. He talks back when you talk to him (and sometimes makes embarrassing mistakes):

This is Steve. Steve was apparently "getting a bit murderous".

Tim Cox ran an excellent workshop and had set up a cityscape full of vehicles, interconnected traffic lights each using PiStop (available from 4tronix) and an enormous (by comparison) Me Arm controllable by a Raspberry Pi and, I'm guessing, python-curses judging by the look of the output on the screen. I was impressed with the Me Arm. I have previously done something similar using the Maplin Robot Arm and the Pi, but I don't like the imprecise geared motors in the Maplin arm. By contrast the Me Arm was much more precise even though it too is not using stepper motors. The screen you can see is from
"Watch out for the Evil Claw" cried the residents of PiCity.

Someone (sorry Someone, I didn't catch your name) had created a Sentry Gun complete with the required "beep......beep.....beep...beep..beep.beep.beep" heard in the film Aliens from the related motion tracker technology.
If you hear this noise run away and hide.

A couple of students presented a fruit-music maker connected to a Raspberry Pi. Their approach was different to what I have seen before as they were not relying on one completing a circuit (touch a wire, and with your other hand touch the fruit to make a sound play), but were instead relying on (we think) a capacitive drop when you touched the fruit ("touch the kiwi fruit and absorb its power").... or perhaps it was due to electromagnetic fields. They are currently going through a process of elimination as they learn how exactly this works. However it worked, it worked well.

Play that funky banana!

Various other workshops and exhibits ran throughout the day including working with a BBC Buggy and separately, Hedgehog Bots controlled by Arduino Nano and invented by Scott and Joe, graduates from University of Bristol. There was also a horizontal scrolling game controlled by a device one wears that picks up on electrical activity in the brain; you moved up by thinking more and down by thinking less... it was important to not actively think about thinking less. Sadly I forgot to get a photo of these great projects.

Saving the best to last there was Josh who presented an AWESOME Persistence of Vision project. Several rows of LEDs spinning at about 1000RPM (I think that was the speed...) He had animations running, could draw and persist a cube or the time and all sorts of other patterns. It looked great, was a tidy build and captivated us all like moths to a light bulb.

Persistance_Of_Vision_PoV_Josh Oooooh shiny.

Digimakers has again lived up to expectations with Caroline and the team keeping everything running smoothly throughout the day.

The next event is currently scheduled for June, hope to see you there.