Robotics 2: Using the Sharp GP2Y0A41SK0F IR distance sensor with Explorer HAT Pro

The HC-SR04 is a clever component. By measuring the time delay between signals from the included board one can easily calculate the distance to objects. Well, that’s the theory. Unfortunately it turns out that it does not play that well with Explorer HAT Pro board that it is connected to when using the provided explorerhat library. I’ve observed that ranges to perpendicular objects (to give the best result) are miscalculated by up to a metre (plus or minus). This seems to be a timing issue as the HC-SR04 does not return the distance to an object, but instead it sends a connected pin high for the same time as it took for a pulse of ultrasound to bounce back.

Instead, Zumo George has received an upgrade in the form of the Sharp GP2Y0A41SK0F infrared distance sensor. This has the added benefit of making him look more Goliath* and less WALL-E. This is an analog device that runs at 5V which matches perfectly with the Explorer HAT’s analog inputs. It measures distances accurately from 4cm to 30cm which, for the purpose of “don’t crash into an object” is perfect.
ZumGeorge_with_IR
Wiring is straightforward with GND and Vcc to their respective 5V pins and the third cable to one of the analog pins (I use pin four). I bought the sensor on eBay for a few quid and the cable it came with did not have 0.1” pins attached at the breadboard end of things. A quick bit of soldering and heat shrinking later and we’re ready to go.

Next I needed to add to my library of tests for Zumo George to ensure that when he boots up all is A-OK. Let’s write a scenario:

@ir
Scenario: Verify infrared range finder is responding
  Given the distance to an object is greater than 10 cm
  When I read the infrared range finder
  Then I should see a distance greater than 10 cm


Now we execute this with Lettuce. Note though that I have added a tag with the @ symbol to enable me to run just this scenario while we get it working. Hence we:

sudo lettuce -t ir

We need sudo because Pimoroni’s Explorer HAT Pro requires this, and we use the -t parameter to specify the tag to execute.

Immediately we see that our three steps have not been defined, and Lettuce helpfully returns suggested step definitions that assert False. Copying these into place and re-executing moves us from not implemented to not coded, with the three steps each going red.

At this point we need to implement some code to make everything work as it should. To do this I have decided to create a new Python module, zumo.py which will contain specific functions required by Zumo George. We are going to need a way of determining distance by using the GP2Y0A41SK0F sensor, hence in zumo.py I enter the following to create a read_distance() function that we can call from a step definition:

import explorerhat
import time

def read_distance():
  # with help from: http://www.yoctopuce.com/EN/article/an-usb-optical-telemeter
  # and an idea from http://jeremyblythe.blogspot.co.uk/2012/09/raspberry-pi-distance-measuring-sensor.html
  v_readings = []
  # read the voltage 10 times so that we can get a decent average
  for i in range (0,10):
    v_readings.append(explorerhat.analog.four.read())

  av_voltage = sum(v_readings)/10.0
  if av_voltage <= 0:
    av_voltage = 0.001
  distance_cm = 13 / av_voltage
  return distance_cm


The GP2Y0A41SK0F works on the principle that distance to an object is inversely proportional to the voltage that can be read from the connection on analog pin four. In other words the higher the voltage the lower the distance, and our equation takes this into account. The number 13 was determined by looking at the datasheet on the Pololu product page. In the graph we can determine for a reading of 1V we should be 13cm away from our object, i.e.: 13/1 = 13cm. I got the idea to do things this way from Yoctopuce who used another Sharp IR sensor. Their magic number was 60 for the GP2Y0A02YK0F which they obtained in the same way.

Into steps.py I then add the following three step definitions as well as the import zumo command. The Given sets up the expected result, the When is the event (i.e.: read the distance) and the Then is the comparison. I will be honest and say that in Python I don’t know if using global variables in this way is a good thing or not (note to self: must research this), but it works, so at this stage I am not too concerned:

@step(u'the distance to an object is greater than (.*?) cm')
def the_distance_to_an_object_is_greater_than(step, distance):
    global minimum_expected_distance
    minimum_expected_distance = distance

@step(u'I read the infrared range finder')
def i_read_the_infrared_range_finder(step):
    global actual_distance
    actual_distance = zumo.read_distance()

@step(u'I should see a distance greater than (.*?) cm')
def i_should_see_a_distance_greater_than(step, expected_distance):
    assert float(actual_distance) > float(minimum_expected_distance), "Distance returned = %d" % actual_distance


Running Lettuce a third time now shows everything is green, meaning our scenario is passing and our code to generate distances is working. Well, sort of. We have to remember that we are now mixing a controlled software environment with real-world robotics and as the adage goes, “anything can happen in the next half hour.” Clearly this scenario will only pass if the distance from Zumo George to an object is greater than 10cm. For me this is perfect as George always starts his working day facing away from obstacles. We could of course change our code to simulate the response from the GP2Y0A41SK0F (not the easiest of components to pronounce, or spell) but then we are not demonstrating the desired real-world behaviour: that when Zumo George is not facing a nearby object he shall be a happy robot ready to drive.

ir_test

You may note that something strange has happened though. We are clearly only running one scenario against a single feature by specifying the tag ir yet Lettuce is reporting that 2 features have passed (which you would think implies a minimum of two scenarios executed). I think of it this way: we have told Lettuce that only a single feature shall be run by telling it absolutely (in this case) that only a single tag containing a single scenario is within scope, therefore we have told Lettuce that the other feature passes and it counts it as such. This is on the basis that BDD should be all-or-nothing, i.e.: to demonstrate that software is fit for purpose all of our scenarios must pass and we are stating we are taking responsibility for this decision in this other case. Another way to think of it is that as the other feature contains no scenarios that we execute there is nothing that can fail, i.e.: it passes by default. It is something to be aware of though as it can make your numbers look a little strange if you are not used to seeing this outcome.

Next time: The Ryanteck 3 Line Follow sensor

* The evil truck from Knight Rider.
Comments

Robotics 1: Zumo George, a BDD rover

It is time to build a new rover, and to take a different look at how we can determine how it is controlled by a series of behaviours, defined using the tenets of Behaviour-Driven Development. BDD is a superb way to undertake development as it emphasises genuine communication and collaboration between business stakeholders, Developers and Testers. Behaviours are defined as features and scenarios, the latter elicited as specific examples using the Gherkin syntax, for example:

Scenario: Drive forwards
   Given Zumo George is greater than 10cm from a wall
   When power is applied to the motors
   Then Zumo George should drive forwards

BDD scenarios are written before other code, and determine what code is written. Hence no wastage: we develop what is required and it most likely passes first time. When the scenarios (the tests) are executed we can clearly see which steps pass (green), fail (red), or have not yet been implemented (that sort of muddy green-yellowy-brown).

I will be exploring the use of BDD to program this rover in quite some depth over a number of articles, intermixing some electronics to show how BDD is an ideal abstraction to explore behaviour-based robotic control. I’m going to term this BDR, Behaviour-Driven Robotics.

ZumoLasers2

In the above example you will note too important points: firstly in the second feature I am defining scenarios that can be used for an internal diagnostic on Zumo George, for example: “Confirm ultrasonic is responding”. These tests will be executed every time Zumo George boots and if any test fails then he will flash a red light and not commence roving which should avoid the problem of an out of control robot. Perhaps more importantly, you can see that George has no lasers and has not been programmed to use them in any case (the first three undefined steps). This is fortunate for obvious reasons.

You will already have noticed that I have named my robot Zumo George and given him agency. I think this is a good thing as he will be expected to mimic certain human behaviours (e.g.: don’t bump your head on a wall while walking / driving). Agency in a robot enables me to mimic human-like behaviours in code. It does however mean I will find myself leaning towards anthropomorphism, referring to Zumo George as “he” rather than “it”.

The chassis of the rover is based on Pololu’s Zumo (hence the rover’s full name) and Pimoroni’s recently released Explorer HAT Pro. The Zumo is available in a number of variants for Arduino such as the all singing Zumo 32U4 which has sensors, buzzer, LCD, accelerometer and more, and also the bare-bones Zumo Chassis Kit which is perfect for the Raspberry Pi as we can add our own electronics.

ZumoGeorge

The Zumo is very compact meaning that any model B is borderline too large. Whoever would have thought the words “too large” would be used to describe the Raspberry Pi! This is because the Zumo is designed to take part in Mini Sumo competitions where the robot must conform to dimensions of 10cm by 10cm. To be honest, a B/B+/v2B would just about squeeze into the available space and certainly others have created Zumo robots using the B. To minimise the footprint I have opted for a Model A+ which is almost small enough (ahh if only it was 1.5mm thinner. More about this in a later article).

I purchased my Zumo from those excellent people at Pimoroni, and also elected for two 95:1 ratio motors. The motors are intentionally purchased separately to enable you to opt for those that best match your robotics need (essentially outright speed versus torque). You can easily drop in replacement motors if you change your mind at a later date as the plastic motor cover on the chassis is removed with just two screws.

ZumoGeorgeInPieces

At this point a confession is in order: I like to tinker with things and see if I can break them. I think this is because my day job is in software testing. Unfortunately this tendency to meddle with things caused me to break one of the motors, necessitating ordering a third. The lesson quickly learned is never manually rotate the drive shaft of the motor as you will quickly grind down the gears until it slips horrendously in use. I share this in the hope that you only have to purchase two motors. On the plus-side I do have a (slightly crippled) spare motor I can now disassemble to better understand how the gearing works. To be honest the motor mechanism itself is fine as it is just the cogs that are worn so there is hope yet that I can resurrect this for a future project.

I considered various options to control Zumo George’s motors and in the end put four possible solutions up against each other in a winner-takes-Zumo knock-out:

PicoBorg is tiny, it really is, and is a great first-step into robotic motor controllers. I’ve had great success using this with a larger robot based on the Magician Chassis. However it is not bi-directional without the use of a pair of 5V relays and including such immediately bulks out the parts list as one also needs a board and cabling to mount them on. Bi-directional capability is essential in a tracked robot (IMHO) as it provides the ability to turn on the spot and not just in an arc. PicoBorg Reverse was briefly considered as a possible solution, but at £31 for the board was felt to be pushing the budget a bit.

The L298N is the go-to staple of bi-directional controllers. With prices in the £1.50 to £4 region it wins on cost. However it is a comparatively bulky thing with a big heat sink rising vertically, and as a 5V device requires additional circuitry for safe usage with the Raspberry Pi. This makes using it in a compact platform somewhat tricky.

The Ryanteck RPi Motor Controller Board is a great bit of kit. It provides bi-directional motors and is compact. Ryan has done an excellent job of documenting the board and providing example code in Python. The GPIO pins are also exposed making it easy to add further electronics (PicoBorg can be mounted on TriBorg for the same effect). I was all set to purchase this board when I spotted an announcement from Pimoroni...

The Pimoroni Explorer HAT Pro is a crazy good board. Using the available easy-peasy Python library one can control both bi-directional motors and take advantage of a large number of other available inputs and outputs. These include four capacitive touch sensors, four pads to attach crocodile leads to, four buffered 5V input and four buffered 5V output pins, two 5V buffered ground pins and 5V buffered analog pins. Also down one edge of the board are an array of 3v3 pins for use that are not buffered. To finish the board off there is even space for an included mini breadboard to be mounted on top. Coming in at the same size as the A+ this is my new favourite wonder board for the Raspberry Pi.

My parts list is now:

  • Pololu Zumo Chassis
  • 2x 95:1 micro metal gear motors (I needed 3 *ahem*)
  • Pimoroni Explorer HAT Pro with mini breadboard
  • Generic tiny WiFi network adaptor
  • 16GB Integral Micro SD Card
  • Raspberry Pi Camera
  • Bendy arm thing to hold camera up
  • Pimoroni Camera Mount
  • HC-SR04 ultrasonic distance sensor a Sharp GP2Y0A41SK0F IR distance sensor, sourced from eBay (see Part Two)
  • 3x plastic legs to raise up the Raspberry Pi Model A (more on this next time)
  • An assortment of wires to hook everything up.
  • Ryanteck 3 Line Follow Sensor (not shown in the above photographs)
You will note two seemingly obvious missing items, namely a battery and some kind of game controller to drive Zumo George for when he is not in auto-roving mode. I am also investigating pan and tilt mechanisms for the camera and / or distance sensor. These parts will be covered in later articles. I have something cunning in mind for the battery but can’t say more about this at present.

Next time: Replacing the HC-SR04 due to a technical hiccup
Comments

Some XLoBorg code to get you started

Here’s code to create a graphical compass for your XLoBorg from PiBorg. It’s a work in progress, working fine for 2-dimensional headings (ie: keep the Pi on a flat, level surface). It’s got the code in for 3-dimensional headings using the accelerometer as well as the magnetometer to enable the Pi to be tilted but this is *ahem* not quite working yet (my maths is a bit screwy it seems). I’ll post an update when it’s working good ’n proper.

I’ve included support for Declination which is pretty important to get an accurate compass heading. Just define your location(s) as constants near the top and edit the declinationAngle = BRISTOL / 1000 to equal your location and hey presto.

I also found a very interesting PDF from Honeywell all about how these combined accelerometer + magnetometer compasses work. Definitely worth a read.

# Calculate and generate a graphical display of the compass heading
# calculated from the output of XLoBorg from www.piborg.com

# Author: Colin Deady, 2012.
# Released under Creative Commons Attribution Share Alike,
http://creativecommons.org/licenses/by-sa/2.5/

# Credit to Bryan Oakley for the MyApp class code that generates the clock face:
#
http://stackoverflow.com/questions/6161816/tkinter-how-to-make-tkinter-to-refresh-and-delete-last-lines
# Released under Creative Commons Attribution Share Alike (
http://stackexchange.com/legal, http://creativecommons.org/licenses/by-sa/2.5/)

# WARNING: THIS IS NOT TO BE USED FOR ANYTHING IMPORTANT!
# THIS IS PURELY PROOF OF CONCEPT FOR FUN.
# NO liability is accepted WHATSOEVER for inaccurate data generated by this program.
# If you use it to navigate the Pacific / Sahara / other place of your choice and
# end up falling over the edge of the world and the last thing you see are some giant
# elephants then that is your get, not mine.

import Tkinter as tk; import time
from math import cos,sin,pi, atan, atan2, floor, asin
import sys

# Load the XLoBorg library
import XLoBorg

# Tell the library to disable diagnostic printouts
#XLoBorg.printFunction = XLoBorg.NoPrint

# Start the XLoBorg module (sets up devices)
XLoBorg.Init()

# define a target direction that we can use to rotate the Pi towards
# this could be built upon to be the target angle that a robot steers towards
targetDirection = 120

# Declination - very important!
# http://en.wikipedia.org/wiki/Declination
# http://www.loveelectronics.co.uk/Tutorials/8/hmc5883l-tutorial-and-arduino-library
# http://www.magnetic-declination.com/
BRISTOL = -2.2
declinationAngle = BRISTOL/1000

# create a "clock face" upon which we can render the compass heading of XLoBorg
class MyApp(tk.Tk):
def __init__(self, *args, **kwargs):
tk.Tk.__init__(self, *args, **kwargs)

self.size=300

self.title("Compass")
self.w = tk.Canvas(self, width=320, height=320, bg="#111", relief= "sunken", border=10)
self.w.pack()

self.w.create_oval(10,10,330,330, fill="cyan", tags="compassbg1")
self.w.create_oval(70,70,270,270, fill="blue", tags="compassbg2")
self.w.create_line(0,0,0,0, fill="red", width="3", tags="compass")
self.w.create_line(0,0,0,0, fill="yellow", width="3", tag="target")

legendbg = "#fff"
uzr1 = tk.Label(self, text="N", bg=legendbg )
uzr1.place(x=160, y=12)
uzr2 = tk.Label(self, text="S", bg=legendbg )
uzr2.place(x=160, y=311)
uzr3 = tk.Label(self, text="E", bg=legendbg )
uzr3.place(x=317, y=160)
uzr4 = tk.Label(self, text="W", bg=legendbg )
uzr4.place(x=12, y=162)

e = tk.Button(self,text="Quit", command=self.Quit)
e.pack()

self.update_compass()

def update_compass(self):

# Read and render the raw compass readings, correcting by our manual calibration values
compass = XLoBorg.ReadCompassRaw()
print str(compass[0])+","+str(compass[1])+","+str(compass[2])
mX = compass[0] - -248
mY = compass[1] - 370
mZ = compass[2] - 1384
# print "mX:"+str(mX)+",mY:"+str(mY)

# Read the raw accelerometer readings
accel = XLoBorg.ReadAccelerometer()
print str(accel[0])+","+str(accel[1])+","+str(accel[2])
aX = accel[0]
aY = accel[1]
aZ = accel[2]

# some notes on tilt compensation
# http://www.loveelectronics.co.uk/Tutorials/13/tilt-compensated-compass-arduino-tutorial

# =======================================================================================
# The following code does NOT yet work... it should but my maths is a bit screwy
# When I manage to fix it then we'll have a compass that copes with being tilted, huzzah!
rollRadians = asin(aY)
pitchRadians = asin(aX)
# up to 40 degree tilt max:
# http://www.loveelectronics.co.uk/Tutorials/13/tilt-compensated-compass-arduino-tutorial
cosRoll = cos(rollRadians)
sinRoll = sin(rollRadians)
cosPitch = cos(pitchRadians);
sinPitch = sin(pitchRadians);

vX = mX * cosPitch + mZ * sinPitch;
vY = mX * sinRoll * sinPitch + mY * cosRoll - mZ * sinRoll * cosPitch;
# =======================================================================================

# get the heading in radians
heading = atan2(vY,vX)

# correct for declination
heading +=declinationAngle

# Correct negative values
if (heading < 0):
heading = heading + (2 * pi)

# Check for wrap due to declination and compensate
if(heading > 2*pi):
heading -= 2*pi

# convert to degrees
heading = heading * 180/pi;

# get the base degrees
heading = floor(heading)
print heading

angle = heading*pi*2/360
ox = 165
oy = 165
x = ox + self.size*sin(angle)*0.45
y = oy - self.size*cos(angle)*0.45
self.w.coords("compass", (ox,oy,x,y))

angleTarget = targetDirection*pi*2/360
xt = ox + self.size*sin(angleTarget)*0.45
yt = oy - self.size*cos(angleTarget)*0.45
self.w.coords("target", (ox,oy, xt, yt))

gapAngle = abs(targetDirection - heading) # we want the angle without the sign
# if the gap angle is more than 180 degrees away then it is closer than we think
if gapAngle > 180 :
gapAngle = 360-gapAngle
if gapAngle < 0 :
print "Target is " + str(gapAngle) + " degrees anticlockwise"
elif gapAngle > 0 :
print "Target is " + str(gapAngle) + " degrees clockwise"
else:
print "Target acquired!"

self.after(500, self.update_compass)

def Quit(self):
self.after(700,self.destroy())

app = MyApp()
app.mainloop()

Comments

Remotely access your Raspberry Pi part 1: VNC

If you do not have a monitor attached to your Pi permanently there are several ways to remotely access it. One of the most straightforward is VNC (Virtual Network Computing). In straightforward terms VNC displays the desktop of your Pi on another computer that you own.

To use VNC you will first need to configure your Raspberry Pi, and then from your other computer run a VNC client and connect back to the Pi. For the client, try RealVNC which is available for Windows, Mac OS X, Linux and Solaris.

First, configure your Raspberry Pi (configuration will require you to either have a monitor attached or to be connecting from the terminal via ssh):

1) At the terminal run the following commands to install vncserver:

sudo apt-get update
sudo apt-get install vnc-server
vncserver -geometry 1024x768


The first command will ensure that your Raspberry Pi will run any installation commands against the latest available list of packages, ie: it will install the latest version of any program available in the repositories.

The second command installs vncserver. Note that installation is “vnc-server” with the hyphen.

The third command starts the VNC server (this time note the command omits the hyphen) and sets the geometry parameter to indicate that when a connection is made a window containing the Pi’s desktop that is 1024x768 pixels should be sent to the client computer. Change this value to change the screen size of your Pi’s desktop when accessed remotely.

Note that the first time you start vncserver you will be prompted for a password to remotely connect to your Pi with (this should not be the same as the password you use to normally log into your Pi with!) Ensure that you enter a unique secure password. You will also be asked to enter an optional password for read only access - you can skip this second password request for now.

Assuming that vncserver starts successfully you should see something like:

pi@rpi3 ~ $ vncserver -geometry 1024x768

New 'X' desktop is rpi3:1

Starting applications specified in /home/pi/.vnc/xstartup
Log file is /home/pi/.vnc/rpi3:1.log


Note the “rpi3:1” bit. rpi3 is the name of my Raspberry Pi and 1 is the number of the desktop you will be connecting to. We will need this 1 later.

2) On your client computer install the RealVNC Viewer application from the downloads page. Note you should install the Viewer from this page, not the full RealVNC package as you want to connect to your Pi.

3) Back on your Pi run ifconfig at the command prompt in the terminal to determine your IP address. Look for the eth0 entry (assuming your Pi is using a wired network connection and not WiFi (else look for the wlan0 section):

eth0 Link encap:Ethernet HWaddr ****
inet addr:192.168.1.75 Bcast:192.168.1.255 Mask:255.255.255.0
...

The important information you will need is the IP address, which in my case is 192.168.1.75 (yours will vary). This is the address we will connect to from RealVNC Viewer.

4) Back on your client computer (I’m using an Apple Mac for this example, but these instructions will work on Windows as well). Run VNC Viewer and enter the IP address of your Pi followed by :1 (the 1 we got at the end of step 1).

vnc1

5) Click Connect. Immediately you will see a warning message. This is important - the connection you are creating is not encrypted, which means that if you type sensitive information into your Pi via this method (eg: passwords) then it can potentially be intercepted by a third party. This includes the password used to access your Raspberry Pi via VNC (hence choosing a different password to your normal login password when starting vncserver for the first time). You should not use this unencrypted method to access your Pi from other internet as a result! And you should exercise care when using this method on your own local network.

vnc2


6) Assuming we accept the risk click [Continue].

7) You will be prompted for the password you entered to access VNC earlier.

vnc3

8) Click OK after entering your password. You will see a message stating that RealVNC Viewer is connecting to your Pi. After a few seconds you should see your Pi’s desktop on your own. You can now run programs on your Raspberry Pi from your client computer directly.

vnc4

9) Note that if you reboot your Pi you will need to restart vncserver. vncserver can be set to automatically run after a reboot, but given the unencrypted connection being used it is suggested that you do not run it in this form unless needed. Encrypting your VNC connection is beyond the scope of this introductory tutorial but will be covered in the future. However if you plan on taking your Raspberry Pi to an event and want to have vncserver start automatically then those great people at Adafruit have straightforward instructions for you to follow ready written:

http://learn.adafruit.com/adafruit-raspberry-pi-lesson-7-remote-control-with-vnc/running-vncserver-at-startup

Note that for this step to work you will also need to ensure that your Raspberry Pi boot straight into the Desktop. Fortunately Adafruit have this covered too.

http://learn.adafruit.com/adafruits-raspberry-pi-lesson-2-first-time-configuration/booting-into-desktop
Comments

Nokia 3310 LCD board for Raspberry Pi

I recently ordered a Nokia 3310 LCD shield from Texy. The beauty is it comes pre-assembled with 5 micro switches and the 84x48 LCD so no soldering required. The price was pretty superb too at £14 + £4 postage. I looked up the list price of just the LCD and it is about £10 hence 8 pounds extra for a fully assembled board delivered to my door is a bargain.

I have some interesting plans a-foot for this little board.

To get it up and running I needed to install WiringPi from Gordons Projects to work with Texy’s sample code. Be sure to install WiringPi for Python though else you’ll sit there confused like I did wondering what went wrong ;)

Texy_Nokia3310LCD

Something that is most interesting with the 3310 LCD is that it has no internal font set. Consequently you have to define all characters in hex up-front. Luckily (and VERY thoughtfully) Texy provides sample Python code to introduce the functionality of the board which includes a default fairly small font. Calling:

text(‘Hello world’)

will display the text string on-screen. One of the first things I did was to port a pretty good large font to Python from C, the latter courtesy of Petras Sadulkis. The large font takes up three rows per character meaning I had to get it to loop through each row of the array which takes me into my favourite world of multi-dimensional arrays :) (ask me sometime why I love multidimensional arrays so much...) You can see the board outputting the current time (HH-MM, SS.ms) using this larger font.

Here’s some code to get large text working for you for numbers and a few characters (add this into Texy’s sample code), and I apologise in advance for the gratuitous hex:

def display_largechar(char, character_position, display_on_row, font=LARGEFONT):
  try:
    gotorc(0 + display_on_row, character_position)
    for value in font[char][0]:
      lcd_data(value)
    gotorc(1 + display_on_row, character_position)
    for value in font[char][1]:
      lcd_data(value)
    gotorc(2 + display_on_row, character_position)
    for value in font[char][2]:
      lcd_data(value)
    lcd_data(0) # Space inbetween characters.
  except KeyError:
    pass # Ignore undefined characters.


def largetext(string, display_on_row, font=LARGEFONT):
  character_position = 0
  for char in string:
    display_largechar(char, character_position, display_on_row, font)
    character_position += 2

# Based on http://mbed.org/users/snatch59/code/N3310LCD/
LARGEFONT = {
  '0': [
    [0x00,0x00,0xc0,0xe0,0x70,0x30,0x30,0x30,0x70,0xe0,0xc0,0x00,0x00,0x00,0x00,0x00],
    [0x00,0xff,0xff,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0xff,0xff,0x00,0x00,0x00,0x00],
    [0x00,0x07,0x1f,0x38,0x70,0x60,0x60,0x60,0x70,0x38,0x1f,0x07,0x00,0x00,0x00,0x00],
  ],
  '1': [
    [0x00,0x00,0x00,0xc0,0xc0,0xc0,0xf0,0xf0,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x00,0x00,0x00,0xff,0xff,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x60,0x60,0x60,0x7f,0x7f,0x60,0x60,0x60,0x00,0x00,0x00,0x00,0x00]
  ],
  '2': [
    [0x00,0xe0,0x60,0x70,0x30,0x30,0x30,0x30,0x60,0xe0,0x80,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x00,0x00,0x00,0x80,0xc0,0xf0,0x3f,0x1f,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x70,0x78,0x7c,0x6e,0x67,0x63,0x61,0x60,0x60,0x60,0x60,0x00,0x00,0x00,0x00],
  ],
  '3': [
    [0x00,0xe0,0x60,0x70,0x30,0x30,0x30,0x30,0x30,0x60,0xe0,0xc0,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x00,0x00,0x18,0x18,0x18,0x3c,0x7c,0xe7,0xc3,0x00,0x00,0x00,0x00],
    [0x00,0x38,0x30,0x70,0x60,0x60,0x60,0x60,0x70,0x38,0x1f,0x0f,0x00,0x00,0x00,0x00],
  ],
  '4': [
    [0x00,0x00,0x00,0x00,0x00,0x00,0x80,0xc0,0x60,0xf0,0xf0,0x00,0x00,0x00,0x00,0x00],
    [0x00,0xe0,0xf0,0xdc,0xce,0xc7,0xc1,0xc0,0xc0,0xff,0xff,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x7f,0x7f,0x00,0x00,0x00,0x00,0x00],
  ],
  '5': [
    [0x00,0x00,0xf0,0xf0,0x30,0x30,0x30,0x30,0x30,0x30,0x30,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x1f,0x1f,0x18,0x18,0x18,0x18,0x30,0xf0,0xc0,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x30,0x70,0x60,0x60,0x60,0x60,0x70,0x38,0x1f,0x0f,0x00,0x00,0x00,0x00,0x00],
  ],
  '6': [
    [0x00,0x00,0x80,0xc0,0xe0,0x60,0x30,0x30,0x30,0x30,0x30,0x00,0x00,0x00,0x00,0x00],
    [0x00,0xfc,0xff,0x33,0x18,0x18,0x18,0x18,0x18,0x30,0xf0,0xc0,0x00,0x00,0x00,0x00],
    [0x00,0x07,0x1f,0x38,0x70,0x60,0x60,0x60,0x60,0x30,0x1f,0x0f,0x00,0x00,0x00,0x00],
  ],
  '7': [
    [0x00,0x30,0x30,0x30,0x30,0x30,0x30,0x30,0xb0,0xf0,0xf0,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x00,0xc0,0xf0,0x78,0x1e,0x07,0x01,0x00,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x60,0x78,0x3e,0x0f,0x03,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00,0x00],
  ],
  '8': [
    [0x00,0x80,0xe0,0x60,0x30,0x30,0x30,0x30,0x30,0x60,0xe0,0xc0,0x00,0x00,0x00,0x00],
    [0x00,0x87,0xe7,0x6c,0x18,0x18,0x30,0x30,0x38,0x6c,0xc7,0x83,0x00,0x00,0x00,0x00],
    [0x00,0x0f,0x3f,0x38,0x70,0x60,0x60,0x60,0x60,0x30,0x1f,0x0f,0x00,0x00,0x00,0x00],
  ],
  '9': [
    [0x00,0x80,0xc0,0x60,0x30,0x30,0x30,0x30,0x70,0xe0,0xc0,0x00,0x00,0x00,0x00,0x00],
    [0x00,0x0f,0x3f,0x30,0x60,0x60,0x60,0x60,0x60,0x30,0xff,0xff,0x00,0x00,0x00,0x00],
    [0x00,0x00,0x00,0x60,0x60,0x60,0x60,0x70,0x38,0x1e,0x0f,0x03,0x00,0x00,0x00,0x00],
  ],
  ':': [
        [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
        [0,00,56,56,56,56,56,56,56,56,56,0,0,0,0,0],
        [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
  ],
  '+': [
        [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
        [0,0,64,64,64,64,64,254,254,64,64,64,64,64,0,0],
        [0,0,0,0,0,0,0,15,15,0,0,0,0,0,0,0]
  ],
  '-': [
        [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],
        [0,64,64,64,64,64,64,0,0,0,0,0,0,0,0,0],
        [0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0]
  ],  
}

from datetime import datetime

# clear the screen then display the time HH:MM on one row and SS:ms on another
cls()
now = datetime.now()
largetext(now.strftime("%H:%M"),0)
largetext(now.strftime("%S.%f")[:5],3)

With the large font in place and the time display working I can move on to adding a stop watch, along with some configuration menus for brightness, contrast and a few other parameters. Once this is done the fun begins as I’ll be using it to remotely control my two Nikon cameras using the time and stop watch functionality to control when the Pi triggers the cameras to take a photo.

This is a great board to experiment with. It brings out the hex in all of us;) Luckily there are tools online to help out with generating hex images in case you need them:

http://www.quinapalus.com/hd44780udg.html (online character generator - this is pretty nifty)
http://en.radzio.dxp.pl/bitmap_converter/ (LCD Assistant: windows tool for converting monochromatic bitmaps to hex data arrays)

As you can probably tell I’m very pleased with this shield. If I was being picky I would say, as Texy notes, it “almost” fits within the credit-card form factor of the Pi, with the red board under the LCD just poking out a little. I’m hoping this can be sorted as it means the board won’t fit inside tight-fitting cases designed for the Pi. But apart from that, this shield gets a thumbs up from me.
Comments

The MagPi issue 8 is out

The MagPi issue 8 has been released, featuring my second ever published program when I take over The Python Pit again. This time around I show how to use Python’s subprocess to create desktop widgets. Think: those things what Windows Vista and 7 has and you won’t be far off.

The Python program has an intentional flaw, but I only reveal general details on this to the readership. This is by design as one of the things I always liked about programs in those 1980s computer magazines (you know, the one’s that listed the code and you had to type it yourself back in the days before cover cassettes / disks or discs became the norm) was that you often had to finish the program off, or improve it to get it to work _just_right_.

If you are reading this I’ll let you know a bit more detail to help you out.

As I note at the end of the article, refreshing the Pygame screen for each widget occurs at the same time as each checks for new content. The problem this causes is that if I drag another window on top of either Pygame window it will blank the output until the next check for content, which could be hours away. The fix is quite straightforward and involves adding an if statement that uses datetime to determine when to check for new content, and changing time.sleep(28800) and time.sleep(3600) to both be time.sleep(1). This means that each pygame widget’s screen will refresh every second (change this to 0.1 for faster refresh, every tenth of a second, at the cost of higher CPU usage by Python) BUT a check for new content will still only happen periodically when the new datetime value is suitably different from the old datetime value. You do NOT just want to change to time.sleep(1) on its own as this will cause both URLs to be queried every second for new content which is much too frequent.
Comments

LedBorg arrives: time for some pretty GPIO-driven colours

Yeay! I bought a LedBorg recently from PiBorg (only cost a few quid) and it has proven very straightforward to set up and use. LedBorg is essentially 3 LEDs on a very small (as in VERY _small_) expansion board for the Raspberry Pi. It is about the width, and half the length of my index finger making it possibly the teeniest hardware add-on I have every installed.

LedBorg’s LEDs can be set to off, low or high for each of the red, green and blue diodes. In combination this can be used to generate up to 26 colours. The setup instructions are clear and easy to follow. Just make sure that if you have a 512MB Raspberry Pi that you follow the Rev 2 instructions that are on the page.

Changing the colour is as easy as issuing the following from the terminal:
echo “RGB” > /dev/ledborg
replacing RGB with the off (0), low (1) or high (2) value for each of the red, green and blue diodes.

A few examples:

White
echo “222” > /dev/ledborg
(all LEDs at maximum output)

Black
echo “000” > /dev/ledborg
(all LEDs are off - strictly speaking the LedBorg is not actually outputting black... it is off)

Red
echo “200” > /dev/ledborg

Darker Red
echo “100” > /dev/ledborg
(appears less intense than full Red’s 200)

Orange
echo “220” > /dev/ledborg

Magenta
echo “202” > /dev/ledborg


Ok, so what practical uses can it be put to? Anything from a random colour generator, to colour waves to a CPU usage monitor. In the latter case the demo application that can be installed from PiBorg will change LedBorg’s colour output from Green to Red when the CPU spikes. Here’s an example of a random colour generator that I wrote quickly in Python:

import random, time

while True:
    lbRed = random.randrange(0,3)
    lbGreen = random.randrange(0,3)
    lbBlue = random.randrange(0,3)

    lbColour = str(lbRed)+str(lbGreen)+str(lbBlue)

    LedBorg = open('/dev/ledborg', 'w')
    LedBorg.write(lbColour)
    del LedBorg

    print lbColour
    time.sleep(0.1)


This will pick a random value between 0 and 2 for each of the 3 diodes, set these values, causing the LedBorg to light up accordingly and then print the colour values selected to the terminal. Note that the range is specified as 0,3 as, from tutorialspoint the second value is the stop value and is excluded from the range. Try changing this to 0,2 and you will see that when run the program never output the number 2 (ie: high, LED on maximum) to the terminal. The Python docs do not explain this subtlety.

I have various plans for LedBorg using web.py, AJAX and my Android phone for remote control of my Pi when not connected via SSH and with no monitor plugged in. I can see LedBorg as a very handy gadget to give visual feedback that whatever I set on my Android phone has been so set on the Pi.

PiBorg also make the PiBorg (unsurprisingly), an interesting robot controller for the Raspberry Pi. Here’s a thought: if you cross a LedBorg with a PiBorg do you get a Cylon?
Comments

Remember the Pi (GTD with RTM on RPi)

You have a Raspberry Pi (or you have placed an order for one). You have a ton of project ideas that you would like to try out, but how are you going to keep track of them? Bits of paper? A notes app on your computer or phone? How are you going to keep your notes in sync as new ideas and updates to others come to mind?

This is where Remember The Milk comes in.
Read More...
Comments

Case your Pi part 2: the case arrives and is assembled

This morning I assembled the case that I previously bought from Shropshire Linux User Group via their eBay store.

The case is made from high quality pre-cut acrylic components ready to be assembled. It comes with 8 nuts and bolts to fasten it all together (mine came with 9 - always good to have a spare) and 4 rubber feet. I am a big fan of clear acrylic, as it is very strong and unobtrusive. It is an excellence choice for the casing material.

Read on for assembly instructions (and note those that come with the case, and the YouTube video of the case designers putting the case together).
Read More...
Comments

Using an Android phone as an XBMC remote control

I’ve been a little quiet on the RPi for a week while enjoying XBMC working pretty well. However one thing has been bothering me: a lack of a decent remote control.
Read More...
Comments

Case your Pi

Turns out there are already several case options available for my RPi. I went for one from Shropshire Linux User Group via the eBay link they provide. Cost me a tenner, and should arrive in a few days. I like this one because:

a) it is available NOW (and my RPi looks naked and rather vulnerable at the moment!)
b) it fully enclosed the RPi.

More info on the case when I receive it.

--

23/05/2012 Update: it arrived! Tomorrow I will have time to assemble it and post a photo.
Comments

XBMC Huzzah!

My N570 powered Asus Eee Netbook took 12 hours to build OpenELEC for the RPi. And now I have XBMC up and running :)
Read More...
Comments