The words you are searching are inside this book. To get more targeted content, please make full-text search by clicking here.

AI made easy with raspberry pi

Discover the best professional documents and content resources in AnyFlip Document Base.
Search
Published by luis, 2020-08-13 19:13:51

MagPi Issue 72

AI made easy with raspberry pi

Keywords: Raspberry Pi,AI,Electronics

BUY IN PRINT WORLDWIDE! magpi.cc/store

The official Raspberry Pi magazine Issue 72 August 2018 raspberrypi.org/magpi

M&AMTHINEEMCARTAICFTA LGEAAMRENLVOIDGEICO
Building blocks
by the numbers Discover the brains
running games

YOMUARNFAIGLEES MADE EASY
Use File Manager
to full effect Artificial intelligence projects for Raspberry Pi OWKNNITSYEONUSROR
Craft an interface
for Raspberry Pi

HMOIWNITOMBAUGILIDCYMOUIRRROWONR MAKE MUSIC
Create some noise
with Raspberry Pi
magpi.cc/store



Welcome

WELCOME TO PAGE 14
THE OFFICIAL
MAGAZINE SEE PAGE 14 FOR DETAILS

A wise maker called Ben Driscoll once said: THIS MONTH:
“If you ever code something that ‘feels 16 ARTIFICIAL INTELLIGENCE
like a hack but it works,’ just remember
that a CPU is literally a rock that we tricked Get to grips with AI using a Raspberry Pi
into thinking.”
28 GHOST DETECTOR
This month, we’re putting that ‘hack that works’
concept into the trickiest of thinking concepts: The non-existence of spirits won’t stop our maker
artificial intelligence (AI).
66 MAKING MUSIC
AI works right at the coalface of computing. It’s
exciting, complicated, baffling, and brilliant. But you Turn your Pi into a home recording studio
don’t need a PhD in neural networks to get our AI
services up and running. These projects are picked to 88 MEDICAL MARVEL
help you understand AI with the minimum of fuss.
We chat to inventive nurse Ernesto Holguin
The Raspberry Pi is a perfect fit for AI. Because
while you might need a fast CPUs and multiple GPUs
to create AI models (the decision-making plans), you
can run models on microcomputers like the Raspberry
Pi and access powerful AI tools via cloud services.

And the Raspberry Pi acts as the interface
to physical computing hardware, turning
those mysterious AI models into practical
real-world projects.

AI is one of the most misunderstood technologies
in the modern world. A sprinkling of AI dust on a
physical computing project can create a seemingly
magical device; read on and you’ll discover how
down-to-earth AI really is.

Lucy Hattersley
Editor

FIND US ONLINE raspberrypi.org/magpi GET IN TOUCH [email protected]

EDITORIAL DESIGN PUBLISHING

Editor: Lucy Hattersley Critical Media: criticalmedia.co.uk For advertising & licensing:
[email protected] Head of Design: Dougal Matthews Publishing Director: Russell Barnes
Features Editor: Rob Zwetsloot Designers: Mike Kay and Lee Allen [email protected] | +44 (0)7904 766523
[email protected] Illustrator: Sam Alder Director of Communications: Liz Upton
Sub Editors: Phil King and Jem Roberts CEO: Eben Upton
SUBSCRIPTIONS
DISTRIBUTION CONTRIBUTORS
Raspberry Pi Press
Seymour Distribution Ltd Mann Enterprises, Unit E, Brocks Wes Archer, Guenter Bartsch, Alex Bate, Brian Beuken, Mike Cook,
2 East Poultry Ave Business Centre, Haverhill, CB9 8QP David Crookes, PJ Evans, Nicola King, Joe McLoone, Martin O’Hanlon,
London To subscribe:: magpi.cc/subscribe KG Orphanides, David Pride, Jose Marcial Portilla, Nik Rawlinson, Matt
EC1A 9PT | +44 (0)207 429 4000 To get help: magpi.cc/subshelp Richardson, Richard Smedley, Mark Vanstone, Clive Webster, Robert Zakon

This magazine is printed on paper sourced from The MagPi magazine is published by Raspberry Pi (Trading) Ltd., 30 Station Road, Cambridge, CB1 2JH. The publisher,
sustainable forests and the printer operates an editor, and contributors accept no responsibility in respect of any omissions or errors relating to goods, products or
environmental management system which has services referred to or advertised in the magazine. Except where otherwise noted, content in this magazine is licensed
been assessed as conforming to ISO 14001. under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported (CC BY-NC-SA 3.0). ISSN: 2051-9982.

raspberrypi.org/magpi AuAgupsritl 20168 3

Contents
Issue 72 August 2018 raspberrypi.org/magpi

TUTORIALS COVER FEATURE

> PI 101: USING THE FILE MANAGER 36 16

The humble graphical file manager is very powerful AI MADE EASY

> KNIT YOUR OWN SENSOR 38

Stretch your imagination in this month’s Pi Bakery

> LEARN PYGAME ZERO: PART 2 44

We get into the basics of Pygame logic

> BUILD A MINI MAGIC MIRROR 50

A bit of bedside magic

> DOCUMENTING YOUR CODE 52

Build a site to automatically document Python code

> MODDING MINECRAFT PI 58

Use Mathematica to change your Minecraft world

> MAKE GAMES IN C PART 8 60

Time to optimise and speed up your game

IN THE NEWS PI WARS 2019 EASY GPIO

NEW RASPBIAN

Big updates to the Pi operating system 06 Start your robot engines 08 10

Pi Zeros with added colour

Some of the tools and techniques shown in The MagPi magazine are dangerous unless used with skill, experience, and appropriate personal protection equipment. While we
attempt to guide the reader, ultimately you are responsible for your own safety and understanding the limits of yourself and your equipment. Children should be supervised.
Raspberry Pi (Trading) Ltd does not accept responsibility for any injuries, damage to equipment, or costs incurred from projects, tutorials or suggestions in The MagPi
magazine. Laws and regulations covering many of the topics in The MagPi magazine are different between countries, and are always subject to change. You are responsible
for understanding the requirements in your jurisdiction and ensuring that you comply with them. Some manufacturers place limits on the use of their hardware which some
projects or suggestions in The MagPi magazine may go beyond. It is your responsibility to understand the manufacturer’s limits.

4 August 2018 raspberrypi.org/magpi

Contents

THE BIG FEATURE

MAKE MUSIC 66 97

Turn your Raspberry Pi into a band REGULARS 06
64
YOUR PROJECTS > NEWS 80
> TECHNICAL FAQ 98
> BOOK REVIEWS
> FINAL WORD

COMMUNITY

28 > THE MONTH IN PI 84
88
The latest buzz in the Pi universe 90
92
> MEDICAL PI INTERVIEW 94

Ernesto Holguin uses Pi to fight diabetes

GHOST DETECTOR > SWAY GRANTHAM PROFILE
Who exactly are you gonna call?
A teacher trying to make a difference
  30
> UPCOMING EVENTS
SQUIRREL CAFE
Where to meet up with the community
A cafe for squirrels, not one you visit
> LETTERS

You ask us questions, we answer them

SONIC PI GLOCKENSPIEL 32 REVIEWS 74
34 76
Hit the right notes with Sonic Pi > POE HAT 78
> BEOCREATE
  > JUICEBOX ZERO

NEMO-PI

Conservation of one of nature’s wonders

raspberrypi.org/magpi August 2018 5

News RASPBIAN UPDATED

RASPBIAN

UPDATED
Latest OS update gets setup wizard, app store and The MagPi

T he Raspberry Pi language right (because otherwise WiFi hardware to be certified for
Foundation has released a input and output don’t work use, it must not radiate unless a
new version of Raspbian, properly), setting the WiFi country country has been set.”
including a setup wizard for the (as otherwise you can’t use the
first time. network), changing the password The new version of Raspbian
As Simon Long, Senior Principal (basic security), checking for also includes a Recommended
Software Engineer, Raspberry updates (basic security and Software ‘app store’ for the first
Pi Foundation, explains about reliability), and establishing a time – see the ‘Raspbian App
previous versions: “When a new network connection (required to Store’ box for more details.
user first boots up a new Pi … they check for updates).”
might not have much of an idea A new PDF viewer
what they ought to do next. Personalisation options, such as
“With the new update,” setting the desktop background, Raspbian has used Xpdf since
Simon continues, “a simple weren’t included in the setup its inception, but this venerable
setup wizard runs automatically wizard, as “the system will work application has become dated in its
to walk you through the basic perfectly fine without [them].” look, capabilities, and performance.
setup operations.”
Simon tells us, “The essentials Setting the WiFi country The new version of Raspbian
are getting the keyboard and correctly is important since the has replaced Xpdf with qpdfview
release of the Raspberry Pi 3B+, as because, Simon reveals, the
Right The Simon clarifies: “In order for 5G default Raspbian PDF viewer
new setup “needed to render fast, preload
wizard guides
you through
the essential
setup process

6 August 2018 raspberrypi.org/magpi

RASPBIAN UPDATED News

Left Raspbian now
prompts users to
change the password
when they first boot up
the Raspberry Pi

pages, have a modern UI … and to a PDF file,” Simon confirms.
fit our look and feel.” “Look in the ‘MagPi’ directory in
your home directory ‘pi’.” You’ll
The more well-known Okular still be able to download the PDF
wasn’t chosen; Simon explains version of this magazine from
that “the fact that it’s a 200MB magpi.cc/issues.
install, including an older version

Raspbian also includes a

Recommended Software RASPBIAN APP STORE
‘app store’ for the first time
The new version of Raspbian also includes a new
of the Qt toolkit than we already Upgrade Raspbian through the app store called Recommended Software. This is
ship” counted against it. Terminal in the usual way (sudo a separate application to the software manager,
apt-get update; sudo apt- allowing Raspbian users to select which of the
Simon continues, “qpdfview is get dist-upgrade). To view the previously default software to install.
Qt based, which is non-ideal – our startup wizard and Recommended
standard GUI toolkit is GTK, but I Not everyone needs BlueJ, Node-RED or
was able to hack the Qt theming to Software, download the image VNC Viewer, so including these by default
closely resemble GTK.” only leads to a larger download, clutter,
from magpi.cc/PVNGfh. and confusion. Installation is now optional.
You might also be pleased to
hear that “we are now including The x86 desktop version of The Recommended Software app brings
the latest issue of The MagPi as together all of the software that, as Simon notes,
Raspbian has also been updated “several third-party companies have generously
offered … in some cases giving free licences,” as
with “most of the changes”. well as software recommended for use on a Pi by
the Raspberry Pi Foundation.
Above The new PDF viewer is a welcome update from Xpdf, and there’s a digital copy
of The MagPi magazine waiting in your ‘pi’ home directory to read For example, a Mathematica licence would
raspberrypi.org/magpi normally cost “several hundred pounds,” says
Simon, while there’s also a “free licence for
RealVNC Viewer and Server.”

The Recommended Software application can
be found in the Preferences menu of the new
Raspbian release. To add the application to
your Raspbian install, use the command sudo
apt‑get install rp-prefapps.

August 2018 7

News PI WARS 2019

PI WARS 2019

Applications open, challenges announced

T he application process for
Pi Wars 2019 is now open,
and this year sees the first
Pi Wars theme: space exploration!
Pi Wars co-organiser Mike Horne
confirms: “This is the first time we
have themed Pi Wars, in honour of
the 50th year since the first moon
landing.” The idea came from
previous winner Dave Pride.
The theme means cosmetic
changes to both the challenges

This is the first time we have themed Above The Pi Wars
Pi Wars, in honour of the 50th year since 2018 obstacle
the first moon landing course drew a
crowd of roboteers
and supporters

and the venue, which remains as Space Invaders is based on the Pi Wars 2019 will take place on
the William Gates Building of the Duck Hunt challenge, “with the 30 and 31 March, so head over to
Cambridge Computer Laboratory. same skills required to score.” piwars.org to read about rules,
“Lots of painting of the courses is The Spirit of Curiosity challenge challenges, and to apply. Keep up
going to occur over the summer,” involves driving your robot over to date with news via the mailing
Mike tells us. an obstacle course, collecting list or Twitter @PiWarsRobotics.
a sample (which “will vary in
Challenger size, but not by much”) and Below Entrants don’t
driving back. need to use expensively
The space theme introduces new assembled robots; LEGO
challenges, but Mike says that Re-entry is a decent material for a
“the rules are pretty similar to modifiable challenger
before, with a few tightened up and Pi Wars 2018 received “over 150
few loosened.” entrants,” Mike confirms, “which
had to be cut down to just 76, plus
For example, the new Hubble reserves.” This involves a “very
Telescope Challenge is based hard couple of days”, as Mike tells
on 2018’s Over the Rainbow, us that “we cannot just choose
where robots needed to identify previous entrants as we want
coloured balls and drive to them in to make space for noobs.”
sequence. “This was the hardest
course by a long way,” Mike Mike has this advice for your
reveals. “We will be making the application form: “Ensure the
targets larger, but have not yet entrants show their enthusiasm
finalised that.” [and] we like detail.”

8 August 2018 raspberrypi.org/magpi

Code Design

Configure Analyze

Now free for home projects

A professional control system
development tool

CDP Studio is a development platform for industrial control systems, now coming with a free
version for non-commercial use. The system can run on a Raspberry Pi, supports C++, open source
libraries and has a large feature toolbox including GPIO, I2C and MQTT. Its built in GUI design tool
and features lets you code less and do more.

Free download on www.cdpstudio.com

CDP Technologies AS Tel: +47 990 80 900
Nedre Strandgate 29 [email protected]
P.O. Box 144 www.cdpstudio.com
NO-6001 Ålesund, Norway

News COLOUR-CODED PI ZEROS | NEW DIDDYBORG ROBOT

COLOUR Above The colour-coded
-CODED headers make it easier
PI ZEROS to connect sensors and
outputs to your Pi Zero
Making making easier

P i Supply has released a a low-cost way of not having to users say how helpful this
pair of Pi Zero boards with remember the layout or go off and GPIO header is, particularly
colour-coded GPIO pins, Google it.” for beginners.”
as well as a separate colour-coded
header, to make your mini-builds a The colour-coding follows the John adds that “we hope to
bit easier. usual convention, where red is see a colour-coded header on
+5 V, yellow is +3.3 V (also referred the main Raspberry Pi boards in
As John Whiting, director of to as 3V3), black is ground, blue is the future!”
marketing for Pi Supply, explains, DNC (or reversed I2C), and green
“A lot of users have struggled in shows the GPIO pins. The colour-coded Pi Zero 1.3
the past remembering the GPIO costs £9.16, the Pi Zero W £12.50,
layout, so the coloured header is “The reaction has been great”, and the header only £1.25, all from
John tells us. “We’ve had many uk.pi-supply.com.

Below The new DiddyBorg v2
is now as fast as the larger,
racier MonsterBorg but still
has plenty of torque

The v2 proves red ones are faster!

O ne of the most popular Red to be a similar speed to our
robot kits has received an very popular MonsterBorg kit”,
update, with more powerful merely changing the gearing on
motors and a newly designed the existing DiddyBorg motors
chassis that’s still compatible with would increase the speed, “but the
add-ons such as the touchscreen torque would come right down,
and camera mounts. which would have made tank
steering impossible.”
Timothy Freeburn, director of
Freeburn Robotics and PiBorg, The extra grunt of the DiddyBorg
explains that “the DiddyBorg v2 v2 comes from six 590 rpm, 28 mm
Red is essentially a high-speed motors. Timothy notes that “you
version of the DiddyBorg.” can easily upgrade the motors” of
a DiddyBorg v1 with six P28UK12
We reviewed the DiddyBorg V2 in motors for £64.80 (total) from
The MagPi #70 (magpi.cc/70). piborg.org. The new DiddyBorg v2
Red costs £220 from the same site.
While Timothy and the design
team “wanted the DiddyBorg v2

10 August 2018 raspberrypi.org/magpi

ALPINE OS COMES TO THE PI News

ALPINE NOW TRENDING

OS COMES The stories we shared
TO THE PI that flew around the world
Light, secure, and fast
REVERSE-EMULATED SNES
T he latest version of Alpine size of ‘around 130MB’ while
Linux, an ‘independent, ‘a container requires no more magpi.cc/hNHkYT
non-commercial, general- than 8MB’. To make a point about the nature of humour, Tom
purpose Linux distribution Murphy VII ‘reverse-emulated’ an unmodified NES to
designed for power users’, Alpine Linux is said to be run SNES games via a Raspberry Pi embedded in the
supports 64-bit Raspberry Pi ‘designed with security in mind’ NES game cartridge.
models, including the Pi 3B+. by using an ‘unofficial port of
While version 3.2 of Alpine grsecurity/PaX’ while ‘all userland
Linux supported the 32-bit binaries are compiled as position-
independent executables (PIE)

Alpine Linux is a sound
choice for a home PVR, or
an iSCSI storage controller

Raspberry Pi 2, the new release with stack-smashing protection.’ TESLONDA
uses an aarch64 image to support Which surely all makes Alpine
64-bit Pi devices – you can Linux a sound choice for ‘a magpi.cc/plSXSC
download this image for free from home PVR, or an iSCSI storage
alpinelinux.org/downloads. controller, a wafer-thin mail What do you get if you cross a Tesla engine with a 1981
server container, or a rock-solid Honda Accord? Not much, unless you then integrate
Alpine Linux ‘is built around embedded switch’, according to a Raspberry Pi to unleash the 536 hp and achieve
musl libc and Busybox’, giving the About page. 0–60 mph in 2.48 seconds!
the distro a claimed install

Image credit: Alpine Linux, see alpinelinux.org THERMAL PAPER POLAROID

Above Alpine Linux is a small, focused OS ideal magpi.cc/gbPIac
for making Pi-based single-use devices
raspberrypi.org/magpi Tim Jacobs revived the thrill of instant photography
by incorporating a Pi Zero, a £2 webcam, and a tiny
thermal printer in an old Polaroid camera body.

FAeAbuurgguuuassrytt 220187 11

News PI-MONITORED SOURDOUGH STARTER

PI-MONITORED
SOURDOUGH
STARTER

Use a Pi for the perfect loaf

J ustin Lam, a mechatronics before proceeding to make a loaf. Justin used the Pi Zero “for speed Far Left The Pi Zero
engineer (in training), had If you start making that bread of development” and because “the with camera was
a problem: how to make dough too early or too late in the RPi community is a significant strapped to a tape
the perfect sourdough bread. He fermentation process, the resultant advantage over other platforms.” measure; the jars of
applied his knowledge of Linux, loaf won’t be good eating. sourdough starter
the Raspberry Pi, and image and He then used a ‘threshold were backlit in the
statistical analysis to predict Even worse, Justin tells us algorithm’ to measure the level of final project
peak fermentation to achieve the that while “the time to max the starter in the jar, graphing the Below Justin says
perfect crumb. fermentation will be consistent” rise over time. His starter typically his analyses of
if you use the same volumes of achieved peak fermentation after the visual and
Sourdough bread uses slightly starter and feed, “temperature and eight hours, with a window of statistical data
unpredictable ‘wild yeast’ rather humidity [changes] throughout the around one hour to start forming a were “extremely
than the “stable and consistent” season will alter” the time until loaf-sized dough to bake. simple” – perhaps
instant dry yeast used in standard maximum fermentation. ‘relatively’ might be
bread, Justin explains. But, instant Justin says the analyses he did more accurate?
yeast “lacks the complex flavours BreadCam were “extremely simple”, using
that wild yeast [in] sourdough Scikit-learn. The details are well
starter provides.” Justin’s solution was to monitor documented on Justin’s blog –
the rise of the starter with a Pi see magpi.cc/cCGydQ.
To bake a sourdough loaf, you Zero fitted with a Raspberry Pi
need to take some starter (a mixture camera. Justin says, “I was able to
of flour and water, exposed to the air leverage my previous experience
to ‘catch’ the wild yeast) and ‘feed’ with image analysis” to build
it with some flour and water. You a system that could accurately
then have to wait for this mixture monitor the fermentation process
to achieve maximum fermentation in a jar.

12 August 2018 raspberrypi.org/magpi

HACKSTER AND ARM AUTONOMOUS ROBOT CHALLENGE News

HACKSTER Above Use an
ARM-powered
device, such as
the Raspberry
Pi, to invent an
autonomous robot

AND ARM
AUTONOMOUS ROBOT
CHALLENGE Win some goodies with your

autonomous vehicle design

H ackster has joined with diving systems has arrived. … We project. Not perfect, but inventive Below The open-
processor designer ARM want to see how society can benefit and working.” source DonkeyCar
to launch the Autonomous from these new capabilities and
Robot Challenge, ‘to push the limits combine the trifecta of low-cost The Autonomous Robot Challenge is a suggested
of what low-cost, open-source UAVs, AI, and [machine learning] is offering some great prizes for device that might
hardware and deep learning’ can do into new inventions.” various categories, such as an form a useful basis
for humanity. X PlusOne HD drone for the two for your project. See
Rex St John, Senior Manager, IoT projects making best use of AI, and a donkeycar.com for
The basics of the challenge are to Ecosystem at ARM, adds: “We feel Geth-spider-like Robugtix T8X robot
design a machine based around an there is a good side to the topic for the two most creative projects. more details
See magpi.cc/TzzoWh for details.
Autonomously transport a
package in an urban, rural, Participation is open until
or underwater environment 30 September 2018, 11:59pm Pacific
Time. Winners will be announced
by 15 October 2018.

ARM device (such as a Raspberry of autonomous machines that we
Pi) that can do at least one of want to highlight and inspire the
these two tasks: ‘autonomously creativity of developers to think
transport a package in an urban, more broadly about how this low-
rural, or underwater environment’; cost AI technology can be used for
‘autonomously assist in a real- positive reasons.”
world scenario’.
You can enter by “submitting
Adam Benzion, CEO of Hackster, code, schematics, and BOM [bill of
explains, “We feel that the era of materials],” reveals Adam, adding
working drones, driving machines, that “it has to be a functional

raspberrypi.org/magpi August 2018 13

Tutorial WALKTHROUGH

SUBSCRIBE TODAY FROM JUST £5

SAVE
UP TO

35%

£5 FREE! MVOOUDCMHYEPRI Pricing

FOR ALL SUBSCRIBERS Rolling Subscription

Subscription benefits: £5 a month
Quick and easy to set up
FREE! Delivery to your door No long-term commitment
EXCLUSIVE! Raspberry Pi offers and discounts
NO OBLIGATION! Leave any time* * Leave any time applies to Rolling Subscription only

Subscribe for a year:
£55 (UK)
£80 (EU)
£90 (USA)
£95 (Rest of World)

magpi.cc/subscribe

14 August 2018 raspberrypi.org/magpi

JOIN FOR 12 MONTHS AND GET ATutorial

FREE
PI ZERO W
STARTER KIT
WITH YOUR SUBSCRIPTION Subscribe in print for
12 months today and
you’ll receive:
£20WORTH
Pi Zero W
Pi Zero W case

with three covers

USB and HDMI
converter cables

Camera Module
connector

ASUPBPSSCRTIOBREEOSN

FROM

£2.29

raspberrypi.org/magpi August 2018 15

Feature

MADE EASY

Give your Raspberry Pi an IQ boost with some AI projects

Last year we released a very special
issue of The MagPi that included the
Google AIY Projects Voice Kit, and it
was a huge smash. People went on to
send us loads of fun voice assistant
projects made out of the kit, getting

their introduction to the world of
artificial intelligence in the process.

With the maturation of cloud
computing, image recognition,
and voice assistants, AI is quickly
becoming something that the

masses can have access to.
What exactly is AI and machine
learning, though? And how can you

use it with a Raspberry Pi?
Read on to find out.…

16 August 2018 raspberrypi.org/magpi

ARTIFICIAL INTELLIGENCE MADE EASY Feature

WHAT IS AI? JOSE MARCIAL PORTILLA

Jose is the Head of Data Science
at Pierian Data Inc, and an AI
expert. He has several AI learning
courses you can find here:

magpi.cc/rxcYLk

The term ‘artificial intelligence’ There are also instances of intelligence tasks, including
was coined in 1956 at a workshop what is commonly referred to as creating a computer that can
in Dartmouth College by John ‘narrow AI’ tasks; these include defeat the world’s best Go
McCarthy, a few years after Alan more complex applications players (magpi.cc/FRLdsa),
Turing had written his now famous of neural networks (a specific developing self-driving cars
paper ‘Computing Machinery and framework of machine-learning with WayMo (magpi.cc/inTtzd),
Intelligence’ in 1950. algorithms modelled after and creating a Google Assistant
biological neurons). These narrow capable of calling and creating
A common misunderstanding AI tasks can include things appointments with interactions
is to conflate the terms such as image classification, (magpi.cc/itbNbz). These
‘machine learning’ and ‘artificial
intelligence’. Machine learning We’ve recently seen developments
is the use of algorithms that have
the ability to learn from a data from Google in these sort of narrow
source, allowing the algorithm
to then create predictions artificial intelligence tasks
or decisions based on that
data. Examples include spam language translation, and face developments help pave the
email classification, housing recognition. There have been way towards the creation of
price prediction, and product huge advancements in this field ‘strong AI’, which are artificially
recommendations in e-commerce. over the past few years due to intelligent systems that become
a large increase in availability indistinguishable from the
While machine learning is in computational power, intelligence of humans. This is
extremely useful, we typically especially due to advancements a future where AI can begin to
don’t think of these single in GPU performance. develop music, create works of art,
machine-learning algorithms as and hold a normal conversation
‘intelligent’ in the same way we We have recently seen with a human.
think of humans as ‘intelligent’. developments from Google in
This is why machine learning these sort of narrow artificial While there have been many
is a subset of the larger goal of developments in these individual
artificial intelligence. topics, we’re still far away from
a truly artificially intelligent
system, but working on these
more narrow AI problems can help
researchers solve the issues that
may face them when pursuing
general strong AI.

AlphaGo Zero learns to play Go (a board
game) without help. After 70 hours it
becomes a superhuman player and after 40
days trains itself to become, arguably, the
best Go player on Earth magpi.cc/oSPVEz.

raspberrypi.org/magpi August 2018 17

Feature

LINE FOLLOWING

WITH OPENCV

Is your robot just a bit too… robotic?

YOU’LL NEED

OpenCV
opencv.org
Pi Camera Module
magpi.cc/camera
Full code
magpi.cc/FwbnYS

The OpenCV (Open Source Computer >STEP 01 >STEP 02
Vision) Python library can add
Setting up OpenCV Capture an Image
some fairly heavyweight visual AI
to your Raspberry Pi robot. This Undoubtedly the trickiest Capturing images is done with the
hugely powerful library has over part of the whole process is getting picamera.capture() function.
OpenCV installed and running on We set the resolution fairly low
2500 different functions, including a your Raspberry Pi. The installation
comprehensive set of both classic and process depends on which Pi / OS (320×240) as this keeps the
state-of-the-art computer vision and setup you are using. Fortunately,
machine-learning algorithms. These there are some very good guides image size small and lowers the
algorithms can be used to detect and out there, including this one by the
recognise faces and identify and track very awesome Adrian Rosebrock: processing power required to
objects. In this example, we’ll get you magpi.cc/PwLKfE.
started by showing how OpenCV can
raspberrypi.org/magpi
be used to detect a line and how we
can then train a robot to follow it.

18 August 2018

ARTIFICIAL INTELLIGENCE MADE EASY Feature

analyse each one. Once the image accordingly. There is some very
is captured, we also then crop it to simple logic that can be applied at
ensure we get just the central part this point. The pseudocode below
that we’re interested in. should help.

>STEP 03 if LINE in centre of image: FURTHER ADVENTURES
GO STRAIGHT ON SOMEWHERE OVER
Blur the image THE RAINBOW!
if LINE is on left of image:
As we’re not really interested in TURN LEFT There is much, much more that you can do
the actual details of the image, we with OpenCV. In the ‘Somewhere Over the
apply a Gaussian blur on the whole if LINE is on the right of Rainbow’ event in Pi Wars this year, robots had
thing to blur out unnecessary the image: to autonomously locate and then drive to four
details. This helps level out any TURN RIGHT different coloured balls set in the corners of a
noise in the image. Don’t worry 1.2 m square ‘arena’ (Some teams were more
about the name – this is just an This will work, but you can get a successful at this than others!) The colour/object
OpenCV function. lot more subtle and complex than tracking teams used was mainly OpenCV-based
this if you want. For example, and takes a similar approach to finding a line.
>STEP 04 speed for both turning and driving Except it’s a ball. And there’s four of them. And
can be altered depending on how
Use findContours() function far away from the centre the they’re all different colours!
to find the line line is. magpi.cc/aUzwfk

Firstly, we make a negative of the Now, it doesn’t matter what Left The picture the
image as this makes it easier for robot platform you use, (think Pi Camera takes of
OpenCV to detect edges. We then CamJam EduKit or the Tiny 4WD the line in front of it
use the findContours() function from Coretec Robotics) all of the Below Performing
to find the edges of the line. With above will remain the same. It is some editing
some simple maths, we can then only the instructions to actually techniques makes it
work out exactly where in the drive the motors that will change readable to the Pi
image the line appears. depending on your setup.

>STEP 05 A full version of the code
described above can be found here:
Let’s get moving magpi.cc/FwbnYS.

Now we know where in the image
the edges of the line are, we
can instruct our robot to move

raspberrypi.org/magpi August 2018 19

Feature

IMAGE

RECOGNITION WITH

MICROSOFT AI
It’s remarkably simple to use this computer vision service to give
you a description of what the Raspberry Pi can ‘see’

M icrosoft’s Computer Vision among other features. Pricing is The image may either be a URL or
Cognitive Service is a free for up to 5000 transactions a file. The visual features include:
per month, and thereafter $1 or Categories, Tags, Description,
feature-rich, cloud-based less per 1000 transactions for Faces, ImageType, Colour, and
core features. Adult. Additional details may also
API providing image analysis that be requested such as identifying
Send a picture celebrities and landmarks. A full
can be easily and quickly integrated list of all the options and what
In order to get started with the each provides is available on
into your project. In addition to Computer Vision service, an the API documentation page at
API key is required and may be magpi.cc/vOeFzE.
providing a description of what obtained at magpi.cc/dMRkhi.
Using the API is simply a matter The recognition.py listing is an
it ‘sees’, the service is able to of sending a POST request with example in Python for requesting
the API key, an image, and a list a description of an image stored
categorise and tag images, detect of the desired visual analysis locally in the file /tmp/image.jpg.
features, then processing the The returned result will be of
human faces, and recognise text, returned result. the form shown below:

01. {
02. "description": {

03. "captions": [
04. {

"text": "The description of the image appears here",
05. "confidence": 0.9234897234987234
06. }
07. ]
08. },
09. "requestId": "c11894eb-de3e-451b-9257-7c8b168073d1",
10. "metadata": {
11. "height": 600,
12. "width": 400,
13. "format": "Jpeg"
14. }
15. }

20 August 2018 raspberrypi.org/magpi

ARTIFICIAL INTELLIGENCE MADE EASY Feature

Robert Zakon’s Seeing Wand may look a little
rough around the edges, but it’s a very smart
device. It uses a Pi Zero and Camera Module,
along with Microsoft Cognitive Services, to
identify objects at which it is pointed.

SEEING WAND recognition.py CODE:

The basics of this project can be applied 01. #!/usr/bin/python LANGUAGE:
to build a ‘Seeing Wand’, a project that 02. import httplib, urllib, base64, json, re PYTHON
tells you what it’s being pointed at! You
can find out more details on how to build DOWNLOAD:
magpi.cc/DWcGRT
one here: magpi.cc/pfOPwB.
The wand was built to help a blind person 03. # CHANGE {MS_API_KEY} BELOW WITH YOUR MICROSOFT VISION API
‘see’ a bit more of the world around them 04. KEY

– it’s a bit of a makeshift contraption ms_api_key = "{MS_API_KEY}"
though, using a breadboard and a PVC 05.
06. # setup vision API
tube. Still, it’s pretty wonderful. 07. headers = {
08. 'Content-Type': 'application/octet-stream',
In order to learn more about the 09. 'Ocp-Apim-Subscription-Key': ms_api_key,
Computer Vision service and test it 10. }
out using your own images (without 11. params = urllib.urlencode({
having to write code if you wish), 12. 'visualFeatures': 'Description',
check out: magpi.cc/fFLtpJ. 13. })
14.
raspberrypi.org/magpi 15. # read image
16. body = open('/tmp/image.jpg', "rb").read()
17.
18. # submit request to API and print description if successful
19. or error otherwise

try:
20. conn = httplib.HTTPSConnection('westcentralus.api.
21. cognitive.microsoft.com')

conn.request("POST", "/vision/v1.0/analyze?%s"%params,
22. body, headers)

response = conn.getresponse()
23. analysis=json.loads(response.read())
24. image_caption = analysis["description"]["captions"][0]
25. ["text"].capitalize()

conn.close()
26. print image_caption
27.
28. except Exception as e:
29. print e.args
30.

August 2018 21

Feature

SPEECH
RECOGNITION
WITH ZAMIA
YOU’LL NEED

Active speakers
USB microphone

eliza.py AI projects don’t always have to be connected to the internet,
magpi.cc/DUkkTT

and Zamia shows us how to do offline voice recognition

V oice assistants are all the To set up the additional apt-get update
rage these days – but most APT source, execute with root apt-get install python3-
of them rely on cloud permissions (sudo -i): nltools python-kaldiasr-doc
services requiring an internet kaldi-chain-zamia-speech-en
connection, which may cause data echo "deb http://goofy. pulseaudio-utils pulseaudio
privacy issues and obscure their zamia.org/repo-ai/raspbian/
inner workings. stretch/armhf/ ./" >/etc/apt/ >STEP 02
sources.list.d/zamia-ai.list
>STEP 01 ASR on a WAV file
wget -qO - http://goofy.
Installation zamia.org/repo-ai/raspbian/ We start out small: to try out the
stretch/armhf/bofh.asc | sudo Kaldi-ASR speech-recognition
Most of the more advanced apt-key add - engine, we will feed it a simple
speech-recognition software we recording of a spoken sentence. We
are using here is not yet part of With that in place, you can will use one of the example files that
Raspbian. We will therefore rely on install the required packages using comes with it (or any other 16 kHz
a third-party repository from the APT (again with root permissions): mono WAV file, for that matter):
Zamia-Speech project.
zcat /usr/share/doc/python-
kaldiasr-doc/examples/dw961.
wav.gz > dw961.wav

The code to run Kaldi ASR on
this example is shown in the
wav_decoder.py code.

>STEP 03

Live speech recognition

The live_vad.py program is the
foundation of our voice assistant:
we record samples from our
microphone using Pulse Recorder,
feed it into a Voice Activity
Detector (VAD), and decode it using
Kaldi ASR. The MODELDIR variable
points to the embedded English
Zamia-Speech speech-recognition

22 August 2018 raspberrypi.org/magpi

ARTIFICIAL INTELLIGENCE MADE EASY Feature

Ideal for offline speech recognition, Zamia
Speech offers pre-built Kaldi ASR packages
for Raspbian, complete with pre-trained
models for English and German. Everything
is free, cloudless, and open source

CODE:

model which we installed from With that in place, we can simply >STEP 07 LANGUAGE:
their APT source earlier. You can look up the user’s utterance to find PYTHON
experiment with different volume the intent and generate a response. Taking it further
settings if you are not satisfied The va_simple.py code is our DOWNLOAD:
with the recognition results. complete bot at this stage. magpi.cc/zbtuSx

>STEP 04 >STEP 06 There are endless possibilities to

Text to speech Patterns and ELIZA improve upon this voice assistant.

A voice assistant should To make our bot tolerant against You can get creative: make it
not only recognise natural input variations, we can use patterns:
language, it should also be able ‘switch the (radio|music) (on|off)’ tell jokes, say quotes from your
to produce spoken answers to
communicate back to the user. favourite movies, or come up with
We will use eSpeak NG here for
speech synthesis since it is free your own witty responses. Or
software and easy to set up. The
espeakng_tts.py code is a short make it more practical: add a wake
example of how to use it.
word and attention mechanism,
>STEP 05
make it tell the current time
Intents and actions
There are endless
We need to figure out what the user possibilities to improve
wants us to do: the user’s intent. upon this voice assistant
We want to keep things simple
for now so we will have just three will already generate four patterns. or date; or, if you do not need
intents: HELLO (say hello), LIGHTS, Instead of a simple mapping between it to run offline, add online
and RADIO (toggle the lights and input and patterns, we will use their skills like fetching the weather
the radio on or off). add_utt is edit distance (the minimum set of report or looking up definitions
a utility function that we use to insertion, deletion, and substitution on Wikipedia.
create a static mapping between operations to transform one into
natural language utterances the other) to compute the closest You can also improve its
such as ‘switch on the radio’ and match – this will make our bot language capabilities: use more
corresponding intents. tolerant against utterances we did advanced speech synthesis
not foresee. If we do not find a tools like SVOX Pico or Google’s
close enough match, we will turn Tacotron for more natural-
to ELIZA to generate a response. sounding responses, or go to
The va_eliza.py code comprises the zamia-speech.org to learn how
completed bot. to adapt the speech-recognition
model to your domain.

raspberrypi.org/magpi August 2018 23

Feature

INSPIRATIONAL

AI PROJECTS
More examples of some of the amazing
things you can do with AI on Raspberry Pi

ARTIFICIAL MAKER: MICHAEL DARBY
LIFE PROJECT AI TYPE: SIMULATION

magpi.cc/ZKcLUY

Something you’d more
traditionally associate with AI,
this simulation creates artificial
lifeforms in Python code, giving
them different traits and allowing
them to ‘breed’ and die and such.
It would be quite horrifying an
experiment if they were real, but
they’re not, so it’s fine. Anyway,
it’s all displayed beautifully on a
Unicorn HAT.

As a bit of a bonus, Michael
figured out a way to plug it into
Minecraft Pi to get a different kind
of visual representation of what
goes on in the simulation.

COUNTING MAKER: MAT KELCEY
BEES
AI TYPE: VISION
magpi.cc/BHXFpo

Like the name suggests, this
project is used to count bees,
specifically in a beehive.
According to Mat on his blog
about it, he couldn’t find a decent
enough, non-intrusive system to
the trick. So he made one!

It uses a standard Raspberry Pi
Camera Module, and the Python
code uses TensorFlow (and some
manual training) to detect the
bees in the image. Over time it’s
become a lot more accurate,
through a mixture of machine
learning and image optimisation.

24 August 2018 raspberrypi.org/magpi

ARTIFICIAL INTELLIGENCE MADE EASY Feature

POKÉDEX MAKER: ADRIAN ROSEBROCK DEODTBREJOCENTCIEOTN MAKER: ARUN GANDHI

AI TYPE: VISION AI TYPE: VISION
magpi.cc/gYUpxN magpi.cc/vZXAXa

There have been several Pokédex toys in the past Drone technology, and
20 or so years since the Pokémon games were drone automation, is always
released. However, none possessed the actual improving. Pairing drones with
capability to use the image of a Pokémon to image recognition seems like
identify it like in the games and TV show. a no-brainer, and this fusion
of technology has been used
Using hundreds of images, from official artwork in Africa to monitor house
to renders and plushes and fan-works, Adrian construction projects.
managed to train a Raspberry Pi to be able to
identify five of the 807 Pokémon currently in The images capture the various
existence (minus different forms, regional variants, building stages of the houses in an
Mega evolutions, etc.). attempt to make sure that if any
problems arise, the team will know
The results? From his blog about it, about it. Using nearly 1500 images,
pretty accurate. Maybe Nintendo will they trained the AI to expertly look
license it and put it in Pokémon GO? out for the signs of different parts
of house construction so it could
be relayed back for inspection.
This project was created with the
NanoNets API (nanonets.com).

IDNOTEOLRLLIGOECNKT MAKER: KHAIRUL ALAM This project is pretty complex,
but basically it’s a smart lock.
AI TYPE: VISION All you need to do is train it to
magpi.cc/Cbwaih recognise people you know and if
they come to your door, it will let
you know who is there. It’s more
complicated than just installing a
camera you can stream over the
web, but it means you can receive
a text message about who it is
rather than squint at a camera
from across town.

This particular project uses
Alexa, and it can also open the
door without you being physically
present. It will even welcome your
guest. Proper Star Trek stuff.

raspberrypi.org/magpi August 2018 25

Tutorial STEP BY STEP

SUBSCRIBE AND

UP

S3A5V%ETO

on the cover price

hsmag.cc

26 March 2015 raspberrypi.org/magpi

Tutorial

OUT NOWISSUE #09

raspberrypi.org/magpi April 2016 27

Projects SHOWCASE ANTHONY DIPILATO

Based in the Miami/Fort Lauderdale
area of Florida, Anthony is a full-stack
developer who enjoys designing,
building, and coding things.
anthonydipilato.com

GHOST DETECTOR

QFaucictsk Equipped with all manner of sensors, this Ghost Detector is designed
to discover paranormal presences. A spooked Phil King investigates
> B uild details
are at T he truth is out there… his father. “My dad watches a lot of Raspberry Pi offers an infrared
magpi.cc/ At least that’s what they paranormal investigation shows,” camera,” recalls Anthony, “so I
bLOSrp used to say on The X-Files. says Anthony, “so I thought it decided to build something that
Well, if there is any paranormal would be a fun project to give could record video with overlaid
> C ode is on activity around, Anthony DiPilato’s as a Christmas gift.” sensor data.”
GitHub: sensor-packed Ghost Detector
magpi.cc/ aims to find it. While he built the Infrared camera The Raspberry Pi records video,
zbDYrV device around two years ago, this audio, and sensor data, then saves
top-secret project has only just While the project started off as it to a USB flash drive. Mounted
> T he ‘wooden’ been shared with the wider world. a simple Arduino-based EMF on top of the device, an official
case is actually The idea stemmed from a desire (electromagnetic field) meter, it Raspberry Pi 7-inch touchscreen
3D-printed to create a home-made present for quickly evolved into something provides a user interface, while
from PLA far more ambitious. “I saw also displaying the data from the

> T he project A 7-inch touchscreen shows
took two the live camera view overlaid
months with sensor data
to build

> T he detector
includes a
cooling fan

28 August 2018 Aided by two sets of IR LEDs,
an infrared camera captures
images in the dark

Twin antennas amplify the
signal for the EMF sensors

raspberrypi.org/magpi

GHOST DETECTOR Projects

BUILD A GHOST DETECTOR

>STEP-01 >STEP-02 >STEP-03

3D-printed enclosure Add loads of sensors Keep your cool

After creating several prototypes and A magnetometer and temperature/ A cooling fan blows air into a duct that
making adjustments, Anthony 3D-printed pressure sensor are mounted on vents up the back of the enclosure.
the final enclosure from Hatchbox Wood stripboard, along with an Arduino Nano Added partly for aesthetic effect, twin
PLA, then sanded and stained it for a connected to dual EMF sensors. The telescopic antennas are linked to the
wood-style finish. Geiger counter is a separate board. EMF sensors.

numerous sensors and a live video sensors, there’s a magnetometer
view from the infrared camera. (compass), altimeter, temperature
and barometric pressure sensor,
Featuring a pistol-grip handle, microphone, and a Geiger counter
the body of the detector was to measure radioactivity. Most of
3D-printed on Anthony’s newly the sensors and other electronics
acquired Monoprice Maker Select. are mounted on stripboard,
He designed the enclosure using

Since it is a pseudoscientific
instrument, I wanted to make
it look as ridiculous as possible

the Autodesk Fusion 360 CAD including two 5 V 3 A step-up power It took him a mere two days to Above Anthony
software, which offers a free supplies, an Arduino Nano, and a program the software, mainly tested the
licence for hobbyists. logic level converter to interface comprising Python scripts. electronics out
the Nano to the Raspberry Pi. before cramming
“Since it is a pseudoscientific Asked if he has detected any them into the
instrument, I wanted to make it The Geiger counter is a separate unearthly presences, Anthony 3D-printed
look as ridiculous as possible,” board, while its Geiger tube is replies, “I only tested at a few enclosure
he tells us. “So I included rabbit- mounted on the front along with local places that are supposedly
ear telescopic antennas [for the camera and two lots of infrared haunted, but I was not able to
the EMF sensors] and a Geiger LEDs either side. To power the record any conclusive evidence
tube. I thought the stained device, two Panasonic 18650 of hauntings.” He did discover
wood enclosure would match 3400 mAh batteries are housed in that blood absorbs infrared light,
that aesthetic.” the handle. however, showing up all the veins
in his arm on the camera view –
Sensory overload From start to finish, the Ghost which looks pretty spooky.
Detector took Anthony around two
Continuing the theme of making months to build: “The only major
it as ludicrous as possible, issue I encountered was the control
Anthony crammed the detector board for my 3D printer burned
with “as many sensors as I out, and needed to be replaced
could fit.” Along with the EMF before I could finish the project.”

raspberrypi.org/magpi August 2018 29

Projects SHOWCASE CARSTEN DANNAT
AKA ‘THE SQUIRREL
A mechanical switch is GASTRONOMER’
pressed whenever the lid
is opened by a squirrel A software engineer from Ahrensburg
The nuts are visible in Germany, Carsten opened the
behind a glass panel Squirrel Cafe in 2007, later adding the
Raspberry Pi IoT project to monitor it.
thesquirrelcafe.com

THE A Raspberry Pi is
connected to the switch,
SQUIRREL LED display, and a USB
webcam to take photos

CAFE
Predict the weather with… squirrels and nuts!?
QFaucictsk Nicola King lifts the lid on an ingenious project

> It correctly B ack in 2012, Carsten “On returning home, I got the stats on the nuts eaten and time
predicted the Dannat was at a science sudden inspiration to measure the taken. Unsurprisingly perhaps,
cold winter of summit in London, during nut consumption of squirrels at Carsten says that the squirrels
2017-18 which a lecture inspired him to our squirrel feeder”, he says. Four are “focussed on nuts and are
come up with a way of finding years later and his first prototype not showing interest at all in
> Carsten plans correlations between nature and of the ‘The Squirrel Cafe’ was the electronics!”
to add a scale climate. “Some people say it’s built, incorporating a first-
to weigh possible to predict changes in generation Raspberry Pi. So, how do you know how many
the nuts… weather by looking at the way nuts have actually been eaten by
certain animals behave,” he tells A tough nut to crack the squirrels? Carsten explains
> … for more us. “Perhaps you can predict that “the number of nuts eaten
accurate how cold it’ll be next winter by A switch in the feeder’s lid is per visit is calculated by counting
measuring analysing the eating habits of triggered every time a squirrel lid openings. This part of the
of nut animals? Do animals eat more opens it. To give visual feedback source code had been reworked
consumption to get additional fat and excess on how often the lid has been a couple of times to get adjusted
weight to be prepared for the opened, a seven-segment LED to the squirrel’s behaviour while
> Raccoons have upcoming winter?” display shows the number of grabbing a nut out of the feeder.
broken into An interesting idea, and one openings per meal break. A USB Not always has a nut been taken
the feeder that Germany-based Carsten was webcam is also used to capture out of the feeder, even if the
determined to investigate further. images of the squirrels, which are lid has been opened.” Carsten
> A video tweeted automatically, along with makes an assumption that if
‘security
camera’ now
monitors
all visitors

30 August 2018 raspberrypi.org/magpi

THE SQUIRREL CAFE Projects

the lid hasn’t been opened for Left A squirrel
at least 90 seconds, the squirrel enjoying a tasty
went away. “I’m planning to treat at the
improve the current design by Squirrel Cafe
implementing a scale to weigh Below The
the nuts themselves to get a more results of a
accurate measurement of nut raccoon’s
consumption,” he says. rampage

Jthuestwneuattshaebr!out

The big question of course is
what does this all tell us about
the weather? Well, this is a
complicated area too, as Carsten

Some people say it’s possible to predict changes in
weather by looking at the way certain animals behave

illustrates: “There are a lot of visiting the Cafe, with what he
factors to consider if you want intriguingly calls “individual
to find a correlation between squirrel recognition” – a planned
eating habits and the prediction improvement for a future
of the upcoming winter weather. incarnation of The Squirrel Cafe.
One of them is that I cannot
differentiate between individual Fine-tuning of the system
squirrels currently [in order to aside, Carsten’s forecast for the
calculate overall nut consumption winter of 2017/18 was spot-on
per squirrel].” when he predicted, via Twitter,
a very cold winter compared to
He suggests that one way the previous year. He was proven
around this might be to weigh right, as Germany experienced
the individual squirrels in its coldest winter since 2012.
order to know exactly who is Go squirrels!

SECRET SQUIRREL

>STEP-01 >STEP-02 >STEP-03

Lid switch Nut supply Tweepy tweet

When a squirrel opens the lid, a The feeder is filled with peanuts. Since After each meal visit, the Tweepy

mechanical switch is triggered. This these are split into halves, it’s assumed Python library is used to automatically

replaced a magnetic reed switch, which that each lid opening results in half a nut tweet the details and a photo taken by

Carsten says wasn’t totally reliable. being consumed by the squirrel. a connected USB webcam.

raspberrypi.org/magpi August 2018 31

Projects SHOWCASE ROBIN NEWMAN

SONIC PI Now retired, Robin was once the Head
of IT at Oundle School in Peterborough.
He currently enjoys experimenting with
his Raspberry Pi boards.
magpi.cc/Dwyabx

GLOCKENSPIEL
This project strikes a chord for its clever use of Sonic Pi
QFaucictsk and LEGO to make sweet music, as David Crookes discovers

> It uses Sonic Pi R obots have already blown Raspberry Pi playing a glockenspiel Sonic Pi-controlled glockenspiel
version 3.0.1 their own trumpet: Toyota and it’s been music to our ears. after seeing similar projects online
developed a humanoid that used an Arduino.
> L EGO forms in 2004 which could play When It’s all thanks to Robin Newman
the Pi- You Wish Upon a Star by clasping whose love of computers and “Version 3.1 was a game changer
controlled the instrument in its hands and music goes way back. In the 1980s, because it allowed Sonic Pi to
hammers blowing air through its mouth. he networked 24 BBC Micros and communicate with the outside
Since then, we’ve seen robots play had them play parts of a Bach world using either MIDI signals or
> Spacing the drums and guitar; even going Brandenburg Concerto. “Today, Open Sound Control messages,”
between the as far as recording an album. But 80 percent of my work with he explains. “It enables Sonic Pi to
centre of notes now we’ve heard the results of a Raspberry Pi boards involves Sonic interact with Python-controlled
is 24 mm Pi,” he says. He got the idea for a devices and to interact easily with

> The note range You can pick up glockenspiels on eBay
used is :C6 and it’s also possible to buy eight-note
to :E7 children’s toys, although Robin says
some may have “dubious tuning”
> Robin’s
glockenspiel
was vintage
Cold War East
Germany

The project uses a Raspberry Pi 3 fitted raspberrypi.org/magpi
with a RaspPiO Pro Hat that can easily
accommodate 11 circuits
The LEGO hammers use a 15 beam
with a 5×3 angle on the end, pivoted
on an axle of length three
32 August 2018

SONIC PI GLOCKENSPIEL Projects

START MAKING MUSIC

>STEP-01 >STEP-02 >STEP-03

Wire the Pro Hat Get out the LEGO Completing the setup

After grabbing a glockenspiel, Robin restricted The hammers were built from LEGO and, After wiring up the solenoids, Robin
the notes he’d use and accommodated the by happy coincidence, the width of the propped the glockenspiel on wooden
driver circuitry on a RasPiO Pro Hat. He used basic hammer mechanism was three LEGO blocks to allow the hammers to fit under
three TP102 Darlington power transistors. A beams (or 24 mm). Since ten notes on the instrument. He used Sonic Pi to send
breadboard was used to distribute signals Robin’s glockenspiel occupied 234 mm, it OSC information to the Python script and
to the solenoids. made for a good fit. specify the notes.

signals to and from the Pi’s GPIO off. This worked fine and so the Robin wrote a Python script to Below Robin
pins. I wanted to use the fact that question was how could this small drive the GPIO pins and therefore has been
it could control a glockenspiel movement be used to hit the keys.” the solenoids, and he delayed the experimenting
and play itself at the same time to It was then that he turned to LEGO. notes sent to the glockenspiel by with overhead
accompany the instrument.” 0.3 seconds. This compensated hammers, winding
Hammer time for the delay between the Sonic 38 cm of insulated
Setting up Pi playing a note and the note 1/0.6 wire around
Before the Pi was launched, Robin sounding in a speaker, allowing the a biro to form
Robin already had a glockenspiel. had spent a few years working Pi to accompany the glockenspiel a weak spring,
A 30-year-old gift to his son, it was with the LEGO EV3 system, mainly without it sounding odd. allowing them to
languishing in his attic. As such, producing colour-sorting robots. return to a rest
“I’m now looking at a system position clear of
80 percent of my work of overhead hammers,” Robin the metal notes
with Raspberry Pi boards reveals, keen to continue refining
involves Sonic Pi the project. “This will open up the
range of glockenspiels that can
be used.”

he sought to produce an easily “After some experimentation, it
constructed project that could be turned out to be beautifully simple
added to the instrument. The Pi, he to produce a hammer mechanism
envisaged, would control hammers out of LEGO which could then be
to strike the glockenspiel’s driven by the solenoid, providing
metal bars and he decided to use just the right kick,” he says. To do
solenoids as the actuators. this, he had the LEGO hammers
strike the notes from underneath,
“I bought a 5 V, Adafruit- allowing gravity to return them
sourced solenoid and I already to their resting position. “It’s
had a suitable power supply to important that the hammers strike
hand,” he recalls. “I also had a the notes and then immediately
power transistor and projection move away from them so that
diode from an Arduino starter kit. they don’t dampen the sound,”
I was able to connect them up to he explains. From then on, the
a GPIO pin and use the GPIO Zero software could work its magic.
LED command to switch it on and

raspberrypi.org/magpi August 2018 33

Projects SHOWCASE DIEMO NIEMANN

Diemo Niemann is the CEO of the
Save-Nemo Foundation in Germany
which is helping to save coral reefs
from damage.
save-nemo.org

QFaucictsk NEMO-PI
Coral reefs are threatened by divers and climate change, but one
> Nemo-Pi has organisation hopes to save them by using a Raspberry Pi as an
been tested to underwater ‘weather station’. David Crookes reports
a depth of up
to 50 m F or the past two years, the But while that has had a positive and the concentration of CO2 and
Save Nemo Foundation effect on the creeping destruction, nitrogen oxide at each anchor point.
> T he first has worked hard to protect the organisation spotted another
Nemo‑Pi has coral reefs off the coast of Thailand opportunity. “We realised we Oh buoy
been running and Indonesia. Its members have could do more by making these
for a year been sinking concrete blocks moorings smart,” says its CEO Every one of the concrete moorings
next to the reefs, allowing diving Diemo Niemann. “So we created has a buoy attached on top and,
> Each Nemo-Pi and snorkelling boats to safely a plan to collect underwater as it bobs in the water, it shows
setup costs moor by using them as anchor physical and chemical data that boat crews where they can anchor.
$600 max points. In doing so, they’ve is not only essential for science, The idea behind Nemo-Pi is to put
eased the problem of boat crews but helpful for local people and an encased Raspberry Pi into the
> The price dropping anchor directly into the their business.” The result? buoy, attach it to a solar panel for
depends on reefs, which has already caused Nemo-Pi, a device able to measure power, include a GPS device so that
the number significant damage. temperature, visibility, pH levels, its location can be determined,
of sensors and run an array of sensors from

> E ight sensors
are currently
being tested

A waterproof 10 W solar panel is
controlled by this WittyPi Mini RTC
and power management module

The Pi can be powered each day for 90 minutes – enough Housed within a saltwater-
for 15 daily measurements to the server. New GSM/GPRS resistant buoy, It’s important that
modules consuming under 2 Ah of power are being tested the Pi has sufficient power to
collect and send data
34 August 2018
raspberrypi.org/magpi

NEMO-PI Projects

MONITORING
THE SEAS

The Nemo-Pi project, which won a Google Impact Challenge 2018 award, >STEP-01
needs volunteer programmers. Interested? Email [email protected]
On the surface
the computer into the sea that can Although the project is still in
then feed back vital information. a pre-production stage (“we have Buoys are attached to concrete moorings. Inside the
eight sensors under long-term prototypes for Nemo-Pi is a Raspberry Pi – chosen
A team of programmers has been alpha and beta testing,” says over an Arduino following a year of research. A
busy coding in Python and C++ on Diemo), the setup works well on a flexible and robust solar panel and battery unit
a slimmed Debian environment number of levels. Since it allows is also included.
to create the Nemo‑Pi itself. water conditions to be monitored,
Meanwhile, much testing has been holidaymakers – as well as
carried out to ensure the project underwater, hotel, and cruising
is saltwater resistant and able businesses – can determine
to withstand high levels of UV whether it’s worth making a trip.
irradiation. It is important that the

We created a plan to collect >STEP-02
underwater physical and
chemical data Senses from the deep

entire setup is simple, sustainable, “Hundreds of dives and Sensors connect to the Pi and run out from the buoy.
affordable and reliable, not to snorkelling trips have been They are submerged, measuring different physical
mention energy-efficient. cancelled each day while out on the and chemical data at various depths. New sensors are
water because of bad conditions,” being added, such as one measuring refraction of light
Saving energy Diemo continues. “If you know underwater – testing suspended matter/visibility.
where to be at the right time,
“The Nemo-Pi has a modified real- you can save gasoline, working >STEP-03
time clock and GPRS/GPS hub,” hours, and unnecessary anchoring,
Diemo explains. “With this, the and because of that we can help Processing the data
device is powered up and down to generate corals and sea life.”
save energy and send its data direct Readings taken from the sensors are uploaded live
to our server, which takes care of Nemo-Pi also assists scientists, to a public web server. A dashboard displays the
the visualisation and processing. universities, and governments data and allows observers to figure whether tours
During the night, Nemo-Pi is by picking up on signs of climate are feasible or whether the conditions need further
automatically powered off and we change and dangers to marine scientific analysis.
have developed a library to optimise wildlife. “It is a small but
data transmission against sunlight, important step against global
power consumption and battery warming and pollution of the
load. It means a Pi can operate sea and it all helps to make
endlessly – or at least until the our ocean more blue again,”
battery or solar gives up.” Diemo concludes.

raspberrypi.org/magpi August 2018 35

Tutorial RASPBERRY PI 101: BEGINNER’S GUIDE TO RASPBIAN FILE MANAGER

MASTER THE

RASPBIAN

FILE MANAGER

Find and manage files and folders on your Raspberry Pi and connected servers

YoNue’leld F ile Manager sits at the heart of the Raspbian move the cursor to the path box and type in the
operating system, giving you a graphical, location directly.
> R aspberry Pi easily navigated view of your files, folders,
running Raspbian and drives. Switchers from Windows or macOS might You can also connect to remote folders on WebDAV
find the unfamiliar Linux folder structure daunting at and FTP servers, thus managing websites directly
> Keyboard first, but – as we’ll demonstrate here – you can easily within File Manager, without the kind of FTP software
slim down what’s on show for a clearer view of your commonly required on other platforms.
> M ouse or own personal assets.
trackpad Once you get used to the structure and start Don’t confuse File Manager with the system menus.
to learn where Raspbian places files that are These are shortcuts to specific apps or utilities
stored outside your user folder – like web assets installed on your Raspberry Pi. The File Manager is
at /var/www/html/, for example – you can skip far more flexible, giving you direct access to specific
directly to those folders without stepping out of the locations, and putting a friendly, graphical face on
graphical environment. Use CTRL+L or ALT+D to the text commands like ls (to list files), cd (to change
directory), and mv (to move or rename files) used in
the Terminal environment.

Use the Return and Parent
icons to navigate back to the

previous directory

36 Axxuxgxu2s0t126018 Files and folders
are displayed in the main

central window

The side pane displays
all the files and folders in the

directory tree

raspberrypi.org/magpi

RASPBIAN FILE MANAGER Tutorial

HOW TO: 04
NAVIGATE FILES
AND FOLDERS
IN RASPBIAN

01 >STEP-04

Quickly Go

Click on the Go menu to find shortcuts to the Home
folder, Desktop, and Wastebasket. You can also click
the down arrow to the right of the directory path to
reveal a history of recently visited places.

05

>STEP-01

Open File Manager

Click the File Manager icon in the Menu bar. By
default, File Manager shows large icons for each file
and folder from your Home directory. All the files and
folders are also displayed in the sidebar.

02

>STEP-02 >STEP-05

Moving around Open files

Double-click a folder (such as Documents) to move Double-click a file to open it in the default program.
the main window to that folder. Now use the Return You can choose a different program to handle the file by
(left) or Parent (up) arrow to move back to the Home right-clicking the file and picking ‘Open with…’. Close
directory. You can also click on folders in the sidebar. the file using the Close (‘X’) icon when you are done.

06

03

>STEP-03 >STEP-06 37

Drag and drop with ease Remote file management

The easiest way to copy a file is to open a second Manage remote files on FTP and WebDAV servers or
window with CTRL+N. Now click and drag the file via SSH by picking ‘Connect to Server’ from the Go
from one window to another. menu. Pick the server type from the drop-down,
then provide login details.
raspberrypi.org/magpi
Auxgxuxxsxt 20168

Tutorial WALKTHROUGH MIKE COOK

MIKE’S PI BAKERY Veteran magazine author from the old
days and writer of the Body Build series.
Co-author of Raspberry Pi for Dummies,
Raspberry Pi Projects, and Raspberry Pi
Projects for Dummies.
magpi.cc/259aT3X

KNIT YOUR OWN

STRETCH SENSOR
YoNue’leld Make a Stretch Armstrong out of code and some thread

> Knitting Nancy I nitially it seems improbable that you could
knit your own sensor, but you can. With the
> C onducting right sort of knitting and some conductive
thread – thread, you can make a stretch sensor quite simply.
smooth There is a touch more of the experimental nature
about this project than most of the ones we do, but
> W ool or other it is none the less rewarding for that. Here at the
thread Bakery we are old enough to have been brought up
in a time when everyone was taught needlework at
> Reel holder primary school, but unfortunately this seems no
longer to be the case. We are quite firm believers
> A/D converter, in the opinion that there is no such thing as
either gender-specific activities.
commercial
based on the Knitting Nancy
MCP3008 or as
we made in The A Knitting Nancy can be made from an old wooden
MagPi #68 cotton reel and four very small nails, but for not
very much more you can purchase a fancy one.
> 2 × Crocodile These are often built in the form of a cartoon-like
clips figure. Their purpose is to produce ‘French knitting’,
which is tubular. Due to the way the thread is looped
> 270 Ω resistor back on itself, it acquires a sort of springiness. By
incorporating a conducting thread along with the
> Crochet hook, basic thread, we can make a sensor that changes its
snipe-nose resistance as it is stretched.
tweezers,
darning needle How it works
(optional)
Conductive thread is made from tiny slivers of
Fibres under Fibres under stainless steel foil, spun together to make a thread.
Compression Tension We used the smooth variety for these experiments;
Low this type is not so ‘hairy’ as the normal type, but
High resistance there is nothing to stop you using that.
resistance
Stainless steel is not the best electrical conductor,
Figure 1 A representation of how fibres in conducting thread are but it is good enough when used to stitch circuit
aligned when under tension and are bunched up under compression paths over a short distance; this is because when
you stitch with it, the thread is under tension. This
means that the short foil pieces make good contact
with others that surround it. However, when the
thread is not under tension, and even more when

38 August 2018 raspberrypi.org/magpi

Stretch Armstrong display KNIT YOUR OWN STRETCH SENSOR Tutorial
Knitting Nancy
A/D converter

Stretch sensor August 2018 39

  KNITTING
YOUR
SENSOR

>STEP-01

Making a bobbin holder

On a 150 mm by 90 mm piece
of wood (actually a floorboard
off-cut) we drilled three 6 mm
holes part way through and
glued in three 6 mm diameter
80 mm long dowels. A piece
of 2 mm foam was glued to
the underside to ensure a firm
grip on the table and also to
prevent it scratching the table.

raspberrypi.org/magpi

Tutorial WALKTHROUGH

Vin (3V3) it is under a slight compression, the thread’s fibres
bunch up and tend to not make contact over their
Total Current l R1 whole length. This leads to an increase in resistance
270R of the thread. This effect is illustrated in Figure 1.
The result is that as the knitted tube is stretched, its
To A/D resistance drops.

Rs Vout Details of how to knit your own sensor are shown
Stretch Sensor Voltage Reading in the numbered steps. We got a resistance range of
about 140 Ω between the stretched and unstretched
Total Current l = 3.3 / (R1 + Rs) sensor; this was for tube lengths of about 24 cm.
Voltage across Rs = l x Rs
There are many videos on French knitting on the
Rs = R1 x 1 internet. One of the first we found (magpi.cc/frZJnV)
shows you how to cast on and off.
Vin -1
Vout Reading the resistance

Figure 2 Calculating and measuring the sensor resistance The stretch sensor is connected to an input channel
of an A/D converter using crocodile clips, which
>STEP-02 are known in the US as alligator clips. In the Drum
Sequencer project of The Mag Pi #68, we showed
Start knitting you how to make an A/D converter. Figure 2 shows
you not only how to connect this up, but also the
The initial setup of the threads formula needed to calculate the resistance of the
is shown in the photo. We sensor. We can find the resistance of the sensor
started knitting a few rounds if we know the voltage across it, and divide that
with a base thread before by the current passing through it. This is called
laying the conducting thread Ohm’s law.
beside it. You then have to lift
the two threads over the hooks We have a resistor, R1, of 270 Ω connected
together. Occasional gentle between the 3V3 supply and the analogue input,
tugs on the thread out of base with the sensor wired between the analogue input
stops the threads bunching and ground. Resistors in series simply add up to
up. Be careful not to make
the tension too tight, as it will Figure 3 Physical wiring attaching the resistor and sensor
make lifting the threads too to the A/D converter
difficult and you could snap
the conducting thread.

40 August 2018 raspberrypi.org/magpi

KNIT YOUR OWN STRETCH SENSOR Tutorial

>STEP-03

Making a stitch

This image shows the three steps in making a stitch. Start by pulling
the thread over the next free peg, so you have two threads on that
peg. Then hook the lower thread and loop it over the upper thread
and place it the other side of the peg. You can use a needle, stick or
crochet hook, but our favourite was a pair of snipe-nosed tweezers.
Finally, pull the thread onto the next peg and repeat.

give the total resistance, so we can calculate the
total current. And knowing this value of current,
we can measure the voltage across the sensor.
Unfortunately, these formulae require you to know
the resistance of the sensor, which is what we are
trying to measure. Fortunately, it is just a simple
simultaneous equation problem that can be solved
by the substitution method.

Resistors in series simply add

up to give the total resistance,

so we can calculate the total

Finally, you can manipulate the resulting equation Figure 4
so that Rs is the subject of the equation. We will leave The Stretch
this as an exercise to the reader to verify the final display for two
formula. Figure 3 shows the physical wiring of the different stretches
sensor to the A/D converter, with R1 being connected
to 3V3 and A0, and the sensor's connection wires to
ground and A0.

Stretch Armstrong

Stretch Armstrong was a doll that could be stretched
to a very large extent and then recover. This was
about 40 years ago, but it made a reappearance a few
years back. We are going to do a software version
of this, and you can use an image of your favourite

raspberrypi.org/magpi August 2018 41

Tutorial WALKTHROUGH

>STEP-04 Armstrong; we chose Alexander Armstrong from
Pointless, and created an image 300 pixels high by 96
Casting off wide, with a transparent background, and placed it in
a folder called images. Alongside this we put the code
When you have knitted a long shown in the stretch.py listing.
enough sensor, you can cast
off. Lift the stitch off the last This Python 3 program is written under the
peg you used and loop it over Pygame framework, and is quite simple. The Vin
the peg next to it. Then, just and R1 variable values are defined at the start of the
like a normal stitch, pull the program. The code shows the standard values – but,
thread that was already on the for more accuracy, you can use measured values. The
peg over it. Repeat this with calculation of the sensor’s resistance is done in the
the next peg and so on until readSensor function, and a couple of if statements
you only have one stitch on stop the code from trying to divide by zero. The
one peg. Then cut the thread drawScreen function actually does the rescaling of
about 8 cm from the stitch, and the image and works out where to put the head so the
thread it through a darning feet always stay in the same place. The Ω (ohm) and
needle. Use this to darn the ∞ (infinity) symbols are printed using their Unicode
end so it is knotted and will not character numbers (0x3a9 and 0x221E). The results
fray. If this sounds complicated of two different stretch measurements are shown
then just view one of the many in Figure 4.
online French knitting videos.
Taking it further
>STEP-05
While Stretch Armstrong is a simple example of the
Mixing the threads use of the sensor, there are many others. One that
springs to mind is to use several sensors to control an
This shows three sensors we made. The top one was a on-screen virtual puppet, or even a servo-controlled
combination of hairy wool, knitting elastic, and conducting thread; real puppet. It could be used to control effects in
the middle one just knitting elastic and conducting thread; and a MIDI sound setup, or even used to weigh things.
finally, non-stretchy crochet thread and conducting thread. This You can do this by attaching the sensor to a piece
last one worked the best. Note that knitting elastic is elastic of bungee elastic so that it takes a lot more force to
with a cotton thread wrapped round it; it was first used in the stretch it. Note that for most applications you don't
1940s/1950s for knitting swimming costumes, but the least said have to calculate the resistance, just use the raw
about that the better. We found the bottom example worked best. A/D readings.

42 August 2018 raspberrypi.org/magpi

KNIT YOUR OWN STRETCH SENSOR Tutorial

stretch.py Language

>PYTHON 3

01. import pygame, os, time, random 45. textRect = DOWNLOAD:
02. import spidev
03. textSurface.get_rect() magpi.cc/1NqJjmV
04. pygame.init() # initialise graphics interface 46. textRect.left = x
05. 47. textRect.top = y PROJECT
06. os.environ['SDL_VIDEO_WINDOW_POS'] = 'center' 48. screen.blit(textSurface, VIDEOS
07. pygame.display.set_caption("Stretch")
08. pygame.event.set_allowed(None) textRect) Check out Mike’s
09. pygame.event.set_allowed([pygame.KEYDOWN, 49. Bakery videos at:
50. def initScreen(): magpi.cc/DsjbZK
51. pygame.draw.rect(screen,ba

pygame.QUIT]) ckground,(0,0,screenWidth,screenHeight),0)
10. textHeight=26 ; font = pygame.font.Font(None, 52. drawWords("Reading Resistance",16,454,pramCol,

textHeight) background)
11. screenWidth = 206 ; screenHeight = 500 53.
12. screen = pygame.display.set_mode([screenWidth, 54. def loadResource():
55. global spi, armstrong
screenHeight],0,32) 56. spi = spidev.SpiDev()
13. background = (150,100,40) ; pramCol = (180,200,150) 57. spi.open(0,0)
14. lastReading = 0.0 ; reading = 200 58. spi.max_speed_hz=1000000
15. Vin = 3.3 ; R1 = 270 # or replace with measured 59. armstrong = pygame.image.load(

values "images/Armstrong.png").convert_alpha()
16. 60.
17. def main(): 61. def readSensor():
18. global lastReading, reading 62. global reading,Rs
19. loadResource() 63. adc = spi.xfer2([1,(8)<<4,0]) # request
20. initScreen()
21. while(1): channel
22. checkForEvent() 64. reading = (adc[1] & 3)<<8 | adc[2] # join two
23. readSensor()
24. if lastReading != reading: bytes together
25. drawScreen() 65. if reading !=0:
26. lastReading = reading 66. Rs = R1*( ( 1/ (Vin / (reading * Vin /1024)
27. time.sleep(0.1)
28. -1 ) ) )
29. def drawScreen(): 67. Rs = int(Rs) # convert into interger
30. pygame.draw.rect(screen,background,(0,474, 68. else:
69. Rs = 0
70.

screenWidth,screenHeight),0) 71. def terminate(): # close down the program
31. pygame.draw.rect(screen,background,(0,0, 72. print ("Closing down")
73. pygame.quit() # close pygame
screenWidth,450),0) 74. os._exit(1)
32. drawWords(str(reading),40,478,pramCol,background) 75.
33. if reading > 600: 76. def checkForEvent(): # handle events
34. drawWords(chr(0x221E),116,478,pramCol, 77. global reading
78. event = pygame.event.poll()
background) # infinity 79. if event.type == pygame.QUIT :
35. else: 80. terminate()
36. drawWords(str(Rs)+" "+chr(0x3a9),116,478, 81. if event.type == pygame.KEYDOWN :
82. if event.key == pygame.K_ESCAPE :
pramCol,background) 83. terminate()
37. stretch = 500 - reading 84. if event.key == pygame.K_d : # screen dump
38. if stretch > 2 : 85. os.system("scrot -u")
39. stretched = pygame.transform.smoothscale( 86.
87. # Main program logic:
armstrong , (96,stretch) ) 88. if __name__ == '__main__':
40. screen.blit(stretched,(54,450 - stretch)) 89. main()
41. pygame.display.update() 90.
42.
43. def drawWords(words,x,y,col,backCol) :
44. textSurface = font.render(words, True, col,

backCol)

raspberrypi.org/magpi August 2018 43

Tutorial STEP BY STEP MARK VANSTONE

Educational software author from the
nineties, author of the ArcVenture series,
disappeared into the corporate software
wasteland. Rescued by the Raspberry Pi!
technovisualeducation.co.uk
twitter.com/mindexplorers

Instructions are displayed here

The game has four coloured
buttons that light up when
they are pressed

Let's get this Brian
battle underway

PYGAME ZERO PART 02

VIDEO GAME LOGIC WITH

SIMPLE BRIAN
YoNue’leld Recreate a classic electronic game using Pygame Zero

> Raspbian Jessie L ong, long ago, before the Raspberry Pi Figure 1 does not run straight from IDLE then you
or newer existed, there was a game. It was a round may need to upgrade Raspbian on your Pi.
plastic box with four coloured buttons on
> An image top and you had to copy what it did. To reconstruct figure1.py
manipulation this classic game using Pygame Zero, we’ll first
program such need a name. To avoid any possible trademark 01. import pgzrun
as GIMP, or misunderstandings and because we are using the 02.
images from Python language, let’s call it ‘Brian’. The way the 03. # Your program code will go here
magpi.cc/keaiKi game works is that Brian will show you a colour 04.
sequence by lighting up the buttons and then you 05. pgzrun.go()
> The latest have to copy the sequence by pressing the coloured
version of buttons in the same sequence. Each round, an extra Figure 1 With Pygame Zero V1.2 you can now run your program
Pygame Zero (1.2) button is added to the sequence and you get a point from IDLE with these extra lines top and bottom
for each round you complete correctly. The game
> A good memory continues until you get the sequence wrong. >STEP 02

>STEP 01 The stage is set

Run, run as fast as you can If we are using Pygame Zero, the point is to get things
happening quickly, so let’s get some stuff on the screen.
In the first part of this series (magpi.cc/71), we We are going to need some images that make up the
ran our Pygame Zero code by typing the pgzrun buttons of the Brian game. You can make your own or
command into a Terminal window, as previously you can get ours from GitHub at magpi.cc/keaiKi. The
there was no way of running a Pygame Zero program images will need to be in an images directory next to
directly from IDLE. With the 1.2 update of Pygame your program file. We have called our starting images
Zero, there is now a way to run your programs redunlit, greenunlit, blueunlit, and yellowunlit because
directly from IDLE without using a Terminal window. all the buttons will be unlit at the start of the game. We
You will need to make sure that you are running have also got a play button so that the player can click on
version 1.2 or later of Pygame Zero. If the code in it to start the game.

44 August 2018 raspberrypi.org/magpi

VIDEO GAME LOGIC WITH SIMPLE BRIAN Tutorial

figure2.py Language

>PYTHON

01. import pgzrun DOWNLOAD:
02.
magpi.cc/hPLqsi

03. myButtons = []

04. myButtons.append(Actor('redunlit',

bottomright=(400,270)))

05. myButtons.append(Actor('greenunlit',

bottomleft=(400,270)))

06. myButtons.append(Actor('blueunlit',topright=(400,270)))

07. myButtons.append(Actor('yellowunlit',topleft=(400,270)))

08. playButton = Actor('play', pos=(400,540))

Above There are nine positions that an Actor’s co-ordinates 09.
can be aligned to when the Actor is created
10. def draw(): # Pygame Zero draw function
>STEP 03
11. screen.fill((30, 10, 30))
Getting the actors on stage
12. for b in myButtons: b.draw()
As we saw in part one, we can create Actors by
simply supplying an image name and a position 13. playButton.draw() Figure 2 Set up the screen by
on the screen for it to go. There are several ways 14. creating Actors and draw them
of supplying the position information. This time 15. pgzrun.go() in the draw() function
we are going to use position handles to define
where the character appears. We will use the same figure3.py Figure 3 Adding a state to
co‑ordinates for each quadrant of the whole graphic, the buttons means that we
but we will change the handle names we use. For 01. import pgzrun can change the image in our
these Actors we can use bottomright, bottomleft, update() function and draw
topright, and topleft, as well as the co-ordinates them as lit
(400,270), which is the centre point of our whole
graphic. Have a look at Figure 2 and see how this 02.
is written.
03. myButtons = []
>STEP 04
04. myButtons.append(Actor('redunlit',
Look at the state of that
bottomright=(400,270)))
We now need to add some logic to determine if
each button is on or off and show it lit or unlit 05. myButtons[0].state = False
accordingly. We can do this by adding a variable
to each of our Actors. We want it to be either on or 06. myButtons.append(Actor('greenunlit',
off, so we can set this variable as a Boolean value,
i.e. True or False. If we call this variable ‘state’, bottomleft=(400,270)))
we can add it to the Actor by writing (for the first
button): myButtons[0].state = False. We 07. myButtons[1].state = False
then do the same for each of the button Actors with
their list numbers 1, 2, and 3 because we defined 08. myButtons.append(Actor('blueunlit',topright=(400,270)))
them as a list.
09. myButtons[2].state = False
>STEP 05
10. myButtons.append(Actor('yellowunlit',topleft=(400,270)))
Light goes on, light goes off
11. myButtons[3].state = False
We have defined a state for each of our buttons,
but now we have to write some code to react to 12. buttonsLit = ['redlit', 'greenlit', 'bluelit',
that state. First let’s make a couple of lists which
hold the names of the images we will use for 'yellowlit']
the two states. The first list will be the images
we use for the buttons being lit, which would 13. buttonsUnlit = ['redunlit', 'greenunlit', 'blueunlit',
be: buttonsLit = ['redlit', 'greenlit',
'bluelit', 'yellowlit']. We then need a list of 'yellowunlit']
the unlit buttons: buttonsUnlit = ['redunlit',
'greenunlit', 'blueunlit', 'yellowunlit']. 14. playButton = Actor('play', pos=(400,540))

15.

16. def draw(): # Pygame Zero draw function

17. screen.fill((30, 10, 30))

18. for b in myButtons: b.draw()

19. playButton.draw()

20.

21. def update(): # Pygame Zero update function

22. bcount = 0

23. for b in myButtons:

24. if b.state == True: b.image = buttonsLit[bcount]

25. else: b.image = buttonsUnlit[bcount]

26. bcount += 1

27.

28. pgzrun.go()

raspberrypi.org/magpi August 2018 45

Tutorial STEP BY STEP

GLOBALS Then we can use these lists in an update() function Above When the mouse is clicked on a button, we switch
to set the image of each button to match its state. the unlit image for the lit image
You can See Figure 3 to see how we can do this.
read global >STEP 08
variables >STEP 06
inside a Ups and downs
function, but Switching images
if you change We can write a test in on_mouse_down() for each
the value of We can see from Figure 3 that each time our button, to see if it has been pressed, and then change
the variable, update() function runs, we will loop through our the state of the button if it has been pressed. We can
you must button list. If the button’s state is True, we set the then write code to set all the button states to False in
declare it image of the button to the image in the buttonsLit the on_mouse_up() function, and our update() and
as global in list. If not (i.e. the state variable is False) then draw() functions will now reflect what we need to
the function. we set the image of the button to the image in the see on the screen from those actions. Look at Figure 4
buttonsUnlit list. At this stage you can test to see and you will see how we can change the state of the
if your logic is working by setting the state variables buttons as a response to mouse events. When you
at the top of the code to True and see if when you run have added this to your program, test it to make sure
the program, the buttons display lit. that the buttons light up correctly when clicked.

>STEP 07 >STEP 09

What happens if I press this button? Write a list

The idea of a button is that you can press it and Now that we have our buttons working, we will need
something happens, so the next thing we need to write to make a way to use them in two different ways. The
is a way to allow the user to press the buttons and make first will be for the game to display a sequence for the
them light up. We can do this with the Pygame Zero player to follow, and the second is to receive input
functions on_mouse_down() and on_mouse_up(). from the player when they repeat the sequence. For
What we need to test in these functions is if the mouse the first task we will need to build a list to represent
has been clicked down; if it has, we should set our the sequence and then play that sequence to the
button state to True. We also need to test if the mouse player. Let’s define our list at the top of the code
button has been released, in which case, all the buttons with buttonList = [] and then make a function
should be set to False. We can test the value we are def addButton(): which will create an additional
passed (pos) into these functions with the method entry into the sequence each round.
collidepoint(), which is part of the Actor object.
>STEP 10
figure4.py Figure 4 Check for mouse clicks on
mouse down, and set all buttons That’s a bit random
back to unlit on mouse up
We can generate our sequence by generating random
01. def on_mouse_down(pos): integers using the random module. We can use this
module by importing it at the top of our code with:
02. global myButtons from random import randint. We will only need
the randint() function, so we can import that
03. for b in myButtons: function specifically. To add a new number to the
sequence in the addButton() function, we can write
04. if b.collidepoint(pos): b.state = True buttonList.append(randint(0, 3)), which
will add a number between 0 and 3 to our list. Once
05. we have added our new number, we will want to
show the player the sequence, so add a line in the
06. def on_mouse_up(pos): addButton() function: playAnimation().

07. global myButtons >STEP 11

08. for b in myButtons: b.state = False Playing the animation

figure5.py We have set up our function to create the sequence.
Now we need a system to play that sequence so that
01. def playAnimation(): the player can see it. We will do this with a counter
variable called playPosition. We define this at the
02. global playPosition, playingAnimation

03. playPosition = 0 Figure 5 addButton()
04. playingAnimation = True generates a random
05. number between
06. def addButton(): 0 and 3, then adds
07. global buttonList this number to
08. buttonList.append(randint(0, 3)) the end of the list
09. playAnimation() called buttonList.
Then we call the
playAnimation()
function

46 August 2018 raspberrypi.org/magpi

VIDEO GAME LOGIC WITH SIMPLE BRIAN Tutorial

start of our code: playPosition = 0. We will also figure6.py
need a variable to show that our animation is playing:
playingAnimation = False, also written at the start 01. def update(): # Pygame Zero update function
of the code. We can then define our playAnimation() 02. global myButtons, playingAnimation, playPosition
function that we used in the previous step. Look 03. if playingAnimation:
at Figure 5 to see how the addButton() and 04. playPosition += 1
playAnimation() functions are written. 05. listpos = math.floor(playPosition/LOOPDELAY)
06. if listpos == len(buttonList):
>STEP 12 07. playingAnimation = False
08. clearButtons()
Are we playing? 09. else:
10. litButton = buttonList[listpos]
So, once we have set our animation going, we will 11. if playPosition%LOOPDELAY > LOOPDELAY/2:
need to react to that in our update() function. We
know that all we need to do is change the state of litButton = -1
the buttons and the draw() function will handle 12. bcount = 0
the visuals for us. In our update() function, we 13. for b in myButtons:
have to say: “If the animation is playing then 14. if litButton == bcount: b.state = True
increment our animation counter, check that we 15. else: b.state = False
haven’t reached the end of the animation and if 16. bcount += 1
not then light the button (change its state to True)
which is indicated by our sequence list.” This is a Figure 6 The update() function with a check to see if the
bit of a mouthful, so in Figure 6 we can see how this animation is playing. If it is, we need to work out which
translates into code. buttons to light and move the animation on

>STEP 13 This function adds a random button number to the
list and sets the animation in motion. To test it, we
Getting a bit loopy can call it a few times at the bottom of our code, just
above the pgzrun.go(). If we call the addButton()
We can see from Figure 6 that we are incrementing function three times then three numbers will be
the play position each time update() is called. What added to the buttonList list and the animation will
we want to do is keep each button in the sequence lit start. If this all works, we are ready to add the code to
for several refreshes, so we divide the playPosition capture the player’s response.
by a predefined number (LOOPDELAY) to get our list
position that we want to display. We round the result We can collect the player’s
downwards with the math.floor() function. To use
this function, you will have to import the math module clicks on the buttons just
at the top of your code. So if LOOPDELAY is 80 then we
will move from one list position (listpos) to the next by adding another list
every 80 times update() is called.
>STEP 16 OMROD%ULO
>STEP 14
I need input The % symbol
A dramatic pause is used to get
We can collect the player’s clicks on the buttons the remainder
Still in Figure 6, we check to see if we have reached just by adding another list, playerInput, to the after a division
the end of the buttonList with listpos. If we have calculation.
then we stop the animation. Otherwise, if we are still definitions at the top of the code and adding a few It’s useful
running the animation, we work out which button lines into our on_mouse_down() function. Add for creating
should be lit from our buttonList. We could just a counter variable bcount = 0 at the top of the smaller
say “light that button and set the rest to unlit”, but function and then add one to bcount at the end of the repeats within
before we do that we have a line that basically says: loop. Then, after if b.collidepoint(pos): we add a larger loop.
“If we are in the second half of our button lighting playerInput.append(bcount). We can then test
loop, set all the buttons to unlit.” This means that we the player input to see if it matches the buttonList
will get a pause in between each button being lit when
no buttons are lit. We can then just loop through our list we are looking for. We will write this as a separate
buttons and set their state according to the value function called checkPlayerInput() and call it at
of litButton. the end of our on_mouse_down() function. As we now

>STEP 15 have the basis of our game, refer to the full listing to

Testing the animation see how the rest of the code comes together as we go

Now, ignoring the fact that we have a play button through the final steps.
ready and waiting to do something, we can test our
animation by calling the addButton() function.

raspberrypi.org/magpi August 2018 47

Tutorial STEP BY STEP

CHANGING >STEP 17 brian.py
THE DELAY
Game over man 001. import pgzrun
We have used
the constant The checkPlayerInput() function will check the 002. from random import randint
LOOPDELAY
for timing buttons that the player has clicked against the list 003. import math
our loops, if held in buttonList which we have been building up
the game is with the addButton() function. So we need to loop 004. WIDTH = 800
running too through the playerInput list with a counter variable
slow, decrease – let’s call it ui, and write if playerInput[ui] 005. HEIGHT = 600
this value at != buttonList[ui]: gameOver(). If we get to the
the top of end of the list and both playerInput and buttonList 006.
the code.
are the same length then we know that the player has 007. myButtons = []

completed the sequence and we can signal that the 008. myButtons.append(Actor('redunlit',

score needs to be incremented. The score variable can bottomright=(400,270)))
be defined at the top of the code as score = 0. In our
on_mouse_up() function, we can then respond to the 009. myButtons[0].state = False

score signal by incrementing the score and setting the 010. myButtons.append(Actor('greenunlit',

next round in motion. bottomleft=(400,270)))

>STEP 18 011. myButtons[1].state = False

Just press play 012. myButtons.append(Actor('blueunlit',

We still haven’t done anything with that play topright=(400,270)))

button Actor that we set up at the beginning. 013. myButtons[2].state = False

Let’s put some code behind that to get the game 014. myButtons.append(Actor('yellowunlit',

started. Make sure you have removed any testing topleft=(400,270)))
calls at the bottom of your code to addButton()
015. myButtons[3].state = False
(Step 15). We will need a variable to check if the
game is started, so put gameStarted = False 016. buttonsLit = ['redlit', 'greenlit',

'bluelit', 'yellowlit']

017. buttonsUnlit = ['redunlit',

'greenunlit', 'blueunlit',

'yellowunlit']

018. playButton = Actor('play',

Let’s put some code behind pos=(400,540))
that to get the game started
019. buttonList = []

020. playPosition = 0

021. playingAnimation = False

022. gameCountdown = -1

at the top of the code with the other variables 023. LOOPDELAY = 80
and then in our on_mouse_up() function we can
add a test: if playButton.collidepoint(pos) 024. score = 0
and gameStarted == False: and then set the
gameStarted variable to True. We can set a countdown 025. playerInput = []
variable when the play button is clicked so that there is
a slight pause before the first animation starts. 026. signalScore = False

>STEP 19 027. gameStarted = False

Finishing touches 028.

We’re nearly there with our game: we have a way to 029. def draw(): # Pygame Zero draw function
play a random sequence and build that list round by
round, and we have a way to capture and check user 030. global playingAnimation, score
input. The last things we need are some instructions
for the player, which we can do with the Pygame 031. screen.fill((30, 10, 30))
Zero screen.draw.text() function. We will want
an initial ‘Press Play to Start’ message, a ‘Watch’ 032. for b in myButtons: b.draw()
message for when the animation is playing, a ‘Now
USING TEXT You’ message to prompt the player to respond, and a 033. if gameStarted:
score message to be displayed when the game is over.
If you are Have a look in the draw() function in the complete 034. screen.draw.text("Score : " +
going to use listing to see how these fit in. There are many ways
text, it’s a we can enhance this game; for example, the original str(score), (310, 540), owidth=0.5,
good idea to electronic game had sound too, but that will be
display some covered in another part of this series. ocolor=(255,255,255), color=(255,128,0)
on the first
screen as it , fontsize=60)
can take a
while to load 035. else:
fonts for the
first time. 036. playButton.draw()
You may get
unwanted 037. screen.draw.text("Play", (370,
delays if you
load it later. 525), owidth=0.5, ocolor=(255,255,255),

color=(255,128,0) , fontsize=40)

038. if score > 0:

039. screen.draw.text("Final

Score : " + str(score), (250, 20),

owidth=0.5, ocolor=(255,255,255),

48 August 2018 raspberrypi.org/magpi

VIDEO GAME LOGIC WITH SIMPLE BRIAN Tutorial

color=(255,128,0) , fontsize=60) 082. def checkPlayerInput(): Language

040. else: 083. global playerInput, >PYTHON

041. screen.draw.text("Press gameStarted, score, DOWNLOAD:
buttonList, gameCountdown,
Play to Start", (220, 20), owidth=0.5, signalScore magpi.cc/hPlqsI

ocolor=(255,255,255), color=(255,128,0) ,

fontsize=60) 084. ui = 0

042. if playingAnimation or gameCountdown > 0: 085. while ui < len(playerInput):

043. screen.draw.text("Watch", (330, 086. if playerInput[ui] != buttonList[ui]:

20), owidth=0.5, ocolor=(255,255,255), gameOver()

color=(255,128,0) , fontsize=60) 087. ui += 1

044. if not playingAnimation and gameCountdown == 088. if ui == len(buttonList): signalScore = True

0: 089.

045. screen.draw.text("Now You", (310, 090. def on_mouse_down(pos):

20), owidth=0.5, ocolor=(255,255,255), 091. global myButtons, playingAnimation,

color=(255,128,0) , fontsize=60) gameCountdown, playerInput

046. 092. if not playingAnimation and gameCountdown ==

047. def update(): # Pygame Zero update function 0:

048. global myButtons, playingAnimation, 093. bcount = 0

playPosition, gameCountdown 094. for b in myButtons:

049. if playingAnimation: 095. if b.collidepoint(pos):

050. playPosition += 1 096. playerInput.append(bcount)

051. listpos = math.floor(playPosition/ 097. b.state = True

LOOPDELAY) 098. bcount += 1

052. if listpos == len(buttonList): 099. checkPlayerInput()

053. playingAnimation = False 100.

054. clearButtons() 101. def on_mouse_up(pos):

055. else: 102. global myButtons, gameStarted, gameCountdown,

056. litButton = buttonList[listpos] signalScore, score

057. if playPosition%LOOPDELAY > 103. if not playingAnimation and gameCountdown ==

LOOPDELAY/2: litButton = -1 0:

058. bcount = 0 104. for b in myButtons: b.state = False

059. for b in myButtons: 105. if playButton.collidepoint(pos) and

060. if litButton == bcount: b.state = gameStarted == False:

True 106. gameStarted = True

061. else: b.state = False 107. score = 0

062. bcount += 1 108. gameCountdown = LOOPDELAY

063. bcount = 0 109. if signalScore:

064. for b in myButtons: 110. score += 1

065. if b.state == True: b.image = 111. gameCountdown = LOOPDELAY

buttonsLit[bcount] 112. clearButtons()

066. else: b.image = buttonsUnlit[bcount] 113. signalScore = False

067. bcount += 1 114.

068. if gameCountdown > 0: 115. def clearButtons():

069. gameCountdown -=1 116. global myButtons

070. if gameCountdown == 0: 117. for b in myButtons: b.state = False

071. addButton() 118.

072. playerInput.clear() 119. def playAnimation():

073. 120. global playPosition, playingAnimation

074. def gameOver(): 121. playPosition = 0

075. global gameStarted, gameCountdown, 122. playingAnimation = True

playerInput, buttonList 123.

076. gameStarted = False 124. def addButton():

077. gameCountdown = -1 125. global buttonList

078. playerInput.clear() 126. buttonList.append(randint(0, 3))

079. buttonList.clear() 127. playAnimation()

080. clearButtons() 128.

081. 129. pgzrun.go()

raspberrypi.org/magpi August 2018 49

Tutorial STEP BY STEP WESLEY ARCHER

Raspberry Pi fan and man behind
Raspberry Coulis. Enjoys building
Pi-related projects and writing
up guides.
raspberrycoulis.com
twitter.com/RaspberryCoulis

MINI PI-POWERED

SMART MIRROR
Build your own mini smart mirror to display the temperature and humidity
YoNue’leld
N ew parents can be obsessed with ensuring stripping a small part of the wire in the middle and
> Raspberry Pi their child is not too hot or cold. Sure, they soldering to that so the wire is shared worked well.
Zero W could use a standard thermometer, but that’s Or you could solder two wires to the one GPIO pin; i.e.
not as fun as using a Raspberry Pi. Smart mirrors are one at the top, one at the bottom.
> M icro Dot pHAT all the rage, and we’ll show you how to build a mini
magpi.cc/ smart mirror to display the temperature and humidity >STEP-02
2cfq7Ob in a room. And without breaking the bank.
Applying the mirrored window film
> B ME280 sensor >STEP-01
We found a UK supplier of mirrored window film that
> Prototyping Prototype your circuit first issued free A5-sized samples – they’ve since started
wire and charging 99p, which is still great value. Applying the
soldering iron We recommend testing your circuit with a breadboard mirrored window film is fairly straightforward, but
before soldering them. The BME280 sensor is 3 V make sure you follow instructions! We used soapy
> MOSSEBO tolerant, and connects to the SDA and SCL pins on the water, a squeegee, and patience when applying ours to
13×18 cm photo Pi, but shares them with the Micro Dot pHAT, so will make sure no air bubbles were trapped, and we applied
frame – IKEA require intricate wiring and soldering – we found that two layers to add to the effect. Make sure it dries!

> Reflective mirror
window film

> Beebotte
account –
beebotte.com

The data from the BME280 raspberrypi.org/magpi
sensor is displayed on the
Micro Dot pHAT
The MOSSEBO frame
from IKEA is ideal and
inexpensive for this build
50 August 2018


Click to View FlipBook Version