So your Pi Glove grew out of Project New York – tell us about that.
I was invited to the second Picademy back in July. The first day was workshops, education, and the second day was a pure hack day, and we had a bit of a chat with Eben [Upton] and some other people who wanted to support and build something. We wanted to come up with something you could physically use, and we decided to use Scratch GPIO.
So we came up with this idea that if you touched a thumb and finger together, it would move a character on-screen, so that was the original Michael Jackson glove. And at the end we thought that we could actually make something more ergonomic than a phone or a Google Glass – something that fits on the hand – and I said I’d go away and use Python to make something creative for social media, so taking photographs, playing music.
That’s how it started. The Project New York element was actually from my son. I showed him the video after Picademy and he asked what I was going to do now, and I said I’d build a glove that lets you do all this stuff. And he said, “Huh, right. We live in a village – this isn’t New York.” So that was it then – I dedicated the project to New York!
What is the Pi Glove set up to do?
From the index finger, if you press the first button it’ll take a picture using picamera. The second button will tweet that picture to your Twitter account (saying something like “This picture was taken at this time”). The third button is an mp3 player; so you’ve got a list of ten songs on there and it randomly picks one and plays that – that was using Pygame. And then the last one… you know Paul Beech, from Pimoroni? Yeah, so he was on the original Picademy course with me. And after that I did something called Deer Shed, which is kind of a fine technology festival out here in the sticks, and he was there too.
So we were chatting and one of the things we were talking about is train times – I was saying that if you’re at the station and want to check when the next train arrives, you have to get your phone out, check the app, load it up – wouldn’t it be better if you pressed a button in your pocket and it read it out in your ear, which platform to go to and stuff like that? So he told me to get in touch with this guy who’d done some scraping. Basically, he uses Google Sheets to scrape the data in and that downloads into a text file on the Raspberry Pi, which then parses through it and reads it out. You’ve got to be connected to Wi-Fi, and I developed it so that it always works between two stations that you travel between, but the next stage is to develop something to find out where you are and then pull the data for that.
So you have four finger buttons – do you have plans for some kind of hierarchical structure? Maybe a palm button so that if you go down into the mp3 layer, you have volume control, track skipping and so on mapped out?
That will be the next stage. I was chatting to sixth form students about what they want to do, and it’s simple things like checking the time. Someone said, “If I wake up in the middle of the night then I usually go for my phone – how good would it be to just press a button and have it tell me the time?” And we got talking about things like if you have special needs or you require assistance, how you could develop for that – you could have some kind of extra button to press that would load up a particular menu.
So I use Python eSpeak to read out what menu you’re inside, because there’s no GUI or anything to tell you where you are other than this voice saying, “You are now in the camera”. It’s not a polished article by any means but it works. I was looking online to see what there is already and I was struggling to find stuff – there’s an Arduino drum glove which you move up and down, and when you touch your fingers on the desk it plays a drum beat, but there was nothing else like this. Even in terms of the hardware – what is out there? So eventually I envisage having some kind of glove where it’s more tactile, rather than pushing buttons.
Have you heard of Imogen Heap and the project MiMu gloves? They’re very different, with gesture control.
Yes! Well, Sam Aaron, who does Sonic Pi, was at one of the Picademy sessions with Sonic Pi, and on the next day when he saw the project, he was talking about a friend of his who is trying to get more audience participation in music; so if the audience moves to the left then the volume goes up, and if they move to the right then something else happens. Phenomenal. He was very passionate about the idea of the audience impacting on the music they hear, rather than having ‘I’m a performer, I’m just going to play something’. If the audience doesn’t like a certain bit and wants to change it, they just walk to the right-hand side and then it alters it.
You could add tilt control to the Pi Glove, so it could read if you’re tilting to the left or right with a wave.
Well it started with the idea that everyone takes pictures – Instagram is bombarded with pictures – but actually, how many times are people posing for those pictures and how many times are they just clicking away? The original concept was that if I’ve got to get a phone out to take a picture, okay that’s fairly quick but, in twenty to fifty years time, people will look back and say, “Do you remember when we used to have to unlock our phones and put the code in, and we had to wait for this app to load – isn’t it better to just press a button and it takes a picture?”
So the idea was that the camera would be located in the breast pocket or something like that – near eye level, taking a picture of what’s in front of you – and the issue with that is the quality of the picture, but I think people are becoming so snap-happy that you just want to be able to take photos quickly if something happens. It’s ironic, actually – Barack Obama just asked for funding for 50,000 police cameras mounted in their uniforms, to take pictures and stuff like that, so it’s the same idea. And I think that’s the next step: people walking around with these cameras.