What was the inspiration behind Bullet Pi?
So I’d seen The Matrix and also a BBC programme called Supernatural: The Unseen Powers Of Animals, where they shot dolphins jumping out of the water and they could just freeze time, and then spin round so you saw the water drops, and I thought that was a really amazing effect. And reading how it was done, it was with a whole array of cameras and it seemed quite simple. So I had the Raspberry Pi and a couple of cameras sat on my desk, and it was sort of like, ‘Well, if it seems that simple, then surely if we just got a whole load of Raspberry Pis and cameras – and they’re not that expensive – why shouldn’t we just be able to build it?’ It was one of those moments where you think it should work but you don’t actually know. So what I did was buy four cameras and I tried it, and that looked like it would work – it gave us a tantalising glimpse that we could do it.
So it was a good concept test?
Yeah, but even so it wasn’t conclusive – it just told us that it’s possibly likely but we still didn’t actually know. So then it was just a case of buying a few more cameras. You get to the stage where all you can think is, ‘Right, let’s go for it. Let’s have a full set of 48 cameras and just build it.’ It was only really when we got the first set of images out, and then stitched that together, that you could see having a whole load of cameras and firing them at once is pretty much all we needed. It was one of those things where ‘Wouldn’t it be cool if…?’ became ‘You know, we might be able to…’ and then ‘Oh, we’ve done this!’
That’s where the best Pi projects come from! So how do you set it off?
It’s gone through a few iterations and, as we’ve tried it, we’ve made a few refinements. In the latest setup, we’ve got all the Pis booted into Raspbian and, on boot, they run a little Python script we’ve got. We’ve fitted them so every Pi’s got a PiFace Control and Display module that you can have simple interactions
with using buttons – it just drives a menu. So that lets us know that all the Pis have booted up, because if you’ve got 48 Pis and a network cable comes out, it’s tricky to find out which one when they all look the same. So that was really just a luxury in terms of helping to debug, and it also means we can reconfigure the array if we wanted to do a lateral version.
What are you using to stitch the images?
That’s FFmpeg. We have had to do some things with the alignment just so you get a smooth video out, because some of the cameras are at slightly different angles and if you don’t then the picture tends to go up and down. So basically we take a set of images with each calibration and work out whether there’s a positive or negative offset to each frame, and then when we stitch them together we effectively chop off the top and the bottom and just use the middle of the frame.
Is that handled by a script or do you do it manually once you’ve taken the shots?
We actually trigger it as two points, but there’s nothing to stop the two scripts being put together in the command line and all being run at once. The only reason we run it twice is that the script to stitch the images together takes about a minute and sometimes, if you take an image and then the person you’re taking the image of says ‘I don’t like that, let’s do it again’, it saves cancelling something that’s already started. So for logical reasons we run it as two separate operations but effectively they’re just two commands: one command to take the images and one command to stitch the images, and there’s nothing to stop it all being automated. You could put a pressure pad in there so that when people stood in the middle of the bullet rig it would automatically trigger.
Is the PiFace a crucial component or was it just something you wanted to use?
It was a luxury for us that we had it kicking around. It turned out to save us time when we were debugging. It’s not essential – you can build it without – but it made life much easier. We’ve also tried it in other configurations, and when you put it in other configurations, just being able to go through a menu on each Pi and then set where it is in the sequence is a lot easier if you can actually relate to the Pi that you’ve got in your hand, rather than having to look at what number it is, SSH into it and set a file.
What configurations have you tried?
We tried semicircles and also we’ve tried ones in straight lines, and that way you get the effect of the camera panning along as opposed to spinning around a point. We’ve got future plans, which do involve taking it into more wild and wacky situations. So far we’ve learned from the examples we’ve done inside, and then as we’ve got the setup faster and more automated and got all the little tweaks and experiences from it, we’re becoming more ambitious with it. There are definite plans to mount it on impressive things outside and get some sports photography going with it. There’s one person who actually booked us for their wedding after they’d seen it – they wanted photographs of their guests and it would be different, something unusual to capture the guests in a 360-degree spin. That’s the other nice thing about it – community. You put something out there online and suddenly it goes off in a different direction.
Do you plan to take it to any more Jams?
I think we’d like to do more things with it. One of the things I’d like to do is try and play around more with fusion of movements. So rather than having someone standing in the circle and then performing in the circle, see whether we could actually get a portable version and take the rig around – that way you could follow the action, if you like, and also get the video coming out of the Raspberry Pis. I would also like to experiment with background motion blur, things like that. We also think fireworks could be quite interesting, come November time.