Notice: Undefined index: order_next_posts in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 194

Notice: Undefined index: post_link_target in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 195

Notice: Undefined index: posts_featured_size in /nas/content/live/gadgetmag/wp-content/plugins/smart-scroll-posts/smart-scroll-posts.php on line 196

ElectroSuper LED tunnel

We speak to artists and electricians Fred Sapey-Triomphe and Yann Guidon, who make Mons railway station sparkle with a supersized LED installation



Fred Sapey-Triomphe
Fred Sapey-Triomphe is a visual artist who lives and works in Paris. Trained at the École Nationale Supérieure des Beaux-Arts in Paris and the École Boulle, Fred produces digital art installations, using light as a raw material.
Yann Guidon
Yann Guidon is an engineer who specialises in electronics and algorithmics. He has designed two open source CPU architectures – the F-CPU and the YASEP – and collaborated with many other artists on LED installations.

The ElectroSuper installation looks amazing! How long have you both been collaborating?

Fred  Yann and I have been working together since February 2013. I wanted to create a large-scale LED display, and I’m a visual artist and have no foundation in electronics, so I couldn’t make it by myself. I met Yann through common friends and I went to his studio early in 2013 – I was asked to do another project for a show in Normandy, France, so I had a little budget for that and I asked Yann if he would like to start working on that project.

Yann  And it was not one project but two projects!

One of these was the Rosace, right?

Fred  Yes, it’s an electronic rosace built with the inspiration of the antique rosace gadget (Ed: a persistence of vision mill, like a zoetrope or phenakistoscope), so that was the first project we worked together on. It was quite a challenge indeed! Very complex. We made it so wonderfully, it was great to work with him.

How did you begin work on the ElectroSuper at Mons railway station?

Fred  Well, basically we had a show in 2014 in Pompidou Centre, in the east of France, close to the German border in Metz and very close to Belgium, and there is a guy there in Metz who told me that the city of Mons is looking for something to dress up the railway station.

Part of the European capital project, right?

Fred  Yeah exactly, so they were looking for a project to cover up the railway station.

Yann  It’s a temporary railway station which is made of prebuilt material – it’s not very nice, it’s not up to the standard and panache of the event.

Fred  So this guy said, “Why don’t you send your portfolio to Mons? Your work could fit very well with what they are looking for.” So that’s what I did, and a few weeks later I received a positive answer, and then we started negotiations – it was kind of complex, but we finally signed the contract three months before the opening. We only had two months to produce the whole piece, which is a 42m long ceiling screen, so it was a big piece and we had very little time to achieve it.

The installation interacts with passers-by in the tunnel – how does that work?

Fred  The idea was to cover the ceiling of a passenger path – people getting off the train have to take this path to get into the station, so this is the first thing that visitors are going to see as they arrive in Mons. We were asked to create something engaging, powerful, colourful, something that would put the visitor into a good mood for their visit to the city. We wanted it to be interactive, so Yann decided to put in four massive infrared sensors, at the entry and exit points of the tunnel. The images that are displayed by the screen are changing according to the number of visitors.

Yann  I put each pair of sensors one metre apart, so when I pick up a series of pulses I know that something is moving in one direction on the pathway.

Is it reacting to the direction alone or the number of passengers as well?

Yann  It’s not that fine, because the sensors that could fit into the project have about 30 seconds of release time, so if more people come then I cannot pick them up!

So it’s kind of tidal, then – every time there’s a new movement, it sends another wave of colour?

Fred  Yes, it’s a good way to explain it. Also, this project is running for the whole year. We designed it so that the visual effect varies according to the seasons. So right now it’s August and the amount of light is larger than in December, so we had to create specific videos for each season.

Aside from the sensors, what other hardware are you using?

Yann  Fred designed the whole structure and helped build it – there are a lot of wooden structures, with special treatment for the wood because it has to sustain snow and rain and sun. He found premade elements on which we could affix LED strips –  2 x 3.5 metres, like a tile. So we split the whole surface into six sections; each section is 40 x 70 LEDs, so 2800 LEDs. We have 16,800 in total and it’s about one watt per bulb, so if you multiply everything you get more than 16 Kwatts. The length is 42m and we have to transmit data across this distance, and it creates a problem of integrity, reliability, etc, so I chose to use Ethernet for the transmission of data because it’s cheap and well-supported.

WizYasepWe are very careful about reliability and we have a lot of experience now with making something that is not too expensive but also that works, and sustains the weather and other injuries. Many people will start by driving a WS2812 with Arduinos, which works with one strip, and then to make a screen they will add in more and more, and it will work on the table but when you move to outdoor installations, the constraints are enormous and Arduino doesn’t cut it. So I created a special board, the WizYasep, for driving thousands of LEDs and putting them in parallel to drive even more, and make it more reliable.

So you are using a different WizYasep board for each section?

Yann  Yes, I used a hierarchical structure. At the top we have a single Raspberry Pi B+, which contains a customised Raspbian – for example, I removed everything to do with X Window, so I saved something like 1 GB, which is even more space for storing the videos. And then I did some adaptations: I hardened the operating system so it would be in read-only mode. In these installations we never know if or when the power supply is removed, so there is the risk of wearing out the SD card if it is storing logs etc. So I put the system in read-only mode, and when I log in via SSH it goes into read-write mode and I can configure, then I reboot and it starts a simple script that runs a simple C file, which reads data from the SD card and then sends it over the network. There is also a modification to add a real-time clock, because we are in a public space and there is also the issue of saving energy. So when the sun is out there is no point having the LEDs working, so from about 10am to 5pm the system is turned off – it stops sending data – and when the WisYasep boards see that no data is coming, they set everything to black so that it draws less current.

The Raspberry Pi is connected with a little 8-port hub, 100 Mbit,  so the average load of the Raspberry Pi at full speed is about 10-15% of CPU usage. So that’s okay because one frame is about 50 Kb, and multiplied by 25 frames per second, it’s less than 1.5 MB per second.

Going back to the display, how are you getting the videos to change to interact with the passengers?

Yann  Fred prepares video sequences and then he exports those in a special format, so I can then process them and turn the sequences into files that can be read back. From there I can modify the program to, for example, apply filters or to speed up or slow down the playback. For now, the system is streamlined, it’s smooth, because there is not much processing done on the Pi – it just reads a big block of SD card memory and cuts it into pieces, and those pieces are then sent in a special order through the network to the six controller boards, according to a quite simple algorithm that optimises the network congestion. It has a big buffer so it can buffer a lot of data, and then all the screens are updated at exactly the same time by receiving a tiny broadcast packet, so it ensures that the screens are perfectly synchronised, and you don’t see the seams.