When cycling data becomes the input, not the output, with Raspberry Pi
Twelve months on from Le Grand Départ of the Tour de France in Yorkshire.
A year since the peloton passed through North Leeds on it’s way to the ceremonial start at Harewood House and was captured by a group of young children using their Raspberry Pi mini computers.
Planned, coded and created using window sills and trees along the route to create a time-lapse video to share as an output.
A data output of moving images as a digital storytelling piece and fitting creatively into the Computing curriculum. Well, if it had formally been in place then.
So what can happen in a year if the TdF legacy conversation is explored but we consider Computing this time?
Time-wise some of us, particularly in Yorkshire, think back to those iconic images and forever lasting moments shared a year ago, particularly as the 2015 start approaches. Which made me think about using exactly those to look at project opportunities, activities, resources and more than anything perception and confidence with Computing from a work and social/community perspective.
We've had a Raspberry Jam in Leeds this year, another one scheduled for July and plans to introduce a regular diary date. All very timely!
The Cote de Buttertubs Pass section of the race is the still image in my mind which encompasses the spirit of last year's TdF in Yorkshire. So could we utilise Strava with cyclists on that particular section of the route and this year use their data as an input with Raspberry Pi? It's an attempt to shift from the peloton as an output in a time-lapse to actually using data from the route as the input this year. No pro-cyclists in 2015 as we move towards more open data and Strava with local riders : )
So how could we build contextual models to tell the story of one or two cyclists on that ascent?
Which tools could we use and whose experiences could we tap into?
How could the data be visualised and would one method create more understanding than another?
Would that depend on the creator and/or the viewer?
Is it possible at all to use the data from Buttertubs to share a rider's experience?
Photo courtesy of https://twitter.com/liverpoolmerc/
Here’s a digital storytelling piece about one cyclist's ascent of Cote de Buttertubs using their speed and cadence data. Through a sonification activity, Sonic Pi is used to blend three elements giving a different tone, output and ultimately a very different story of what the data would ordinarily infer during this ascent.
Cadence data is used for the melody
The small range of speed data (km/Hr) can be heard as notes using random selection from that range
Listen out for the Queen 'Bicycle Race' undertones
For computer scientists, the Sonic Pi script below will show a very long winded approach to coding. We talked a lot about possibilities, what we wanted to incorporate and how we wanted to use the techniques to display differences in the data and to try to create an atmospheric piece. And we coded as we talked, and tried to understand the storytelling element through the data as we went along too.
As a bit of context, here are some thoughts and considerations discussed in planning:
Freddie Mercury was inspired to write 'Bicycle Race' after watching the 1978 Tour go past his hotel. He was in France writing at the time, and as he was inspired and we recognise the local impact of legacy from TdF, then we wanted that piece of music incorporated. It also gave us the impetus to collaborate and blend numerous elements and data sets into one piece whilst progressing our Sonic Pi experiences.
Conversations with music tech students at a recent Jam cemented planning thoughts after we considered the range of data.
Cadence was the chosen data to use as the melody due to it's perfect fit with midi notes.
The small range of speed data, through km/hr and between 13 - 19, meant initial considerations to convert to integers and then into base 16 or base 8 to form chords. Then we changed to learn more about using random numbers from a range in Sonic Pi.
Length of notes, amplitude and sleep were variables used to try to capture the storytelling element.
Above all else could the output ever tell the story and accentuate that from the rider's perspective? We thought about it and went in a completely different direction. Abstract really. We wanted to manipulate the data but the output would be melodic. Almost calm : )
Other resources and project ideas to build contextual models from the data? High Speed Racing on Figure of Eight track from Cannybots on Vimeo. Next month we'll look to use the Cannybots racers at the Leeds' Jam as we intend to race the data from two riders 'on the flat'. A sharp contrast to the ascent but one which fits with modelling using the sensors on the racers. The speed data can be imported into this Python script - finish line report to follow. We'll also take a peek at how the data could look as blocks in Minecraft.
How could and would heart rate data be visualised in a virtual world?
What if we added Buttertubs to the Minecraft World and built structures based on the data?
Maybe. But then again our local Minecraft expert, who incidentally stood and watched the TdF at that very point, is working on the Leeds' Minecraft map so perhaps we'll explore that further in the Autumn term.
Link to data & Sonic script on GitHub: