Features:

How the Guardian Made RioRun

Our interactive podcast brought an Olympic marathon to everyday people


(The Guardian)

News organizations devote considerable resources to creating journalism about the Olympics, but we wondered if it was possible to create journalism that would allow our readers to take part as well.

Fortunately, in 2016 many of our readers carry around pocket-sized supercomputers loaded with sensors that we can make use of. So in an early brainstorming meeting, we asked ourselves what we could do with the data from those sensors that would provide readers with a meaningful Olympic experience—and that would allow us to spend our afternoons outside and call it “testing.”

Several weeks later, we unveiled RioRun, an “interactive podcast” that takes you on a guided tour of the Rio de Janeiro Olympic marathon course—all 26.2 miles of it—as you run.

When you open RioRun and go for a jog (or walk) wherever you are in the world, your phone’s GPS tracks your progress and triggers audio clips about your virtual surroundings in Rio based on the distance you’ve covered in real life. Our narrator, Valerie Lapinski, is on hand to recount the city’s history, politics and culture as you go, and the Guardian’s Latin America correspondent, Jonathan Watts, tells you about recent news events leading up to the Olympics. Marathon coach Bob Larsen also chimes in with advice on distance running at key points throughout the race. All the while, ambient audio recorded in the Rio neighborhoods you’re “running through” plays in the background, immersing you in Brazil’s street life.

Here’s how we developed the idea, how we built the interactive, and what we learned along the way.

The app in use on a smartphone

The finished app.

From Sprint to Marathon

Our initial idea was to use the accelerometer that’s inside every modern smartphone to measure how fast our users run and let them compare themselves to Olympians.

We were inspired by past interactives like Slate’s comparison of Usain Bolt against bygone runners and the New York Times’ One Race, Every Medalist Ever, both from 2012. But we wondered: Could we get readers to physically test their own running time against the athletes? That would give an even more awe-inspiring perspective on the achievement of top Olympians–Usain Bolt is a few seconds faster than the next fastest man, but he’s a lot faster than me and you.

We hoped to be able to determine how long it takes a reader to run 100m, for example, and at the end tell them how much slower they are than the fastest men and women on Earth, but also how they would fare against Olympians of old (Thomas Burke, the winner of the 100m race at the 1896 Athens Olympics, finished in 12.0 seconds—slightly more achievable than the current 9.58 second world record).

But early prototypes dashed our hopes: while you can use acceleration and orientation sensors to determine movement (it’s called inertial navigation, and it’s used in fields like aviation), the consumer-grade sensors used in smartphones are nowhere near good enough.

Still, focusing on running felt like the right move, since lots of people run and very little equipment is needed. A lot of us on the team are runners—occasionally we’ll even run home together, something our coworkers describe as “adorable”—which added to our enthusiasm.

So, if we couldn’t use inertial navigation we’d just have to use geolocation. And since geolocation accuracy is limited, that meant we had to track people over longer distances than 100m—much longer.

A marathon was the obvious choice.

After we shifted the idea to a longer course, it was clear that the focus of the interactive would need to change too. After all, learning how much slower you are over 100m than Usain is mildly interesting; perhaps you’ll try again and do better next time. It’s a wholly different proposition when learning how many hours you were behind marathon world record holder Dennis Kimetto.

Instead of having the runner focus on comparing him- or herself against Olympic athletes, we decided to put Rio de Janeiro itself at the heart of the interactive. We used the marathon route as a jumping-off point to tell the story of Rio and these Olympic games. The interactive podcast was born.

Photo from editing process

Aliza Aufrichtig and Jan Diehm editing the script (about two hours of audio which had to be carefully paced).

Sound and Fury

Though we were excited to do something ambitious with audio, audio is also notoriously tricky on the mobile web. There are basically two ways to go about it, neither of them good:

  • Using an <audio> element is the easy way. Unfortunately, you can only have one clip playing at a time, and playback can only be triggered by a user gesture. For RioRun, which can have up to three layered clips playing simultaneously, triggered by your distance, this was a non-starter.

  • Using the web audio API’s AudioBufferSourceNode. This gives you lots more control, but with the rather unappealing trade-off that it might crash your phone. That happens because the decoded audio—typically 11 times larger than the source MP3 file—has to fit into the phone’s limited memory.

Never the sort of people to be discouraged by such things, we set about solving the problem once and for all with Phonograph, an open source library that splits up large audio files into smaller chunks that can safely be held in memory and decoded as and when they’re needed.

Screen shot

How Phonograph works: overlapping chunks of audio are swapped in and out as needed.

This allows us to stream in background audio as you move into a new neighborhood, crossfading from the existing background audio, and seamlessly looping the clip back on itself with another crossfade when it ends, all the while playing commentary over the top. That would be impossible with <audio>, and dangerous with AudioBufferSourceNode.

You can read about the gory details of Phonograph on Medium.

Photo of recording process

Nadja Popovich and Valerie Lapinski (the host of RioRun, and head of video and audio at Guardian US) recording it.

Measuring Progress

To track a runner’s distance, you need more than raw location data.

Your phone determines your location from a variety of sources—GPS, cell tower triangulation, Wi-Fi networks—with varying degrees of accuracy, and your location will often bounce around. If you plot a typical series of geolocation points on a map, particularly from an old phone, it will look like an angry person drew it.

We built an app, Marshal, to capture geolocation data that we could use to refine an algorithm for cleaning it up. (The name comes from a. course marshals, who keep runners on track during a race, b. the computer science concept of marshaling data, which is how it gets from phone to server, and c. the US Marshals, who among other things track down fugitives—we weren’t capturing runners, but we were capturing runners’ data.)

Two screen shots of Marshal

A Marshal session, before and after correction. (Each session has a name generated by namey-mcnameface, to make it easy to share notes)

The algorithm we settled on was crude but reasonably effective: we eliminate points that have an accuracy radius much larger than the median, remove those that would require someone to run at unrealistic speeds, cap the speed at 12mph (just slower than Dennis Kimetto), and divide the result by 1.2. As kludgy as it sounds, this produces remarkably accurate results in most cases, and because we show progress in 0.1 mile increments, runners are unlikely to notice when we get it wrong.

Handmade Maps

We’re big fans of MapboxGL, but for this project we wanted a very specific look and feel, and quickly concluded that our best option was to render our own map of Rio.

Mapzen has a wonderful vector tile service that lets you freely download GeoJSON tiles of OpenStreetMap data. GeoJSON is very easy to manipulate—combining tiles into a single object, removing unnecessary properties and so on—but it’s not very efficient. Happily, Mapbox maintains two open source libraries called geobuf and pbf, which together shrink GeoJSON down to a tiny binary file on the server and decompress it on the client without any loss of data. (Remarkably, the decompression is actually faster than using JSON.parse!) It goes something like this:


// in node.js
const Pbf = require( 'pbf' );
const geobuf = require( 'geobuf' );

const data = getGeojsonSomehow();
const buffer = geobuf.encode( data, new Pbf() );
fs.writeFileSync( `path/to/geodata.geobuf`, buffer );

// on the client
const xhr = new XMLHttpRequest();
xhr.responseType = 'arraybuffer';

xhr.onload = () => {
  const pbf = new Pbf( xhr.response );
  const data = geobuf.decode( pbf );

  doSomethingWith( data );
};

xhr.open( 'GET', 'path/to/geodata.geobuf' );
xhr.send();

Once we have the OpenStreetMap data in the app, we can render it using a <canvas> element (we initially used SVG because it’s easier to work with, but it soon became obvious that SVG was much too slow for the task). We use one canvas for the map itself and another for the overlay showing your progress, meaning we only have to re-render the map when it moves or when the user zooms in or out. It’s not quite fast enough to render at 60fps on old phones, so while the map is moving we render at half resolution. Because it’s moving, this trick is all but imperceptible.

All told, the map rendering code (including geobuf and pbf) weighs in at about 8kb, and the geodata is a one-time download of 257kb. By comparison, doing the same thing with MapboxGL would involve over 900kb of code and data, meaning a slower initial load. There are definitely lots of situations where our approach wouldn’t be appropriate, but we’ll probably end up adapting it and using it in other visualizations in future.

Adding Sights to Sounds

Screen shot

After a few test outings it was clear that the look of this project was going to be just as important as how it functioned. We went from rough utilitarian mocks, to something that emphasized the controls and progress, and finally to something that we hope captures a bit of Rio’s joyful ambience.

The code and the design worked in tandem to guide our choices. As we got closer to understanding how people would interact with RioRun, we realized that we wanted to create a game-like atmosphere. We built in badges that unlock as you pass cultural landmarks and distance milestones along the route. Since this was an experience designed around Rio’s sounds, the design was our only chance to give people a glimpse into the city’s sights.

We also made sure to include a lot of social sharing opportunities for users—each badge has an associated share image for Facebook and Twitter. If we were asking people to complete a marathon (or at least part of it!), we wanted to reward them and give them an opportunity to brag.

Facebook card

One of the Facebook cards runners can share as they complete the course.

Designing alongside the development did mean that we sometimes went through several iterations before hitting on the final design. But it also allowed us to be quicker and more adaptive when we encountered user experience hurdles, like where to include all of the distance, time and speed stats (it ultimately ended up in a bottom drawer) or how much information to give people on the screen at the end of a session (we broke it down into two pages).

The designs went through some heavy evolution (some stages are not even pictured above), and so did the name. Initially inspired by Pokémon GO, we called the project RioGO, but in the end we chose the more straightforward name RioRun.

Crossing the Finish Line

Photo of runner on track.

Testing at the Red Hook Recreation Area in Brooklyn.

So far, at least one RioRunner, from Helsinki, has completed the entire course. And the development of RioRun was enough of a long, hard slog that we feel like we’ve run a marathon. Was it worth it?

Without giving our figures away, it’s fair to say that our unique users are lower than our interactives would normally get. But when you consider how much commitment is involved—how many pieces of journalism expect you to change clothes?—the analytics look much rosier. People from over 600 cities, including Rio itself, and from every continent except Antarctica have taken part in RioRun. Lots of them have come back to continue the marathon, in both outdoor and treadmill modes.

The trend in recent years has been towards making interactive journalism as low-friction and digestible as possible, but our data clearly shows that there’s an appetite for journalism that demands more from readers, as long as the reward is there.

The project was also a tremendous learning experience for our team. It was the first time we’ve created something that didn’t work on desktop, the first time we focused so heavily on audio, the first time experimenting with Progressive Web App techniques, and the first time creating something that relies so heavily on phone sensor data—all things that will pay dividends in future storytelling experiments.

Photo of runner on track.

The end of the road.

People

Organizations

Credits

Recently

Current page