Features:

We Are Never Neutral

A Q&A with SRCCON:POWER speaker Sara Wachter-Boettcher.


(Photo: Ben Grey)

SRCCON:POWER is coming up so soon, on December 13 and 14. Before then, we’re previewing the speakers selected to talk at the event. Here’s our Q&A with designer and author Sara Wachter-Boettcher.

Sara Wachter-Boettcher on the Illusion of Neutrality in Data, Tech, and Humans

Would you introduce yourself to our readers, please?

I’m Sara Wachter-Boettcher, and I’ve recently realized I don’t know what to call myself anymore! I thought I was going to be a journalist in college, but I took a turn into content strategy and UX in my twenties, and loved it. In 2011, I started my own consulting practice, Rare Union, where I work with a lot of different organizations on building more inclusive, useful content and experiences online.

A few years ago—as I watched Facebook continually shove tragic memories into users’ faces, or online form fields fail my trans and non binary friends, or Twitter become a safe haven for fascists—I became increasingly interested in the way my peers’ design and product choices were impacting users’ lives and emotions. I started writing and speaking about how those of us who create digital experiences can do a much better job considering unintended consequences and potential harms in our work, and preventing bias and exclusion from bleeding into the products we build. In 2016, I first explored these themes in Design for Real Life, a short book for design and tech practitioners I cowrote with Eric Meyer, and in 2017, I expanded on the topic in a more mainstream book called Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech.

My talk at SRCCON Power will build on the themes in that book, and on the responsibility of all of us who have the power and platforms to shape others’ perceptions of the world to turn a much more critical eye toward how technology is made, used, and talked about.

Do you have any advice for someone in a newsroom who wants to act thoughtfully in opposition to the prevailing ethics of a platform?

I don’t work in a newsroom and only ever played at being a journalist, so I don’t want to tell folks who spend their days there how to do their jobs! But I will say, I think one of the biggest things that we need from newsrooms right now is direct, honest language. When someone in power lies, call it a lie—and not five grafs deep in the story. When a group holds fascist beliefs, then their members are part of a fascist group. And, in the same vein, when a tech company collects data it doesn’t need and hasn’t really acknowledged it’s even collecting, call it what it is: surveillance.

In the tech industry, there’s a long history of obfuscation: we obsess over concepts like “engagement” and “stickiness”; we throw around words like “delight.” Those terms tend to glide right over what’s actually happening—massive data collection (which, as we can see, is constantly misused or hacked), addiction-like behaviors, and worse. We need journalists who aren’t willing to accept a tech platform’s obfuscation—just like we need journalists who aren’t willing to accept it from politicians.

What are a few high-level takeaways from your talk that you wish everyone could hear, even if they can’t be in the room at SRCCON:POWER?

I plan to talk about the ways we’ve been conditioned to design for a narrow audience and deliver experiences that are positive for a few, but ultimately really harmful. One high-level takeaway would be to understand that we are never neutral—we go into everything with biases, and we have to be a lot more up front about how those affect the work we do. We also need to ask a lot tougher questions of the data we’re relying on, or that we’re training our billions of bots and algorithmic systems on. Because that data, like us, is also going to be biased. And finally, we need to take responsibility for the things we put out into the world, whether we’re journalists writing a gushing profile about a dapper young man who just happens to espouse Nazi ideology (but looks great doing it!), or a product manager at a tech platform that green lights a new feature that makes it easier for that same man to abuse people.

Learn more about SRCCON:POWER, and view the full schedule.

Credits

Recently

Current page