Reading: Technically Wrong

Computer BookI didn’t take as much time to read this vacation, but I did finish Technically Wrong by Sara Wachter-Boettcher. Everyone should read this book.  It’s a very clear explanation of the ways in which bias creeps into our software, and of the ways that software silently tracks you and the impact that has on all of us, but often affects vulnerable people even more. One of the key points Wachter-Boettcher makes is that people outside of tech think it’s magic and way too complicated for the average person to understand and so we should leave the tech people alone, because they’re geniuses. This is perpetuated by people in tech themselves, but also by a lot of the rest of us, who do a lot of hand waving when they’re working with technology. “Oh, this stuff is too complicated. I’m not smart enough to understand that.” But, Wachter-Boettcher explains, it’s not really that hard to understand. Technology was created by humans, so it’s understandable, and it’s biased.

The good news is there’s actually no magic to tech. As opaque as it might seem from the outside, it’s just a skill set—one that all kinds of people can, and do, learn. There’s no reason to allow tech companies to obfuscate their work, to call it special and exempt it from our pesky ethics. Except that we’ve never demanded they do better.

For many years, the call for diversity in every field has hinged on the idea that a diverse workforce creates better ideas and products that appeal to a broader market. In other words, there’s always been an appeal to the bottom line. That should be true in tech, too, but it’s even more important as just a few tech companies control so much of our lives, and are collecting data about those lives. What those companies allow us to say about ourselves can influence who those companies think are using their products, clicking on ads, etc. Can you only select one of two genders when you sign up for a service? One race? Can you opt out of giving that information? If the people designing software and interfaces assume that there are only two genders to choose from and that people identify as a single race, then that’s the information they get and use to build more products. Do autofill options assume because you’re a woman, you’re a nurse or a teacher? Do Google image searches return only men when you search for “engineer”?

These are all examples of decisions programmers (who are mostly white and mostly men) make based on their own biases. And increasingly, decisions are getting made by programs themselves, which are fed data that are biased because that’s what the programmers had or because the collection of the data didn’t allow for the full range of information to be collected. Or the program was built to focus on averages rather than ranges.  There are all kinds of ways programs inherit human bias. They’re not, as technology gurus tell us constantly, completely neutral.

This is why I think teaching Computer Science is important, and teaching it not just as programming, but as a broad set of skills (including thinking about ethics) that can lead to understanding the world we live in, which is increasingly mediated by technology. Just think about the tech products you got for the holidays (or a recent birthday): a new phone, Alexa or Google Home, a Smart TV, a streaming device, a new car with built-in everything. Now step back and think about how you got the gifts you got: through Amazon or other online retailer, at a store where you paid with Apple Pay.  And think about how you spent your vacation: surfing Facebook, posting pictures there of your family enjoying the time off, streaming movies, traveling and going through EZPass, Tweeting about your travels, posting an Instragram photo of your pet. Now think about how much those things reveal about you. To play ignorance about these things is dangerous, Wachter-Boettcher explains, because then you give the tech a pass on being ethical, on eliminating bias from their products, on being better citizens in the world.

I’ll leave you with a quote that sums up for me why everyone should one, read this book, and two, take Computer Science and Humanities and Sociology:

As a result, the system prizes technical abilities—and systematically devalues the people who bring the very skills to the table that could strengthen products, both ethically and commercially: people with the humanities and social science training needed to consider historical and cultural context, identify unconscious bias, and be more empathetic to the needs of users.

We need that kind of perspective now more than ever. Tech has become both a driving force in the economy and a mediator of almost every experience you can imagine. And, more and more, those experiences aren’t driven solely by human choices, but by decisions made by machines. This means the biases and blind spots that tech perpetuates aren’t just worming their way into individual hearts and minds, but literally becoming embedded in infrastructures that humans can’t easily see, much less critically assess or fix (p. 176).

 

About the Author

Leave a Reply