TNS
VOXPOP
Get, Post or Go Home?
Should GET and POST be the only HTTP request methods used by developers?
Yes, GET and POST are the only verbs needed.
0%
No, DELETE, PATCH and other requests have their place.
0%
GraphQL.
0%
Tech Life

Move Fast and Break People: ‘Technically Wrong’ Examines Toxic Tech and What to Do about It

Oct 20th, 2017 6:45am by
Featued image for: Move Fast and Break People: ‘Technically Wrong’ Examines Toxic Tech and What to Do about It

“There’s nothing wrong with you,” writes Sara Wachter-Boettcher in her new book Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. Rather, “There is something wrong with tech.”

“Technically Wrong” tackles several issues plaguing the average person’s use of tech, including why social media is so favorable to harassers and bots, and how AI algorithms need to be carefully vetted for bias before they deeply embedded in software.

Geared towards the average person using social media, Wachter-Boettcher’s book explains why diversity matters, that algorithms are just a bunch of code that someone created and hence is susceptible to the bias of the creator. She breaks down the culture of industry giants like Facebook, Reddit and Twitter to explain why they are set up to encourage online harassment and fake news. It’s a fascinating read, well-written, engaging and thought-provoking, even for a hardened geek such as myself.

Part of the reason she wrote the book, Wachter-Boettcher said, is that we need to make these discussions about technology and its role a part of a central dialog in our country. She wants the tech companies to feel like they no longer have a free pass.

Wachter-Boettcher peppers her narrative with real-life examples of why having only young men in the design room leads to tech that does not work for everyone, even though it is sold that way. This business model leads end-users to think they are the ones doing it wrong.

For example, the young designers of Facebook’s “Year In Review” feature failed to consider that not everyone had a great year. The algorithm that put the slideshow together chose your most-commented post and put that front and center. For one user, wrote Wachter-Boettcher, that was a picture of his six-year-old daughter who died of cancer. And there was no way for him to opt out or turn it off.

This case, she argued, is one example of why representation matters. And is one of many examples of why not having a cross-section of society at the design table leads to tech that fails pretty much every piece of society not represented.

“The founders believe so deeply in their own vision — and have been rewarded for it for so long — that they don’t realize how narrow that vision is, and how many humans it could harm.” — Sara Wachter-Boettcher

But it’s not just representation. Tech’s business culture, summed up in Uber’s motto “move fast and break things” and Facebook’s emphasis on “the hacker way” call for throwing out software as fast as possible with no need to think about ramifications because you can always come back and fix it.

That is why, she says, that it was so easy for fake news to take hold on Facebook. “Combine the deeply held conviction that you can engineer your way out of anything with a culture focused on moving fast without worrying about the implications, and you don’t just break things. You break people.” (Emphasis hers.)

These companies share an abdication of responsibility, she argues. “A collective shrug of the shoulders at the harm they have caused, and an unwillingness to take responsibility for preventing it in the future.”

It’s not just that tech companies lack diverse staffs, she argues. “It’s that the founders believe so deeply in their own vision — and have been rewarded for it for so long — that they don’t realize how narrow that vision is, and how many humans it could harm.”

Artificial Intelligence: You Are What You Teach

But even more disturbing to Wachter-Boettcher is the rise of artificial intelligence (AI) and how it absorbs implicit bias. In a chapter called Algorithmic Inequity, Wachter-Boettcher explains the basics of algorithms, then talks about how bias in algorithms, or defining what is “normal” not only perpetuates bias but will embed it if not checked.

The results, she explains, seem to be scientific and data-driven. But when the source data holds bias — and they all do — they are not neutral.

“This means biases and blind spots that tech perpetuates aren’t just working their way into individual hearts and minds,” she writes, “but literally becoming embedded in infrastructure that humans can’t easily see, much less critically assess or fix… The bias sticks around — long after we’ve realized it’s there.”

The problem isn’t with the technology, she said. “It’s with the assumptions that technologists so often make that the data they have is neutral and that anything at the edge can be written off. And once those assumptions are made, they wrap themselves up in a pretty polished software package, making it even harder for everyone else to understand what’s actually happening under the surface.”

The ramifications, she said, are huge and are creating disruption in ways we never anticipated. And aren’t prepared for.

“The truth is that most of tech isn’t about to start questioning itself. Not if it doesn’t have to,” she writes.

But What’s a Dev to Do?

Up until now, Wachter-Boettcher said in an interview, tech has been working on the premise of can we do this? “We need a lot more perspective,” she said, “beyond just ‘can we technically do something?’ And a lot more of the like, ‘should we do something?’ And involving in our tech companies at high levels, the decision-making levels.”

That’s all pretty high-level stuff for company decision-makers. I asked her what a developer can do to move the needle forward.

Wachter-Boettcher wants us to take away a deeper understanding of just how powerful our work can be. But tech is also limited if you’re only coming at it from a technical perspective. People building tech need to include the emotional, social, political ramifications of their work, she said.

“Because, the way that technology has evolved in the past decade or so particularly, the technical really touches everything else that we do,” she said, noting the current U.S. Congressional investigation into 2016 political campaign-smear ads Facebook ran on behalf of Russia.

On the developer level, there are changes that need to happen, she said. You can’t fix meritocracy on your own, but there are things you can do.

Start making different kinds of choices and start asking new questions, she suggested. “When you make design decisions, what assumptions am I making? What happens if I’m wrong about these assumptions? Does it still work?”

Also, think all the way through potential outcomes. “How do you have you thought about how people might misuse your product? What happens when it gets into the hands of the worst people?” she asked. “Because that is likely to happen.”

AI or machine learning has another set of questions: “Do you know where your data sets are coming from? Machines learn from something, where does your training data come from? Is it representative of the broad expanse of humans who will use the product? Are you testing the results on the other end? What kind of auditing is happening on the other end?”

Which is why she is encouraging people, average people, to push back. In an interview, she suggested complaining more when products don’t meet your needs.

One of the ways change happens is when the company feels pressure, she explained. Make it visible. Share it online with tagging the company as a place to start.

“Nothing happens without public pressure,” she said.

Group Created with Sketch.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.