New Book Identifies 26 Lines of Code that Changed the World
“I can’t take any credit for the title,” Torie Bosch tells me about the name of her new book, “but I thought it was perfect… It’s got this cheekiness to it.”
Bosch is the editor of the new book “You Are Not Expected to Understand This”: How 26 Lines of Code Changed the World. In a preface, Bosch describes the book’s 29 different authors as “technologists, historians, journalists, academics, and sometimes the coders themselves,” explaining “how code works — or how, sometimes, it doesn’t work — owing in no small way to the people behind it.”
With tantalizing chapter titles like “Wear this code, go to jail” and “the code that launched a million cat videos,” each chapter offers a lively appreciation for programmers, gathering up stories about not just their famous lives but their sometimes infamous works. (In Chapter 10 — “The Accidental Felon” — journalist Katie Hafner reveals whatever happened to that Harvard undergraduate who went on to inadvertently create one of the first malware programs in 1988…)
The book quickly jumps from milestones like the Jacquard Loom and the invention of COBOL to bitcoin and our thought-provoking present, acknowledging both the code that guided the Apollo 11 moon landing and the code behind the 1962 videogame Spacewar. The Smithsonian Institution’s director for their Center for the Study of Invention and Innovation writes in Chapter 4 that the game “symbolized a shift from computing being in the hands of priest-like technicians operating massive computers to enthusiasts programming and hacking, sometimes for the sheer joy of it.”
I myself contributed chapter 9, about a 1975 comment in some Unix code that became “an accidental icon” commemorating a “momentary glow of humanity in a world of unforgiving logic.” This chapter provided the book with its title. (And I’m also responsible for the book’s index entry for “Linux, expletives in source code of”.)
So Bosch applauds the title for “keeping with the ethos of this book, of being really about the humans in coding.”
But more importantly, the book gives all this history some relevant modern context about unintended impacts — especially in a chapter by Charlton McIlwain, a media/culture/communications professor at New York University.
While noting that the computer revolution coincided with the black civil rights revolution, McIlwain points out that this ultimately led to a flawed crime-data algorithm that “laid the cornerstone” for what would become today’s surveillance infrastructure.
And there’s also a larger problem, McIlwain adds in his chapter on the Police Beat Algorithm (which was also published online). “Belief in the objective and infallible nature, and in the predictive power of data, continues to run rampant among technology purveyors, law enforcement personnel, public officials and policy influencers.”
The first excerpt from You Are Not Expected to Understand This is live! @cmcilwain chronicles the time LBJ and IBM teamed up to solve America’s crime problem. The “solution” still haunts us. https://t.co/qv3Ku2HHO4
— Torie Bosch (@thekibosch) November 11, 2022
It’s all packed into a 216-page paperback (also available as an audiobook), which Bosch described in our interview as really smart people sharing an “astounding” number of illuminating anecdotes. Programmer/author Ellen Ullman writes in the book’s introduction that programs “are works of the imagination that must then make the hazardous crossing into the structured world of code,” then compares hazards like the rain of security vulnerabilities to omnipresent yet invisible solar winds.
Bosch also edits Future Tense, a self-described “citizen’s guide to the future” exploring technology’s intersection with society and public policy (published on Slate in partnership with Arizona State University and the public policy think tank New America). And she brings this interest in larger societal issues to the book’s rollicking romp through the history of the very human art of programming.
In the book’s final chapter, data journalism professor Meredith Broussard warns that code doesn’t just change the world, but also “makes culture incarnate” — and can make social change harder. (“The next frontier in gender rights is inside databases.”)
But on a more hopeful note, Bosch tells me in an interview that “Just thinking through how things were, how they’ve changed, why they’ve changed, can really help us in thinking through what we want to make the future look like — and how to get there.”
Here’s more from that interview.
The book’s last chapter reminds us that computer systems are sociotechnical constructs, and “need to be extensively updated on a regular basis. Just like humans.” Is the book in some way a call to action? Once you do recognize that computer code, the work of humans, is not immune to errors, doesn’t that almost dare you to take some next logical step?
Oh, I hope so. I really hope that someone might read this, and it would prompt them to spend some more time thinking about ethical decision-making in computers, thinking about the potential implications of the work they do, and how to do their work in a way that benefits people… If you’re in a situation in which you’re being asked by your bosses to code something to help a car evade emissions inspections, like Lee Vinsel writes about — the chapter on the Volkswagen emissions scandal… I hope that thinking about these sorts of things will help people think about when they might want to say no to. Or how they might want to think about what kinds of companies they’re working with.
One lofty idea I had is I would love it if people used this book in computer science classes. I think it’s really important in computer science education to give students an opportunity to think through these sorts of issues.
Sara Wachter-Boettcher, author of Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, calls your book “provocative” — not just essential reading on coding’s history and culture, but also “bursting with humanity.” Was that a deliberate decision, to focus the book on the very human stories that surround code?
The idea for this book actually originates in a project that I worked on for Slate back in 2019…. [The Lines of Code That Changed Everything] And in talking through this idea of a book with Hallie Stebbens from Princeton University Press, one of the things we agreed on really early was people had to be a really big part of this….
Sometimes, it feels like code or at least technology is inevitable. That as a consumer or someone who maybe doesn’t work within technology, it feels like these products arrive to you fully formed — you know, this is the way the Roomba was meant to be. But the fact is that all code is the result of human decisions, and those human decisions are made by people who may not be fully caffeinated one morning. Or may have, you know, had a really great day, or may have certain biases that they’re not even aware of. And so if we’re talking about code, we’re talking about the humans who made the decisions that created the code.
And those decisions, of those humans, end up affecting all of our lives in often very unexpected ways.
That sounds like a warning and a hope, all bundled together.
That’s exactly right. One of the big hopes for this book is that it’s something that will have sort of an upstairs/downstairs appeal. I think that there really is something here for people who work with code day in and day out, offering them an opportunity to step back and think a little bit holistically about their work… Sometimes if you’re in the trenches, it’s hard to do that.
But I also really want this to be understandable for someone who doesn’t think about code much, who can use this book as an opportunity to think more deeply about the way technology affects their lives in ways they maybe hadn’t thought about before.
So how much power do programmers have today? They say software is eating the world — but doesn’t that mean that programmers are the ones carving up the slices?
I’ve been really interested over the past several years to watch the power of the tech activists and tech labor movements. I think they’ve shown really immense power to affect change, and power to say, “I’m not going to work on something that doesn’t align with what I want for the future.” That’s really something to admire.
But of course, people are up against really big forces. These are the most powerful companies in the world, in many cases — so I don’t want to oversell or overstate the burden on individuals, in the face of giant, society-wide obstacles. But I do have so much admiration for the people who are working to create better systems.
My favorite illustration in the book shows Neil Armstrong walking on the moon — and then to the left of the image is the code that got him up there. Is there perhaps also a hopeful message in there — maybe a once and future hope, that if we innovated once, we can innovate again — now fore-armed with some knowledge about what could go wrong?
Absolutely. I think the message is that code is human, and humans are always changing — and arguably coming up with brilliant ideas to make things better. So we can keep doing that…
We just have to really think carefully through what we want to do differently and how to get there. And to remember that all of this comes down to individuals.