Are Programmers Ethically (and Legally) Responsible for Their Code?

Is the tech industry ethical? Does it have to be? And the consumer demand for features is more important than security. And millennials don’t even care about privacy anyway.
These are all common myths about tech ethics. And the greatest myth is that developers don’t care if the code they are writing, testing or releasing is ethical. Or is that the real myth? For the first time in 2018, the annual Stack Overflow developer survey of more than 100,000 members of the international developer community addressed ethics in coding. Let’s summarize the responses to those four questions:
- Would the responding developers continue to write code for unethical purposes if they found out about those purposes? A majority, 58.4 percent, said No, while more than a third said depending on what it is.
- How would developers report unethical code? Almost half said it depended, while about a third said only within the company. About 13 percent said publicly.
- Do developers have an obligation to consider the ethical implications of their code? Almost 80 percent said Yes.
- Who is ultimately responsible for code that accomplishes something unethical? About 58 percent responded upper-level management, 23 percent said the person who came up with the idea, while only 20 percent felt the coder was responsible.
This seems right in line with what one of the Volkswagen developers who was sentenced with three years in jail for more than a decade’s involvement in the automobile company selling diesel cars that were well past the U.S. environmental standards, but were programmed to look like they weren’t. When advocating for house arrest, the Volkswagen employee James Liang’s lawyer, Daniel Nixon said that his client was not a “mastermind” of the emissions fraud, but rather Liang “blindly executed a misguided loyalty to his employer.”
This anecdotal evidence echoes the trend that, while developers acknowledge they should be thinking about ethics in the code they’re writing and releasing, in the end, they don’t feel the weight of responsibility — most assume that falls on the leadership.
Anne Currie, founder of the Coed Ethics conference, says that it’s not that developers don’t care, it’s that they don’t think they have any power to make a difference — and she argues they do. While the conference had a lot about the psychological and philosophical basis to decision making and some examples of where profits and world domination can win the ethical debate, it also had some actionable takeaways. Today we’re going to share just what developers can do to influence the ethical decision making within their companies and a couple examples when they did just that.
If Google Can Do it, Can Anyone?
The conference kicked off by scaring the crowd a little — one of the more extreme cases when a lack of tech ethics or accuracy actually leads to loss of innocent human life. Human Rights Lawyer Cori Crider shared the story of signature strikes, targeted drone kills based on machine learning algorithms, most notably performed by the U.S. military on people whose geo-locational life patterns, social networks, and travel behavior model that of a terrorist.
Former NSA and CIA director Michael Hayden famously said: “We kill people on metadata.”
This data is inherently flawed when an embedded Al Qaeda reporter was the top “known terrorist” result simply due to him being a good journalist on the scene.
“The data isn’t perfect, it’s never perfect, but that’s what modern warfare looks like these days,” Crider said. “Algorithms and warfare are making probabilistic judgments about what is a threat or not.”
She said these imperfections have cost hundreds if not thousands of civilian lives in the drone wars, not even mentioning those who’ve also died in active war zones like Afghanistan, Syria and Iraq, which she says saw 6,000 civilian deaths in 2017 alone. Crider says since humans are using the algorithms and corners are being cut, it leaves to an “indifference to civilian life.”
The issue can spark an ethical debate — should the best developers try to improve the machine learning behind these deadly signature search results or should tech companies stay out of the defense sphere?
A slew of Google’s employees had something to say about it when the tech giant won a bid to work with the U.S. Department of Defense to develop artificial intelligence algorithms to process drone video feeds. Google said its work with Project Maven was for “non-offensive uses” but four percent of its staff differed, writing an open letter to their CEO via the New York Times that Google should not be in the business of war. Google decided not to renew its Project Maven contract and even created a new set of AI principles, which includes that its AI research should not be used for weapons.
“I think the Project Maven example shows that you guys as developers have actually a lot more power than you think,” Crider said to the audience, “and you can and should ask questions:
- What am I actually building?
- What’s the supply chain?
- What other uses are there?”
She ended her talk that developers need to recognize their own power, negotiate not just pay but work too, ask questions, and ask for help from more knowledgeable resources (like human rights law firms) when you don’t know the answer.
“We all have a collective responsibility to be more informed.” — Cori Crider
Responsible Development
So devs have power, but what does a responsible development process look like? This was the topic of the talk given by think tank Doteveryone’s Sam Brown and Container Solutions’ Ádám Sándor.
Sándor says making tech responsibility the new norm all comes down to tracking how data moves around a company.
He asked “Whose problem is it if data gets stolen? Was it devs not thinking, ops not securing or management not giving enough budget? In these situations, it’s very easy to think ‘This isn’t my own problem, I’m just a cog in the machine.'”
Sándor contends the breaking of inner-company silos and the agile movement are putting everybody — project managers, UX designers, software developers — on the same team, and now is the perfect time to build ethics into these processes. This means the growing popularity of multidisciplinary teams with shared ownership, that sees developers owning the software to full production.
“That’s how we get the powerful developer team to make the connections,” he said.
“Start a conversation with people inside your company you’ve never talked about this issue before.” — Yanqing Cheng
Brown said she believes developers and product owners have always had the power, but that her data responsibility thinktank is working to put it all together into a meaningful action plan. It all starts with a definition of what responsible technology really is.
She went onto say developers and product owners “must acknowledge the impact that these technologies have been having on our relationships and our communities and the very institutions that form our societies.”
At the highest level, this means making sure tech is:
- not creating or deepening inequality
- recognizing and respecting the dignity and human rights
- giving people confidence and trust in their use
This led to developing the 3C Model to support this approach, having devs and POs ask questions and consider responsibility at each stage of a project.
This adaptable framework comes down to developers being cognizant of:
- Context — Actually understanding how technology operates in the wider world, when your developing from the beginning, including how you understand the user journey, building with a diverse team, and inclusive design
- Consequences — How the tech is going to be monitored and supported, how it can affect social norms, security, reliability, and anticipating unintended consequences
- Contribution — Holistically considering cross-functional, cross-sector ownership, algorithm inputs, and best practices.
Soon everyone will implement a responsible tech product assessment to follow along the way. Even pausing to review responsible and ethical criteria during retrospectives can lead to more ethical behavior, including ethical considerations in your documentation and backlog, including to see what you missed or didn’t have funds to test, acting transparently with users if you are releasing a less secure new product.
“Don’t build something if you don’t have the budget to build the security infrastructure properly. Knowing your limits is also important to behave ethically.” — Ádám Sándor
“Have you done these things that you need to consider to be responsible? The act of actually having to stop and consider and having a checklist that you can go against in all the areas of responsibility in a given process,” Brown said.
“You have to think about what matters to you on a daily basis and what makes you sleep at night — then you can take the forward step.” — Andrea Kock
She clarified that there is no gold standard, but ethics and ethics assessment are part of continuous improvement. Everyone will just create a dashboard to measure against and leadership training to support the behavior. And it’s not just about ethics, it’s about good business, too.
Yes, these all seem like simple ideas and like the agile development world doesn’t need more frameworks and canvases, but many of the speakers offered the idea that you should start small and that’ll lead to empowerment.
In the end, Currie doesn’t look at Coed Ethics as just a conference, but as a conversation starting point for a movement, as she looks to make London the epicenter of a tech ethics ripple that will eventually flow around the tech world.