Modal Title
Machine Learning / Tech Life

Cultural Bias in Artificial Intelligence

Camille Eddy, currently a student pursuing a mechanical engineering bachelors degree at Boise State, and an intern at Nvidia, discusses the issue of bias in machine learning.
Aug 7th, 2018 11:03am by
Featued image for: Cultural Bias in Artificial Intelligence


Culture Bias in AI with Camille Eddy

Advertising and white papers may make artificial intelligence seem like a pie in the sky proposition, with easy analysis, deep insights, and fair algorithms available everywhere. The reality, however, is that AI can expose an even darker side of our own humanity, acting as more of a mirror than as sky-pie. We saw this when Microsoft put an AI-driven bot up on Twitter, only to have it spout racist statements shortly thereafter.

Camille Eddy, currently a student pursuing a mechanical engineering bachelor’s degree at Boise State, already has a long career as a high-tech robotics intern at places like Alphabet and HP. She’s currently interning at Nvidia, in fact, when she’s not out on the speaking circuit. At OSCON, she spoke on the topic of recognizing cultural bias in AI.

“Some of the things we’ve seen are misclassification or misidentification. For example, Microsoft’s Tay AI, a bot that was released on Twitter was famously easily influenced by people talking to it in racist and sexist ways, and it reflected that. People would say ‘This is an idea, you should hold this idea,’ and it did. Talking about ways it can reflect our own biases as a society, and how that might not be something that we want,” said Eddy.

Eddy takes the long view of this problem, expecting it to be a long term issue that will need to continue to be addressed by teams as long as there is AI to train. “I don’t think we will be a society that will produce technologies that will never have bias, because we’re not like that as human beings. We’re always going to have some type of bias. I am really interested in how we can de-bias technology in general and go broader,” said Eddy.

In this Edition:

0:48: Tell us about some of the problems that have surfaced in AI.
1:43: How do we plan to take this into account for the future?
5:59: Who needs to be involved to make sure this doesn’t happen in a software project?
7:02: How did you become interested in AI?
13:13: How does this apply to emerging fields?
14:08: Who were you really trying to help communicate with your OSCON session about bias in AI?

Microsoft is a sponsor of The New Stack.

Feature image by James Pond on Unsplash.

Group Created with Sketch.
TNS owner Insight Partners is an investor in: The New Stack.
THE NEW STACK UPDATE A newsletter digest of the week’s most important stories & analyses.