Lack of diversity in data science perpetuates AI bias


SOURCE: SILICONANGLE.COM
MAR 09, 2022

Data privacy measures such as the General Data Protection Regulation and the California Consumer Privacy Act are expanding the definition and protection of private sensitive data. Anonymization efforts, though valiant, can only go so far.

“You can only manage what you measure, right?” said Hannah Sperling (pictured), business process intelligence, academic and research alliances at SAP SE. “But if everybody is afraid to touch sensitive data, we might not get to where we want to be. I’ve been getting into data anonymization procedures, because if we could render more workforce data usable, especially when it comes to increasing diversity in STEM or in technology jobs, we should really be letting the data speak.“

Sperling spoke with Lisa Martin, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during the Women in Data Science (WiDS) event. They discussed data anonymization and the inherent bias of human-generated analysis.

Complete objectivity is logically impossible

Taking the human factor out of analysis is not only idealistic, it’s the wrong path, according to Sperling. Since analysis is inherently a backward-looking effort, she believes that recognizing and adjusting for those biases is the model to follow.

“I’m sometimes amazed at how many people still seem to think that data can be unbiased,” Sperling said. “The sooner that we realize that we need to take into account certain biases, the closer we’re going to get to something that represents reality better and might help us to change reality for the better as well.”

Lack of diversity in data science has perpetuated bias in artificial intelligence decisions, from soap dispensers that only recognize light-colored skin to decisions on hiring, financial applications and parole approvals.

“There is a big trend around explainability, interpretability in AI worldwide because awareness around those topics is increasing,” Sperling explained. “That will show you the blind spots that you may have, no matter how much you think about the context. We need to get better at including everybody; otherwise you’re always going to have a certain selection bias.”

A message from John Furrier, co-founder of SiliconANGLE:

Show your support for our mission by joining our Cube Club and Cube Event Community of experts. Join the community that includes Amazon Web Services and Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger and many more luminaries and experts.

Similar articles you can read