Whether you're searching for information on Google, being subjected to facial recognition technology built by Amazon, wondering if your Facebook post will be taken down or curious whether your resume made it to the top of the pile, behind-the-scenes systems fueled by essentially unknowable algorithms are powering these and countless other online and offline processes.
A bill introduced on Wednesday by Democratic Sens. Ron Wyden and Cory Booker called the Algorithmic Accountability Act aims to force tech giants to audit their machine-learning and artificial intelligence systems for bias. A similar bill has been introduced in the House of Representatives.
Tech firms have been loath to share much information about how these internal computational systems work. As such, they are often characterized as "black boxes" with the potential for abuse. Where there's no light, critics say, bias and discrimination against groups based on race, gender and other traits can flourish.
“The harms are not evenly distributed, but this is in our lives, right?” Meredith Whittaker, a co-founder of the AI Now Institute, asked in a recent conversation with Kara Swisher. “There are license-plate profiling AIs that are sort of tracking people as they go over different bridges in New York. You have systems that are determining which school your child gets enrolled in. You have automated essay scoring systems that are determining whether it’s written well enough. Whose version of written English is that? And what is it rewarding or not? What kind of creativity can get through that?”
The bill would empower the Federal Trade Commission to create and enforce new rules for companies to check for accuracy, bias and potential privacy or security concerns in their automated systems, and to correct them if issues are discovered. Smaller firms that make less than $50 million per year are exempted, unless they are data brokers with information on at least 1 million customers.
“Computers are increasingly involved in so many of the key decisions Americans make with respect to their daily lives — whether somebody can buy a home, get a job or even go to jail,” Sen. Ron Wyden said in an interview with The Associated Press.
Advocates for a more ethical approach to artificial intelligence have pointed out that machine learning systems are still built by humans – in terms of Silicon Valley's biggest players, that typically means white men – and therefore should not be seen as neutral or infallible.
“[Amazon] took two years to design, essentially, an AI automatic resume scanner,” the AI Now Institute's Kate Crawford told Swisher on the podcast. “And they found that it was so biased against any female applicant that if you even had the word ‘woman’ on your resume that it went to the bottom of the pile.”