Tackling “Coded Bias”

Tackling “Coded Bias”

Machine learning is the latest technological craze. Machine learning algorithms — a subset of artificial intelligence — are algorithms designed to be fed data to make predictions. Major companies such as Meta, Amazon and Google are funneling their massive vaults of data into machine learning algorithms. Google Maps, for example, uses a combination of historical traffic patterns and aggregate location data to avoid traffic jams and determine best routes. 

The hottest machine learning algorithms are the types that can be used to identify the faces of people. Some believe that these algorithms are more biased and invasive than the public has been led to believe. 

Director Shalini Kantayya’s documentary, “Coded Bias,” argues against the use of machine learning algorithms in unfair, biased ways. “Coded Bias” follows various authors and researchers to show misuses of machine learning in modern society. The documentary heavily focuses on Dr. Joy Buolamwini, founder of the Algorithmic Justice League. 

Kantayya had no experience with machine learning algorithms prior to the creation of this documentary. “I was brand new to the subject matter, so the journey that the audience goes on with Coded Bias, is the steep learning curve that I myself went on.” She was shocked by the lack of regulations or government oversight in the use of machine learning algorithms used to identify people. 

“Coded Bias” makes the point that machine learning, a concept considered by many to be absolute and unbiased, is nothing more than an echo of society’s existing biases. The documentary shows various places where machine learning is used, in addition to various areas where machine learning could be misused.  

One such place was Atlantic Plaza Towers, an apartment complex in Ocean Hill, Brownsville. Prior to the creation of “Coded Bias,” the building’s landlord made a request for the inclusion of facial recognition software to its key fob system, requiring residents to scan their faces to enter their apartments. Buolamwini met with some residents as they aired complaints. 

At the same time, the documentary follows Buolamwini’s fight against the use of biased facial recognition algorithms. Buolamwini’s journey began when facial recognition software consistently failed to recognize her face. She soon realized that the software was more effective at recognizing men and lighter-skinned people. 

The documentary alternates between issues in facial recognition software, greater fundamental issues in machine learning and personal grievances with machine learning. The harvesting of face data also requires guidelines and oversight. The documentary concludes with Buolamwini’s testimony to Congress: “Our faces may well be the final frontier of privacy.” 

More information on “Coded Bias” can be found at https://www.codedbias.com/. This documentary can be streamed on Netflix. 

About The Author

Karim Gueye

Gueye (Computer Science '22) is part of the Vector writing and copy editing teams.

Voice your opinions