The Dangers of Algorithmic Bias

Print Friendly, PDF & Email

In this special guest feature, Sanjay Arora, CEO and Founder of Million Short, examines the dangers of algorithmic bias. It’s tempting to regard algorithms as objective, unbiased constructs that are free from the corruptive influence of human prejudices and biases. Unfortunately, this is seldom the case. Because algorithms are designed by humans – and increasingly learn by observing human behavior – they tend to adopt the biases of their developers and of society as a whole. Sanjay is the CEO and Founder of Million Short and of Nextopia, a search and navigation engine for online retailers. He first honed his search-engine expertise while building Nextopia from a small team to one of the largest eCommerce employers in Toronto, serving over 1,600 customers and partnering with platforms like Yahoo and Shopify. He is dedicated to disrupting the search engine status quo and to improving the way that people discover information online.

For most people, it’s impossible to go any length of time without interacting with an algorithm. Algorithms are behind every online search we make, every resume we review, and every potential dating partner we swipe. And that means that they have a significant influence over the way we perceive and experience the world.

Algorithms enable us to accomplish things that previous generations could never have dreamed of. But their ubiquity can also lead to serious social consequences.

Algorithms Adopt Society’s Unjust Biases

It’s tempting to regard algorithms as objective, unbiased constructs that are free from the corruptive influence of human prejudices and biases. Unfortunately, this is seldom the case. Because algorithms are designed by humans – and increasingly learn by observing human behavior – they tend to adopt the biases of their developers and of society as a whole.

In recent years, the phenomenon of “algorithmic bias” has manifested itself in a number of disturbing ways. Algorithms employed by job search platforms have been shown to recommend male candidates for executive roles more frequently than female candidates, perpetuating gender-based employment discrimination. One researcher found that online searches for names associated with the black community were 25% likelier to generate ads insinuating that the person has a criminal background – whether or not that insinuation is true.

In some cases, the consequences of biased algorithms are even more severe. COMPAS is a software that uses algorithms to predict which convicted criminals are likely to re-offend. Judges throughout the country rely on COMPAS forecasts to help them determine sentences, bail amounts, and parole decisions. There’s just one problem: ProPublica recently found that COMPAS discriminates against people of color, incorrectly labeling black offenders as high-risk twice as often as it does whites.

Similar racial biases have been found in algorithms that determine whether or not a person’s loan application should be approved. These are decisions with serious ramifications for people’s lives, and they’re being influenced by algorithms that perpetuate – rather than mitigate – unjust biases.

Facing Our Own Biases

At the end of the day, we have to face the fact that algorithmic biases generally reflect the biases of society. An algorithm can be a dark mirror that reflects our own conscious and unconscious prejudices right back at us. On an individual level, this can produce a feedback loop that continually reinforces our prejudices rather than challenging them.

Personalization algorithms like those employed by Google News and Facebook’s Newsfeed create “filter bubbles” that degrade our capacity for empathizing with people who have opinions and ideas that differ from our own. This phenomenon has serious social consequences, as leaders like Bill Gates and Angela Merkel have begun to point out.

Algorithmic bias is so dangerous in part because we seldom see it coming. It’s time we begin treating algorithms with the same level of scrutiny that we apply to human actors. And it’s crucial that developers critically examine their own prejudices and those of society so that they can correct for unjust biases in the design of their algorithms.

 

Sign up for the free insideBIGDATA newsletter.

 

 

 

 

Speak Your Mind

*