Saturday , November 28 2020

Coded Bias review: Eye-opening documentary faces up to racist technology

CNET también está disponible en español.

Don’t show this again

Meet the activists fighting back against the human failings hardwired into the algorithms already ruling our lives.

Scientist and campaigner Joy Buolamwini faces the consequences of racially biased algorithms.

When exactly did computers start making decisions about our health care, our jobs, our access to opportunities, even whether we can walk down the street without being arrested? The new documentary Coded Bias delves into the chilling reality of how much power technology holds over us right here, right now — and introduces the campaigners fighting the tech that’s shaped by our worst human failings.  

In select theaters and streaming online, Coded Bias begins with Joy Buolamwini. We meet this infectiously curious Canada-born and Mississippi-raised computer scientist sporting Wakanda earrings in an MIT office filled with Lego, reading not a comic book but a book about how to make comic books. She talks about her youthful enthusiasm for the idea of technology transcending the problems of the real world — until a project involving facial recognition made her realize human failings are in fact hardwired into the things we build. The film shows a face analysis system fail to notice this talented Black engineer until she puts on a white mask: an eye-opening visual metaphor for how technology perpetuates the biases of the people who make it.

Entertain your brain with the coolest news from streaming to superheroes, memes to video games.

Facial analysis becomes a starting point for director Shalini Kantayya to deftly widen the scope. Coded Bias shines a light on the worryingly unregulated power that’s increasingly exerted by algorithms, data science, machine learning and so-called artificial intelligence, all doing the bidding of programmers and engineers and billionaires who are passing on their conscious and unconscious biases. The result is a brave new world of machines that looks a lot like it’s shaped by the racist and sexist power structures of the past.

The film dispenses with the familiar Hollywood comparisons to HAL and the Terminator before diving into the origin story of AI: a summer workshop of white academics at Dartmouth University in 1956. This serves as an interesting bit of history, but it also sets up an important theme. Today’s innovation is beholden to the decisions not just of the people working in the field now, but the people who laid the foundations and guided us to where we are today — steered knowingly or unknowingly by their own attitudes about what science should be and who it’s for. 

From yesterday’s (mostly white, mostly male) theorists to today’s (mostly white, mostly male) tech billionaires, there’s a marked continuity between the people in charge then and who’s in charge now. “Data is a reflection of our history,” as Buolamwini puts it. “The past dwells within our algorithms.”

This isn’t some troubling but abstract thought experiment. AI shapes our lives now. The most chilling parts of Coded Bias are various sections highlighting actual examples of where the algorithm is already making decisions about your credit, your health, your housing, your college or job applications, your access to possibility. The system holds in its hands your hope of a better life — and there’s no appeal if the computer says no. 

It’s alarming how much global power is accruing with the “big nine” tech giants: Amazon, Google, Facebook, Tencent, Baidu, Alibaba, Microsoft, IBM and Apple. Even more sinister is the prospect of law enforcement seizing on new technology before it’s been tested, or weighed by lawmakers. Sure, we can point at China’s unsettling Black Mirror-style social credit system and say at least we’re not like that. But in supposedly democratic capitalist economies, the problem is actually more insidious: over 117 million people in the US are registered in face recognition networks, for example. 

And on the camera-packed streets of London, the film meets activist group Big Brother Watch both mounting high-level legal challenges against mass surveillance and pounding the pavement helping people threatened by police because a hidden camera pointed them out. One scene shows a schoolboy bundled off a busy street by a posse of un-uniformed officers when a facial analysis system misidentified him. The schoolboy was Black. Another scene shows an absurd confrontation as police fine a man who covered his face when passing a facial recognition camera. Even if the technology was accurate in identifying suspects — it very much isn’t — law officers are clearly using it to designate a behavior as suspicious, provoke that very behavior, and manufacture punishment.

If facial analysis can’t accurately recognize Black faces and machine learning can’t unpick the human biases leaching into the data that feeds it, then the technology clearly has no place holding such monumental sway over people. 

The film pushes back against the seductive yet dangerous idea that technology in general is neutral and can be delegated difficult decisions. Allowing technology to make such significant decisions means abdicating responsibility for the catastrophic social and cultural divides humanity has stoked over the generations. If we let computers do the thinking, even if it means they keep making the same mistakes we have, then we opt out of reckoning with those mistakes.

Coded Bias is an essential warning, but it isn’t here to plunge you into existential despair as you sink back into doom-scrolling on your smartphone. Buolamwini, who founded campaign group the Algorithmic Justice League to fight back against bias in decision-making systems, personifies hope. Throughout the film, she and the filmmakers seek out and join forces with other campaigners, many Black, many women, chipping away at unseen monoliths of control from apartment blocks and hair salons and MIT labs to the corridors of power. 

If there’s one thing to take away from Coded Bias, it’s that this needs to be challenged now. We’ve opted into an unimaginably vast surveillance project controlled by a handful of profit-driven companies, billionaires and states in which democratically elected lawmakers and the people themselves are steps behind, and we didn’t read the terms and conditions. But Buolamwini and a league of activists around the world have already affected real change as tech giants rethink their systems and lawmakers tackle the implications.

Insightful documentary Coded Bias doesn’t just begin with with face recognition: it recognizes the women facing the future.

This Article was first published on cnet.com

About IT News Ug

Check Also

Netflix says Queen’s Gambit is its most-watched limited series yet

CNET también está disponible en español. Don’t show this again A show about an orphaned …

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

//luvaihoo.com/afu.php?zoneid=2572107