Outrage over police brutality has finally convinced Amazon, Microsoft, and IBM to rule out selling facial recognition tech to law enforcement. Here's what's going on.

Authored by businessinsider.com and submitted by maxwellhill

IBM, Amazon, and Microsoft have all committed to not sell facial recognition to law enforcement at least temporarily.

While activists have been campaigning for the companies to do this for years, the Black Lives Matter movement appears to have tipped the scale.

As facial recognition becomes more widely used to catch criminals, illegal immigrants, or terrorists, there is mounting concern about how the technology might be abused.

Each company has made subtly different promises on their sales bans.

Visit Business Insider's homepage for more stories.

Three of the world's biggest tech companies have backed off selling facial recognition to law enforcement amid ongoing protests against police brutality.

IBM announced on Monday it is halting the sale of "general purpose" facial recognition. Amazon on Wednesday announced it was imposing a one-year suspension on the sale of its facial recognition software to law enforcement. Microsoft followed suit on Thursday, saying it does not sell facial recognition tech to US police forces and will not do so until legislation is passed governing the use of the technology.

It's something of a u-turn, since activists and academics have been advocating for companies not to sell facial recognition to law enforcement on the basis it exacerbates racial injustice for years.

Facial recognition is becoming an increasingly popular tool for government agencies and law enforcement to track down criminals, terrorists, or illegal immigrants. But with the Black Lives Matter protests sweeping across the world in the wake of George Floyd's death, there are now fresh calls for the big tech companies to stop selling the technology to police.

The argument long put forward by civil rights groups and AI experts is that facial recognition disproportionately affects people of color in two ways.

Firstly, like any policing tool operating by systemically racist societies or institutions, it will inevitably be used to target people of color more often.

Secondly, the data used to build facial recognition software ingrains it with racial bias which makes it more likely to misidentify women and people of color, which would in turn lead to more wrongful arrests. This is because the datasets used to train facial recognition algorithms are often predominantly made up of pictures of white men.

Here is a breakdown of exactly what each company has promised:

graebot on June 13rd, 2020 at 15:36 UTC »

Let's be real. As soon as the public eye moves on, sale will be back on. You can trust huge companies to make money any way they can get away with.

lilrabbitfoofoo on June 13rd, 2020 at 15:28 UTC »

But they will all "sell" it to newly formed opaque corporations that will rebrand the tech and sell it to law enforcement...

GORDON1014 on June 13rd, 2020 at 15:23 UTC »

Instead they are going to rent out the private police force they form that has access to these technologies