Shareholders want Amazon’s ‘faulty’ face recognition software to be used by police

Montage of faces simulating Amazon's facial recognition software
Glen Black

Amazon shareholders voted not to restrict the company’s sales of automated facial recognition (AFR) software. And the margin by which this decision passed was shocking.

Only 2.4% voted to restrict

On 22 May, shareholders voted against a proposal to stop sales of Rekognition, Amazon’s AFR software, to US police forces. And on 28 May, details of the vote were published. They showed that just 2.4% of shareholders were in favour of restricting Rekognition’s sales. The motion needed at least 50% to pass. Meanwhile, 27.5% of shareholders voted for a second proposal asking Amazon to conduct an independent human rights assessment of the software.

The American Civil Liberties Union (ACLU) said before the breakdown was published that the proposal “should serve as a wake-up call” to Amazon about “the real harms of face surveillance”. Digital privacy campaign group Open MIC, which organised the proposal for an independent assessment, nonetheless celebrated the failed vote. It described the number of shareholders voting for an assessment as “precedent-setting” and shows that they:

Start your day with The Canary News Digest

Fresh and fearless; get excellent independent journalism from The Canary, delivered straight to your inbox every morning.




understand the social and business risks of facial recognition technology and are prepared to press for corporate accountability on this issue over the long-term.

“Useful applications”

Amazon defended Rekognition, telling technology website TechCrunch that it “has many useful applications in the real world”. It also rebutted concerns about the software, stating:

Our quality of life would be much worse today if we outlawed new technology because some people could choose to abuse the technology.

The company also called for legislation on the use of AFR software. The technology is currently unregulated by the US federal government, although authorities in San Francisco banned it from the city.

Both votes were held on a ‘one share, one vote’ basis. That means those with the most shares had the biggest influence over the outcome. Amazon founder Jeff Bezos retains about 12% of shares. And all four biggest individual shareholders are also senior employees of Amazon. Furthermore, the company itself had tried to stop the vote but was overruled by the US government. As a result, it recommended shareholders vote against both the proposals.

Gender and ethnic bias

The Amazon shareholders’ decision came after real-world trials showing at least one police force in Oregon misusing the AFR software. And in April, researchers from the Massachusetts Institute of Technology showed Rekognition displayed gender and ethnic bias. Amazon reportedly pitched its software to Immigration and Customs Enforcement (ICE) in the summer of 2018. ICE has been criticised by groups such as ACLU for posing “threats to civil liberties” and has been implicated in racial profiling. However, technology magazine the Verge said there was “no indication that ICE ultimately purchased or used the system”.

A 2018 statement, signed by 67 civil liberties groups, faith and ethnicity advocates, press freedom organisations, digital privacy firms, and lawyers guilds, called for Amazon to:

stop powering a government surveillance infrastructure that poses a grave threat to customers and communities across the [US]. … People should be free to walk down the street without being watched by the government. Facial recognitionin American communities threatens this freedom. In overpoliced communities of color, it could effectively eliminate it.

Yet votes against both proposals showed an overwhelming desire to place marketability over ethics. Perhaps it’s because those holding a majority of shares are least likely to be targeted by law enforcement.

Fighting back

People are also fighting back against the use of AFR software by police in the UK, where Japanese firm NEC’s NeoFace software has been in use since 2014. Like their US counterparts, trials by police in England and Wales have consistently misidentified large numbers of people.

As this type of technology begins to take hold across the world, it’s necessary for people everywhere to hold government and corporations accountable for its use. Because, as the UK government’s own surveillance commissioner Tony Porter pointed out in 2015, just the presence of surveillance changes communities. And it’s a few wealthy shareholders that are driving that change for their own interests.

Featured image via YouTube – AWS Online Tech Talks

Since you're here ...

We know you don't need a lecture. You wouldn't be here if you didn't care.
Now, more than ever, we need your help to challenge the rightwing press and hold power to account. Please help us survive and thrive.

The Canary Support