AI News, Police Technology Updates artificial intelligence

Amazon's face-detection technology for police shows bias, researchers say

NEW YORK - Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities.

The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Wood said facial analysis “can spot faces in videos or images and assign generic attributes such as wearing glasses;

“If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free,”

Amazon’s website credits Rekognition for helping the Washington County Sheriff Office in Oregon speed up how long it took to identify suspects from hundreds of thousands of photo records.

Response: Racial and Gender bias in Amazon Rekognition — Commercial AI System for Analyzing Faces.

Affect recognition is one way of describing facial attribute classification that deals with facial expressions and inferred emotions.

Just because a company makes a claim about being able to determine attributes about an individual based on her face doesn’t make it true, appropriate, or even scientifically valid.

Gender Classification Our studies on facial analysis technology sold by companies like Amazon have focused on binary gender classification to provide just one example of how facial analysis technology can be biased.

Again, like its peers Amazon should submit its models both the new ones and the legacy ones still being used by customers to the national benchmarks for evaluating accuracy of systems that analyze human faces.

Often times companies like Amazon provide AI services that analyze faces in a number of ways offering features like labeling the gender or providing identification services.

As I have conducted studies on commercial AI systems, we find time and time again that the internal accuracy rates if reported by companies seem to be at odds with external accuracy rates reported by independent third parties.

We have to keep in mind that when we talk about accuracy for a facial analysis technology, accuracy is determined by using a set of face images or videos collected for testing these systems.

With technologies like machine learning –which learn patterns of things like faces from data, data is destiny, and we are destined to fail the rest of the world if we rely on pale male datasets to benchmark our AI systems.

Back in 2014, computer vision researchers gained a false sense of universal progress with facial recognition technology when Facebook announced 97% accuracy on the gold standard benchmark of the day called Labeled Faces in the Wild.

I decided to choose 3 African countries and 3 European counties to help balance on skin type by having faces with some of the lightest individuals and also very dark-skinned members.

All systems performed better on male faces than female faces overall, and all systems performed better on lighter-skinned faces than darker-skinned faces overall.

Error rates were as high as 35% for darker-skinned women, 12% for darker-skinned men, 7% for lighter-skinned women, and no more than 1% for lighter-skinned men.

The discrepancies in thresholds and reporting methodology are exactly why we need stricter standards, detailed user guidance, and external evaluation that is based on real-world use.

Anytime a company claims to have completely accurate systems or makes claims they are unable to replicate a result but do not provide the detailed demographic breakdown of their benchmarks or evaluation methods, be skeptical.

For any publicly funded agencies using or thinking about using Amazon services, they need to check the different versions and demand external testing of those versions for bias.

However policing often relies on looking for a profile of a person of interest based on demographic attributes like gender, race, and age as well as physical characteristics like facial hair.

To try to see if a face in an image or on a video is a person of interest, the detected face may need to be compared to faces in databases containing more than 100 million faces.

In the US alone, Georgetown released a study showing 1 in 2 adults, more than 117 million adults, have their faces in facial recognition databases that can be searched unwarranted using AI systems that haven’t been audited for accuracy.

As an expert on bias in facial analysis technology, I advise Amazon to 1) immediately halt the use of facial recognition and any other kinds of facial analysis technology in high-stakes contexts like policing and government surveillance 2) submit company models currently in use by customers to the National Institute of Standards and Technology benchmarks Joy Buolamwini is the founder of the Algorithmic Justice League which uses art and research to illuminate the social implications of artificial intelligence.

Researchers say Amazon face-detection technology shows bias

Facial-detection technology that Amazon is marketing to law enforcement often misidentifies women, particularly those with darker skin, according to researchers from MIT and the University of Toronto.

Privacy and civil rights advocates have called on Amazon to stop marketing its Rekognition service because of worries about discrimination against minorities.

The new study, released late Thursday, warns of the potential of abuse and threats to privacy and civil liberties from facial-detection technology.

Matt Wood, general manager of artificial intelligence with Amazon's cloud-computing unit, said the study uses a 'facial analysis' and not 'facial recognition' technology.

'If you sell one system that has been shown to have bias on human faces, it is doubtful your other face-based products are also completely bias free,' she wrote.

Amazon's website credits Rekognition for helping the Washington County Sheriff Office in Oregon speed up how long it took to identify suspects from hundreds of thousands of photo records.

The National Police Lab AI: collaboration between scientists and Police

The automatic online processing of police reports and the analysis of hours of image material in seconds: these are just two examples of how the police could benefit from artificial intelligence.

This collaboration will give us the knowledge we need to be able to respond to these changing times.’ Scientists of the University of Amsterdam working at the lab will focus on artificial intelligence applications that automate time-consuming police work and support difficult tasks.

The University of Amsterdam will concentrate on extracting relevant information from large quantities of data, as computer programs are far better at recognising underlying patterns than humans are.

‘Once ready, the chatbot will know exactly which questions to ask to ensure that police reports are processed successfully.’ Four researchers will start their research at Utrecht University initially;

BREAKING: Vancouver Police Are Using Prediction Technology For Pre Crime Arrests!

SUPPORT INDEPENDENT MEDIA ➜ Remember the film Minority Report? well the Vancouver Police Department has just made that ..

Ai Glasses Pick Out Criminals In China |Tech Trends|

For more information log on to

Police using AI to spot fake robbery claims

Police are now using AI to spot fake robbery claims. Spain adopted the artificial intelligence system capable of uncovering fake crime and theft claims.

GITEX showcases latest trends in Artificial Intelligence technology

The 38th edition of GITEX Technology Week has wrapped up in Dubai. … READ MORE ...

Can Artificial Intelligence Tell What a Guilty Person Looks Like?

Facial recognition is going mainstream. The technology is increasingly used by law-enforcement agencies and in schools, casinos and retail stores, spurring ...

8 NEW ROBOT Inventions That will Shock You

Check out these mind blowing new advancements in robotics… FREEZE LISTS ESPAÑOL FREEZE LISTS DEUTSCH ..

10 Jobs Artificial Intelligence Can’t Take Away From Humans

Let's face it. Pretty soon, robots will take over the world, and humanity will become a distant memory. The good news is, by the time technology catches up to The ...

AI POLICE STATE China to use technology to predict crimes

AI POLICE STATE: China to use technology to predict crimes BEFORE they happen CHINA is hoping to use artificial intelligence (AI) to look into the future and ...

AI Bots Now Help Police

Artificial intelligence helps with custody decisions now. This was information was recently released Artificial ..

Russian Army Alien Tech Terminator Robots Cyborgs To Crush US Military. Don't Believe? Watch This.

It's not a propaganda video but scientific explanation of Russian military capabilities that will blow you mind because they are centuries ahead of American ...