Facial recognition tech misidentifies people of color up to 100x more than white faces: report
Dive Brief:
- Top facial recognition systems misidentify people of color at higher rates than white people, according to a federal study from the National Institute of Standards and Technology (NIST).
- Asian and African American faces were misidentified up to 100 times more frequently than white faces in some algorithms when conducting one-to-one matching. In one-to-many matching (the type of search done by law enforcement investigators that compares an image to a database of other faces), African American women were falsely identified most frequently.
- The rate of misidentification can lead to false accusations and even security concerns, allowing access to imposters, according to the report. The study assessed 189 software algorithms from 99 developers, representing a majority of the industry.
Dive Insight:
The findings are sure to throw more momentum behind efforts to regulate or restrict the growing use of facial recognition technology by governments and private businesses, especially in law enforcement.
More police departments have explored the use of the technology for crowd control or to identify suspects. The Federal Bureau of Investigation (FBI) has run over 390,000 facial recognition searches since 2011, according to The Washington Post. The Department of Homeland Security has also it could use facial recognition at the border and for travelers, although it dropped plansthis month to seek permission to use the technology to scan travelers coming in and out of the country.
Concerned about the potential civil rights implications, cities like San Francisco, Oakland, CA and Somerville, MA passed bans on government use of the software. Portland, OR is weighing a ban that would extend to private businesses as well. But those restrictions might prove more difficult to enforce than previously thought.
San Francisco is poised to amend its restriction on facial recognition technology, permitting city employees to use devices like Apple products that already have facial recognition built in as an unlocking feature.
Some Democratic lawmakers, who have explored restrictions on the tools, said the NIST report should give users pause. NIST reviewed algorithms from tech giants like Intel and Microsoft, although notably the study does not cover the algorithm Amazon uses in its Rekognition system, which has been marketed to police departments.
Among U.S.-developed algorithms, there were high rates of false positives in one-to-one matching (like identification that would be used for a cellphone or as identification at a border) for Asians, African Americans and American Indians, with American Indians showing the highest rate of false positives. However, algorithms developed by Asian countries did not show the same high failure rates.
In a statement, NIST computer scientist Patrick Grother said that finding was an "encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data." Overall, he said, the report should help policymakers and developers think "about the limitations and appropriate use of these algorithms."
Recommended Reading:
- SMART CITIES DIVESan Francisco to amend facial recognition ban
- SMART CITIES DIVEWhat San Francisco's facial recognition technology ban means for other cities
My name is Gloria Gonzalez and I am a Managing Editor at Industry Dive based in Washington, D.C. Republishing Industry Dive articles as you did in the above blog post is a violation of Industry Dive’s copyright. Please remove this article immediately or pay a reprint fee.
ReplyDeleteDear Ms. Gonzalez:
ReplyDeleteThank you for good article, based on Washington Post article.
It is not a "violation" or copyright infringement for me to reprint your article, with commentary to inform my readers.
It is considered to be "fair use," under copyright law.
We have a Right to Know about this technology, and I appreciate your work.
Thank you!
With kindest regards, I am,
Sincerely yours,
Ed Slavin
Fair use does not entitle the reprinting of entire articles. Please remove the article or pay the reprint fee or I will have to refer this to my company's legal team.
ReplyDeletehttps://www.lexisnexis.com/communities/lexisnexis_biz/b/bizblog/archive/2016/04/22/are-you-violating-copyright-laws-with-your-news-sharing.aspx
Dear Ms. Gonzalez:
ReplyDeleteDe minimis non curat lex.
Please call me to discuss.
Cordially,
Ed Slavin
904-377-4998
Dear Ms. Gonzalez:
ReplyDeleteSomeone doing a bad imitation of a haughty hierarchical and authoritarian corporate manager left a hostile telephone voicemail message for me, unadorned by a telephone number, freighted with animus.
I doubt that you are much fun to work with -- in the immortal words of one of the judges on the D.C. Court of Appeals to lawyer for Georgetown University in oral argument in the landmark Georgetown University Gay rights case, "Try tolerance." Gay Rights Coalition v. Georgetown University, 536 A. 2d. 1 (D.C. 1987)(en banc).
You appear to lack social skills, charm or diplomacy, like a dictator dictating her dictum.
My religious tradition teaches forgiveness.
I forgive you