Capitol: Police identify assailants using Clearview AI and facial recognition

According to Clearview AI’s CEO, the use of his company’s facial recognition technology by law enforcement increased by 26% the day after the Capitol attack. First reported by the New York Times, Hoan Ton-That confirmed that Clearview experienced a sharp increase in the use of its technology on January 7, 2021 in terms of search volume.

Exploiting captured images

The 6 January attack was broadcast live on cable channels and captured in hundreds of images and live feeds. Images and videos showed the rioter faces entering the Capitol building. The FBI and other agencies have asked for the public’s help in identify participants.

According to the Times, the Miami Metro Police Department uses Clearview AI to identify some of the rioters, sending possible matches to the FBI’s joint terrorism task force. And the Wall Street Journal reported that an Alabama police department was also using Clearview to identify the faces in the riot footage before sending the information to the FBI.

Using Facial Recognition Technology

Some face recognition systems used by the authorities use images such as driver’s license photos. The Clearview database for its part contains some 3 billion images extracted from social media and other websites. This explains its effectiveness. This information was revealed by an investigation by The Times last year.

In addition to raising serious concerns about the privacythe practice of taking pictures from social media has breaks the rules of the platforms. As a result of the investigation, technology companies issued numerous cease and desist orders to Clearview.

Clearview AI: facial recognition and controversy

Nathan Freed Wessler, Assistant Project Director of the ACLU’s Speech, Privacy, and Technology Project, said that while facial recognition technology is not a new technology, it is a new way to improve the quality of life for people with disabilities. not regulated by federal lawsound, its ground monitoring potential communities of colour has rightly led state and local governments across the country to ban its use by law enforcement.

Wessler argued that if the use of technology by police services is standardized, everyone knows who it will be used most against: members of black and brown communities who already suffer from a racist law enforcement system. Clearview AI stated in May that it would stop selling its technology to private companies. and would provide it to law enforcement only. According to the company, some 2,400 law enforcement agencies across the United States use Clearview’s software.

Be the first to comment

Leave a Reply

Your email address will not be published.


*