The Latest

THE LATEST

THE LATEST THINKING

THE LATEST THINKING

The opinions of THE LATEST’s guest contributors are their own.

Good Use of Technology or Big Brother?

W. Scott Cole

Posted on February 18, 2020 08:10

1 user

Any time a new technology comes on the scene that law enforcement can use, every person has to make up his or her own mind if how the technology is used is a good use or if big brother is reaching out even further into our lives. I offer no opinions this time (I hope), but I would be interested in knowing how readers feel.

The technology this time is facial recognition. The company using it is named Clearview AI, and it is being offered solely to law enforcement. Its one mission is to help identify victims of sexual abuse, in particular child victims. I’m sure we can all agree the police can use all the help they can get in that battle, and Clearview presents what some call a very powerful tool in victim identification.

It allows police to identify the names and locations of minors in pornographic videos and photos when there is no other way to identify them. In one case in Indiana, detectives ran photos of 21 victims of one criminal through Clearview’s database and received back 14 identifications, the youngest just 13 years old. A victim identification officer in Canada called Clearview’s technology “the biggest breakthrough in the last decade” in fighting child pornography, even though the focus until a short time ago was on identifying the abuser rather than the victim.

The company is very secretive and operated almost unknown until recently, when the New York Times ran a piece that revealed that law enforcement agencies across the country use Clearview’s tools on both the local and federal levels.

Critics of Clearview point to how the company obtains the materials that power their tool, among other issues. There is obviously no problem with investigators uploading photos to Clearview to run through their algorithms, but it doesn’t stop there. Clearview stores those pictures, which in time, could create a massive database of victims. In addition, the company scrapes photos off social media sites, including Facebook, Twitter, Venmo, and YouTube. At this time, Clearview’s database contains over three billion pictures it has obtained from the public internet without the knowledge of the people whose photos were put into the database.

In addition, the software has not been tested for accuracy by an independent agency and facial recognition technology makes a LOT of mistakes. According to Liz O’Sullivan, Technology Director at the Surveillance Technology Oversight Project, this is especially true of children because their faces change so much as they age and because children are very seldom used in the data sets used to train the algorithms.

A mistake in this one area could have a devastating effect on misidentified child victims and their families. It can ruin the lives of misidentified perpetrators and their families because, far too many times, especially in child sex cases, “innocent until proven guilty” does not apply and it is possible to obtain a conviction based solely on an accusation.

Says Ms. O’Sullivan, “The exchange of freedom and privacy for some early anecdotal evidence that it might help some people is wholly insufficient to trade away our civil liberties.” It seems Twitter, YouTube, Linkedin, and others agree and have sent Clearview a formal cease and desist letter to stop mining photos off their sites. Clearview’s response is that they have a First Amendment right to those photos. I see a court battle coming.

W. Scott Cole

Posted on February 18, 2020 08:10

Comments

comments powered by Disqus

Matthew Feeney Body cameras are often presented as tools that can play a valuable role in criminal justice reform. They...

THE LATEST THINKING

Video Site Tour

The Latest
The Latest

Subscribe to THE LATEST Newsletter.

The Latest
The Latest

Share this TLT through...

The Latest