Facial recognition technology is quickly gaining traction, especially the application Clearview AI. Professor Buzz Scherr discusses the legal implications, Axon's response to the technology, and how state's are handling the privacy concerns.

We are now accepting applications for our International Criminal Law and Justice Programs, learn more at http://law.unh.edu/iclj

Read the Transcript

A. J. Kierstead (Host):

Facial recognition technology is quickly gaining traction, especially Clearview AI. To discuss the concerns and response, I'm joined by professor Buzz Scherr. This is the UNH Law Podcast. Learn more about the law school and apply by visiting law.unh.edu. Opinions discussed are solely the opinion of the faculty or host and do not constitute legal advice or necessarily represent the official views of the University of New Hampshire. So Buzz, we've discussed the concerns of facial recognition systems a few times in the past. But it seems like a few key players, especially Clearview AI, really come to the forefront of digital privacy concerns and law enforcement taking advantage.

Buzz Scherr:

Very much so. Clearview AI, AI short for artificial intelligence, has put together a database of well over a million digital representations of people's faces from mining a variety of sources, particularly social media sources. And has been offering that up to law enforcement agencies for their use in searching using facial recognition technology.

Buzz Scherr:

The difficulty is the quality of the pictures from Clearview, AI are not all that great. There is a huge lack of clarity as to whether people who agreed ... depending on the social media entity or whatever other circumstance, private commercial company. Whether when they gave their facial image to that individual or that site or that company, whether they agreed for it to be put in a database for use by law enforcement.

Buzz Scherr:

Probably the huge, huge majority of those, they had no clue that their pictures were going to be used in that way. And so the broad issue it raises is, what privacy rights do you retain when you put something out in some version of the public?

A. J. Kierstead (Host):

Yeah. This technology is called scraping. Where they literally go over social media platforms and match the name and the image, take that information and put into their database. And this would be a terms of service issue too, wouldn't it?

Buzz Scherr:

It would. And boy, the terms of service for all the different websites and social media locations that they scraped are immensely varied. So it would be interesting to drill down ... and I haven't done this yet. But to drill down on just one or two or five or 10 of those websites and say, did the terms of service include any mention of this kind of scraping for use by law enforcement? My guess is no. Probably what they have is some really ... Oh, we can use this for whatever purpose we want to.

A. J. Kierstead (Host):

But also what makes it hard, is a lot of these profiles are just public and anyone can go in there with proper technology and just take it without even having to get into the system.

Buzz Scherr:

Yes. But the question is as to ... an individual can, a private entity can a private individual can. But when you-

A. J. Kierstead (Host):

Sell that data, basically.

Buzz Scherr:

... When you sell that data to the government. One of the technical constitutional problems is, there's this thing called The Third Party Doctrine, based on an old 1970s case Smith versus Maryland, that says, when you lodge information with a third party, like in that case a phone company ... you do not have a reasonable expectation of privacy in that information.

Buzz Scherr:

That doctrine's been around for a long time. But the US Supreme Court decided a case three or four years ago, US versus Carpenter, where cell phone geolocational information in the possession of your cell phone provider ... that is what towers your cell phone pinged on at a certain date and a certain time, and where those towers were. The US Supreme Court said if the police want to get that information from your cell phone provider, they need a search warrant supported by probable cause.

Buzz Scherr:

So when a police agency goes to Clearview AI and says, run this face through your stuff, is that the same as the Smith versus Maryland case, or is that the same as the US versus Carpenter cell phone geolocation case? I tend to think at this point it's probably a good bit closer to the cell phone geolocation case, but that hasn't been decided yet.

Buzz Scherr:

The other problem is a reliability problem. If in the scraping, they acquire a photograph of varying quality-

A. J. Kierstead (Host):

Varying angle.

Buzz Scherr:

... And varying angle, and they acquire a name that is attached to that photograph, you just have to go onto Twitter and see the photos that people put up under their quote unquote name to realize that there can be a lot of unreliable connection.

A. J. Kierstead (Host):

Yes. There's some composer called A.J. Kierstead up in Canada that always shows up when you search my name.

Buzz Scherr:

Right. Which makes me think I may want to do that when I get home tonight. So there's some not insignificant reliability issues. And so there's a set of compounding issues that really cause great concern. So much so, that at the end of last week ... I believe it was if I'm not mistaken. The New Jersey attorney general, at the end of the week in which a article came out in the New York Times about Clearview AI and what's been happening with them ... which is a very well researched and well written article.

A. J. Kierstead (Host):

Yeah. I'm going to link to that in the description. They also did a podcast on The Daily, which is New York Times podcast, which was fascinating.

Buzz Scherr:

That came out on like a Monday or a Tuesday. By Friday, the New Jersey attorney general had ordered that the state police would not use Clearview AI in any fashion.

A. J. Kierstead (Host):

The podcast was really fascinating too, because there's this very shady background with the owner of the company, who's also the primary developer. He had some fake name that he was using on LinkedIn that was his marketing person.

Buzz Scherr:

Associated with a picture of who? Just out of curiosity.

A. J. Kierstead (Host):

Yes. And then they did a search for the journalist that was doing this article ... had law enforcement search for her name and her name didn't show up. But when she got ahold of the owner, he said, oh yeah, you're in there. So basically there's this weird, how can you trust the data that's in there if this company is so secretive about everything about how it operates?

Buzz Scherr:

Yeah. The other thing it is symbolic of, is the latest craze. You can see it on a monthly basis, if not more frequently, with a technology company coming up with an interesting idea. And them putting forward what I would call, not yet ready for primetime technology. And it being the latest thing since sliced bread, or the 21st century equivalent of sliced bread.

Buzz Scherr:

And then it, over time, very often it starts to unravel. And too many government organizations or significant commercial entities are overly invested too quickly. And so part of the thing that's so hard to do, particularly with the onslaught of new technologies that we're seeing in this century, is to be patient and let things shake out for a good bit of time before we make them the be all and end all of decision making.

A. J. Kierstead (Host):

It seems like they did a really good job from the sales perspective where they just kept it, we're only going to push it to law enforcement. Law enforcement doesn't necessarily push everything out that they're doing on the investigative side, so they're not putting on Facebook, hey, guess what we now have for a new technology to find the latest criminal.

Buzz Scherr:

Here's an interesting ... another version of that. There's a provider of a private company called Axon Corporation. It's one of the leading providers in the United States of technology to police departments. They are one of the two leading providers of body worn cameras to police departments. They have a quasi- independent ... Ethics and Science Advisory Board that has outside lawyers, ethicists on it. Technicians, scientists.

Buzz Scherr:

They took a look at the broad category of facial recognition technologies. And they ended up recommending in the last six months to Axon that they not yet provide facial recognition technology to police departments to accompany their body worn cameras. They did not feel comfortable with the current state of the reliability of the algorithms the technology itself was using, which is ... I mean, think about what that says. It's one thing for a government agency-

A. J. Kierstead (Host):

They could make so much money if they decided to do that. They're the provider of body cameras.

Buzz Scherr:

It's one thing for a government agency like the National Institute of Standards and Technology ... which is a very significant and well-respected government agency that tests everything under the sun new technology, as an independent arbiter of the quality of the technology. I mean, they came out with a study saying the algorithms being used by most of the facial recognition technology providers are not yet reliable. Too many false positives, too many false negatives. Biased against certain groups, everybody but basically middle-aged non bearded men.

Buzz Scherr:

But it's another thing for a company who has a awesome business opportunity to decline that business opportunity. I mean, that's a really powerful statement about current concerns. So you pair that up with what's going on with Clearview AI, the attorney General's office, it's a very interesting set of developments.

A. J. Kierstead (Host):

Because basically, if I had to guess the best way for a company to collect the volume of data that would be required for an artificial intelligence to be able to work, would be this kind of shady scraping mechanism. I can't imagine there's enough photos ... I mean, Axon would probably be one of the closest ones. I mean, they've got probably millions of hours of video in their databases.

Buzz Scherr:

Yeah, but the photos you get from body cams and public surveillance cameras-

A. J. Kierstead (Host):

Yeah, a little Go-Pro-y.

Buzz Scherr:

... Generally, they're just not really good. You need high quality photos. The best that are out there, such as they are ... and I don't mean to say they're awesome. But the best that are out there are driver's license photos. And New Hampshire already doesn't allow that.

A. J. Kierstead (Host):

Interesting. Now, speaking of New Hampshire, I mean, how is New Hampshire looking at this technology?

Buzz Scherr:

There is a facial recognition technology ban pending in the legislature. It has made it through one house committee with a 18-2 recommendation that it be banned. And there was a voice vote in the house floor, the house as a whole, to support that finding. And then, as process dictates, they sent it to another house committee, the House Criminal Justice Committee, and they're going to hold hearings on it.

Buzz Scherr:

So it's making its way in the front end, in a very positive way in support of a ban, through the legislature. It's interesting technology that, right now, concern about it is bipartisan. A voice vote means that nobody raised their voice to say no.

A. J. Kierstead (Host):

Yeah. So it is very apparent. I mean, this is early days technology. A good comparison, just to think about in terms of how accurate this technology could be ... Tesla has invested millions and millions of dollars into using camera technology to be able to just let a car follow the road. Which primarily is following signs and lines on the road, following maps.

Buzz Scherr:

How's that going?

A. J. Kierstead (Host):

They barely have been able to roll it out. It works on the highway, and they're just now beginning to change lanes on the highway. That's literally as far as it's got.

Buzz Scherr:

Yeah, it's complicated stuff.

Buzz Scherr:

It's really hard.

A. J. Kierstead (Host):

I testified in front of the house committee hearing the facial recognition technology ban. And the way I characterize it is, it's just not ready for prime time technology. And we are so in love with the newest technology that our impatience overcomes the, not ready for prime time piece, too quickly.

A. J. Kierstead (Host):

Thanks for listening to the UNH Law Podcast. Be sure to comment and subscribe to the show on Apple podcasts, Google Play or Spotify. See you next Thursday.

A. J. Kierstead (Host):

More than 94% of our law grads get jobs in the open market within 10 months of graduation. That's better than Harvard and Yale. Join the powerhouse, now accepting applications at law.unh.edu

Categories