Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Surveillance And Local Police: How Technology Is Evolving Faster Than Regulation

Journalist Jon Fasman says local police departments are increasingly using powerful surveillance tools — with little oversight.
Alexandra Schuler
/
picture alliance via Getty Images
Journalist Jon Fasman says local police departments are increasingly using powerful surveillance tools — with little oversight.

Your local police department may know more about you than you think. Journalist Jon Fasman says local police are frequently able to access very powerful surveillance tools — including publicly accessible CCTV cameras, automatic license plate readers and cell phone tracking devices — with little oversight.

Fasman embedded with a different police departments across the country to see how officers integrate technology into their day-to-day job. Newark, N.J., for instance, uses a program called Citizen Virtual Patrol, which allows at-home viewers to stream video from cameras placed around the city.

"It gives people an eye on the entire city," Fasman says. "I live about 50 miles north of Newark and I can log in at my desk and see the feed from any one of the 126 cameras that the new public safety department has placed around the city."

Police departments say such technologies help reduce crime. But in his new book, We See It All: Liberty and Justice in an Age of Perpetual Surveillance,Fasman explores the privacy issues related to these tactics — especially as regulation varies state-by-state.

"The rules are all over the place with a lot of this technology because it's so new, because it changes so quickly," Fasman says. "In general, even when there are regulations, there often aren't penalties for violating them or not strong enough penalties. ... I have not written an anti-technology book. I have written a pro-democracy, pro-regulation book."


/ Hachette Book Group
/
Hachette Book Group

Interview highlights

On Citizen Virtual Patrol, a network of publicly accessible cameras

The Citizen Virtual Patrol is a network composed of publicly owned cameras that people can access from a laptop. Now, the idea behind this was to sort of allow people to observe and perhaps testify to crimes from behind a veil of anonymity. ... Technically, the cameras don't show anything that an observer on the street couldn't see. So that shows public streets. It's not aimed at anyone's apartment. It's not looking inside anyone's window.

On the other hand, it does show people's yards, but they have a slightly higher expectation of privacy and it could provide some information that people wouldn't want known about. For instance, let's say I had an ex who lived in Newark and I was watching the camera trained on her house. If I saw her leave with some suitcases and then saw no activity at her house for a couple of days, I could surmise that she wasn't there. I could also know when she goes out, when she comes home, who comes to a house, all these things that of course, I could find out if I observed her in front of her house, but that would make me visible. This renders me invisible and it lets anyone who logs in observe an enormous swath of the city. So that's another thing I think we need to think about when we were thinking of police technologies, and that is that any single instance may be unobjectionable. But when it comes to scale, you're talking about something very different.

On police use of automatic license plate readers, or ALPRs

These things are small cameras that attach to police cars that you really wouldn't see unless you were looking for them. There's sort of little flat cameras either on the front or on the roof of the police car, and what they do is as they pass, they capture an image of each license plate and they translate that image into just plain letters and numbers. They log the geospatial data. So where the car was and what time it was observed and it goes into a database. And so, again, the issue with that is one of scale.

There is nothing illegal about the police noting the license plate of cars parked in public, where they were parked when they were parked there. But if you look down your street and you saw a police officer writing down every license plate all day, every day, you might wonder why he was doing that. A police department, if they had to assign people to do that, might wonder whether it was worth the manpower to have someone just noting down license plates.

On the lack of regulation of ALPR data collection

I'm not suggesting in writing about the dangers of ALPRs that we eliminate them, because it does help find stolen cars. It does help find cars used in crimes. The issue is what about the 99.99 percent of cars that aren't involved in crimes? What do you do with their data?

States have wildly different laws on how long that data can be kept. In New Hampshire, for instance, if the car is not associated with any crime or is not being looked for, then you've got to delete it within three minutes. There are other states that set 24-hour limits, but there are a lot of states that set no limits at all and they just throw these pictures into a huge database. Often those databases are poorly secured. So in 2015 there was a journalist who just stumbled onto the Boston Police Department's entire database of ALPR data. It's that sort of thing, the collection at scale, the lack of regulations over how long that information is kept — and often the lack of security over how it's kept — that combined to make these sorts of technologies really worrying.

On "ShotSpotter" technology, designed to detect gunshots and dispatch police

ShotSpotter is an acoustic sensor designed to detect the sound of gunshots. I saw these in place in Newark, N.J. They often look like little white diamonds or rectangles up on traffic light poles, and they're trained to recognize what ShotSpotter calls "loud, impulsive sounds" between 120 and 160 decibels. When it does hear such a sound, it sends an alert to the ShotSpotter headquarters where a human listens to it and figures out, was that a gunshot? Was it a car backfiring? When I was at ShotSpotter's headquarters in California, there was an alert caused by a truck, Jake Brake, the engine brake that releases a tremendous amount of sound quite quickly. Once it hears a gunshot, it notifies the local police department. It tells the police department how many shots, where, when, and it essentially can dispatch officers to the scene of a suspected shooting.

On suspicion in communities about ShotSpotter and the importance of how these technologies are rolled out

Here's what I think is interesting about ShotSpotter: As I was reporting this story ... a lot of people believed that this sort of technology was being used to overhear private conversations. Now, I think that is extremely unlikely. To my knowledge, there's never been a case that's been brought based on a conversation overheard by ShotSpotter. Every police department, everyone from ShotSpotter said it doesn't hear conversations. It's trained to recognize loud, sudden sounds. That's not how people talk.

But [this is] an instructive lesson in how deploying technology can be used to improve or worsen relations between police and the communities they police. I think that in too many instances, police departments approve the purchase of ShotSpotter and deploy them without doing the work of going into communities they police and saying, "Listen, this is what's going off in these traffic lights. Here is how it works. It doesn't hear conversation. Here's how we know it doesn't hear conversation. If you have any questions about it, please come and talk to me," as opposed to just citizens who often come from communities that have a long history of distrust with police built up over years for valid reasons, as opposed to those communities just seeing another piece of tech up there.

Technology is not good or bad in itself, but police have to be very careful about how they roll it out, and have to go out of the way to gain the trust of the public, particularly in those communities that have a long history of distrust of the police.

If you were a citizen of color who lived in a community that had a long history of distrust with police and all of a sudden the police said this hears gunshots, you might think to yourself, what else does it hear? ... Technology is not good or bad in itself, but police have to be very careful about how they roll it out, and have to go out of the way to gain the trust of the public, particularly in those communities that have a long history of distrust of the police.

On the "Stingray," or IMSI Catcher (international mobile subscriber information), that collects phone data, similar to a cell tower

A Stingray mimics a cell phone tower and it gets your phone to connect to it. And what happens then is that all of the metadata on your phone, that is the non-voice call data can then be read, and that includes texts you might send websites, you might browse who you called and how long you talked for, even without knowing the actual substance of the conversation ... [It] connects to the phone ... and it geolocates you. ...

Increasingly, it's deployed by court order, but that hasn't always been the case. And what happens is even when deployed by court order against a specific subject, the data from every other phone in that area is hoovered up. Now, again, this happens on a stakeout, too. If the police are staking out a suspect, they see all kinds of people walking past. The difference is they don't retain the data from all those people walking past in the case of data hoovered up by Stingrays, that often does get kept for longer than it should. And again, this is an issue in which there's no question that Stingrays can help police catch serious criminals. But there just needs to be some regulations over when they can be used and what happens to the data hoovered up incidentally, during those stakeouts.

On the problem with predictive policing programs that use data to determine where to deploy officers

Predictive policing programs ... are programs that ingest an enormous amount of historical crime data and say, based on the data, based on past practice, these are the areas that we think are likely to be at elevated risk for crime today. So this is where you need to deploy your patrol officers. My concern about that is that historical crime data is not an objective record of all crimes committed in a city. It is a record of crimes that the police know about. And given the sort of historic pattern of overpolicing minority communities, overpolicing poor communities, these programs run the risk of essentially calcifying past racial biases into current practices.

On the justifying of invasive technologies because they're effective

The question is: Is it worth the cost to our privacy and liberty to implement this technology? And if so, what limits are we willing to set? What penalties do we want for failing to observe these limits?

I want to make one point about efficacy as justification: There are a whole lot of things that would help police solve more crimes that are incompatible with living in a free society. The suspension of habeas corpus would probably help police solve more crimes. Keeping everyone under observation all the time would help police solve more crimes. Allowing detention without trial might help the police solve more crimes. But all of these things are incompatible with living in a free, open, liberal democracy.

So when we think about these technologies and what we are willing to accept, we shouldn't just think about whether to help police solve more crimes, because almost all of them will — at least on the margins. The question is: Is it worth the cost to our privacy and liberty to implement this technology? And if so, what limits are we willing to set? What penalties do we want for failing to observe these limits? So it's really a question not just of whether the technology works, but is it worth the cost? And if it's not worth the cost, can we devise a way in which the police can have the tool that they want to solve crimes and we can be comforted that it won't be abused, it won't be used against us, it won't be used to surveil us.

Sam Briger and Thea Chaloner produced and edited the audio of this interview. Bridget Bentz, Molly Seavy-Nesper and Meghan Sullivan adapted it for the Web.

Copyright 2021 Fresh Air. To see more, visit Fresh Air.

Dave Davies is a guest host for NPR's Fresh Air with Terry Gross.
More Stories