Essential Liberty
The Bob Zadek Show


Automated License Plate Readers are a danger to civil liberties – and yet no one is talking about them

In this episode:

Read the full transcript below.


“We know where you are, and where you’ve been”

Bob Zadek: To paraphrase a scary old public service announcement I used to hear on television – remember television? – "It's 10 PM. Do you know where your children are?"

To modernize that quote, 21st-century police departments can boast, right now, “Whatever time it is, we know where you are." Unfortunately, that's probably the case.

Today's guest, Jonathan Hofer, is a research associate at the Independent Institute based in Oakland, California. He has studied privacy law, local governance, and the impact of emerging technologies on civil liberties.

Jonathan will introduce us today to a tool, which adds teeth to that theoretical quote by police departments, "Whatever time it is, we know where you are." And perhaps, I should add, "And where you've been."

Jonathan, welcome to the show today.

Jonathan Hofer: Well, thank you so much.

Bob Zadek: So, Jonathan, is it an exaggeration for me to frighten our audience into believing that local governments around the country always know where we are? And if that's true, how do they learn that?

Jonathan Hofer: We have a lot of reasons to be concerned. Post-Edward Snowden leaks, a lot of attention in the context of government mass surveillance has been on things like the CIA and NSA. But in reality, the most prolific tool of mass surveillance today is automated license plate readers. Even though they've only been around for a few years now, they're quickly exploding onto the scene in the United States, and if your town doesn't have one, it's soon coming to a place near you.


The Rise of Sophisticated Cameras on Every Street Corner

Bob Zadek: Automated license plate reader seem a little bit bland. So, somebody reads license plates — people read all kinds of stuff. What's wrong with reading a license plate or is there more to the story than that? 

Jonathan Hofer: Automated license plate readers (ALPRs) are specially designed high-speed cameras that have the ability to read the alpha-numeric characters on a license plate. They are able to read the numbers and letters using special software called optical character recognition (OCR). It's a similar technology that allows you to see keyword search a PDF that you might have scanned, or if you've ever gone on Google Books, it lets you search printed books. Same technology.

They were originally developed around the 1970s in the United Kingdom and then they started popping up internationally. And for many decades, they were not advanced, or practical. To the extent that they were used, they were only used for things like regular traffic monitoring or sometimes for electronic toll collection or parking enforcement.

But around the early 2000s, and especially into the 2010s, they quickly exploded with law enforcement in the US when police started surmising that this would be a really good crime-fighting tool.

They're generally stationed at either busy intersections, or over a busy roadway, or they could be mounted on a police cruiser or other city infrastructure, and when a car passes by the camera the license plate is detected from the image and a timestamp is applied to that scan. There's also basically a GPS coordinate. They log your latitude and longitude. And this poses a number of civil liberties concerns.

The Fourth Amendment Concern

Bob Zadek: To follow it sequentially, we first have a camera, which records the information you just described. So, now, we have a bit of data stored electronically, rather benign. You started to say how it gets used, but there's a step before that. When the data is organized, it's collected from many locations, so we can know where the car has been.

Is the data available at a website on Google? Tell us about who's holding the data and what the data might reveal about the car. Not about the driver – it's not about humans. Not yet at least. So, tell us that story.

Jonathan Hofer: Yeah, I'll actually say it's a bleak reality that it is the driver and humans, because usually these cameras try and take at least six pictures of eacj individual car and that's per manufacturer's recommendation just to make sure the license plate is legible. That can include the occupants of the car being photographed. When we're talking about this license plate data, we're not just simply talking about the storage of the literal license plate number. What we're really talking about is a roadmap of where you've been, where you've traveled. And when you have a series of these cameras, you can learn a lot about people's lives.

Usually, when a camera scans a license plate that is uploaded to a centralized database. Now, it varies from jurisdiction to jurisdiction who has access and who retains this data. There are usually three types of data storage arrangements. It could be that an individual department stores their own data. For example, maybe your local police department or the California Highway Patrol. It could be a private third party. For example, in Northern California, there's a company called Vigilant Solutions. They get their own records and then they make it available to stay in federal agencies.

Bob Zadek: For a fee?

Jonathan Hofer: Yeah, for a fee, of course. 

Bob Zadek: Don’t forget that part. 

Jonathan Hofer: Yeah. And then, there's also intelligence-sharing bodies. Usually, these are organized under the banner of the Department of Homeland Security. They pop up post 9/11 as counterterrorism centers. They're usually what are referred to as fusion centers and what they do is they aggregate data from multiple sources, and then they also make that available to other agencies, other states, both federal and local.

The big concern in terms of just general civil liberties is that when I have your travel records-- and I should remind the audience that this isn't just we're randomly spotting license plates. In the case of Los Angeles, for example, they're retaining this data for upwards of five years. And if I have your travel data for five years, I really have an intimate picture of who you are. I know where you work. I know where you live. I know what bar you frequent. I could even know where your place of worship is. I've even made the point to other people, I'm sure the federal government would like National Firearms Registry, but we can infer, if you have a firearm, if you're traveling to a gun show, or to the gun store, or a firing range, which is usually remote, that's good enough for them. And in fact, we have instances of the state police in Virginia doing just that.

So, it really creates a massive Fourth Amendment concern once you're collecting this data, having a large historical dataset that you can analyze for individual people. And especially, this isn't just for people who are accused or suspected of criminal wrongdoing. This is anyone who's just driving on the street.

Bob Zadek: One thing you mentioned a few seconds ago, I perked up. I perked up for all of it. I haven't been napping, Jonathan. But you said, in Northern California, I believe you pointed out that the data collection was done by a private party. So, first of all, I imagine, since I'm a private body, I am allowed to have as my hobby standing on the freeway taking pictures of license plates. That's not against the law. I presume that there's no law that regulates the collection by as many private parties who want to collect the data if they have the right to put their cameras on poles or whatever they need. We'll get to law enforcement in a moment. But in the first instance, since we know that data is today a commodity, it saleable, it depends upon how it's packaged, and how it's organized. But data per se, facts are a commodity, which are routinely bought and sold starting from mailing lists and going on from that. 

Now, we have a private party, which gets the right by leasing or easements, whatever it needs to get, to put a bunch of cameras all over the place and it collects the data. Now, that private party, are they or are they not themselves free to sell that data or are there privacy regulations that regulate you and I from buying a whole lot of cameras, putting them on poles, collecting the data, and then saying, "We got some data about who's driving on this freeway, and we'll sell it to you"? Is there any regulation or law governing that activity?

Jonathan Hofer: Yeah, that's a great point and it's a really interesting legal question. By and large, since the late 60s, case law in the United States when regarding surveillance topics has largely focused on the issue of whether you have an expectation of privacy. And that's in the Katz v. United States decision, I believe in 1968. And one of the things that we talk about when we're looking at if you have an expectation of privacy is, is it just like in the plane view, is it in the public square? Obviously, roads are public. When you put those together, you don't really have an expectation of privacy over your license plate. And in fact, the Supreme Court has explicitly addressed this topic and they've explicitly said that, "No, you don't have an expectation of privacy for your license plate. It's not illegal to photograph a person's license plate." You could set up your camera on the roadway, but it does start to broach into another Fourth Amendment legal territory once you start aggregating the data.

Bob Zadek: Well, let's back up just a moment. The concept of right to privacy describes the relationship between individual and government. Right to privacy doesn't describe relationship simply between private bodies. If you and I, who are bold and entrepreneurial, decide we have developed a business model. But I told you about law enforcement, and we are not law enforcement, we're not deputized. We're just two guys with time on their hands. We put up a bunch of cameras and we start to collect the data to build an inventory, which we hope to sell, because we hope somebody will want to buy pictures of license plates going down the freeway. If I stop there, and we take our information, and we make it available, we package it, it's on the internet, and for $19.95, you can buy this data, something like that. Anything wrong or illegal about that? Or does this not implicate the right to privacy you referred to a few moments ago?

Jonathan Hofer: You should actually be totally in the clear. There are no state or federal laws that would prevent you from doing that.

Bob Zadek: Okay. The aggregate, the collection and organization of data by a private party simply as a business activity, we're not talking about that, per se. It's only when government invites itself to the party, if you will, and wants to take advantage of this data. Now, we implicate the right to privacy that you have so accurately described a few moments ago.

Okay. so, now, back to our story, Jonathan. We now have the data being collected and aggregated. And there are organizations, governmental, and perhaps private, that will take the data from one county and then for the next county. And before you know it, you have the state. And then, you can weave it together, and through the miracle of digital magic, you then can isolate a license plate from all of these pots of data that you have collected. And now, you know everything about a human being you would ever want to know.

Jonathan Hofer: Absolutely. 

Essential Liberty is a reader-supported publication. To receive new posts and support my work, consider becoming a free or paid subscriber.

Trouble in Oakland – Jonathan’s Brush with ALPRs

Bob Zadek: Okay. Now, we know what the technology is. We know what can be done. But something piqued your interest at the Independent Institute, which caused you to write a somewhat alarmist-- it was well tempered, because you're a calm guy. But it was, "Hey, people, there's something going on that you should be aware of." Sort of like Harold Hill singing about Trouble in River City, but your issue is far less benign than a pool hole. So, tell us now why this gets you agitated?

Jonathan Hofer: Yeah. My interest in license plate readers comes from really a personal story. Back when I was an undergraduate at Cal, my brother and I were traveling back to Oakland in a rental car from a Thanksgiving break. We're just a few miles away from home and we get the Contra Costa Sheriff's Department flashing his lights behind us, and we're like, "Okay, what are we doing? We're not speeding, we're following the traffic laws.” And then, it gets even weirder when he gets on his loudspeaker and says, "Exit the freeway." That's not normally what you would do for just a routine ticket. 

We end up in what's basically a vacant shopping center parking lot. It's a cold November night. It's basically pitch black. Some time passes, but we notice that one of the officers gets out of his car, has his gun out and does the thing where he says, "Now, take the keys out of the ignition. Put them on the roof. Put your hands outside the window." Some more time passes and then now, we're completely surrounded. So, at that point, maybe six, seven, eight other officers are surrounding our car and probably about four or five different vehicles. They all have their guns out. They tell us to exit the vehicle separately. When it's my turn to get out of the car, I hear an officer, he's telling me, "Take so many steps backwards. Take so many steps to my left," and so forth. 

Out of nowhere, I get tackled from behind, just like absolutely-- it's like a linebacker or something. This wasn't a friendly handcuffing. And in that process, one of the sheriffs just puts a gun to the back of my head as he's just yelling at me. It was totally incomprehensible. I followed their directions very explicitly. I was fully cooperative. I wasn't saying anything. And after that, they separated us out into two separate cars. 

About 45 minutes passes and they said, "Hey, we're sorry, but we thought your car was stolen. One of the automated license plate readers that you pass had identified your car as stolen. And one of our deputies pulled up behind you, and flagged your vehicle." What had happened was that the car we are traveling in was reported stolen in San Jose a long time ago. It was totally recovered. The police even called up the owner of the vehicle, because there was car sharing rental. They said, "Hey, we got your stolen car," and the owner is like, "What are you talking about?"

Bob Zadek: [laughs] 

Jonathan Hofer: What had happened was, the San Jose PD never updated the stolen car registry. And so, the Contra Costa Sheriff acting upon outdated information turned our traffic stop into a high-risk felony stop, which is usually a police procedure to handle ALPR flags like this. I think my situation was that this could really happen to anyone. And I looked and saw if this had happened to other people. Sure enough, this happened all over the nation. Just a year or two prior to my case, a woman in San Francisco, she had just settled with San Francisco PD over a similar stop where they pulled her out at gunpoint. They even had guns on her after they handcuffed her, and they alleged she was a car thief. But her burgundy Lexus wasn't the gray GMC truck they were looking for. 

Bob Zadek: [laughs] 

Jonathan Hofer: They didn't even check the make and model. But there's a real serious risk when they're relying on this inaccurate technology. And sometimes, it's just a failure on the camera's part, but sometimes, it's also just a failure of accurately maintaining the data. This is commonly pulled data. So, many departments are contributing to these databases. And one error on the part of a single party really magnifies.

Installing Safeguards

Bob Zadek: There's a risk of error. Now, of course, one, if an individual or listener to our discussion was less sensitive to violations of personal freedom, violations of a right to privacy, perhaps, if that's the implication, isn't a response, well, of course, in every aspect of life, sometimes, there are mistakes that are made. In fact, police often-- well, sometimes, we hope not often – arrest the wrong person. They discover the error, and they do the best they can to apologize and whatever they do where they arrest a wrong person. And that's not an argument that police shouldn't arrest people. That's an argument that people should be more careful or that people who are victimized perhaps should be compensated. If it's compensated, then it means all other taxpayers who didn't have anything to do with the wrongful arrest have to pay. Well, that's part of society. That can be absorbed by society. 

Are you making the argument that there is something inherently evil or at least anti-civil liberties about the technology and the process? Or are you making an argument that, okay, there is lots of good that comes from recovering a stolen car is good, finding somebody who's on the lam is good. That's good for all of us. We are safer. Therefore, the technology does a lot of good. So, let's focus on the bad and mitigate it. So, what exactly is the headline of the message you wish the audience to have when they come into contact with or read about, or are voting for funds for automatic license plate readers?

Jonathan Hofer: Yeah, great point. Probably, I would say that I am optimistic that given sufficient safeguards, ALPRs could be used to help law enforcement address crime. However, as they are used in California especially and how they're used nationally, I do not believe they justify their current use. I believe, without safeguards, they are an inherent civil liberties risk and I believe the benefit to law enforcement is minimal to nonexistent. 

One of the things we have to keep in mind is that these tools are not demonstrated to be very effective. Not just in terms of risking pulling over the wrong person, studies have shown that these fail to deter things like car thefts. They don't provide a general crime deterrence.

It was the Piedmont PD, which is a small city surrounded by Oakland. I did a study of their cameras, because they kept basically the most complete data in the nation, as far as I'm aware. Their ALPR scans don't correlate even with stolen vehicle recoveries. It doesn't correlate with investigative leads. Investigative leads could include things like identifying a suspect, locating a witness, or spotting a stolen vehicle. They literally don't help with that. So, I'm not saying that it's time to completely write these off. But as of right now, we need to do it right or not do it at all. And right now, we're not doing it right.

The Misuse of Surveillance Technology

Bob Zadek: Now, the technology is most useful, if at all, in locating stolen vehicles, in finding criminals or suspected criminals who are escaping, and they're escaping by car. With widespread use of this technology, sooner or later, they will trip a camera, and the camera will help law enforcement locate them. Sounds like a good idea. But you have described a technology which goes well, well, well beyond finding the car. But we started our conversation by discussing what it tells us about the life of the user of the car.

Let's leave the safe zone of finding a stolen car. It kind of works. Okay, you'll find a fair amount of stolen cars and you'll find a couple of nerdy guys in the East Bay, who will do another wrong and found themselves on the evening news. But okay, you are collateral damage, Jonathan. Welcome to the club. Tell us about the misuse or the possible misuse of that information and what you suggest ought to be done with it? It's the weaving together, mosaic, and perhaps, you can explain the concept of mosaic because that works its way into your article and then works its way into the conversation. So, tell us what else, as a byproduct, this technology provides to anybody who wants it and what the dangers are.

Jonathan Hofer: Yeah, and I might characterize what we mentioned previously on just pulling over the wrong people and how it shows a picture of people's lives. I think of it as ALPRs are bad when they work incorrectly and they're probably even worse if they work perfectly. That's what we're talking about, is their role in mass surveillance. A couple of things to consider is how the data is stored. And we touched on this briefly, but sometimes, it's a common pool of information. But usually, individual officers or departments will have access to the data. And you would think that given the sensitivity of this data, it would be tightly regulated, or controlled, or there would be a lot of safety features. Unfortunately, that's usually not the case. 

What we found is that many of these departments don't have really good cybersecurity standards. Sometimes, they'll give access to the wrong people. In some cases, officers retire from the department, and they still have access to this database. Sometimes, you have nefarious bad actors, who are using the databases to either stalk ex-wives or ex-girlfriends. There's a semi-notable case in Washington, where other police officers were stalking this female police officer. They accessed her record something like 400 times. 

One of the things that I think is worthwhile to mention is California, for example, has sanctuary city laws. And a part of that is not handing over data to, let's say, Immigration and Customs Enforcement, which is a kind of Jeffersonian principle. I'm not saying anything about sanctuary cities. But there is something to be said of the people are explicitly saying that their local government shouldn't be tasked with enforcing federal law. At the very least, it could be diverting valuable law enforcement resources. What data sharing arrangements do is, they skirt those laws. Let's say you have loose privacy regulations in one jurisdiction. Well, you can use that jurisdiction's database access to get data from that place, or you can say, "Hey, ICE, here's access to all our license plates. We're not helping enforce immigration, but here you go." 

And to your point about mosaic theory, we briefly talked about how taking a picture of someone's license plate is obviously not illegal. And I don't mean to say that that in of itself is, per se, wrong. However, it does encroach on unconstitutionality when you start putting pieces together. And this theory is what legal scholars call mosaic theory and it says that even if collecting individual datapoints is legal, once you create a picture with those points, then it becomes a Fourth Amendment search. And this chiefly stems from two semi-recent Supreme Court cases. 

In 2012, there's United States v. Jones. And I believe Scalia wrote the opinion on this. And I believe it was in the greater D. C. area. Police try and put a GPS tracker on this guy's car saying that he's dealing drugs. This car travels out of his jurisdiction, the warrant's expired. They're still tracking him. Supreme Court unanimously said the government cannot GPS track a vehicle. Okay, and then you flash forward a couple years into 2017, Gorsuch is on the bench now and there's this Carpenter v. United States case. And I think the facts of the case revolved around, I believe, a bank robbery. And in that case, the government used cell phone tower data to track the cell phones of the suspects. Supreme Court says, "You can't do that. Not only can you not do that, you can't even retrace the steps of the suspects using historical data that exceeds 12 days."

Now, when you combine those two cases, I think you have a serious argument that says, "License plate readers, as they are practiced in California and throughout the nation, are unconstitutional." ALPR location data is not technically GPS, it's not using a satellite. However, it is an effective substitute for GPS data. It's logging your latitude and longitude, even though it's over the internet. And so, they are creating a picture of your lives and they can use the historical data to track your whereabouts. So, this has not been in front of the Supreme Court yet. And in fact, there's really no ALPR case that's been before the Supreme Court. There's really nothing that's coming up that the Supreme Court's going to rule on ALPR use generally.

However, in lower states, they have endorsed this mosaic theory. The most recent one that I'm aware of, I think, is Commonwealth v. McCarthy and that's the Massachusetts State Supreme Court. A woman was spotted, I believe, crossing a bridge where there is an ALPR over there. And there's another one in the Texas, maybe Court of Appeals. I'm not sure which district it was. They also endorsed the theory. However, in both those cases, the courts ruled in favor of the state. And their reasoning was that with only one camera and with only one scan in the Texas case, there wasn't sufficient data to form a mosaic.

However, that's not what's happening in California. Piedmont, the city I mentioned previously, it's less than two square miles and they have almost 40 cameras. In Los Angeles, there's 500 cameras. These litter the streets. They're scanning your car more than once. Some guy in San Leandro did a public records request on his car. It was scanned like hundred times. In my contention, that does constitute a mosaic search and therefore, it would require a warrant.

What About Private Surveillance? A Thought Experiment

Bob Zadek: So, the danger is an unintended use, because the technology is not sold as it moves, say, where-- explain to anybody who cares, where you and I have been or at least where our car has been, and presumably, we were in it. But that's a different element of proof. But at least, where our car has been over a period of time. Those who market the technology and those who advocate for its use are typically, as I appreciate it, law enforcement who just say, "We want to catch escaping felons, and stolen cars, and things of that nature." And perhaps, Amber Alerts, maybe. They don't sell it as, "We'll build the life, the hidden story of Jonathan Hofer," so that you have no secrets, Jonathan, at least not anymore. I guess that's a byproduct. That's what I said earlier that, "Hey, we have the data. How else can we use it? And we are in law enforcement. So, let's look for law enforcement type uses like proving he's a drug dealer, as opposed to proving whether his car is stolen."

I think what's happened is, once you have the technology, now, that's information that's nowhere else available. So, "Hey, why not use it for a different purpose?" And then, we have as we always do, when you mentioned several times, decisions of the Supreme Court, the technology is always ahead of the law and the law is trying to catch up. "Wait a minute, how do we apply a statute that was drafted when this technology didn't exist? How do we apply it or not?" So, that's what I think you are explaining. 

Now, there's another question that occurred to me. And, Jonathan, I'm going to ask you a question on the air to be saved for all time, but I say to myself, "This guy is never going to know the answer to this question." So, I tell the audience, when you don't, it doesn't mean you're now humiliated. But what occurred to me is, I was wondering, let's say, there is a civil dispute. A plain old, boring lawsuit. And an element of proof that one side needs is the story which would be told by the mosaic theory on where somebody else's car has been over a period of time to prove a point at a civil case.

Now, if one of the parties in the civil case had a witness and could testify, maybe, Jonathan, you had no life and all you did was you were obsessing about this car and you followed it all over the place, which you're allowed to do. You weren't stalking. That's just your hobby. You follow this green car. And if you were to testify, you're allowed to testify. There's no privacy rights at all. Now, take the same need to prove and the litigant contacts the private company that maintains these cameras and says, "Hey, we understand you got these cameras all around town. Can we pay you, we want to run a license plate search and want to know where this car has been for the last four weeks." There's nothing wrong with that. That doesn't offend you, or does it?

Jonathan Hofer: Well, that's a great point. That actually has come up in a court case, interestingly enough. The argument was-- and this was from a company that aggregates data and leases it or rents out access to police. And they said, "We have a First Amendment right to this data." And they asked the question, "Since license plates are public, what's the difference between us and a newspaper?" Because the newspaper, they use their eyeballs, they observe the facts, they record it and publish it. What's the difference? I think that that's a very important argument to address. Unfortunately, there's no court case where this is butted up against mosaic theory. And I would hold that private people and companies do have a right to store this data. However, it does become concerning once the government is using this as effectively mass surveillance. So, yeah, that's a great point.

Bob Zadek: So, if none of the data was collected by government, it was all collected by private parties and maybe each county franchised that out to one license plate reader collection company. But now, we have the data lawfully being collected, because you told me that's allowed by private parties. And now, it's like cell phone records collected by Verizon, and the police need a warrant to get that cell phone data from Verizon. And they get the warrant, if they can satisfy a judge, they're entitled to it. Would that system, putting aside possibility of abuse, there's always abuse, is your concern materially satisfied, if we simply said law enforcement could never be the collector?

Jonathan Hofer: I don't necessarily have a problem with law enforcement collecting their own scans. And I would say that law enforcement requiring a warrant to get it from third parties, I think, would be eons ahead of where we are now. I think most of my concerns would actually be absolved. And honestly, I don't have a problem with private companies-- like toll collection or things like parking enforcement and private parking garages, I'm totally okay with that. I don't think the government has a vested interest in keeping records on people not suspected of criminal wrongdoing. And so, while I don't criticize private individuals or companies collecting license plate records, I do have a problem with what the government would do with those.

Bob Zadek: Is there any-- because clearly, there is the possibility of abuse and clearly there are examples, you being one of them. And I didn't realize I found this goldmine perfect guest on my show. I had no idea, you not only talk the talk, you walk the walk, but you found yourself lying face down in the pavement that dark foggy night somewhere in the East Bay. So, how lucky we all are to have you on the show? Would your concerns be mitigated, maybe mitigated substantially, if there was a system that "adequately" compensated the victims? 

And now, you have the cost of the system when governments decide whether to buy it. The cost of the system would be the total of what the government pays for the system, for the hardware and the maintenance. Plus, the error factor, and the cost of that error, so that the victims of mistakes get compensated, as best they can be. You couldn't be-- how do you compensate you for lying face down in the pavement when you just want to be home under the sheets? We can figure out a way to do that, that's for somebody else to do. And if you compensate the victims and the cost is borne by the source of the mistake, as a libertarian or libertarian-ish, I will presume libertarian, are you okay now? Is the downside covered so now, we only get the upside and society as a whole is not penalized?

Jonathan Hofer: Yeah, in that case, we really want to make sure that the benefits of technology outweigh all possible costs. So, not just lawsuits, but also any maintenance and other errors this cause. I don't want to fleece the taxpayers with the burden of law enforcement incorrectly pulling over people. However, I do think it's prudent that cities that do operate ALPRs, they carve out some remedy for the people they've wronged. San Francisco has done this in their license plate reader policy. They've carved out a private right of action. They say, "If our surveillance equipment injured you, you have the right to sue us, if we don't make it better." I think that is important to do. Maybe a broader thing-- That also brings up the issue of do police have malpractice insurance? Do they have a stockpile of [unintelligible 00:46:12] so they can pay out lawsuits? That brings up a lot of other issues-- [crosstalk] 

Bob Zadek: We are getting into qualified immunity, aren't we?

Jonathan Hofer: Yeah. 

The Related Challenges of Facial Recognition

Bob Zadek: Okay. Regretfully, I would love to do it. That's a whole other show, a show that I have done and will do more of. Now, as we start to run out of time, one last thought, Jonathan, briefly. This conversation could just as well be on facial recognition.

Jonathan Hofer: Absolutely. 

Bob Zadek: We have CC cameras all over the place, the UK has a gazillion of them. Not only do we know where cars are, we know where your face has been and presumably, you're attached to it. Therefore, we know where you are. How much of what we have discussed today apply with equal force to facial recognition and is one more sinister than the other? We only have about a minute or two, Jonathan.

Jonathan Hofer: Yeah, I'll be quick. Yes, the issues are very similar. In fact, they're used similarly, tactically by police. They have the same data collection problems. The same tracing someone's steps problems. I think that their concern is actually license plate readers just based on their number. Facial recognition is far and few between. And a lot of cities have outright banned facial recognition, like Seattle, Portland, San Francisco, a few others on the East Coast. Most cities have not banned license plate readers. They're used everywhere.

Bob Zadek: Jonathan, on the subject of facial recognition, now, I can't figure out whether it's easier to change your face so the cameras don't get you or your license plate. I don't know which is harder. But putting that issue aside, what is the future? We are now, of course, where we are with facial recognition and with license plate readers. What's the next step in that technology just so nobody goes to bed tonight feeling private?

Jonathan Hofer: Yeah, one of my big fears is that the United States will quickly adopt a lot of the tactics and strategies that the CCP uses to control its Uyghur population. The Chinese government has used a lot of surveillance tools, most notably facial recognition. They have large databases on people's purchases, internet search history. United States already has internet search on lockdown with the NSA. And few people realize that the United States is one of the largest importers of Chinese surveillance technology in the world. 

However, I think we have reason to also be optimistic. This story is not over and there's a lot of things that people could do. There are a lot of votes that have passed in the Bay Area and elsewhere where we've successfully curbed government encroachment into people's privacy. And I think that could be replicated elsewhere.

Bob Zadek: The optimistic point of view is, I don't think we're going to stop the technology. Therefore, the issue is not whether the government has or even uses technology to collect. The issue is the use, and it's easy to control the use. If you misuse it, you go to jail or you pay a fine. And that's the way to control the use. So, rather than try to either stifle the technology or control the technology, that's not going to happen, let's just control the use and limit what the government is allowed to do with its technology. I find myself not caring all that much if somebody in Tulsa, Oklahoma, who will never meet me knows that I went to have doughnuts yesterday as long as he doesn't tell my wife. I don't care if she knows that I had doughnuts. But if she misuses it in our public capacity, that's what gets me concerned. And I think the prospect of controlling the use, while difficult, there's reason to be optimistic, but there's no prospect to control the technology. So, Jonathan, if we both can agree on that, we leave the show. Neither one of us is going to be accused of being a Luddite. We can walk proudly in libertarian friends and not be ashamed.

Tell us about the Independent Institute, what they are, what they do, and how folks can follow your work and the work of the institute.

Jonathan Hofer: Yeah, Independent Institute is a public policy thinktank located in Oakland, California. Our website is and our mission is to advance free people and free markets.

Bob Zadek: We've been talking to Jonathan Hofer. Jonathan is a research associate at the Independent Institute located in Oakland, California. They are elite. They have a huge family of scholars and experts, publish regularly under the Independent Institute as does Jonathan. Jonathan, do you run an active Twitter feed?

Jonathan Hofer: Yeah, the best way to follow my work is searching on my profile at I have my own author profile. You can read commentary, blog posts, things of that nature.

Bob Zadek: That's Jonathan Hofer, H-O-F-E-R. Jonathan, thank you so much. I'm sorry about your loss when you were lying face down in Oakland. If I went to that parking lot, would I see an outline in chalk of your body lying there? 

Jonathan Hofer: Yeah. [crosstalk] 

Bob Zadek: Did they say that it's like a monument?

Jonathan Hofer: [laughs] I hope they did. 

Bob Zadek: Jonathan, thank you so much for your time and to all my friends out there. Thank you for giving us an hour of your most valuable time. We hope you have found it worthwhile. So long for now.

Essential Liberty
The Bob Zadek Show
Bob talks about the issues that affect our lives on a daily basis from a purely libertarian standpoint. He believes in small government, fewer taxes, and greater personal freedom.<br /><br />America has lost its way, but it cannot and does not need to be reinvented. Our founders were correct about their approach to government, as were John Locke, Adam Smith and the other great political philosophers who influenced them. The country’s first principles are economic and social freedom, republicanism, the rule of law, and liberty. Bob believes we must take the best of our founding principles and work from them because a country without principles is just a landmass.