Automated License‑Plate Reader (ALPR) systems have become a routine sight on roads across the United States. These can perform valuable functions, including parking enforcement, speed trap verification, identifying stolen vehicles, responding to Amber Alerts, and more.
Always a concern for privacy activists, a new generation of license plate reader camera systems marketed by companies such as Flock Safety are causing particular alarm. Unlike traditional license-plate readers, such as those from Motorola Solutions, these new systems are both AI-driven and connected to a centralized database that can be accessed by local and federal law enforcement agencies with little oversight.
Instead of police searching for license plates linked to specific crimes, these AI-driven systems can examine millions of license plates to analyze the behavior of every car caught on camera. With this information, they can identify “suspicious” travel patterns(yeni pencere), such as driving patterns that may be associated with drug trafficking activity(yeni pencere).
People are being routinely identified as suspects and subjected to vehicle stops based on uncorroborated evidence that US constitutional law experts say fails to meet the “probable cause” standards required by the Fourth Amendment. These centralized and increasingly ubiquitous AI-driven ALPR systems constitute a mass dragnet-surveillance system that poses a direct threat to the privacy and freedom of everyone in the US.
A nationwide surveillance network
Flock Safety alone deploys(yeni pencere) over 40,000 LPR camera systems (many likely illegally(yeni pencere)), and is used by 5(yeni pencere),000+ law enforcement agencies(yeni pencere) in over 4,000 cities across 49 states.
There are fewer than 300 million registered vehicles in the US, but Flock processes 20+ billion license plate scans(yeni pencere) per month, creating a detailed location tracking record. These scans are sent to a centralized database, where law enforcement agencies from across the country can view drivers’ license plate numbers, locations and directions, and the times of recording without any need for warrants. Between December 2024 and October 2025, 3,900 agencies(yeni pencere) logged some 12 billion searches through the Flock network.
But it’s not just about direct targeted searches. Those billions of untargeted license plate scans are analyzed by sophisticated AI software and combined with other details about a vehicle such as make, car, and model number (but not facial scans) to create unique “fingerprints(yeni pencere)” that can accurately track your car journeys. This means anyone who cares to look can easily gain a detailed insight into your daily activities.
And if the AI software algorithm deems your vehicular activity to be suspicious in any way, law enforcement agencies are increasingly likely to use that information to stop and search your vehicle, often citing spurious evidence to bypass the Fourth Amendment’s prohibition against unreasonable searches and seizures.
No oversight or accountability
Flock and similar companies act as private contractors for local law enforcement agencies, which have much looser transparency and legal requirements than federal authorities do. As private customers, local law enforcement officers are actively encouraged to collaborate(yeni pencere) with other local enforcement customers as a “smarter way” to combat crime.
This cozy atmosphere of casual collaboration that Flock fully encourages also makes it easy for authorities to bypass local privacy regulations(yeni pencere) and severely reduces the accountability of officers who flout those regulations.
Officers further routinely evade legal restrictions on Flock searches by deliberately misframing the reasons for the search. For example, a sheriff’s office in Texas searched data from more than 83,000 Flock cameras to track down(yeni pencere) a woman they suspected of self-managing an abortion — which turned out to be wrong. To justify this search, they claimed they were searching for “a missing person” and that “it was about her safety”.
That local authorities know how sensitive this license plate reader camera data can be is amply illustrated by a recent court case(yeni pencere) in which the city of Washington fought tooth-and-nail to prevent public access to Flock camera images — it failed.
Evidence also shows that local police forces are more than happy to informally share this information with federal authorities such as the FBI and ICE(yeni pencere), allowing these agencies to sidestep their legal and constitutional obligations.
Evidence of APLR AI profiling
Although police forces across the country deny using such AI-generated ALPR tip-offs to initiate vehicle stops (often citing spurious evidence to bypass the Fourth Amendment’s “probable cause” requirement), there is mounting evidence of this becoming routine procedure:
Federal Border Patrol predictive program flagged vehicles that were stopped and searched
An AP investigation(yeni pencere) reported that a “predictive intelligence” program run by the United States Border Patrol resulted in people being stopped, searched, and in some cases arrested.
Local police departments receive ALPR alerts and make vehicle stops
CBS News(yeni pencere) documented multiple cases where LPR camera matches wrongly triggered police stops due to misread characters or faulty database matches, resulting in innocent motorists being pulled over at gunpoint.
Flock’s ALPR logs show searches tied to immigration and protests
Analyses of Flock Safety search logs show dozens of searches connected to immigration enforcement(yeni pencere) and to protests and activist activity(yeni pencere).
Investigators are using AI to obtain automated information
A US Department of Justice (DOJ) report(yeni pencere) describes how AI is used in law-enforcement surveillance, cautioning against discrimination in automated systems. it specifically mentions ALPR as an increasingly common way to obtain automated information used in investigations.
A new wake-up call for privacy
Like all new technologies, AI is double-edged sword. Its potential for solving many of humanity’s most intractable problems is huge, but so too is its potential for harm. Describing the telescreen that allowed Big Brother to access the most intimate spaces in every every household in his seminal novel 1984, George Orwell wrote:
“How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to. You had to live — did live, from habit that became instinct in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized”.
The combination of AI and ubiquitous always-on surveillance systems that can track our every movement far exceeds Orwell’s even most dystopian visions. Recent news that Flock is partnering with Ring(yeni pencere) — the world’s most popular smart doorbell manufacturer — demonstrates just how ubiquitous and invasive this form of tracking already is.
Proton was founded in 2013 as a response to Edward Snowden’s revelations about the extent and reach of the US and its 5-Eyes partners(yeni pencere)‘ dragnet program to spy on just about everyone, including US citizens. But rather than providing a wake-up call for an open debate about public consent for mass surveillance, governments around the world (very much including that of the US) have since doubled-down on the practice.
The fact that invasive AI-backed ALPR systems such as those from Flock are privately owned is not a barrier to government agencies abusing them. On the contrary, the arm’s length relationship enables them to bypass the legal and constitutional safeguards specifically intended to protect ordinary citizens from government overreach.
