Automated License‑Plate Reader (ALPR) systems have become a routine sight on roads across the United States. These can perform valuable functions, including parking enforcement, speed trap verification, identifying stolen vehicles, responding to Amber Alerts, and more.

Always a concern for privacy activists, a new generation of ALPR systems marketed by companies such as Flock Safety are causing particular alarm. Unlike traditional license-plate readers, such as those from Motorola Solutions, these new systems are both AI-driven and connected to a centralized database that can be accessed by local and federal law enforcement agencies with little oversight.

Instead of police searching for license plates linked to specific crimes, these AI-driven systems can examine millions of license plates to analyze the behavior of every car caught on camera. With this information, they can identity “suspicious” travel patterns(nouvelle fenêtre) (such as identifying driving patterns that may be associated with drug trafficking activity(nouvelle fenêtre)).

People are being routinely identified as suspects and subjected to vehicle stops based on uncorroborated evidence that US constitutional law experts say fails to meet the “probable cause” standards required by the Fourth Amendment. These centralized and increasingly ubiquitous AI-driven ALPR systems constitute a mass dragnet-surveillance system that poses a direct threat to the privacy and freedom of everyone in the US.

A nationwide surveillance network

Flock Safety alone deploys(nouvelle fenêtre) over 40,000 ALPR systems (many likely illegally(nouvelle fenêtre)), and is used by 5(nouvelle fenêtre),000+ law enforcement agencies(nouvelle fenêtre) in over 4,000 cities across 49 states.

There are fewer than 300 million registered vehicles in the US, but Flock processes 20+ billion license plate scans(nouvelle fenêtre) per month, creating a detailed location tracking record. These scans are sent to a centralized database, where law enforcement agencies from across the country can view drivers’ license plate numbers, locations and directions, and the times of recording without any need for warrants. Between December 2024 and October 2025, 3,900 agencies(nouvelle fenêtre) logged some 12 billion searches through the Flock network.

But its not just about direct targeted searches. Those billions of untargeted license plate scans are analyzed by sophisticated AI software and combined with other details about a vehicle such as make, car, and model number (but not facial scans) to create unique “fingerprints(nouvelle fenêtre)” that can accurately track your car journeys. This means anyone who cares to look can easily gain a detailed insight into your daily activities.

And if the AI software algorithm deems your vehicular activity to be suspicious in any way, law enforcement agencies are increasingly likely to use that information to stop and search your vehicle (often citing spurious evidence to bypass the Fourth Amendment’s prohibition against unreasonable searches and seizures).

No oversight or accountability

Flock (and similar companies) act as private contractors for local law enforcement agencies, which have much looser transparency and legal requirements than federal authorities do. As private customers, local law enforcement officers are actively encouraged to collaborate(nouvelle fenêtre) with other local enforcement customers as a “smarter way” to combat crime.

This cozy atmosphere of casual collaboration (which is fully encouraged by Flock), also makes it easy for authorities to bypass local privacy regulations(nouvelle fenêtre) and severely reduces the accountability of officers who flout those regulations.

Officers further routinely evade legal restrictions on Flock searches by deliberately mis-framing the reasons for the search. For example, a sheriff’s office in Texas searched data from more than 83,000 Flock cameras to track down(nouvelle fenêtre) a women they (wrongly) suspected of self-managing an abortion. To justify this search, they claimed they were searching for “a missing person” and that “it was about her safety”.

That local authorities know how sensitive this ALPR data can be is amply illustrated by a recent court case(nouvelle fenêtre) in which the city of Washington fought (unsuccessfully) tooth-and-nail to prevent public access to Flock camera images.

Evidence also shows that local police forces are more than happy to also informally share this information with federal authorities such the FBI and ICE(nouvelle fenêtre), allowing these agencies to sidestep their legal and constitutional obligations.

Evidence of APLR AI profiling

Although police forces across the country deny using such AI-generated ALPR tip-offs to initiate vehicle stops (often citing spurious evidence to bypass the Fourth Amendment’s “probable cause” requirement), there is mounting evidence of this becoming routine procedure:

Federal Border-enforcement predictive program flagged vehicles that were stopped and searched.

An AP investigation(nouvelle fenêtre) reported that a United States Border Patrol “predictive intelligence” program resulted in people being stopped, searched, and in some cases arrested.

Local police departments receive ALPR alerts and make vehicle stops.

CBS news(nouvelle fenêtre) documented multiple cases where ALPR matches led to officers stopping motorists by mistake, due to misread characters or bad database matches, producing wrongful stops where innocent people had guns pointed at them.

Flock’s ALPR logs show searches tied to immigration and protests.

Analyses of Flock Safety search logs show dozens of searches connected to immigration enforcement(nouvelle fenêtre) and to protests and activist activity(nouvelle fenêtre).

Investigators are using AI to obtain automated information.

A US Department of Justice (DOJ) report(nouvelle fenêtre) and policy review describes how AI is used in law-enforcement surveillance, cautioning against discrimination in automated systems. The report specifically mentions ALPR as an increasingly common way to obtain automated information used in investigations.

A new wake-up call for privacy

Like all new technologies, AI is double-edged sword. Its potential for solving many of humanity’s most intractable problems is huge, but so too is its potential for harm. Describing the telescreen that allowed Big Brother to access the most intimate spaces in every every household in his seminal novel 1984, George Orwell wrote:

“How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to. You had to live — did live, from habit that became instinct in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized”.

The combination of AI and ubiquitous always-on surveillance systems that can track our every movement far exceeds Orwell’s even most dystopian visions. Recent news that Flock is partnering with Ring(nouvelle fenêtre) — the world’s most popular smart doorbell manufacture — demonstrates just how ubiquitous and invasive this form of tracking already is.

Proton was founded in 2013 as a response to Edward Snowden’s revelations about the extent and reach of the US and its 5-Eyes partners(nouvelle fenêtre)‘ dragnet program to spy on just about everyone (including US citizens). But rather than providing a wake-up call for an open debate about public consent for mass surveillance, governments around the world (very much including that of the US) have since doubled-down on the practice.

The fact that invasive AI-backed ALPR systems such as those from Flock are privately owned is not a barrier to government agencies abusing them. On the contrary, the arm’s length relationship enables them to bypass the legal and constitutional safeguards specifically intended to protect ordinary citizens from government overreach.