Surveillance Has Wings Now
December 1, 2025
In America, our surveillance state rarely announces its next evolution with a press conference or any official announcement. Instead, it creeps forward in procurement filing and vague "requests for information" that seem innocuous until you read the fine print. The FBI's recent request for AI-enhanced surveillance drones equipped with facial recognition, weapons detection, and mass-tracking capabilities signals one of those silent but seismic shifts. It is a blueprint for a dystopian future where aerial monitoring, even of peaceful, constitutionally protected activity, becomes unaccountable.
AI-powered surveillance doesn't just expand state power; it erases any boundaries between investigation and mass monitoring of all. As Matthew Guariglia of the Electronic Frontier Foundation told The Intercept, these systems are "built to do indiscriminate mass surveillance of all people." And while the language and rhetoric of the FBI's request may invoke "public safety," history tells a different story: tools built for terrorism and crime prevention inevitably end up turned on protesters and marginalized communities.
The U.S. government has repeatedly used aerial technologies to monitor political speech, often in defiance of constitutional norms. During the George Floyd protests of 2020, the DHS deployed Predator-class drones over Minneapolis, later expanding operations to at least 15 cities. The U.S. Marshals Service also used drones to record demonstrators in Washington, D.C., showing the history of this technology's use. What the FBI now seeks is not merely more drones, but drones driven by AI, systems able to identify faces, scan license plates, chart movement patterns, and detect "anomalous behavior" without any human oversight. Real-time facial recognition deployed at demonstrations, beyond having high potential for government abuse, chills participation and undermines the freedoms of association and assembly, even perceptually. The FBI's request sits atop a decade of precedent in which every new surveillance tool ends up pointed at those demanding justice rather than those committing violence.
While the Bureau pitches weapon-detection AI as a safety tool, the technology's current reality is closer to science fiction than reliability. As an example, school AI weapons-detection systems repeatedly misidentified everyday objects as firearms and failed to detect actual guns in monitored environments. These systems don't just "make mistakes," they generate false alerts that escalate police encounters, creating even greater danger in communities where armed responses are already the norm. "No company has yet proven that AI firearm detection is a viable technology," as Guariglia told The Intercept. A misread shadow could trigger a police response with lethal consequences.
Modern surveillance doesn't expand through sweeping new laws out in the open. It grows through quiet "mission creep", in which agencies justify new tools under narrow circumstances and gradually broaden their use. Documents obtained by The ACLU show that "public safety drone programs" at local police departments evolve into wide-ranging aerial monitoring systems deployed at parades, community gatherings, and peaceful demonstrations. New York City offers a cautionary example. The NYPD's drone flights increased 3,200% between 2019 and 2024, but faced minimal oversight and repeated uses unrelated to emergencies. The FBI's proposed AI-powered drones would accelerate this dynamic dramatically. Because they are not merely. Their interpretations, embedded with algorithmic bias, disproportionately misidentify Black, Brown, and immigrant faces.
The FBI's pursuit of AI-enabled surveillance drones is not a technical upgrade; it is an authoritarian escalation. It represents a transition from human-directed monitoring to automated dystopia, where a flying camera powered by machine learning becomes the first arbiter of who is dangerous, who is suspicious, and who deserves police attention.
— Omar Dahabra
In Partnership with Capitol Commentary
About the Author
Capitol Commentary Founder & Editor
Omar Dahabra is the founder and chief editor of Capitol Commentary, a political platform centered on bringing an independent political analysis to both domestic and global affairs.
31
Articles
Leave a Comment
Share your thoughts on this article. Your comment will be reviewed before publishing.