Back
EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits | Electronic Frontier Foundation
webEFF investigation into Axon's Draft One AI tool for police reports reveals it lacks audit trails, making it impossible to distinguish AI-generated content from officer-written content, raising critical concerns about accountability and transparency in AI deployment within the criminal justice system.
Metadata
Importance: 62/100press releasenews
Summary
The EFF investigated Axon Enterprise's Draft One, an AI tool that generates police report narratives from body-worn camera audio. The investigation found the product lacks meaningful audit features—AI-generated drafts are not saved, leaving no record of what was written by AI versus an officer. This makes it impossible for judges, defense attorneys, or the public to assess AI influence on police reports.
Key Points
- •Draft One does not save AI-generated drafts or edited versions; text disappears once the window closes, leaving no audit trail.
- •No mechanism exists to distinguish AI-written content from officer-written content in final police reports.
- •The lack of transparency makes it impossible to detect biased language, inaccuracies, or misinterpretations introduced by the AI.
- •Axon's existing relationships with thousands of police agencies accelerate deployment of Draft One with minimal oversight.
- •EFF argues police should not use AI to write reports until fundamental accountability and transparency questions are resolved.
1 FactBase fact citing this source
| Entity | Property | Value | As Of |
|---|---|---|---|
| Electronic Frontier Foundation (EFF) | investigation | Investigated Axon's 'Draft One' AI-written police report product, designed to hinder audits | 2025 |
Cached Content Preview
HTTP 200Fetched Apr 7, 20269 KB
EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits | Electronic Frontier Foundation
Skip to main content
About Contact
Press
People
Opportunities
EFF's 35th Anniversary
Issues Free Speech
Privacy
Creativity and Innovation
Transparency
International
Security
Our Work Deeplinks Blog
Press Releases
Events
Legal Cases
Whitepapers
Podcast
Annual Reports
Take Action Action Center
Electronic Frontier Alliance
Volunteer
Tools Privacy Badger
Surveillance Self-Defense
Certbot
Atlas of Surveillance
Cover Your Tracks
Street Level Surveillance
apkeep
Donate Donate to EFF
Giving Societies
Shop
Sponsorships
Other Ways to Give
Membership FAQ
Donate Donate to EFF
Shop
Other Ways to Give
Email updates on news, actions,
and events in your area.
Join EFF Lists
Copyright (CC BY)
Trademark
Privacy Policy
Thanks
Electronic Frontier Foundation
Donate
If you use technology, this fight is yours. Donate today
EFFecting Change: Can't Stop the Signal on April 16
EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits
Axon Enterprise’s Draft One Allows No Real Transparency and Accountability
PRESS RELEASE
Press Release July 10, 2025
EFF Investigation: AI Product for Police Reports is Designed to Hinder Audits
Share It
Share on Mastodon
Share on Twitter
Share on Facebook
Copy link
SAN FRANCISCO – Axon Enterprise's Draft One product, which uses generative artificial intelligence to write police report narratives based on body-worn camera audio, seems designed to stymie any attempts at auditing, transparency, and accountability, an Electronic Frontier Foundation (EFF) investigation has found.
The investigation – based on public records obtained from dozens of police agencies already using Draft One, Axon user manuals, and other materials – found the product offers meager oversight features, and the result is that when a police report includes biased language, inaccuracies, misinterpretations, or lies, there’s no record showing whether the culprit was the officer or the AI. This makes it extremely difficult if not impossible to assess how the system affects justice outcomes over time.
“Police should not be using AI to write police reports,” said EFF Senior Policy Analyst Matthew Guariglia. “There are just too many questions left unanswered about how AI would translate the audio of situations, whether police will actually edit those drafts, and whether the public will ever be able to tell what was written by a person and what was written by a computer. This is before we even get to the question of how these reports might lead to problems in an already unfair and
... (truncated, 9 KB total)Resource ID:
kb-b32c8765b3705645