Cop’s AI-generated police report claims officer “turned into a frog”

Michael Gwilliam
4 Min Read

An AI program built to write police reports went wildly off script after it claimed a Utah officer had transformed into a frog.

The bizarre error surfaced in December when the Heber City Police Department in Utah tested new artificial intelligence tools designed to cut down on paperwork.

Instead of delivering a clean incident summary, the software generated a report stating an officer had shape-shifted into an amphibian.

Police later discovered the system had accidentally pulled dialogue from a Disney film playing in the background during the body camera recording.

Disney film causes AI police report mix-up

“The AI report writing software picked up on the movie that was playing, which happened to be The Princess and the Frog,” Sgt. Keel told FOX 13 News.

“That’s when we learned the importance of correcting these AI-generated reports,” he added.

The department began trialing two separate pieces of software earlier in December: Draft One and Code Four.

The Disney movie caused some big issues with the AI police report.

Code Four was created by George Cheng and Dylan Nguyen, both 19, described as MIT dropouts. The program analyzes body cam audio and automatically produces full police reports.

Draft One, the second tool, was announced last year by police tech company Axon. That system also generates written reports directly from body camera recordings.

Draft One uses OpenAI’s GPT language models. Axon promotes the technology as a way to save officers hours of administrative work.

In the frog incident, Draft One was the platform responsible for drafting the faulty report.

The embarrassing mistake was uncovered during a mock traffic stop meant to demonstrate the tool’s capabilities.

According to FOX 13, the resulting AI document was filled with inaccuracies and required extensive corrections before it could be considered usable.

Keel said the tool still offers clear benefits, despite occasional glitches.

The Heber City sergeant claims the AI software now saves him six to eight hours of work every week.

Police Car

“I’m not the most tech-savvy person, so it’s very user-friendly,” he told the outlet.

Keel added the weekly time savings are real. He described the program as simple to learn and easy to operate.

The incident highlights growing pains as law enforcement agencies rush to adopt AI for daily operations.

Police and security organizations have increasingly turned to AI for report drafting, facial recognition, and surveillance assistance. However, automation errors continue to raise concerns, and this isn’t the first time AI has caused problems for policing.

In 2025, AI security software mistakenly flagged a student and claimed his bag of Doritos chips was actually a firearm.

A separate incident involved an innocent man being arrested after an AI system incorrectly identified him as a banned casino patron, despite providing a 99% match.

As such, human oversight remains essential, something Keel made clear following the frog fiasco.

Share This Article
Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *