OKLAHOMA CITY (news agencies) — A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour.
Normally, the Oklahoma City police sergeant would grab his laptop and spend another 30 to 45 minutes writing up a report about the search. But this time he had artificial intelligence write the first draft.
Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilmore’s body camera, the AI tool churned out a report in eight seconds.
“It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilmore said. It even documented a fact he didn’t remember hearing — another officer’s mention of the color of the car the suspects ran from.
Oklahoma City’s police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who’ve tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned.
Built with the same technology as ChatGPT and sold by Axon, best known for developing the Taser and as the dominant U.S. supplier of body cameras, it could become what Gilmore describes as another “game changer” for police work.
“They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” said Axon’s founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced.
“Now, there’s certainly concerns,” Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers — not solely an AI chatbot — are responsible for authoring their reports because they may have to testify in court about what they witnessed.
“They never want to get an officer on the stand who says, well, ‘The AI wrote that, I didn’t,’” Smith said.
AI technology is not new to police agencies, which have adopted algorithmic tools to read license plates, recognize suspects’ faces, detect gunshot sounds and predict where crimes might occur. Many of those applications have come with privacy and civil rights concerns and attempts by legislators to set safeguards. But the introduction of AI-generated police reports is so new that there are few, if any, guardrails guiding their use.