My page - topic 1, topic 2, topic 3 Postbox Live

Police Say AI’s That Have Hallucinations

's That Have Hallucinations

Police Say AI’s That Have Hallucinations

Are Prepared to Write Police Reports That Could Put People in Jail

 

 

 

“The open question is how reliance on AI-generative suspicion will distort the foundation of a legal system dependent on the humble police report.”

experts are raising concerns about the adoption of AI-written police reports by US police departments.

The police technology business Axon revealed the AI tool, dubbed “Draft One,” back in April. The AI program, according to Axon, which also makes tasers and other weapons, leverages OpenAI’s GPT-4 big language model to accurately create police reports from police body camera audio. It has been touted as a productivity enhancer that can reduce the amount of time officers spend on paperwork.

“If an officer spends half their day reporting, and we can cut that in half,” Rick Smith, the CEO of Axon, told Forbes at the time “we have an opportunity to potentially free up 25 percent of an officer’s time to be back out policing.”

Police reports, however, are more sensitive than ordinary emails when it comes to paperwork, and generative AI is a technology that can lead to what professionals in the field refer to as “hallucination” a catch-all term for common mistakes like made-up facts or other inaccurate information frequently found in synthetic text.

Nevertheless, American police departments are beginning to test the waters with Draft One in states like Colorado, Indiana, and Oklahoma. In fact, some departments have even allowed officers to utilize the program for any type of case, rather than only small-scale incident reports. Naturally, experts are concerned about the repercussions. Given the fundamental significance that police reports play in both legal and investigative procedures, is it prudent or even moral to outsource them?

Professor of law at American University Andrew Ferguson, who wrote the first law review of AI-generated police reports, told the AP that he was worried that police officers would become less meticulous in their writing due to automation and simplicity of use.

Regarding the effectiveness of its drawing tool, Axon has defended it, stating to the AP that its AI product manager, Noah Spitzer-Williams, has “access to more knobs and dials” than the “average ChatGPT user would have.” He went on to suggest that Draft One’s ability to “embellish or hallucinate” like ChatGPT does is limited since Axon has disabled features like GPT-4’s “creativity dial,” among other things.

Despite the knobs and dials, there are still a lot of ethical and legal concerns with this type of automation. Humans are fallible, just like machines, and they undoubtedly have prejudices.

However, human connection lies at the core of efficient policing, so when you begin to erode that with automated technologies, it’s important to have a public conversation about what can be lost in the process.

This includes people’s lives, a considerable deal of which have already been adversely affected by law enforcement investigations employing unfinished AI technology.

“However, reliance on AI-generative suspicion will distort the foundation of a legal system based on the humble police report,” Ferguson said in his review, which was published a month ago.

 

 

 


Discover more from

Subscribe to get the latest posts sent to your email.

Leave a Reply

error: Content is protected !!

Discover more from

Subscribe now to keep reading and get access to the full archive.

Continue reading