Police Department Investigating Real
Crimes using AI-Powered Detective.
Cybercrimes using Bots
According to Sky News, a British police department is testing an AI-powered technology that could help solve cold cases by compressing decades of detective effort into a few hours.
However, there is currently no information available regarding the accuracy rate of this Australian-developed platform, called Soze, which raises serious concerns because AI models can produce wildly inaccurate findings or false impressions of information.
Soze is being used by the Avon and Somerset Police Department, which serves areas of South West England, to scan and examine documents such as financial records, videos, social media accounts, emails, and other types of material in order to test the program.
According to Sky, the AI completed the remarkable task of scanning the evidence from 27 “complex” cases in around 30 hours, which is the equivalent of 81 years of human labor. It is understandable why this police department is interested in utilizing the software; such figures seem like a force multiplier on steroids, which appeals to law enforcement agencies that might be operating under tight budgetary and personnel restrictions.
In an interview with Sky, Gavin Stephens, the UK’s National Police Chiefs’ Council Chair, said, “You might have a cold case review that just looks impossible because of the amount of material there and feed it into a system like this which can just ingest it, then give you an assessment of it.”
“I understand that to be really, really helpful.”
Minorities Report
Another AI project Stephens described was compiling a database of knives and swords that numerous suspects have allegedly used to attack, maim, or kill victims in the United Kingdom.
Although Stephens is optimistic about the upcoming release of these AI tools, it would be wise to make sure they are operating as planned beforehand.
Artificial intelligence (AI) is notoriously prone to false positives and mistakes, perhaps most notably in law enforcement.
This seems like a scene from the 2002 Steven Spielberg film “Minority Report,” based on Philip K. Dick’s novella of the same name.
The US Commission on Civil Rights recently condemned the use of AI in policing due to these alarming mistakes.
There is a misconception that because these are machines performing analysis, their accuracy and dependability will be unmatched. However, as they are based on data gathered by people, who may be biased or simply incorrect, they are inevitably flawed the start.
There is a misconception that because these are machines performing analysis, their accuracy and dependability will be unmatched. However, as they are based on data gathered by people, who may be biased or simply incorrect, they are inevitably flawed the start.
Discover more from Postbox Live
Subscribe to get the latest posts sent to your email.