This story was originally published by the WND News Center.
An organization that defends civil liberties in the digital world is reporting on a new scheme being developed at a national lab: the creation of a digital police officer.
The concept “reads like a pitch for the most dystopian buddy cop movie ever,” explains the report by Dave Maass at the Electronic Frontier Foundation.
The work on a “D-PO,” which now is being presented as a “visionary concept,” is going on at the Pacific Northwest National Laboratory, which is run by Battelle on behalf of the U.S. Department of Energy.
Researchers are working on “forecasting a future where police and border agents are assisted by artificial intelligence, not as a software tool but as an autonomous partner capable of taking the steering wheel during pursuits and scouring social media to target people for closer investigation,” the report said.
EFF uncovered the work through a review of materials and Freedom of Information Act procedures.
“We need to design computing systems that are not simply tools we use, but teammates that we work alongside,” the project explains at one point.
“For years, civil liberties groups have warned about the threats emerging from increased reliance by law enforcement on automated technologies, such as face recognition and ‘predictive policing’ systems. In recent years, we’ve also called attention to the problems inherent in autonomous police robots, such as the pickle-shaped Knightscope security patrol robots and the quadrupedal ‘dog’ robots that U.S. Department of Homeland Security wants to deploy along the U.S.-Mexico border,” the foundation explained.
But it said the newest iteration “goes so much further.”
The idea is that AI learns “from the human and its environment,” and then uses that knowledge “to help guide the team without requiring specific instructions from the human.”
In its scenario, PNNL explains the two “officers’ get an alert of a robbery in progress, and immediately drones are tapped, face recognition is used, self-driving tech is incorporated, and algorithmic prediction is brought into play.
“While Officer Miller drives to the site of the robbery, D-PO monitors camera footage from an autonomous police drone circling the scene of the crime. Next, D-PO uses its deep learning image recognition to detect an individual matching the suspect’s description. D-PO reports to Officer Miller that it has a high-confidence match and requests to take over driving so the officer can study the video footage. The officer accepts the request, and D-PO shares the video footage of the possible suspect on the patrol car’s display. D-PO has highlighted the features on the video and explains the features that led to its high-confidence rating,” EFF’s report explained.
Then there’s a discussion between Miller and the digital officer about how to apprehend the suspect.
“The authors leave the reader to conclude what happens next. If you buy into the fantasy, you might imagine this narrative ending in a perfect apprehension, where no one is hurt and everyone receives a medal–even the digital teammate. But for those who examine the intersection of policing and technology, there are a wide number of tragic endings, from the mistaken identity that gets an innocent person pulled into the criminal justice system to a preventable police shooting–one that ends in zero accountability, because Officer Miller is able to blame an un-punishable algorithm for making a faulty recommendation,” EFF’s report said.
The organization reported that the tech apparently is a “long way off,” but noted that one city police department already has expressed interest in the capabilities.
But the report is not that the tech also is being pushed as optional for the Customs & Border Protection officers.
“CBP is infamous for investing in experimental technologies in the name of border security, from surveillance blimps to autonomous surveillance towers. In the PNNL scenario, the Border Inspections Teammate System (BITS) would be a self-directed artificial intelligence that communicates with checkpoint inspectors via augmented reality (AR) headset,” the report said.
EFF then warned of the problems of adopting unproven tech, which is “often based on miraculous but implausible narratives promoted by tech developers and marketers, without contemplating the damage they might cause.”
“Society would be better served if the PNNL team used their collective imagination to explore the dangers of new policing technologies so we can avoid the pitfalls, not jetpack right into them,” the report noted.