Let's talk about killer robots

I’m looking for Thanksgiving conversation at the table that isn’t politics or professional sports? Okay, let’s talk about killer robots. It’s a concept that has long since leapt from the pages of science fiction to reality, depending on how loose a definition you use for “robot.” Abandoned military drones Asimov’s First Law of Robotics — “A robot may not injure a human being or by inaction allow a human being to be injured” — decades ago.

The topic has resurfaced recently due to the growing prospect of killer robots in local law enforcement. One of the era’s best-known robot manufacturers, Boston Dynamics, raised some public policy alarms when it showed footage of its Spot robot being used as part of a Massachusetts State Police training exercise on our stage in 2019.

The robots were not armed and were instead part of an exercise designed to determine how they could help keep employees out of harm’s way during a hostage or terrorist situation. But the prospect of deploying robots in scenarios where human lives are at immediate risk was enough to prompt an inquiry from the ACLU, which he told TechCrunch:

We urgently need more transparency from government agencies, which need to be upfront with the public about their plans to test and deploy new technologies. We also need state regulations to protect civil liberties, civil rights, and racial justice in the age of artificial intelligence.

Meanwhile last year, The NYPD cut a deal with Boston Dynamics following strong public backlash after images surfaced of Spot being deployed in response to a home invasion in the Bronx.

For its part, Boston Dynamics has been very vocal in its opposition to arming its robots. It was signed last month open letter, along with other leading firms Agility, ANYbotics, Clearpath Robotics and Open Robotics, condemning the action. It states:

We believe that adding weapons to robots that are remotely controlled or autonomous, widely available to the public, and capable of moving to previously inaccessible places where people live and work raises new risks of harm and serious ethical issues. Weaponized applications of these robots with new capabilities will also damage public trust in the technology in ways that will damage the enormous benefits they will bring to society.

The letter is believed to be in part a response to Ghost Robotics’ work with the US military. When images of one of its own robotic dogs surfaced on Twitter with an autonomous rifle, the Philadelphia firm told TechCrunch it took an agnostic stance on how the systems are being used by its military partners:

We don’t do the payloads. Will we promote and advertise any of these weapon systems? Probably not. This is a difficult one to answer. Since we sell to the military, we don’t know what they do with it. We will not dictate to our government customers how to use the robots.

We draw the line where they are sold. We only sell to US and allied governments. We don’t even sell our robots to corporate customers in competitive markets. We get a lot of inquiries about our robots in Russia and China. We do not ship there, even for our corporate customers.

Boston Dynamics and Ghost Robotics are right now involved in litigation involving several patents.

This week’s local police reporting site Local mission a new concern about killer robots has surfaced – this time in San Francisco. The site notes that a policy proposal being reviewed by the city’s Board of Supervisors next week includes language about killer robots. The “Law Enforcement Equipment Policy” begins with an inventory of the robots currently in the possession of the San Francisco Police Department.

There are 17 of them in total, of which 12 are active. They are largely designed for bomb detection and disposal – which is to say, none are specifically designed to kill.

“The robots listed in this section must not be used outside of training and simulations, criminal arrests, critical incidents, exigent circumstances, warrant enforcement, or during suspicious device evaluations,” the policy notes. It then adds, even more disturbingly, “Robots will only be used as a lethal force option when the risk of loss of life to members of the public or officers is imminent and outweighs any other force option available to SFPD.”

Effectively, according to the language, robots can be used to kill in order to potentially save the lives of employees or the public. Perhaps it seems innocuous enough in this context. At the very least, it appears to fall within the statutory definition of “justified” deadly force. But new concerns are emerging in what appears to be a profound shift in policy.

For starters, using a bomb disposal robot to kill a suspect is not without precedent. In July 2016, Dallas police officers did just that for what was believed to be for the first time in US history. “We saw no other option but to use our bomb robot and place a device on its extension to detonate where the suspect is,” Police Chief David Brown said at the time.

Second, it is easy to see how a new precedent can be used in a CYA scenario, if a robot is intentionally or accidentally used in this way. Third, and perhaps most troubling, one can imagine the language applicable to the acquisition of a future robotic system that is not only designed to detect and dispose of explosives.

Mission Local adds that SF Board of Supervisors Rules Committee Chair Aaron Peskin tried to insert the more Asimov-friendly line: “Robots shall not be used as a use of force against any person.” The SFPD apparently is struck out Peskin’s change and updated it to its current language.

The renewed conversation surrounding killer robots in California comes in part because of House Bill 481. Signed by Gov. Gavin Newsom last September, the law aims to make policing more transparent. This includes an inventory of military equipment used by law enforcement.

The 17 robots included in the San Francisco paper are part of a longer list that also includes Lenco BearCat armored vehicleflash guns and 15 automatic machine guns.

Last month, Oakland police said so will not seek approval for armed remote robots. The department said in declaration:

The Oakland Police Department (OPD) is not adding armed remote control vehicles to the department. OPD took part in special committee discussions with the Oakland Police Commission and community members to explore all possible uses for the vehicle. However, after further discussions with the superintendent and executive team, the department decided it no longer wanted to explore this particular option.

The statement followed a public backlash.

The toothpaste is already out of the tube first Asimov’s first law. The killer robots are here. As for the second law – “A robot must obey the orders given to it by human beings” – this is still within our understanding. It is up to society how its robots behave.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *