Thursday

April 25th, 2024

Brave New World?

This robot chose to injure the man who built it. Here's why its inventor is pleased

 Ben Guarino

By Ben Guarino The Washington Post

Published June 22, 2016

This robot chose to injure the man who built it. Here's why its inventor is pleased

Stick a finger beneath Alexander Reben's robot, and it might jab you. It won't be much of a wound. Reben built the robot to inflict the minimal amount of pain and still, technically, be harmful.

But there will be blood. If the robot so chooses, you will come away with a tiny puncture in your index finger. It has claimed some half-dozen victims so far.

Or, maybe, the robot will leave you alone.

No one can say for sure what will happen, not even its creator. "I view it as a piece of tangible philosophy," Reben, a roboticist and artist based in Berkeley, Calif., told The Washington Post by phone early Friday morning. Reben's devices mix art and technology, often whimsically, like his solar-powered music box that plays "You Are My Sunshine."

This new machine, however, was not designed for whimsy. It's the first ever robot to " autonomously and intentionally" break the First Law of Robotics, Reben says on his website. The law, one of a trio of famous science fiction principles created by author Isaac Asimov, declares that robots must not allow harm to befall humans.

As it appeared in Asimov's 1942 short story "Runaround," the First Law states: "A robot may not injure a human being, or, through inaction, allow a human being to come to harm." (The Second Law: "A robot must obey orders given it by human beings except where such orders would conflict with the First Law." And the Third: "A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." In fiction and academic literature, the laws have a way of not working out.)

Despite undermining a 74-year-old sci-fi foundation, Reben's robot is a simple machine. Reben previously used the metal arm for an automated head-scratcher, inspired by the pleasure orbs of the Woody Allen film "Sleeper." People sat in a chair and Reben's arm, tipped with a wire brush, would massage their scalps. This, the roboticist said, created a disconcerting sense of intimacy between machine and human.

"The robot would make people shiver," he said. "They start feeling really weird about it."

Reben rebuilt the robot arm to ever-so-slightly bring the pain. The retrofit took a few days and a few hundred dollars, the BBC reports. The robot arm is a small machine, its base no larger than a piece of printer paper. A sensor, similar to a laptop track pad, detects when someone places a finger beneath the arm. If the robot decides to strike, it does so in a quick downward swoop.

The prick is just powerful enough to slice open a small hole in the skin. But that's threatening enough, Reben told The Post, to psych people out. "It's hard not to get sweaty and nervous," he said.

Crucially, Reben argues the robot makes a decision to harm, unlike any robot designed before. "I thought about military robots," he said, "and they don't fulfill all the tick marks." In the case of a Predator drone and other unmanned military aircraft, human operators decide to open fire. And radar-controlled sentry cannons, developed for the Navy, shoot at flying intruders that fit a pre-programmed description. In Reben's view, such systems do not choose to fire, in the same way land mines do not choose to explode once being stepped on.

A poke from the Asimov lawbreaker robot, on the other hand, is the result of a decision. A finger on the sensor triggers a set of software processes, which arrive at a prick-or-not outcome. "The decision to hurt a person," Reben said, "happens in a way that I can't predict." The software does not use machine learning or artificial intelligence to decide, but neither is it as simple as a 50:50 coin flip. When asked what the likelihood of being stabbed was, Reben said, "I don't know the probability."

Reben would like to avoid spreading hysteria (at MIT, he studied how humans and robots can work in concert) and hepatitis B (he uses sterile needles). Instead, he wants the robot to be provocative.

"It's clear that I programmed this, and easy for people to say I'm responsible," he said. But if a robotic system built by many people and many corporations caused harm, "where does that accountability lie?"

The first robot designed to violate a set of hypothetical laws is no longer hypothetical, Reben points out. It's a real machine that demands gut reactions - and, every so often, the need for a Band-Aid.

Columnists

Toons