Saturday

April 20th, 2024

Etc.

Here's what happens when police pull over a driverless car

Chico Harlan

By Chico Harlan The Washington Post

Published Oct. 25, 2018

Here's what happens when police pull over a driverless car
As self-driving cars become increasingly common on American streets, an obvious question arises:

What happens when police want to pullover a robot-driven vehicle without a human backup driver?

In their recently updated "Emergency Response Guide," Alphabet's Waymo - which has hundreds of autonomous Chrysler Pacifica minivans on the road in Phoenix - provides a protocol that may offer some glimpse of what's to come.

"The Waymo vehicle uses its sensors to identify police or emergency vehicles by detecting their appearance, their sirens, and their emergency lights," the guide states. "If a Waymo fully self-driving vehicle detects that a police or emergency vehicle is behind it and flashing its lights, the Waymo vehicle is designed to pull over and stop when it finds a safe place to do so."

Once it has come to a stop, Waymo's vehicles can unlock its doors and roll down its windows, allowing someone from the company's support to team to communicate with law enforcement, according to the guide. If there are passengers in the vehicle, the guide states, Waymo's "rider support specialists" can communicate with them via speakers, displays and "in-vehicle telecommunications."

If necessary, a Waymo employee may even be dispatched to the scene in person. The company says employees may be sent to the scenes of wrecks as well to interact with police and passengers.

"The Waymo vehicle is capable of detecting that it was involved in a collision," the guide states, noting that if a vehicle's air bag is deployed, its self-driving capability is disabled. "The vehicle will then brake until it reaches a full stop and immediately notify Waymo's Fleet Response specialists."

Waymo has been testing its fleet of autonomous Chrysler Pacifica minivans in Phoenix for years, but the vehicles have been ferrying the public around portions of town without a backup driver for nearly a year. The company, which has a 600-vehicle fleet in Phoenix, says its testing is "picking up speed" and recently announced plans to order thousands more Pacificas as it expands into other cities.

Hoping to avoid a landscape of varying state laws, companies like Waymo have been pushing for a set of federal regulatory rules that would help them to expand nationally, unleashing tens of thousands of self-driving vehicles onto public roads.

"Some areas, like Connecticut and the District of Columbia, ban autonomous vehicles without a human in the driver's seat. Others, like Michigan and Washington, allow it only if certain conditions are met," Bloomberg reported in January.

"California, home to many self-driving and other car technology companies, such as Lyft Inc. and Uber Technologies Inc., also requires a human behind a steering wheel, a California Department of Motor Vehicles spokesperson told Bloomberg Law."

With the number of autonomous vehicles and rules governing them expanding, U.S. transportation regulators are already debating whether police should have the ability to disable driverless cars during emergencies, according to Reuters.

Even if this answer is yes, regulators acknowledge, a host of other consequential questions must be answered. In meetings over the summer, Reuters reported, many of the experts present argued that the same tools that police might use to control self-driving cars could be exploited by hackers and terrorists.

Many regulators, the article notes, "agreed that it is a question of when, not if, there is a massive cyber security attack targeting" driverless vehicles. They added that "planning exercises are needed to prepare for and mitigate a large-scale, potentially multimodal cyber security attack."

(COMMENT, BELOW)

Columnists

Toons