Nation/World

Users of robot taxis in San Francisco targeted with a new form of harassment

SAN FRANCISCO - Stephanie, a tech worker in San Francisco, was drawn to the robot taxis that Alphabet’s Waymo operates in the city because she felt more secure without a human driver. The vehicles provide hundreds of thousands of driverless rides each month, city data shows.

“It’s so much safer, especially for a woman,” she said in a phone interview this month. “You’re not getting in the car with some strange man.”

But when a self-driving ride turned scary last month, she found herself wishing there had been a stranger behind the wheel after all.

Stephanie recalled riding home with her sister in one of Waymo’s driverless Jaguar SUVs around 10:30 p.m. on a Saturday night when a car holding several young men began following them. They drove close to the robotaxi honking and yelling, “Hey, ladies - you guys are hot.”

If she or another human had been driving, it would have been easy to reroute the car to avoid leading the pursuers to her home. But she was scared and didn’t know how to change the robot’s path. She called 911, but a dispatcher said they couldn’t send a police car to a moving vehicle, Stephanie recalled.

“All the safety things then become unsafe if someone is following you,” she said of the incident, which Waymo confirmed and has not been previously reported.

As Waymo expands its service in San Francisco, Los Angeles, Phoenix and Austin, some passengers like Stephanie have found that traveling by robotaxi can make riders into sitting ducks for a new form of public harassment.

ADVERTISEMENT

The Washington Post spoke with four Waymo passengers, three of them women, who said they experienced harassment or what felt like threats to their safety from people who followed, obstructed or attempted to enter a driverless vehicle they were riding in.

Some had reported the incidents to the police or 911 and all spoke on the condition that their last names not be used out of concern for their personal safety.

All the riders said their experiences with Waymo had generally been positive but that the company should improve on how it responds to threats to riders’ personal safety.

“We take these events very seriously and understand how upsetting they can be,” Waymo spokesman Ethan Teicher said in an email. He said incidents of harassment or attempts to enter one of the company’s vehicles during a ride are “extremely rare.”

Support agents stay on the line with riders who call in such incidents and work with law enforcement as appropriate, Teicher said. Passengers can tell a vehicle to pull over or change its next stop or destination using the Waymo app, or ask a support agent to make similar changes. But support staff cannot redirect a vehicle’s specific route, he said.

Elliot, a tech worker in San Francisco, recalled in a phone interview a “scary” situation during a Waymo ride late one night in October. A pedestrian tried to enter the driverless vehicle as it waited at a red light.

“Go away,” Elliot yelled at the man as he knocked on the window before briefly flashing what looked like a knife, video of the incident viewed by The Post showed. “What are you doing?,” Elliot asked, before the man walked away.

Elliot said the encounter unfolded so quickly that he didn’t have time to call Waymo support or the police as it happened, but he recorded video of the interaction and filed a police report a few days later.

In the moment, Elliot said, he wished someone could have “slammed on the gas and gotten away from this guy,” adding that Waymo should change how its vehicles respond in such situations. The San Francisco Police Department in an email confirmed the incident and said it is investigating.

Waymo’s sensor-studded vehicles have become a constant presence on San Francisco streets. Tourists gawk at them and transit riders are confronted by bus shelter ads from the company proclaiming, “Welcome to the future.”

But while Alphabet’s fleet has become the first large-scale deployment of commercial driverless service, San Francisco’s journey as a robotaxi test bed has at times been bumpy. Autonomous vehicles have obstructed other traffic, including fire fighters at emergency scenes. A grisly collision with a pedestrian initially struck by a human-driven vehicle forced General Motors’ Cruise to end its commercial service in the city in 2023.

The harassment incidents are the latest example of how it has proven easier to program robotaxis to observe road signs and speed limits than to handle the nuanced human aspects of driving on public roads.

Michael Brooks, executive director of Center for Auto Safety, a nonprofit consumer advocacy group, said in an email that the vehicles “are way behind when it comes to interpreting human behavior and responding appropriately.”

San Francisco residents have also learned that self-driving vehicles can be manipulated. During a recent test ride through Golden Gate Park in San Francisco, a handful of laughing cyclists held up a Waymo vehicle carrying a Washington Post reporter by weaving in and out of the lane in front of it.

“We see a lot of prank-type behavior,” Brooks said, but some incidents have been more violent. Waymo vehicles have been defaced with graffiti and in February, one was set alight on a crowded street in Chinatown.

Missy Cummings, an engineering professor at George Mason University who directs the university’s Autonomy and Robotics Center, said it’s to be expected that some people will take advantage of situations where no driver is present.

She also theorizes that self-driving cars can trigger Americans’ anxiety about being replaced by technology. “People are just tired of and fearful of tech that seems to be coming at them from all sides in a negative way,” Cummings said.

ADVERTISEMENT

Whatever the causes, those who have experienced harassment or other incidents during a robotaxi ride say operators like Waymo need to provide better support for passengers during and after such moments.

Madelline, a 25-year-old restaurant server in San Francisco, said that during a recent Waymo ride at around 2 a.m., the driverless vehicle had to stop after two drivers ahead began yelling at each other and throwing things out of their cars in what appeared to be a road rage dispute.

The two cars blocked an intersection and one person got out of one of the vehicles.

“I was definitely panicking a little bit,” Madelline said, as her car waited for the road to clear instead of turning off as a human driver might do. In her distress she called her sister for support, and didn’t consider Waymo’s support line.

She would like to have more control over a robotaxi’s route but still prefers Waymo rides to using Uber or Lyft, whose drivers sometimes make her uncomfortable. “Being able to be alone in a car feels a lot safer,” Madelline said.

The night that Stephanie was followed and catcalled during a driverless ride, she said the other car gave up the chase when the Waymo was a minute from her house. She and her sister arrived home safely, though terrified.

Stephanie didn’t catch the car’s license plate number, which the 911 dispatcher requested after her ride concluded. Waymo vehicles, like other driverless cars in development, use multiple cameras to help make sense of the world around them. But when she later asked the company for the car’s video footage, hoping it had captured the license plate, Waymo declined to provide it, she said.

She would like closer coordination between Waymo and first responders and says she is now unsure about self-driving rides after dark. “I would feel safe taking it during the day,” Stephanie said. But “at night, maybe I’m safer having someone else in the car just in case something happens.”

ADVERTISEMENT

Waymo’s spokesman Teicher said the company has a close relationship with emergency responders. “We also respond to lawful investigative requests for information, and we will pursue legal action when appropriate,” he said.

Distressed riders might be helped sooner if robotaxis could respond to incidents without their passenger having to call customer support.

But passengers don’t have the option to “command the vehicle to do something that’s against the road rules,” such as drive at or even into a human in self-defense, said Phil Koopman, an electrical and computer engineering professor at Carnegie Mellon University who has been studying self-driving cars for nearly 30 years.

Cummings, of George Mason University, said solving such personal safety issues is going to take a combination of planning and sensing by vehicles themselves and coordination with remote operations teams.

In September, Amina V. was on her way to a hair appointment when a man stepped in front of her robotaxi and the car stalled in the middle of the street. She already had been recording herself in the Waymo, so she turned the camera to capture the man hitting on her while her car stood frozen in San Francisco’s Soma neighborhood.

She later posted the video on Instagram and X, where Waymo acknowledged the incident.

Amina said in an interview this month that she felt “annoyed” and “powerless” because the car wouldn’t move while pedestrians - first one, then two - were blocking the robotaxi’s path. She called Waymo’s support line, and the operator said they had seen the situation through their cameras.

“They were pretty helpful,” Amina recalled, noting that Waymo asked if she needed police, which she declined because by then the men had left. Waymo offered Amina her next ride free; she took her mom who was visiting from Michigan.

ADVERTISEMENT