Uber Self-Driving Car Death Ruling Sets a Scary Precedent – Next City

Uber Self-Driving Car Death Ruling Sets a Scary Precedent

This March 18, 2018 file photo from video from a mounted camera provided by the Tempe Police Department shows an interior view moments before an Uber SUV hit a woman in Tempe, Ariz. (Tempe Police Department via AP, File)

Editor’s note: Author Angie Schmitt will be talking more about pedestrian deaths and her new book at a Next City webinar on Wednesday, September 30. Register here.

Recently we learned that someone will face charges, after all, in the death of Elaine Herzberg, the first person killed by a self-driving car. But it’s not the powerful tech company that programmed the car that killed her in March 2018 in Tempe, Arizona.

Instead, Maricopa County prosecutors this month brought negligent homicide charges against a woman named Rafaela Vasquez. Vasquez had been working that night as a backup driver; video footage shows, she was watching the television show The Voice on her phone at the time of the crash.

Nevertheless, it is troubling to see blame in this tragedy laid so heavily at the feet of a low-level employee. While Vasquez’s behavior certainly reflected bad judgement, we know in many ways she was set up for this nightmare scenario by decisions made by executives at Uber’s Advanced Technology Group.

Just the week before Herzberg was killed, for example, a whistle blower at the company named Robbie Miller emailed company executives warning about safety problems:

“A car was damaged nearly every other day in February,” he told top officials. “We shouldn’t be hitting things every 15,000 miles.” He also noted prophetically: “Several of the drivers appear to not have been properly vetted or trained.”

There were additional problems, as we write in our new book Right of Way. In the period preceding this case, Uber had used two backup drivers in all of its cars. That was to account for a widely understood phenomenon known as “automation complacency.” A report on this case by the National Transportation Safety Board found that Vasquez’s inattention — she checked her phone 23 times in the three minutes before the crash — was a “typical effect.”

The level of concentration required to monitor a self-driving car for hours and hours, it turns out, is nearly impossible for a single person to maintain. The task is simply too boring for the human brain to remain vigilant about for an extended period of time. But in late 2017, Uber had changed its policy, moving from two backup drivers to one — presumably to save money.

Even given all that — the problems with the tech, the inadequacy of Uber’s backup safety, and Vasquez’s inattention — the whole thing may have been avoided had Uber not deactivated some critical safety features in the car. As NTSB noted in its report on the incident, the car detected Herzberg in the road about six seconds prior to the collision. But Uber had programmed the car to override the emergency braking feature. Vasquez did not even receive a warning that the computer had detected Herzberg in the road ahead.

Looking back, it’s clear that financial considerations outweighed safety for Uber. The company wanted to introduce fully driverless taxi service beginning at the end of 2018. It had lost a jaw-dropping $577 million in the first quarter of 2018 and it was looking ahead to an initial public offering in 2019.

In this instance, the casualty of the company’s recklessness was Herzberg, a homeless woman who had been sleeping in an encampment in a nearby park at the time of her death. Uber settled with Herzberg’s family for an undisclosed amount and briefly suspended its self-driving car operation. But officials chose not to pursue criminal charges against the company.

Even after this case, self-driving car companies like Uber continue to operate with impunity in the United States. There have never been any federal regulations governing their testing or operation. Instead the federal government has issued a set of recommended best practices and relies on a system of voluntary reporting.

Vulnerable people like Herzberg need the government to take an interest in protecting them. Instead the federal government and, in this case, Arizona Governor Greg Ducey brushed aside serious safety concerns raised by state residents and allowed companies like Uber to test a potentially killer product on the public without their explicit consent.

This type of official disinterest in the safety of vulnerable road users is unfortunately a wider issue. Pedestrian deaths have risen roughly 50 percent in the last decade. Many of the same forces in the Herzberg case are at work in this wider safety crisis.

The profits of automakers, who have made a fortune selling oversized pickups and SUVs, have been prioritized over the safety of the public, even as these vehicles’ dangers to pedestrians become better understood. The victims in these cases — like Herzberg — tend to be marginalized; they are disproportionately Black, Latino, Native, lower-income and elderly. Perhaps that’s what makes their deaths so easy to overlook.

But we failed Elaine Herzberg on a systemic, structural level, just as we’re failing the more than 6,500 pedestrians killed each year in the United States. This is really a story about power and vulnerability, not about one individual’s failure. Still that unequal power dynamic is apparent even in the assignment of blame.

Angie Schmitt is the author of Right of Way: Race, Class and the Silent Epidemic of Pedestrian Deaths in America, which was published in August by Island Press.

Charles T. Brown is a senior researcher and adjunct professor at Voorhees Transportation Center at Rutgers University.

Tags: carsuberpedestrian safetyautonomous vehiclestempe

Add to the Discussion

Next City members can comment on our stories. Keep the discussion going! Join our community of engaged members by donating today.

Your Name Your Position at Your Company
×