Mike Isaac’s Super Pumped (p. 207) that complaints about sexual misconduct and assault (typically though not exclusively between drivers and passengers) were so common that Uber created a twenty-one category classification system for these problems.
“Safe Rides Fee” did nothing for safety
Mike Isaac’s Super Pumped (p. 172) reports that in 2014, Uber added a “Safe Rides Fee” — but used that revenue for general purposes, with no specific investment in safety. He quotes a former employee: “We boosted our margins saying our rides were safer. It was obscene.”
Levandowski hired a lobbyist for autonomous vehicles without safety drivers
Then working at Google, Anthony Levandowski hired a lobbyist in Nevada to advocate for a law that allowed autonomous vehicles to operate without backup/safety drivers. Google didn’t know about this, and this was contrary to Google’s careful approach.
Source: Mike Isaac’s Super Pumped (p. 143)
Victims of sexual assault, rape, harassment, and gender-motivated violence criticized Uber’s arbitration clause
Fourteen victims of sexual assault, rape, harassment, and gender-motivated violence criticized Uber’s arbitration clause, which prevented them from bringing lawsuits about the harm they suffered. Their letter to Uber’s Board of Directors asked that Uber remove (or agree not to enforce) its arbitration clause as to these complaints. They noted a California case in which Uber aggressively sought to force one of their complaints into confidential arbitration. They also noted pending legislation in the United States Congress and New York State Senate that would disallow companies from requiring victims of sexual harassment or assault to proceed in arbitration.
News coverage from The Mercury News and Recode.
Uber backup drivers fell short in safety functions
CityLab reported widespread shortcomings of the backup drivers who were responsible for supervising Uber’s self-driving cars. One, it is unclear whether humans can do a good job supervising machines that work well most of the time — requiring intense concentration to identify the occasional error, when most of the time, there are tempting distractions. Uber’s 8 to 10-hour shifts, with one 30 minute lunch break, were grueling — and drivers were often assigned to repeat the same driving “loops” which likely made the task particularly dull for drivers. Additional challenges included working entirely alone (without other humans) (after Uber removed a second staff person from each vehicle), and, CityLab reported, the vehicles’ frequent hard braking.
Meanwhile, CityLab spoke with multiple drivers who were dismissed from Uber for safety infractions, including using a phone while a vehicle was in motion — undermining any suggestion that all safety drivers do as instructed.
Reduced safety sensors on self-driving cars
In scrutiny after an Uber vehicle struck and killed a pedestrian, it was revealed that Uber had reduced the number of safety sensors on its self-driving cars, creating blind spots for pedestrians in certain locations.
Ordered to take self-driving vehicles off Arizona roads
After an Uber self-driving vehicle struck and killed a pedestrian in Tempe, Arizona, the state’s governor ordered all self-driving Uber vehicles off the road. The governor called Uber’s approach “an unquestionable failure to comply” with the state’s expectations for public safety.
Removed second staff person from autonomous cars
Historically, Uber’s autonomous cars had two staff members onboard: One to take over driving in case of problems, and another to monitor onboard systems to track performance and label data. But Uber later moved to a single operator. Reviewing 100 pages of internal company documents, the New York Times reported that some employees expressed safety concerns about the change. Among other concerns, they noted that solo work would make it harder to remain alert during monotonous driving.
Broadly, problems seemed to have unfolded as internal critics worried. One Uber autonomous car safety driver was fired after being seen asleep at the wheel. When an Uber vehicle struck and killed a pedestrian in Tempe, Arizona, early review of the onboard video shows the staff person looking down or sideways, perhaps at a phone or onboard systems, but not at the road.
Self-driving cars fell short of expectations
Reviewing 100 pages of internal company documents, the New York Times reported that Uber vehicles were falling short of company objectives. For example, Google cars could drive an average of nearly 5,600 miles before a driver had to take control from the computer, whereas Uber vehicles struggled to meet the company’s target of one intervention every 13 miles.
Self-driving vehicle struck and killed pedestrian
An Uber self-driving vehicle struck and killed a pedestrian in Tempe, Arizona.
Early reports indicated that the pedestrian was crossing a roadway after dark, outside a crosswalk, and that Uber would probably be deemed not at fault in this incident.
But reviewing the crash video, multiple concerns arose. For one, Uber’s onboard driver — responsible for taking over in case of system problems — was looking down or sideways, hence unable to see the pedestrian. If her hands were on the steering wheel, ready to take over driving from the computer, that is not apparent from the video. Two, the pedestrian was making steady progress across the roadway. Three, some experts said a standard automatic emergency braking system, even on ordinary commercial vehicles, would have been able to detect the pedestrian and at least apply the brakes.
Velodyne, which makes the LIDAR sensors used on Uber’s autonomous cars, expressed surprise that the Uber vehicle hit the pedestrian. A Velodyne spokesperson explained in an email: “We are as baffled as anyone else. … Certainly, our Lidar is capable of clearly imaging Elaine and her bicycle in this situation.” Velodyne suggested that Uber’s software might be at fault, explaining that “[o]ur Lidar doesn’t make the decision to put on the brakes or get out of her way” and that Uber’s systems would need to make those decisions.