Tuesday , September 22 2020
Home / unitedstates / The world's first death of robotic cars was the result of human error – and it can happen again

The world's first death of robotic cars was the result of human error – and it can happen again



[ad_1]

On November 20, the National Transportation Safety Board (NTSB) released the results of its investigation into the fatal 2018 Uber crash in Tempe, Arizona, widely believed to be the world's first death by a self-driving car.

But rather than slap the cuff on Uber's robotic car, investigators instead investigated the many human errors that culminated in the death of 49-year-old Elaine Herzberg. And they sounded a warning: it could happen again.

"If your company is testing automatic driving systems on public roads, this crash, it was about you," NTSB chairman Robert Sumwalt said in his opening statement at the hearing yesterday.

When the board read aloud its findings on the probable cause of the crash in Tempe, the first person to blame was Rafaela Vasquez, the driver of the vehicle in the crash. Vasquez was never called out by name, but her failures as a watchdog for the automated driving system were put on sharp display by NTSB.

In minutes before the impact, Vasquez reportedly streamed a section of The voice on her phone, which is contrary to Uber's policy that prohibits phone use. In fact, investigators determined that she had looked down at her phone and away from the road for over a third of the total time she had been in the car until the moment of the crash.

Vasquez's failure "to monitor the driving environment and the operation of the automated driving system because she was visually distracted during the trip of her personal mobile phone" was cited as the leading cause of the crash. But she shares the blame with her employers at Uber where an unfortunately insufficient safety culture also contributed to Herzberg's death, the board said. Similarly, the federal government also bore its share of responsibility for not better regulating autonomous car operations.


Picture: ABC 15

"In my opinion, they have presented technology development here before saving lives," said NTSB board member Jennifer Homendy, of the National Highway Traffic Safety Administration (NHTSA), which is responsible for regulating vehicle safety standards.

At the time of the crash, Uber's Advanced Technologies Group had no corporate safety plan or guidance document identifying the roles and responsibilities of individual employees in managing safety, says Michael Fox, senior investigator for highway accidents at NTSB. The company also lacked a security department and did not have a dedicated security manager responsible for risk assessment and mitigation. In the weeks before the crash, Uber made the fateful decision to reduce the number of drivers in each vehicle from two to one. That decision removed important redundancy that could have helped prevent Herzberg's death.

Vasquez was not only alone in the car at the time, but her self-giving attitude to the car's automatic driving system also contributed to the crash. And that attitude was badly misplaced. The car discovered that Herzberg crossed the street with his bike 5.6 seconds before impact. But even though the system continued to track Herzberg until the crash, it never correctly identified her as a human being on the road, nor did she predict her path correctly.

"Independence in automation … must be in everyone's vocabulary," said Bruce Landsberg, board member of NTSB.

Vasquez was completely unaware of this conflict until it was too late. One of the implicit lessons of the Uber crash and the subsequent NTSB investigation is the under-utilization of safety drivers in self-driving cars, says Mary "Missy" Cummings, director of the Humans and Autonomy Lab at Duke University. Uber and other companies that test self-driving cars usually hire independent contractors as safety drivers to ride the cars and generate miles. They are seen as little more than bodies in seats. Instead, they should be seen as critical partners in a test protocol that can provide very useful feedback, Cummings said.

"Of course, this would cost money," she said. "Despite everyone's lip service as security is of paramount importance, no one I know of supports safety drivers in this way."

Uber's aggressive corporate culture – the need to test autonomous vehicles despite the lack of technology – has been exposed not only by this investigation, but also by the lawsuit brought by Waymo, the self-driving company spun out of Google, which accused Uber of steal their self-driving business secrets.


Photo: Uber

"The driver had a chance to save his life," a former Uber ATG employee who was with the company at the time of the crash told us The limit, "But Uber had dozens."

Uber ATG was under enormous pressure to show results to the new CEO, Dara Khosrowshahi, who reportedly was considering closing the division due to rising R&D costs. That led them to cut corners, but the company has since made significant progress in addressing these mistakes.

NTSB board members saved their most windy assessments for the federal government. Homendy blamed NHTSA for prioritizing technological development over saving lives, and she called the agency's voluntary guidance so "lax" to be "laughable".

The voluntary safety guidelines were first established under President Obama, who feared that restrictive rules for self-driving car testing could stifle innovation. These rules have been made even more relaxed by President Trump, who went further by eliminating an all-star Federal Automotive Automation Committee that was meant to serve as a "critical resource" for the Department of Transportation. Trump shouldered the committee without even telling any of its members, The limit recently reported.

So far, only 16 companies have submitted voluntary safety reports to NHTSA, many of which constitute little more than "marketing brochures," says Ensar Becic, project manager and human performance investigator at the Office of Highway Safety. It represents only a fraction of the over 60 companies that test self-driving cars in California alone.

"I mean you might as well say we want your judgments, but we really don't need that," Homendy said during the hearing. "So why are you doing this?"

The Ministry of Transport has issued three versions of its automated vehicle safety guidance, and it plans to issue a fourth version containing lessons learned from the Tempe crash, said Joel Szabat, who acts under the policy secretary of the Department of Transport, during a late hearing on November 20. (The original document is called "Automated Driving Systems: A Vision for Safety." "They should rename it "A Vision for Lack of Security," Homendy said.)

Today, there are no federal laws requiring AV operators to demonstrate the safety of their vehicles before testing them on public roads or providing decoupling or malfunctioning data in their automated driving systems. The government's only role is reactive: to revoke a part if it is defective or to start an investigation in the event of a crash.

In its final report, NTSB recommends that it be changed. AV operators should be required, not encouraged, to submit safety assessments if they want to test their vehicles on public roads, it argues, and there must be a process for continuous evaluation to determine whether AV operators are meeting their safety objectives.

But the day after the report was released, NHTSA's top administrator testified at a Senate hearing that Congress should pass a law to speed up the deployment of fully self-driving cars that lack traditional controls like steering wheel and pedals. Currently, the agency is only allowed to exempt a total of 25,000 vehicles from federal motor vehicle safety standards per year.

"When we hear from the industry, the cap may be too small," said NHTSA's acting administrator James Owens.

The previous attempt to lift the car's restrictions without human controls fell. The Democrats in the Senate blocked the bill and cited insufficient measures to ensure security. A second attempt is in the works, but it remains to be seen if it can gather enough votes to go.

Overly restrictive federal regulations at this stage of a rapidly changing technology are likely to cause significantly more harm than good, said Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University. "Just because Uber and their operator were behaving badly, not everyone else should be punished," he said.

Autonomous vehicles should save lives, not take them. And if people don't trust the companies that build the technology, the life-saving potential of self-driving cars will never be realized. The cars will roam the streets, empty and underutilized, until the operators pull the plug.

Studies show that half of adults in the United States think automated vehicles are more dangerous than traditional human-powered vehicles. The opinions are already hardening and it is unclear what can be done to undo the damage that the Uber crash has already caused.

[ad_2]
Source link