Tesla Autopilot Cars Closer to Recall

DETROIT –
Teslas with partially automated driving systems are set to be recalled after the United States stepped up its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.
The National Highway Traffic Safety Administration said Thursday it was upgrading the Tesla probe to an engineering scan, another sign of the maker’s increased scrutiny of electric vehicles and automated systems that perform at least some driving tasks.
Documents released Thursday by the agency raise serious concerns about Tesla’s Autopilot system. The agency has found that it is used in areas where its capabilities are limited and that many drivers take no action to avoid accidents despite warnings from the vehicle.
The probe now covers 830,000 vehicles, nearly all the Austin, Texas automaker has sold in the United States since the start of the 2014 model year.
NHTSA reported that it found 16 crashes in emergency vehicles and trucks with warning signs, resulting in 15 injuries and one death.
Investigators will evaluate additional data, vehicle performance and “explore the extent to which Autopilot and related Tesla systems may exacerbate human factors or behavioral safety risks, compromising the effectiveness of driver supervision,” said the agency.
A message was left Thursday seeking comment from Tesla.
A technical analysis is the final step in an investigation, and in most cases NHTSA decides within a year whether there should be a recall or if the investigation should be closed.
In the majority of the 16 crashes, the Teslas issued collision alerts to drivers just before impact. Automatic emergency braking intervened to at least slow the cars in about half of the cases. On average, the Teslas’ autopilot gave up control less than a second before the crash, NHTSA said in documents detailing the probe.
NHTSA also said it was reviewing crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.
The agency found that in many cases drivers had their hands on the wheel as required by Tesla, but did not take action to avoid an accident. That suggests drivers are complying with Tesla’s monitoring system, but it doesn’t guarantee they’re paying attention.
In crashes where video is available, drivers should have seen first responder vehicles an average of eight seconds before impact, the agency wrote.
The agency will have to decide if there is a safety flaw with the autopilot before pursuing a recall.
Investigators also wrote that a driver’s use or misuse of the driver monitoring system “or driving a vehicle unintentionally does not necessarily prevent a fault in the system.”
The agency document says Tesla’s method of making sure drivers pay attention isn’t good enough, and is flawed and needs to be recalled, said Bryant Walker Smith, a law professor at the University of South Carolina which studies automated vehicles.
“It’s really easy to have one hand on the wheel and be completely disengaged from driving,” he said. Monitoring the position of a driver’s hands is not effective because it only measures a physical position. “It’s not about their mental capacity, their commitment or their ability to react.”
Similar systems from other companies such as General Motors’ Super Cruise use infrared cameras to monitor a driver’s eyes or face to ensure they are looking ahead. But even these can still get a driver out of the area, Walker Smith said.
“This is confirmed in study after study,” he said. “It’s an established fact that people can look engaged and not be engaged. You can have your hands on the wheel and you can be impatient and not have the situational awareness that is required.
In total, the agency reviewed 191 crashes but removed 85 because other drivers were involved or there was not enough information to make a definitive assessment. Of the remaining 106, the main cause of around a quarter of the accidents appears to be the use of the autopilot in areas where it has limitations or in conditions that may interfere with its operations.
“For example, operation on roads other than limited-access highways, or operation in low traction or visibility environments such as rain, snow, or ice,” the agency wrote.
Other automakers limit the use of their systems to limited-access divided highways.
The National Transportation Safety Board, which also investigated some of Tesla’s crashes from 2016, recommended that NHTSA and Tesla limit Autopilot use to areas where it can operate safely. The NTSB also recommended that NHTSA require Tesla to have a better system to ensure drivers pay attention. NHTSA has yet to act on the recommendations. The NTSB can only make recommendations to other federal agencies.
In a statement, NHTSA said there were no vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for the operation of their vehicles,” the agency said.
Driver assistance systems can help avoid accidents but must be used correctly and responsibly, the agency said.
Tesla performed an online Autopilot software update last fall to improve camera-based detection of emergency vehicle lights in low-light conditions. NHTSA asked why the company had not issued a recall.
NHTSA began its investigation in August last year after a series of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems collided with vehicles at scenes where the first responders used flashing lights, flares, an illuminated arrow sign, or hazard cone warnings.