14 March 2021

Yes, Lying about Self Driving Cars Is a Bad Thing, Elon

The NTSB has called out Elon Musk and Tesla for serial lying about their self driving capabilityes, saying that this puts the driving public at risk.

This is not a surprise. 

Tesla's culture comes from the height of the Dot Com bubble, with a, "We'll fix it in Beta," mentality, which is negligent at best, and potentially criminal when dealing with 4000 pound high speed death machines like automobiles:

The National Transportation Safety Board has filed comments blasting the National Highway Traffic Safety Administration for its permissive regulation of driver-assistance systems. The letter was dated February 1 but was only spotted by CNBC's Lora Kolodny on Friday. The letter repeatedly calls out Tesla's Autopilot for its lax safety practices and calls on NHTSA to establish minimum standards for the industry.

The dispute between federal agencies is the result of Congress dividing responsibility for transportation safety among multiple agencies. NHTSA is the main regulator for highway safety: every car and light truck must comply with rules established by NHTSA. NTSB is a separate agency that just does safety investigations. When there's a high-profile highway crash, NTSB investigators travel to the scene to figure out what happened and how to prevent it from happening again. NTSB also does plane crashes and train wrecks, allowing it to apply lessons from one mode of transportation to others.

………

Under then-President Donald Trump, NHTSA largely let automakers do what they liked when it came to advanced driver-assistance systems (ADAS) and prototype driverless vehicles. NHTSA has generally waited until safety problems cropped up with ADAS systems and dealt with them after the fact. NTSB argues NHTSA should be more proactive, and it put Tesla and Autopilot at the center of its argument.

………

The NTSB also calls for NHTSA to require driver-monitoring systems to ensure drivers are paying attention to the road while driver-assistance systems are active.

"Because driver attention is an integral component of lower-level automation systems, a driver-monitoring system must be able to assess whether and to what degree the driver is performing the role of automation supervisor," NTSB argued. "No minimum performance standards exist for the appropriate timing of alerts, the type of alert, or the use of redundant monitoring sensors to ensure driver engagement."

………

Finally, NTSB argues that NHTSA should require automakers to limit use of driver-assistance systems to the types of roads they're designed for. For example, some ADAS systems are designed to only work on limited-access freeways. Yet few cars actually enforce such limitations. Many systems can be activated on roads the systems weren't designed for.

………

The NTSB mentions Tesla 16 times in the report—far more than any other automaker. This is partly because Tesla vehicles have figured so prominently in the NTSB's work. NTSB says it has investigated six crashes involving driver-assistance or self-driving systems between May 2016 and March 2019. Four of those were fatal. One of these four was the 2018 death of Elaine Herzberg after she was hit by an Uber self-driving prototype. The other three were Tesla owners who relied too much on Autopilot, and it cost them their lives.

………

In its report on the crash, NTSB noted that, at the time of the crash, Autopilot software was only designed for use on controlled-access freeways—not rural highways where cars and trucks can enter the highway directly from driveways and side streets. NTSB pointed out that its report on the Brown crash "recommended that NHTSA develop a method to verify" that companies selling driver-assistance systems like Autopilot have safeguards to prevent customers from using the systems on roads they aren't designed for. Such a system might have prevented Brown from activating Autopilot on the day of his death.

………

"The NTSB remains concerned about NHTSA's continued failure to recognize the importance of ensuring that acceptable safeguards are in place so the vehicles do not operate outside of their operational design domains and beyond the capabilities of their system designs," the agency wrote. "Because NHTSA has put in place no requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV control system's limitations."

NTSB then called out Tesla again, specifically criticizing the decision to release its "full self-driving beta" software to a few-dozen customers.

"Tesla recently released a beta version of its Level 2 Autopilot system, described as having full self-driving capability," NTSB wrote. "By releasing the system, Tesla is testing on public roads a highly automated AV technology but with limited oversight and reporting requirements."

This is negligent behavior, both on the part of Tesla and on the part of the NHTSA, and it has already gotten people killed.

1 comments :

Stephen Montsaroff said...

Expecting Musk to tell the truth is like expecting Apples to fall up.

Post a Comment