13 June 2022

It Certainly Sounds Like Something He Would Do

It appears that the investigation of Tesla's Autopilot and self driving systems have moved into high gear,

This is not a surprise, the system has been misrepresented by Elon Musk and Tesla since its introduction.

What I do find interesting though is that it appears that regulators have concerns that Autopilot may be detecting an oncoming crash and deactivating itself about a second before impact in order to minimize liability.

They want to claim that Autopilot was not active at the time of the crash:

U.S. authorities are escalating and expanding a probe into Tesla’s controversial automated driving feature in a move that could prompt a mandatory recall.

On Thursday, the National Highway Traffic Safety Administration, an agency under the guidance of Transportation Secretary Pete Buttigieg, said it would be expanding a probe and look into 830,000 Tesla cars across all four current model lines, 11% more vehicles than they were previously examining.

The move came after the agency analyzed a number of accidents that revealed patterns in the car’s performance and the associated driver’s behavior, concluding that the findings warranted an upgrade to an "Engineering Analysis" from a previous "Preliminary Evaluation." An Engineering Analysis can be the precursor to a recall.

It said the purpose of escalating the investigation was to “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”

………


On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot “aborted vehicle control less than one second prior to the first impact,” suggesting the driver was not prepared to assume full control over the vehicle.

CEO Elon Musk has often claimed that accidents cannot be the fault of the company, as data it extracted invariably showed Autopilot was not active in the moment of the collision.

(emphasis mine)

While anything that might indicate the system was designed to shut off when it sensed an imminent accident might damage Tesla’s image, legally the company would be a difficult target.

I get that drivers are ultimately responsible for their vehicles, but if Autopilot is defined with sort of switch, it would constitute an acknowledgement that they knew the system was being misused, and that they were trying to cover this up.

Given that Tesla has already been caught using non-disclosure agreements to keep its customers from telling the NHTSA about potentially dangerous design flaws, this behavior certainly fits a pattern.

When one considers some of the other actions taken by Tesla,

0 comments :

Post a Comment