08 April 2024

Gee, You Think?

In depositions in a California against Tesla, soon to be former employees (for telling the truth) have stated that Elon's vaunted AI driven autonomous navigation system does little more than follow lines painted on a road.

Given the inevitable wear on lines on the roads, and how roads are repainted, this is a recipe for disaster:


So this fatally confuses the Tesla sooper sekret self driving software?

In Tesla’s marketing materials, the company’s Autopilot driver-assistance system is cast as a technological marvel that uses “advanced cameras, sensors and computing power” to steer, accelerate and brake automatically — even change lanes so “you don’t get stuck behind slow cars or trucks.”

Under oath, however, Tesla engineer Akshay Phatak last year described the software as fairly basic in at least one respect: the way it steers on its own.

“If there are clearly marked lane lines, the system will follow the lane lines,” Phatak said under questioning in July 2023. Tesla’s groundbreaking system, he said, was simply “designed” to follow painted lane lines.

Phatak’s testimony, which was obtained by The Washington Post, came in a deposition for a wrongful-death lawsuit set for trial Tuesday. The case involves a fatal crash in March 2018, when a Tesla in Autopilot careened into a highway barrier near Mountain View, Calif., after getting confused by what the company’s lawyers described in court documents as a “faded and nearly obliterated” lane line.

The driver, Walter Huang, 38, was killed. An investigation by the National Transportation Safety Board later cited Tesla’s failure to limit the use of Autopilot in such conditions as a contributing factor: The company has acknowledged to National Transportation Safety Board that Autopilot is designed for areas with “clear lane markings.”

………

In the months preceding the crash, Huang’s vehicle swerved in a similar location eleven times, according to internal Tesla data discussed by Huang’s lawyers during a court hearing last month. According to the data, the car corrected itself seven times. Four other times, it required Huang’s intervention. Huang was allegedly playing a game on his phone when the crash occurred.

The NTSB concluded that driver distraction and Autopilot’s “system limitations” likely led to Huang’s death. In its report, released about two years after the crash, investigators said Tesla’s “ineffective monitoring” of driver engagement also “facilitated the driver’s complacency and inattentiveness.”

I would also add that Tesla has lied about its so-called self driving capabilities, which would have contributed to driver complacency.

Why Elon has not been frog-marched out of his offices in handcuffs over these lies, lies which he as personally made, is beyond me.

0 comments :

Post a Comment