AutoBusiness

An engineer alleges that the Tesla self-driving video was staged.

Former engineer Ashok Elluswamy is claiming that a video that Tesla used to advertise its self-driving technology in 2016 was staged. According to testimony from the senior engineer, the video was intended to demonstrate features like stopping at a red light and accelerating at a green light that the system did not possess.

Elon Musk, the CEO of Tesla, promoted that video on Twitter in October 2016 as proof that “Tesla drives itself” and it is still stored on the company’s website.

However, Ashok Elluswamy, director of Autopilot software at Tesla, stated in the transcript of a July deposition that was used as evidence in a lawsuit against Tesla for a 2018 fatal crash involving a former Apple (AAPL) (AAPL) engineer that the Model X was not driving itself with technology Tesla had deployed.

 

“The purpose of the film was not to accurately represent what was offered to clients in 2016. According to a transcript of his evidence, Elluswamy stated that the purpose was to illustrate what may be incorporated into the system.

For the first time, a Tesla employee has confirmed and explained how the video was made in his or her previously unreported statement, which was provided by Elluswamy.

 

“The person in the driver’s seat is just there for legal reasons,” the tagline for the video reads. He is not acting in any way. The vehicle is self-driving.

At Musk’s request, the Tesla Autopilot team, according to Elluswamy, set out to design and document a “demonstration of the system’s capabilities.” Huang’s deadly crash was most likely brought on by his distraction and the limits of Autopilot, the National Transportation Safety Board found in 2020. It claimed that the driver was not entirely to responsible for the tragedy and that Tesla’s “ineffective monitoring of driver involvement” had some effects on the accident.

Elluswamy, Musk, and Tesla did not thereafter choose to publicly address the issue. Tesla has cautioned users that while using Autopilot, they must maintain control of their cars by keeping their hands on the wheel.

According to the company’s website, the characteristics of the Tesla technology “do not make the vehicle autonomous.” Instead, they are intended to help with steering, braking, speed, and lane changes.

From a home in Menlo Park, California, to Tesla’s then-headquarters in Palo Alto, he added, the Tesla employed 3D mapping on a prearranged route to generate the film.

In test runs, drivers stepped in to assume control, he claimed. He said a test car in Tesla’s parking area slammed into a fence when he tried to demonstrate the Model X could park itself without a driver.

Musk tweeted, “Tesla drives itself (no human input at all) across urban streets to the interstate to streets, then finds a parking spot,” in response to Tesla’s release of the footage.

After a number of collisions, some of them deadly, employing Autopilot, the U.S. Department of Justice launched a criminal inquiry into Tesla’s promises that its electric cars can drive themselves in 2021, according to Reuters.

Citing unnamed sources, the New York Times claimed in 2021 that Tesla engineers had produced the 2016 advertisement for Autopilot without mentioning that the route had been pre-mapped or that a car had crashed while attempting to complete the shot. In a lawsuit against Tesla over a 2018 accident in Mountain View, California, which killed Apple engineer Walter Huang, Elluswamy was called as a witness.

Asserting Elluswamy’s in July, Huang’s wife’s attorney Andrew McDevitt told Reuters that it was “clearly misleading to feature the video without any disclaimer or asterisk.”

According to Elluswamy, drivers have the ability to “fool the system,” tricking a Tesla system into thinking they are paying attention by providing false feedback from the steering wheel. But if drivers were paying attention, he claimed he did not see any safety concerns with Autopilot.

A few months ago, a group that supports new technology warned that Tesla’s fully autonomous option poses a serious danger to young pedestrians. But no one took notice at the time. They said that during the investigation, Tesla’s self-driving software failed to recognise child-sized mannequins and struck numerous objects, raising questions about the credibility of the largest manufacturer of electric vehicles in the world.

The Tesla Full Self-Driving (FSD) was also the subject of a safety test by The Dawn Project, which reached the same conclusion that the FSD was colliding with stationary kid-sized mannequins in its route. The Tesla auto-driving technology has now been deemed a public threat by The Dawn Project, which is putting pressure on congress to outlaw it.

Both of these revelations potentially cause Tesla problems now that Elluswamy’s allegation is public. Although we are unsure if legal action is being considered against Tesla, the public perception and customers’ trust have undoubtedly been impacted.

 

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button