From July 2021 to October 2022, the US Department of Transportation reported 605 crashes involving vehicles equipped with advanced driver assistance systems (ADAS) – aka autopilot – and 474 of them were Teslas.
On November 13, a disturbing video of a Tesla crash in Guangdong, China went viral. The footage showed a Tesla appearing to try and park, but moments later erratically swerving back onto the road instead, accelerating uncontrollably until finally crashing into a building, killing two people and injuring three more in the process.
China is Tesla’s second major market, accounting for 60% of sales. So what happens in China can adversely affect the company’s bottom line.
Many reported on social media that tweets of the disturbing footage kept getting deleted from the platform.
This video of a Tesla trying to park and instead taking off at high speed, killing two people seems to keep getting deleted, weird!
pic.twitter.com/SGEcZcx6Zq— Stop Cop City (@JoshuaPHilll) November 13, 2022
The driver of this white Model Y Tesla – an unnamed 55-year-old – reportedly lost control of his vehicle, and claimed he had an issue with the brake pedal as he attempted to stop outside his family store. The driver asserted that although he repeatedly attempted to apply the brakes throughout the few minutes of acceleration, he claimed the car’s automated systems malfunctioned and a technical problem prevented him from stopping the vehicle.
The November 5 Tesla crash in China’s eastern city of Chaozhou killed a motorcyclist and a high school girl on a bicycle.
Tesla has pledged to assist local police in investigating the fatal incident, but has denied allegations that their vehicle or technology is to blame for the crash, citing both the footage shows no brake lights illuminated, and their own data logs reveal no attempts from the driver to depress the brake pedal for the duration of the uncontrolled journey.
In their own investigation of the events that led up to the crash, Tesla reported findings that it was in fact the accelerator that was excessively engaged throughout instead.
This is, however, not the first time Tesla’s pioneering autopilot technology has been allegedly linked to fatal road accidents, nor is it the first time the automaker has challenged implications of their own soft- or hardware being at fault.
These are the capabilities that earned Tesla's Model Y a 5-star crash-test rating pic.twitter.com/9SPBjslrEJ
— Insider Tech (@TechInsider) November 17, 2022
Self-driving cars under scrutiny
The National Highway Traffic Safety Administration (NHTSA, the travel agency of the US Federal government) has, since June 2021, ordered all automakers and vehicle tech companies to report “timely and transparent notification” of all road accidents that in any way involve automated or advanced driver assistance systems (ADAS) within 24 hours.
Of the 600+ ADAS collisions reported since last summer, 18 have been fatal, many of them involving Tesla vehicles, two of which were reported to NHTSA between September and October this year. It should be noted however, that the NHTSA states a limitation of this dataset to be that these numbers may not be representative of the true number of crashes as there may be multiple reports submitted per crash. Since June, the agency has also upgraded its special investigations of 830,000 Tesla vehicles with defects.
Related Articles: Tesla Removed From S&P 500 ESG Index, Musk: “ESG Is a Scam” | Did Elon Musk Buy Twitter to Flirt With China and the Far-Right? | Autonomous Vehicles Work to Make Roads Safer | Elon Musk, Tesla, and Twitter
The first known death reportedly tied to Tesla’s self-driving functionality happened in 2016, when 40 year-old Joshua Brown’s Model S autopilot allegedly drove full-speed into a white 18-wheel truck due to its sensors failing to distinguish the trailer against the bright sky. Brown was killed when the top of his car was “torn off by the force of the collision.” Tesla has shared condolences in relation to this tragedy, but they have also been accused of shifting the blame.
Another three people were killed last year in two separate Tesla autopilot crashes, both accidents were reportedly caused by self-driving Model S cars veering off the road and bursting into flames after hitting trees. Both incidents raised questions about the fire safety of the cars’ lithium batteries, as well as its operating system, as reports revealed that at least one of the passengers was possibly killed in the fire rather than the collision due to being unable to open their door.
“Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert,” said Tesla.
In August this year, an activist group released a report suggesting that some Tesla cars in self-driving mode were unable to detect children in the road.
This is just one video of a broader experiment, called The Dawn Project, to independently test Tesla car safety. The project is funded by Dan O’Dowd, a California software entrepreneur and billionaire who has criticized Elon Musk.
Read my interview with @washingtonpost on The Dawn Project's campaign to ban @Tesla Full Self-Driving and make computers safe for humanity. https://t.co/kHqzDcV4ep pic.twitter.com/53xJrhOVoi
— Dan O'Dowd (@RealDanODowd) November 13, 2022
Also, on Tuesday this week, the Australian government recalled more than 1000 Teslas over a software calibration issue causing steering defects meaning the “Electronic Power Assist Steering system (EPAS) may not operate as intended.”
Furthermore, last April at a Tesla auto show in Shanghai, a woman climbed atop one of the showroom cars to protest that she had almost died when her Tesla’s brakes failed – one of two Chinese citizens who were later sued by the car giant.
Tesla filed a suit against one Model S owner for making “unverified, ungrounded and defamatory remarks.”
A female Tesla owner climbed on top of a car’s roof at the Tesla booth to protest her car’s brake malfunction at the Shanghai auto show Monday. The booth beefed up its security after the incident. pic.twitter.com/ct7RmF1agM
— Global Times (@globaltimesnews) April 19, 2021
Some have suggested the company’s marketing strategy may be partly to blame, as amping up the automated functionality to a level that removes responsibility from drivers could make them inattentive and over reliant on the vehicles autopilot capabilities.
Does autopilot undermine human agency?
In the first case attempting to navigate the tricky landscape of human agency in self-driving cars, Kevin George Aziz Riad is currently facing trial after crashing his Tesla Model S into another car in 2019, killing two people inside.
Riad’s car was in autopilot mode at the time of the crash, so the jury are faced with a difficult question, as law professor and expert in self-driving cars Edward Walters, put it: “Who’s at fault, man or machine?”
Tesla is not facing any criminal charges from the incident, but the nature of the crash has created space for much-needed conversation on the philosophical issue of driver vs. vehicle autonomy, a topic that’s sure to surface time and time again as more manufacturers roll-out autopilot and driverless features.
This landmark case in California, the recent fatal crash in China, and many other incidences of autopilot-related road accidents all underline the urgent need for more clarity on the safety of self-driving vehicle technologies.
But as well as raising questions around autonomy, safety and responsibility, these reports have also opened the floor for consumers and autonomous vehicle corporations to debate whether; behind the wheel, are humans or technology more trustworthy?
Tesla has stated:
“We’re building Autopilot to give you more confidence behind the wheel, increase your safety on the road, and make highway driving more enjoyable … The driver is still responsible for, and ultimately in control of, the car.”
Silicon Valley has always believed in the superiority of technology, and that self-driving cars would result in fewer accidents and fewer road deaths. This belief is even reflected in the optimistic tone of the US Department of Transportation page on “Automated Vehicle Safety”.
But this belief could be destined to be shaken as more road accidents happen, showing the limits of technology.
It is encouraging that the Transportation Department has developed an “interactive AV test tracking tool” to keep the public informed about developments in this area. As time passes and evidence accumulates, one may expect that some sort of answer might be arrived at.
— —
Correction: This article has been updated since publication to clarify the limitations of the NHTSA dataset on ADAS crashes.
Editor’s Note: The opinions expressed here by the authors are their own, not those of Impakter.com — In the Featured Photo: Black Tesla. Featured Photo Credit: Andres Jasso/Unsplash