Senators Look For
FTC Investigation Into Tesla's Misleading Advertising of Driving
Amid a series of Tesla crashes, U.S. Senators Richard Blumenthal (D-CT) and Edward J. Markey (D-MA), members of the Senate Commerce, Science, and Transportation Committee, voiced serious concerns about Tesla’s misleading advertising and marketing of its Autopilot and Full-Self Driving (FSD) features to consumers, and called on the Federal Trade Commission (FTC) to launch an investigation and take enforcement action.
“Tesla’s marketing has repeatedly overstated the capabilities of its
vehicles, and these statements increasingly pose a threat to motorists
and other users of the road,” wrote the senators to FTC Chair Lina Khan.
“Accordingly, we urge you to open an investigation into potentially
deceptive and unfair practices in Tesla’s advertising and marketing of
its driving automation systems and take appropriate enforcement action
to ensure the safety of all drivers on the road.”
The full text of the letter can be found here and below.
August 18, 2021
The Honorable Lina Khan
Federal Trade Commission
600 Pennsylvania Avenue, NW
Washington, D.C. 20580
Dear Chair Khan,
We write to express our serious concerns about Tesla’s misleading advertising of its Autopilot and Full Self-Driving (FSD) features. Tesla’s marketing has repeatedly overstated the capabilities of its vehicles, and these statements increasingly pose a threat to motorists and other users of the road. Accordingly, we urge you to open an investigation into potentially deceptive and unfair practices in Tesla’s advertising and marketing of its driving automation systems and take appropriate enforcement action to ensure the safety of all drivers on the road.
On August 13, 2021, the National Highway Traffic Safety Administration (NHTSA) opened a formal investigation into Tesla’s Autopilot feature after identifying 11 crashes with Autopilot engaged since 2018 that involved a Tesla striking one or more vehicles at first responder sites. This is not the first time NHTSA has investigated Tesla’s Autopilot. While a previous investigation, closed in 2017, did not identify any defects with Autopilot after a fatal crash, this latest investigation is a new defect investigation into Tesla’s Autopilot.
Tesla’s Autopilot and FSD are partially automated and include lane keeping assistance and adaptive cruise control features that can help prevent driver stress and fatigue when properly used. They are not fully autonomous features, however, and there are no fully autonomous vehicles currently available on the market. In fact, NHTSA estimates that fully automated safety features and true highway autopilot will not be ready until at least 2025. Understanding these limitations is essential, for when drivers’ expectations exceed their vehicle’s capabilities, serious and fatal accidents can and do result.
We fear that Tesla’s Autopilot and FSD features are not as mature and reliable as the company pitches to the public. On April 22, 2019, Tesla posted a video on its YouTube channel titled “Full Self-Driving” showing a Tesla driving entirely on its own. Tesla CEO Elon Musk has also repeatedly boasted about Tesla’s systems. In July 2020 and again in January 2021, Mr. Musk claimed to consumers that Tesla vehicles would soon reach Level 5 autonomy, or full automation. Unfortunately, Tesla’s advertising and marketing is reaching a large audience: the “Full Self-Driving” video has been viewed more than 18 million times. While Tesla has buried qualifying disclaimers elsewhere on their website, the link in the video’s caption redirects to a purchasing page that fails to provide additional information about the true capabilities of the vehicle.
Tesla drivers listen to these claims and believe their vehicles are equipped to drive themselves – with potentially deadly consequences. At least 11 people have died in fatal crashes with Autopilot activated since Tesla introduced the feature in 2015. In May, the driver of a Tesla Model 3 was killed after the vehicle crashed on a highway in California. The driver had previously posted a video of his Tesla online in which it drove itself without human assistance. Autopilot was activated when the crash occurred. Less than two weeks after the fatal crash, California Highway Patrol arrested a man for riding in the backseat of his Tesla while the vehicle was in Autopilot on the highway. After his arrest, the driver cited Mr. Musk’s statements about the vehicle’s abilities as justification for his actions. It is clear that drivers take Tesla’s statements about their vehicles’ capabilities at face value and suffer grave consequences.
Advocates and other federal agencies have repeatedly called on the FTC to act on Tesla’s possible false advertising of its driving automation systems. In 2018, the Center for Auto Safety and Consumer Watchdog wrote to then FTC Chairman Joseph Simons urging the FTC to investigate Tesla’s deceptive and unfair practices in the advertising and marketing of Autopilot after two fatal crashes. They renewed their request to the Commission in 2019 following additional fatal incidents. NHTSA also sent Mr. Musk a cease-and-desist letter in 2018 over his claims about the vehicles’ safety and, importantly, asked the FTC to investigate the claims under its “unfair or deceptive acts practices” authority. As the FTC has noted in other matters, the Commission has a significant role in protecting consumers against false, misleading, and dangerous advertising in car sales.
Despite these warnings, Tesla has persistently misrepresented the capabilities of its cars and the company’s progress towards safe Autopilot and FSD technology. On an earnings call in January this year, Mr. Musk claimed Tesla vehicles would be fully autonomous by the end of the year. On July 9, 2021, Tesla released beta version 9 of what it brands to consumers as “Full Self-Driving” software, a subscription feature (recently made available to all Tesla owners) that costs hundreds of dollars per month but – despite its name – does not deliver full autonomy. After the update, drivers have posted videos online showing their updated Tesla vehicles making unexpected maneuvers that require human intervention to prevent a crash. Mr. Musk’s tepid precautions tucked away on social media are no excuse for misleading drivers and endangering the lives of everyone on the road. As Tesla makes widely available its FSD and Autopilot technology and doubles down on its inflated promises, we are alarmed by the prospect of more drivers relying more frequently on systems that do not nearly deliver the expected level of safety.
Tesla and Mr. Musk’s repeated overstatements of their vehicle’s capabilities – despite clear and frequent warnings – demonstrate a deeply concerning disregard for the safety of those on the road and require real accountability. Their claims put Tesla drivers – and all of the travelling public – at risk of serious injury or death. In light of these concerns, we urge you to swiftly open an investigation into Tesla’s repeated and overstated claims about their Autopilot and Full Self-Driving features and take appropriate enforcement action to prevent further injury or death as a result of any Tesla feature.
Thank you for your attention to this important matter, and we look forward to your response.