Latest news with #TeslaFullSelf-Driving

Miami Herald
5 days ago
- Automotive
- Miami Herald
Why Tesla Faces Crackdown $58K Daily Fine Over Its Marketing
The French Ministry of the Economy has threatened to hit Tesla with a $58,000 daily fine if the automaker doesn't end what the department considers deceptive commercial practices. France is the latest country to take issue with the Tesla Full Self-Driving (FSD) feature's name since the software isn't fully autonomous or operating at Level 5 autonomy. The ministry's investigation began in 2023 following reports to France's consumer complaint service SignalConso. In addition to ruling that Tesla was responsible for misleading business practices regarding the fully autonomous driving capacity of its vehicles and the availability of certain options and trade-in offers, the department viewed Tesla as not specifying the date, deadline, or location for car deliveries, not detailing if a purchase was made on credit, and having customers make payments before the withdrawal period enjoyed by the consumer when they finance their purchase with an assigned credit ended, according to Electrek. Additionally, Tesla was described as not providing receipts when customers made partial cash payments and not rightfully refunding within the deadlines for orders. Tesla has four months to comply with the ministry's order before fines begin. In April, China began cracking down on Tesla's FSD marketing with new rules banning car companies from using words like "self-driving," "autonomous driving," "smart driving," and "advanced smart driving." Instead, China's government recommended automakers describe features like FSD as "combined assisted driving." This regulation arrived after Tesla had already changed FSD's name in China to "Intelligent Assisted Driving" following its China launch, implying the transition occurred as the investigation unfolded. While Tesla doesn't face any federal ban on its FSD terminology in the U.S., California lawmakers banned the company from using the marketing terminology in 2022. An excerpt from California's law reads: "A manufacturer or dealer shall not name any partial driving automation feature, or describe any partial driving automation feature in marketing materials, using language that implies or would otherwise lead a reasonable person to believe, that the feature allows the vehicle to function as an autonomous vehicle, as defined in Section 38750, or otherwise has functionality not actually included in the feature," according to Autobody News. FSD is also hitting roadblocks in Stockholm, Sweden, as the city's officials have rejected Tesla's request to test the tech in its streets. Stockholm's traffic department cited safety risks to its citizens and infrastructure and "heavy pressure from other ongoing innovation tests," Teslarati reports. In Australia, an ongoing lawsuit filed in February accuses Tesla of overpromising on self-driving features while flagging other issues like instances of phantom braking. Tesla's regulatory scrutiny from France is part of a global trend targeting the automaker's sales practices. The $58,000 fine Tesla faces from France's Ministry of the Economy, China's new guidelines, and California's ban show how consumer protection is becoming more critical as daily driving functions become increasingly automated and confusion about their capabilities grows. However, Tesla's recent sales struggles could impact its decision to play ball in hopes of maintaining accessibility to major global markets. Copyright 2025 The Arena Group, Inc. All Rights Reserved.

Yahoo
12-06-2025
- Automotive
- Yahoo
The Dawn Project and Tesla Takedown to Demonstrate the Danger Tesla Full Self-Driving Poses to Children in Live Austin Safety Tests
AUSTIN, Texas, June 12, 2025 (GLOBE NEWSWIRE) -- Public safety advocacy group The Dawn Project is collaborating with Tesla Takedown to demonstrate the critical safety defects of Tesla's Full Self-Driving software in a live demonstration in Downtown Austin today. The Dawn Project, in partnership with Tesla Takedown and ResistAustin, will demonstrate that the latest version of Tesla Full Self-Driving, version 13.2.9, will run down a child crossing the road while illegally blowing past a stopped school bus with its red lights flashing and stop sign extended. The event will begin at 11am CDT on Thursday 12 June at the intersection of Camacho and Mattie Streets by Mueller Park in Austin and the media and public are encouraged to attend. The organizations are choosing to conduct the demonstration on 12 June as it is the day which Bloomberg reported would mark the beginning of Tesla's Robotaxi service in Austin. Tesla has failed to meet this reported deadline and this week Elon Musk 'tentatively' set a new public launch date of 22 June. Tesla is also headquartered in Austin. Both The Dawn Project and Tesla Takedown are calling for a boycott of Tesla. For the past two years The Dawn Project has urged the public not to buy new Tesla vehicles and to sell their Tesla shares until the automaker fixes the safety critical defects in 'Full Self-Driving' and Tesla conclusively demonstrates that it is safe for use on public roads. Tesla Takedown has organized nationwide protests against Elon Musk and Tesla in recent months. The Dawn Project first warned Tesla that Full Self-Driving would illegally overtake stopped school buses in November 2022, via a full-page ad campaign in The New York Times. Tesla took no action to address the issue. The Dawn Project subsequently broadcasted a Super Bowl Commercial in February 2023 showing video footage of Tesla Full Self-Driving blowing past stopped school buses. Tesla still did nothing. Elon Musk's reaction was to crow on Twitter that the shocking footage would 'greatly increase public awareness that a Tesla can drive itself'. In March 2023, just one month after The Dawn Project's Super Bowl PSA, a self-driving Tesla illegally blew past a stopped school bus in North Carolina and struck a child, who was hospitalized with a fractured neck and broken leg. Commenting on the collaboration with Tesla Takedown, Founder of The Dawn Project Dan O'Dowd said: 'Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately. Tesla's failure to address this critical safety defect demonstrates Elon Musk's utter negligence and contempt for public safety. If Tesla's engineers cannot fix this egregiously dangerous safety defect, they should be fired. If they can fix it but are choosing not to, they should be prosecuted. 'The National Highway Traffic Safety Administration must step up and ban Full Self-Driving from public roads to protect children. It is disappointing that the federal regulator in charge of road safety has taken no action to hold Tesla accountable. NHTSA must do its job and ban Tesla's defective Full Self-Driving technology from public roads before a child is killed.' A spokesperson for Tesla Takedown commented: 'The government is failing to protect children from Tesla Full Self-Driving by allowing Tesla to test defective software on public roads. It is time for the public to step up to keep children safe from Tesla Full Self-Driving. The only way to stop Elon Musk's reckless self-driving experiments is to boycott Tesla.' ResistAustin organizer, Nevin Kamath, commented: "Austinites are not Elon's personal crash-test dummies. ResistAustin is appalled that Musk chose our town as the launching pad for this dangerous technology and we encourage the people of Austin to boycott this dangerous service." CONTACT: Contact: Arthur Maltin, arthur@ / dawnproject@ / (+1) 805-335-7807


WIRED
19-03-2025
- Automotive
- WIRED
DOGE's Dodgy Numbers Employ a Tesla Technique
Safety researchers claim Elon Musk's auto company has a long history of potentially misleading stats. Now it looks like his government department is following suit. Photograph:Elon Musk has pledged that the work of his so-called Department of Government Efficiency, or DOGE, would be 'maximally transparent.' DOGE's website is proof of that, the Tesla and SpaceX CEO, and now White House adviser, has repeatedly said. There, the group maintains a list of slashed grants and budgets, a running tally of its work. But in recent weeks, The New York Times reported that DOGE has not only posted major mistakes to the website—crediting DOGE, for example, with saving $8 billion when the contract canceled was for $8 million and had already paid out $2.5 million—but also worked to obfuscate those mistakes after the fact, deleting identifying details about DOGE's cuts from the website, and later even from its code, that made them easy for the public to verify and track. For road-safety researchers who have been following Musk for years, the modus operandi feels familiar. DOGE 'put out some numbers, they didn't smell good, they switched things around,' alleges Noah Goodall, an independent transportation researcher. 'That screamed Tesla. You get the feeling they're not really interested in the truth.' For nearly a decade, Goodall and others have been tracking Tesla's public releases on its Autopilot and Full Self-Driving features, advanced driver-assistance systems designed to make driving less stressful and more safe. Over the years, researchers claim, Tesla has released safety statistics without proper context; promoted numbers that are impossible for outside experts to verify; touted favorable safety statistics that were later proved misleading; and even changed already-released safety statistics retroactively. The numbers have been so inconsistent that Tesla Full Self-Driving fans have taken to crowdsourcing performance data themselves. Instead of public data releases, 'what we have is these little snippets that, when researchers look into them in context, seem really suspicious,' alleges Bryant Walker Smith, a law professor and engineer who studies autonomous vehicles at the University of South Carolina. Government-Aided Whoopsie Tesla's first and most public number mix-up came in 2018, when it released its first Autopilot safety figures after the first known death of a driver using Autopilot. Immediately, researchers noted that while the numbers seemed to show that drivers using Autopilot were much less likely to crash than other Americans on the road, the figures lacked critical context. At the time, Autopilot combined adaptive cruise control, which maintains a set distance between the Tesla and the vehicle in front of it, and steering assistance, which keeps the car centered between lane markings. But the comparison didn't control for type of car (luxury vehicles, the only kind Tesla made at the time, are less likely to crash than others), the person driving the car (Tesla owners were more likely to be affluent and older, and thus less likely to crash), or the types of roads where Teslas were driving (Autopilot operated only on divided highways, but crashes are more likely to occur on rural roads, and especially connector and local ones). The confusion didn't stop there. In response to the fatal Autopilot crash, Tesla did hand over some safety numbers to the National Highway Traffic Safety Administration, the nation's road safety regulator. Using those figures, the NHTSA published a report indicating that Autopilot led to a 40 percent reduction in crashes. Tesla promoted the favorable statistic, even citing it when, in 2018, another person died while using Autopilot. But by spring of 2018, the NHTSA had copped to the number being off. The agency did not wholly evaluate the effectiveness of the technology in comparison to Teslas not using the feature—using, for example, air bag deployment as an inexact proxy for crash rates. (The airbags did not deploy in the 2018 Autopilot death.) Because Tesla does not release Autopilot or Full Self-Driving safety data to independent, third-party researchers, it's difficult to tell exactly how safe the features are. (Independent crash tests by the NHTSA and other auto regulators have found that Tesla cars are very safe, but these don't evaluate driver assistance tech.) Researchers contrast this approach with the self-driving vehicle developer Waymo, which often publishes peer-reviewed papers on its technology's performance. Still, the unknown safety numbers did not prevent Musk from criticizing anyone who questioned Autopilot's safety record. 'It's really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,' he said in 2018, around the time the NHTSA figure publicly fell apart. 'Because people might actually turn it off, and then die.' Number Questions More recently, Tesla has continued to shift its Autopilot safety figures, leading to further questions about its methods. Without explanation, the automaker stopped putting out quarterly Autopilot safety reports in the fall of 2022. Then, in January 2023, it revised all of its safety numbers. Tesla said it had belatedly discovered that it had erroneously included in its crash numbers events where no airbags nor active restraints were deployed and that it had found that some events were counted more than once. Now, instead of dividing its crash rates into three categories, "Autopilot engaged,' 'without Autopilot but with our active safety features,' and 'without Autopilot and without our active safety features,' it would report just two: with and without Autopilot. It applied those new categories, retroactively, to its old safety numbers and said it would use them going forward. That discrepancy allowed Goodall, the researcher, to peer more closely into the specifics of Tesla's crash reporting. He noticed something in the data. He expected the 'without Autopilot' number to just be an average of the two old 'without Auptilot' categories. It wasn't. Instead, the new figure looked much more like the old 'without Autopilot and without our active safety features' number. That's weird, he thought. It's not easy—or, according to studies that also include other car makes, common—for drivers to turn off all their active safety features, which include lane departure and forward collision warnings and automatic emergency braking. Goodall calculated that even if Tesla drivers were going through the burdensome and complicated steps of turning off their EV's safety features, they'd need to drive way more miles than other Tesla drivers to create a sensible baseline. The upshot: Goodall wonders if Tesla is allegedly making its non-Autopilot crash rate look higher than it is—and so the Autopilot crash rate allegedly looks much better by comparison. The discrepancy is still puzzling to the researcher, who published a peer-reviewed note on the topic last summer. Tesla 'put out this data that looks questionable on first glance—and then you look at it, and it is questionable,' he claims. 'Instead of taking it down and acknowledging it, they change the numbers to something that is even weirder and flawed in a more complicated way. I feel like I'm doing their homework at this point.' The researcher calls for more transparency. So far, Tesla has not put out more specific safety figures. Tesla, which disbanded its public relations team in 2021, did not reply to WIRED's questions about the study or its other public safety data. Direct Reports Tesla is not a total outlier in the auto industry when it comes to clamming up about the performance of its advanced technology. Automakers are not required to make public many of their safety numbers. But where tech developers are required to submit public accounting on their crashes, Tesla is still less transparent than most. One prominent national data submission requirement, first instituted by the NHTSA in 2021, requires makers of both advanced driver assistance and automated driving tech to submit public data about its crashes. Tesla redacts nearly every detail about its Autopilot-related crashes in its public submissions. 'The specifics of all 2,819 crash reports have been redacted from publicly available data at Tesla's request,' says Philip Koopman, an engineering professor at Carnegie Mellon University whose research includes self-driving-car safety. 'No other company is so blatantly opaque about their crash data.' The federal government likely has access to details on these crashes, but the public doesn't. But even that is at risk. Late last year, Reuters reported that the crash-reporting requirement appeared to be a focus of the Trump transition team. In many ways, Tesla—and perhaps DOGE—is distinctive. 'Tesla also uniquely engages with the public and is such a cause célèbre that they don't have to do their own marketing. I think that also entails some special responsibility. Lots of claims are made on behalf of Tesla,' says Walker Smith, the law professor. 'I think it engages selectively and opportunistically and does not correct sufficiently.' Proponents of DOGE, like those of Tesla, engage enthusiastically on Musk's platform, X, applauded by Musk himself. The two entities have at least one other thing in common: ProPublica recently reported that there is a new employee at the US Department of Transportation—a former Tesla senior counsel.