logo
#

Latest news with #DanO'Dowd

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress
The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

Yahoo

time08-07-2025

  • Automotive
  • Yahoo

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

SANTA BARBARA, Calif., July 08, 2025 (GLOBE NEWSWIRE) -- Public safety advocacy group The Dawn Project has called on legislators to take action to ban Tesla's Full Self-Driving software from public roads over the litany of critical safety defects uncovered by the group's safety tests of the software. Today, The Dawn Project shared a report of its findings with legislators and key regulators, including the National Highway Traffic Safety Administration (NHTSA), which has numerous open investigations into Tesla Full Self-Driving. The report comes amid Tesla's rollout of its 'Robotaxi' service in Austin, Texas, which has been plagued with safety critical errors such as the Robotaxis driving on the wrong side of the road, blocking intersections, and nearly colliding with other cars. The Dawn Project recently held a live demonstration of its safety tests in Austin which proved that the latest publicly available version of Tesla Full Self-Driving will run down a child crossing the road while illegally blowing past a stopped school bus with its red lights flashing and stop sign extended. The Dawn Project was recreating a tragic incident in North Carolina in which a child was run down by a self-driving Tesla as they exited a school bus. The self-driving Tesla had blown past the school bus's red flashing lights and stop sign before running down the child, who suffered a fractured neck and broken leg and was placed on a ventilator. The Austin test was run eight times, and Full Self-Driving ran down the child mannequin while illegally blowing past the school bus on every single test. Tesla's Full Self-Driving software did not disengage or even alert the driver to the fact there had been a collision on any of the test runs. The Dawn Project has catalogued thousands of safety critical errors committed by Tesla's Full Self-Driving software on public roads and constantly updates this database with new errors as they are identified. The Dawn Project also maintains a publicly accessible database of safety critical and other driving errors committed by Tesla's Robotaxis in Austin. NHTSA has reported 50 fatalities and 2,185 crashes involving Tesla's self-driving technology. The Dawn Project's Report to Congress outlines the key findings from the group's safety tests and demands that legislators take immediate action to protect road users from Tesla Full Self-Driving by banning the software until Tesla conclusively proves it is safe. Dan O'Dowd, Founder of The Dawn Project, commented: 'Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately. It is only a matter of time before a child is killed while getting off a school bus because of Elon Musk and Tesla's utter negligence and contempt for public safety.' 'The National Highway Traffic Safety Administration must step up and ban Tesla Full Self-Driving from public roads to protect children, pedestrians, and other road users. It is disappointing that the federal regulator in charge of road safety has taken no action to hold Tesla accountable. NHTSA must do its job and ban Tesla's defective Full Self-Driving technology from public roads before more people are killed.' 'Legislators should protect their constituents from Tesla Full Self-Driving by calling for this dangerous and defective software to be banned immediately.' A copy of The Dawn Project's annual report can be viewed Contact: info@

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress
The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

Associated Press

time08-07-2025

  • Automotive
  • Associated Press

The Dawn Project Urges Legislators To Ban Tesla Full Self-Driving Over Critical Safety Defects Published in New Report to Congress

SANTA BARBARA, Calif., July 08, 2025 (GLOBE NEWSWIRE) -- Public safety advocacy group The Dawn Project has called on legislators to take action to ban Tesla's Full Self-Driving software from public roads over the litany of critical safety defects uncovered by the group's safety tests of the software. Today, The Dawn Project shared a report of its findings with legislators and key regulators, including the National Highway Traffic Safety Administration (NHTSA), which has numerous open investigations into Tesla Full Self-Driving. The report comes amid Tesla's rollout of its 'Robotaxi' service in Austin, Texas, which has been plagued with safety critical errors such as the Robotaxis driving on the wrong side of the road, blocking intersections, and nearly colliding with other cars. The Dawn Project recently held a live demonstration of its safety tests in Austin which proved that the latest publicly available version of Tesla Full Self-Driving will run down a child crossing the road while illegally blowing past a stopped school bus with its red lights flashing and stop sign extended. The Dawn Project was recreating a tragic incident in North Carolina in which a child was run down by a self-driving Tesla as they exited a school bus. The self-driving Tesla had blown past the school bus's red flashing lights and stop sign before running down the child, who suffered a fractured neck and broken leg and was placed on a ventilator. The Austin test was run eight times, and Full Self-Driving ran down the child mannequin while illegally blowing past the school bus on every single test. Tesla's Full Self-Driving software did not disengage or even alert the driver to the fact there had been a collision on any of the test runs. The Dawn Project has catalogued thousands of safety critical errors committed by Tesla's Full Self-Driving software on public roads and constantly updates this database with new errors as they are identified. The Dawn Project also maintains a publicly accessible database of safety critical and other driving errors committed by Tesla's Robotaxis in Austin. NHTSA has reported 50 fatalities and 2,185 crashes involving Tesla's self-driving technology. The Dawn Project's Report to Congress outlines the key findings from the group's safety tests and demands that legislators take immediate action to protect road users from Tesla Full Self-Driving by banning the software until Tesla conclusively proves it is safe. Dan O'Dowd, Founder of The Dawn Project, commented: 'Self-driving software that illegally blows past stopped school buses and runs down children crossing the road must be banned immediately. It is only a matter of time before a child is killed while getting off a school bus because of Elon Musk and Tesla's utter negligence and contempt for public safety.' 'The National Highway Traffic Safety Administration must step up and ban Tesla Full Self-Driving from public roads to protect children, pedestrians, and other road users. It is disappointing that the federal regulator in charge of road safety has taken no action to hold Tesla accountable. NHTSA must do its job and ban Tesla's defective Full Self-Driving technology from public roads before more people are killed.'A copy of The Dawn Project's annual report can be viewed here. Contact: [email protected]

Tesla Model Y fails self-driving test, hits child-sized dummies 8 times: Why Elon Musk should be worried
Tesla Model Y fails self-driving test, hits child-sized dummies 8 times: Why Elon Musk should be worried

Express Tribune

time16-06-2025

  • Automotive
  • Express Tribune

Tesla Model Y fails self-driving test, hits child-sized dummies 8 times: Why Elon Musk should be worried

At a recent demonstration in Texas, a Tesla Model Y operating in Full Self-Driving (FSD) mode was shown failing to stop for a stationary school bus and striking child-sized dummies. The tests, organised by advocacy groups The Dawn Project, Tesla Takedown, and ResistAustin, replicated the scenario eight times, each time with the Tesla Model Y ignoring the bus's flashing lights and stop signs. Video footage from the demonstration showed the vehicle driving past the bus and colliding with the mannequins intended to represent children. The demonstration has raised fresh concerns about the readiness of autonomous vehicle technology. Tesla's system—officially named Full Self-Driving (Supervised)—requires active driver supervision and issues escalating warnings if the driver does not respond. The company has repeatedly cautioned users that failure to comply could lead to serious injury or death. While Tesla was not involved in the demonstration, this is not the first time its autonomous technology has drawn scrutiny. In April 2024, a Tesla Model S using FSD was involved in a fatal accident in Washington State, in which a motorcyclist was killed. The Dawn Project, whose founder Dan O'Dowd also leads a company developing competing driver-assistance software, has previously run campaigns highlighting perceived flaws in Tesla's FSD system. The incident comes amid anticipation surrounding Tesla's new Cybercab, an all-electric, fully autonomous vehicle initially set for rollout on 22 June. Chief Executive Elon Musk has since hinted at a delay, saying the company is 'being super paranoid about safety' and suggesting the first vehicle to autonomously drive from the factory to a customer's home could launch on 28 June. Tentatively, June 22. We are being super paranoid about safety, so the date could shift. First Tesla that drives itself from factory end of line all the way to a customer house is June 28. — Elon Musk (@elonmusk) June 11, 2025 As the debate around autonomous vehicle safety intensifies, the industry continues to face questions about whether current technology can meet the expectations—and responsibilities—of full autonomy.

Tesla blows past stopped school bus and hits kid-sized dummies in Full Self-Driving tests
Tesla blows past stopped school bus and hits kid-sized dummies in Full Self-Driving tests

Engadget

time15-06-2025

  • Automotive
  • Engadget

Tesla blows past stopped school bus and hits kid-sized dummies in Full Self-Driving tests

A revealing demonstration with Tesla's Full Self-Driving mode is raising concerns about whether fully autonomous cars are ready to hit the streets. Tesla has reportedly pushed back the rollout of its upcoming all-electric, fully autonomous car called the Cybercab, while a recent demonstration in Austin, Texas showed a Tesla Model Y running through a school bus' flashing lights and stop signs, and hitting child-size mannequins. The tests were conducted by The Dawn Project, along with Tesla Takedown and ResistAustin, and showed Tesla's Full Self-Driving software repeating the same mistake eight times. To view this content, you'll need to update your privacy settings. Please click here and view the "Content and social-media partners" setting to do so. It's worth noting that Tesla's autonomous driving feature is formally known as Full Self-Driving (Supervised) and "requires a fully attentive driver and will display a series of escalating warnings requiring driver response." Tesla even has a warning that says, "failure to follow these instructions could cause damage, serious injury or death." However, it's not the first time that Tesla's FSD software has found itself in hot water. The Dawn Project, whose founder Dan O'Dowd is the CEO of a company that offers competing automated driving system software, previously took out ads warning about the dangers of Tesla's Full Self-Driving and how it would fail to yield around school buses. In April 2024, a Model S using Full Self-Driving was involved in a crash in Washington, where a motorcyclist died. With anticipation building up for an eventual Cybercab rollout on June 22, the company's CEO posted some additional details on X. According to Elon Musk, Tesla is "being super paranoid about safety, so the date could shift." Beyond that, Musk also posted that the "first Tesla that drives itself from factory end of line all the way to a customer house is June 28."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store