Safety Advocates Slam NHTSA for Pushing “Robot Car Technology”
The National Highway Traffic Administration should stop pushing the adoption of “robot car technology” until more testing is done and enforceable safety standards are promulgated, according to a July 28 letter sent by three advocacy groups to NHTSA Administrator Mark Rosekind.
Still from YouTube video of Tesla S owner using autopilot mode
3 min to read
Still from YouTube video of Tesla S owner using autopilot mode
The National Highway Traffic Administration should stop pushing the adoption of “robot car technology” until more testing is done and enforceable safety standards are promulgated, according to a July 28 letter sent by three advocacy groups to NHTSA Administrator Mark Rosekind.
The groups said the letter was sent in response to Rosekind's “recent assertion” that NHTSA cannot "stand idly by while we wait for the perfect" before autonomous-driving systems are deployed in the U.S.
Ad Loading...
“The question is not whether autonomous technology must be perfect before it hits the road, but whether safety regulators should allow demonstrably dangerous technology, with no minimum safety performance standards in place, to be deployed on American highways,” wrote the signatories, Joan Claybrook, former NHTSA Administrator and president emeritus of Public Citizen; Clarence Ditlow, executive director of the Center for Auto Safety; Carmen Balber, executive director of Consumer Watchdog, and John M. Simpson, Consumer Watchdog's Privacy Project director.
“Instead of seeking a recall of Tesla’s flawed [Autopilot] technology, you inexcusably are rushing full speed ahead” to promote the deployment of self-driving technology instead of developing safety standards "crucial to ensuring imperfect technologies do not kill people by being introduced into vehicles before the technology matures," the writers argued.
The advocates also stated that they were "dumbfounded that the fatal crash of a Tesla Model S in Florida that killed a former Navy SEAL did not give you pause, cause NHTSA to raise a warning flag, bring you to ask Tesla to adjust its software to require drivers' hands on the wheel while in autopilot mode, or even to rename its 'autopilot' to 'pilot assist' until the crash investigation is complete. "Instead, you doubled down on a plan to rush robot cars to the road.”
The writers contended that “technology with such an obvious flaw should never have been deployed, and should not remain on the road.”
The letter also charged Rosekind and his colleagues with becoming “giddy advocates of self-driving cars, instead of sober safety regulators tasked with ensuring that new systems don't kill people. Instead of seeking a recall of Tesla's flawed technology, you inexcusably are rushing full speed ahead."
Ad Loading...
The advocates also stated that it is their position that “adequate safety standards developed in the full light of day are crucial to ensuring imperfect technologies do not kill people by being introduced into vehicles before the technology matures.”
On the other hand, the groups conceded that autonomous technology “can save lives someday.”
Yet they stressed that self-driving technologies “should only be implemented” after thorough testing and a rulemaking to set enforceable safety standards.
"That is why we petitioned NHTSA for a rulemaking to set standards for automatic emergency braking, rather than rely on a meager auto industry-friendly voluntary agreement worked out behind closed doors that cannot be enforced," the advocates wrote. "If mandatory standards had been in place for automatic emergency brakes before the Florida crash, it might have been prevented."
Distracted driving remains one of the most persistent risks in fleet operations. New approaches focus on removing mobile device use entirely while adding real-time safety support.
As distraction risks evolve, fleets are turning to smarter, more connected technologies to better understand what’s happening behind the wheel. Part 2 explores how these tools are helping identify risky behaviors and improve visibility across operations.
Distracted driving is often measured by what we can see—phones in hand, eyes off the road. But what about the distractions we can’t? A recent incident raises a bigger question about awareness, attention, and why subtle risks so often go unnoticed.
Fleets have more driver data than ever, so why isn't behavior changing? Training requires more than reports and coaching — it requires real-world practice.
A two-part conversation with Stefan Heck on how AI is transforming the fight against distracted driving. As fleets adopt smarter tools, the focus shifts from reacting to preventing risk. In Part 1, we look at where AI is making an impact for fleets today.
An 11% drop in pedestrian fatalities in early 2025 signals progress in U.S. road safety, but elevated death rates and ongoing risks underscore the need for continued action from fleets and policymakers.