NTSB Faults Trucker and Motorist in Fatal Tesla Crash
The National Transportation Safety Board has ruled that a truck driver’s failure to yield the right of way and a car driver’s “inattention due to over-reliance on vehicle automation” are the probable cause of the May 7, 2016, collision of a tractor-trailer and a Tesla S sedan operating in autonomous mode.
David Cullen・[Former] Business/Washington Contributing Editor
Joshua Brown's Tesla sedan, after the crash with a truck that took his life. Photo: NTSB/Florida Highway Patrol
4 min to read
Joshua Brown's Tesla sedan, after the crash with a truck that took his life. Photo: NTSB/Florida Highway Patrol
The National Transportation Safety Board has ruled that a truck driver’s failure to yield the right of way and a car driver’s “inattention due to over-reliance on vehicle automation” are the probable cause of the May 7, 2016, collision of a tractor-trailer and a Tesla Model S 70D sedan operating in autonomous mode.
The first fatal crash of an autonomous car in the U.S., the accident claimed the life of the Tesla’s driver, 40-year-old Joshua Brown, of Canton, Ohio.
Ad Loading...
In a statement on the report, which was issued on Sept. 12, NTSB said it also found that the operational design of the Tesla’s vehicle automation “permitted the car driver’s overreliance on the automation," noting its design "allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.”
“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong.”
He added that “safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.”
Per NTSB, the report’s findings include:
The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate
The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations
If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains
The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement
Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence
Fatigue, highway design and mechanical system failures were not factors in the crash. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed the Tesla driver was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention
Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.
Ad Loading...
As a result of its investigation, NTSB has issued seven new safety recommendations. One recommendation has been issued to the Department of Transportation, three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems, and one each to the Alliance of Automobile Manufacturers and Global Automakers.
NTSB said the safety recommendations for autonomous vehicles address the need for:
Event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems
Manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards
Development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking
Manufacturers to report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems
The board also reiterated two safety recommendations that it had issued to the National Highway Traffic Safety Administration in 2013. These deal with minimum performance standards for connected vehicle technology for all highway vehicles as well as the need to require installation of the technology-- once developed-- on all newly manufactured highway vehicles.
The abstract of NTSB’s final report, which includes the findings, probable cause, and the safety recommendations is available online. The final report itself will be publicly released in the next several days. The webcast of the board meeting for this investigation will be available here for 90 days.
Distracted driving remains one of the most persistent risks in fleet operations. New approaches focus on removing mobile device use entirely while adding real-time safety support.
As distraction risks evolve, fleets are turning to smarter, more connected technologies to better understand what’s happening behind the wheel. Part 2 explores how these tools are helping identify risky behaviors and improve visibility across operations.
Distracted driving is often measured by what we can see—phones in hand, eyes off the road. But what about the distractions we can’t? A recent incident raises a bigger question about awareness, attention, and why subtle risks so often go unnoticed.
Fleets have more driver data than ever, so why isn't behavior changing? Training requires more than reports and coaching — it requires real-world practice.
A two-part conversation with Stefan Heck on how AI is transforming the fight against distracted driving. As fleets adopt smarter tools, the focus shifts from reacting to preventing risk. In Part 1, we look at where AI is making an impact for fleets today.
An 11% drop in pedestrian fatalities in early 2025 signals progress in U.S. road safety, but elevated death rates and ongoing risks underscore the need for continued action from fleets and policymakers.