Tesla Motors CEO Elon Musk unveils a new all-wheel-drive version of the Model S car in Hawthorne, California October 9, 2014.
Lucy Nicholson | Reuters
On Monday, the National Highway Traffic Safety Administration said it “immediately” launched another investigation into Tesla following a fatal crash in Spring, Texas over the weekend.
Two men died in the crash Saturday night, and apparently, nobody was behind the wheel, according to multiple press interviews with local police.
The electric vehicle, a Tesla 2019 Model S, crashed into a tree and burst into flames. One person was found in the front passenger seat, and another was in the rear passenger seat of the vehicle.
Police and NHTSA have not finished their comprehensive investigations. A preliminary report is not conclusive, and questions remain about whether Tesla’s advanced driver assistance systems were engaged before or during the crash.
The company’s systems are marketed under the brand names Autopilot, Full Self-Driving or Full Self-Driving beta. Tesla includes Autopilot standard in all of its newer vehicles. And it sells Full Self-Driving for $10,000 with a subscription option in the works.
Autopilot and Full Self-Driving (or FSD) technology do not make Tesla vehicles safe for operation without a driver at the wheel. Some customers who purchase the FSD option also get access to a “beta” version to try the newest features that are being added to the system on public roads before all bugs are worked out.
The company says in its owners’ manuals that drivers must use Autopilot and FSD only with “active supervision.”
At the same time, CEO Elon Musk touts these systems as safe and continuously improving on Twitter, where he has 50 million followers, and in media appearances.
On an episode of the popular Joe Rogan Experience podcast in February, Musk and Rogan discussed how Tesla drivers could play chess on their cars’ touchscreens while driving, even though they shouldn’t. (They have to press a button stating that they are the passenger.)
In that same episode, Musk also said, “I think Autopilot’s getting good enough that you won’t need to drive most of the time unless you really want to.”
The great hope for autonomous and automated driving systems in development today is that — like seatbelts, automated emergency braking, airbags and other technologies that became standard — they will prevent crashes or lessen the impact of them. There were 36,096 fatalities in motor vehicle traffic crashes in 2019, according to NHTSA data.
The National Highway Traffic Safety Administration has opened around 28 investigations into crashes of Tesla vehicles to date, and about 24 of these are active today.
The National Transportation Safety Board, an independent federal agency that investigates crashes to determine contributing factors, has called on NHTSA to impose strong safety standards on automated vehicle tech. NTSB called out Tesla for poor safety practices, specifically, in their recommendations and expressed frustration with NHTSA’s reluctance to take action following multiple fatal crashes involving Uber and Tesla vehicles.
Tesla Autopilot-involved fatal crashes have killed Joshua Brown in Florida, Walter Huang in California and Jeremy Banner in Florida in addition to the two men who died in Texas. An Autopilot-involved crash killed Tesla driver Gao Yaning in China as well, and there was an Autopilot-involved crash in Japan that took the life of a pedestrian, Yoshihiro Umeda.
Here’s the full statement that a NHTSA spokesperson sent to CNBC about the Spring, Texas crash:
“NHTSA is aware of the tragic crash involving a Tesla vehicle outside of Houston, Texas. NHTSA has immediately launched a Special Crash Investigation team to investigate the crash. We are actively engaged with local law enforcement and Tesla to learn more about the details of the crash and will take appropriate steps when we have more information.”