Search

NTSB Concludes Study of the First Tesla Model S Crash Fatality

The First Tesla Model S Crash

In May of 2016 the first Tesla Model S crash fatality occurred. The NTSB (National Transportation Safety Board) released preliminary reports of the crash, stating that Tesla’s Autopilot was engaged at the time. This was the first fatality recorded in a Tesla Model S and it raised many questions, the foremost being; is Autopilot safe?

Tesla responded to the Model S crash with their own investigation into Autopilot safety. After some consideration, a few changes were made to Autopilot’s operational software. Though Tesla found the need to upgrade Autopilot, they stood by their work, noting that Autopilot is in the beta phase and users must agree to the terms before operating the program. Namely, staying vigilant, as opposed to watching a movie while driving down the highway. But what did the NTSB conclude about this famous crash? Was Autopilot to blame?

The Official Crash Findings

According to a factual report by the NTSB, the Model S issued several warnings to disengage Autopilot before the crash occurred. These warnings were ignored. It is impossible to know why, despite apparent warnings, the driver did not respond. However, the news has frequently reported he was watching a movie at the time.

The NTSB did not find fault with Tesla for the accident, but rather, human error. They concluded that if Tesla’s instructions for Autopilot had been followed, the Model S crash would likely not have occurred. The driver of the truck bears responsibility, but ultimately the Model S driver should have been remained alert.

Tesla and Autopilot

Tesla has clear instructions for Autopilot use, or do they? This has been a subject of much debate. Many are concerned that Tesla has misled consumers to believe that Autopilot is 100% self sufficient. Consumer Reports has weighed in on the matter, stating:

“By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security,” says Laura MacCleery, vice president of consumer policy and mobilization for Consumer Reports….’Autopilot’ can’t actually drive the car, yet it allows consumers to have their hands off the steering wheel for minutes at a time. Tesla should disable automatic steering in its cars until it updates the program to verify that the driver’s hands are on the wheel.”

Since the crash Tesla has made several upgrades to increase safety and functionality. Consumer reports VP of consumer policy agrees that “In the long run, advanced active safety technologies in vehicles could make our roads safer.” So what can we take away from this tragedy? Autopilot is a good thing, in the long run, but stay vigilant today. Collecting real time information from Tesla owners will propel new safety technology. In the meantime, be careful, take warnings seriously and avoid assumptions that Autopilot will save you. We can work with Autopilot, within the system’s capabilities, to reduce human error, making the roads a little safer every day.