Automotive

Tesla shares blame for fatal Autopilot crash according to NTSB report

Tesla shares blame for fatal Autopilot crash according to NTSB report
In the NTSB’s report on the fatal Tesla crash, the blame was placed on the driver of the semi-truck, the Tesla driver, and the car’s automated systems
In the NTSB’s report on the fatal Tesla crash, the blame was placed on the driver of the semi-truck, the Tesla driver, and the car’s automated systems
View 2 Images
Tesla Motors has taken a lot of flak for the name of its system and for a reliance on small print to explain that it is not, in fact, a fully autonomous driving system
1/2
Tesla Motors has taken a lot of flak for the name of its system and for a reliance on small print to explain that it is not, in fact, a fully autonomous driving system
In the NTSB’s report on the fatal Tesla crash, the blame was placed on the driver of the semi-truck, the Tesla driver, and the car’s automated systems
2/2
In the NTSB’s report on the fatal Tesla crash, the blame was placed on the driver of the semi-truck, the Tesla driver, and the car’s automated systems

The U.S. National Transportation Safety Board (NTSB) has completed its investigation into a fatal crash involving a semi-truck and a Tesla Model S utilizing automated driving systems. The reasons for the crash are complex, but the report highlights issues with self-driving vehicles that should be of concern.

The incident happened in May of 2016 in Florida. It gained wide media attention because the fatality in the wreck was the driver of a Tesla Model S who was using the car's "Autopilot" semi-automated driving system. Blame for the wreck has been bandied about, thrown at both the commercial vehicle's driver and the Tesla driver. Based on evidence from the crash, the NTSB's report blames both drivers and the way Tesla's Autopilot handled the situation.

Tesla Motors has taken a lot of flak for the name of its system and for its reliance on small print to explain that it is not, in fact, a fully autonomous driving system as the name might imply. To the company's credit, though, it has revised much of its marketing and has now changed the software that controls the Autopilot system, which the NTSB report noted.

Yet blame for the crash itself is not terribly important. What's more important is what can be learned from it. Namely some of the inherent dangers in autonomous vehicles, our perception of them, and how they'll function in a world with mixed human and computer drivers on the road. The near-future of vehicle automation is going to determine what the public's perception of self-driving vehicles is for some time.

In the NTSB's report on the fatal Tesla crash, the blame was placed on the driver of the semi-truck, the Tesla driver, and the car's automated systems. All three drivers (truck driver, car driver, and computer) made serious mistakes that ultimately lead to the accident.

The semi-truck driver did not yield proper right of way, causing the big rig to move in front of the Tesla unexpectedly. The driver of the Model S was not paying attention to the road at all, relying solely on the automated driving systems in the car. The Autopilot system was not designed for fully automated driving and had no way of "seeing" the oncoming crash due to limitations in its sensor setup. Nor was the Tesla adequately engaging the driver with warnings about his inattention to the road or the task of driving.

Tesla Motors has taken a lot of flak for the name of its system and for a reliance on small print to explain that it is not, in fact, a fully autonomous driving system
Tesla Motors has taken a lot of flak for the name of its system and for a reliance on small print to explain that it is not, in fact, a fully autonomous driving system

So the crash proceeded as follows: the truck driver failed to yield right of way and entered the Tesla's path as it proceeded forward. The only indication of possible impairment to the truck driver was a trace of marijuana in the driver's blood, but no other distractions were found in the investigation.

Meanwhile, the Model S driver was not paying attention to the road at all, though what exactly the driver was doing is undetermined. The driver's cause of death was definitely crash-related, however, indicating that the driver did not suffer a medical emergency or other problem that could have led to the incident. The driver had a history, according to the Tesla's recording software, of misusing the Autopilot system in this way.

The Tesla Model S' Autopilot system had alerted the driver several times to his inattention, but had not taken further lengths or, the NTSB found, done enough to adequately prevent the driver from relinquishing all control to the car. Furthermore, the sensors and systems on board the Model S were not capable of registering the truck or its potential (and eventual) crossing of the car's path and thus did not engage emergency braking or avoidance maneuvers. That latter part attests to the often misunderstood nature of today's semi-automated driving systems.

From these facts, the NTSB listed several recommendations for semi-automated vehicles to meet. In its own investigation into the crash and with early input from the NTSB, Tesla found problems with the Autopilot driver inattention warning system, and has since taken steps to remedy them. Tesla Motors has also revised most of its current marketing materials to further emphasize that the Autopilot system is not a fully-automated driving system capable of completely autonomous vehicle operation and that drivers are still required to be engaged in driving even when Autopilot is activated.

The NTSB is recommending that manufacturers put restrictions in place to keep semi-automated vehicle control systems working within the confines of their design conditions to prevent drivers from misusing them. This would mean that a semi-automated vehicle whose automation is designed for use during commutes at highway speeds would need to not operate at speeds lower than that and would not function in driving situations where the reading of road signs or compliance with pedestrian crossings and the like are required.

Today, most semi-automated driving systems being used at the consumer level are based around adaptive cruise control designs. These are made to watch traffic on a freeway or highway, where multiple lanes are available, but cross-traffic and pedestrians do not exist. These systems commonly require the driver to have hands on the steering wheel at all times and are often now augmented by "driver awareness" indicators that measure how attentive the driver is. Most work by gauging the driver's ability to keep the vehicle within its lane without assistance. Some also work by noting the driver's head position, input to the steering wheel, and position in the seat.

The NTSB also called for vehicle event data to be captured in all semi-automated vehicles and made available in standard formats so investigators can more easily use them. They called for manufacturers to incorporate robust system safeguards to limit the automated control systems' use, and they called for the development of applications to more effectively sense the driver's level of engagement.

The NTSB also asked manufacturers to more closely report incidents involving semi-automated vehicle control systems. These recommendations were issued to the National Highway Traffic Safety Administration, the U.S. Department of Transportation, the Alliance of Automobile Manufacturers, the Global Automakers group, and to individual manufacturers designing and implementing autonomous vehicle technologies.

With the release of the NTSB's summary report today, the U.S. Department of Transportation also released its own guidance on automated driving systems. These federal guidelines are given as suggestions that vehicle manufacturers are asked to voluntarily follow.

Source: NTSB

8 comments
8 comments
SimonClarke
The NTSB partially blames Tesla but the guy only touched the steering wheel for 20 seconds in 37 minutes. You are 'required' to at least 'touch' the steering wheel. As a car driver you are still responsible for your driving and the car. if I let go of the steering wheel for a long period in my car and crashed into another (I drive a Dacia) would they be partially to blame for my actions?
Daishi
If something never works you tend not to rely on it. When it seemingly works 99% of the time it becomes harder as humans not to let our guard and focus down. There will be a period in semi-autonomous vehicles where it will be easy for drivers to gain a false confidence in the capabilities. I call this the "uncanny valley" of autonomous vehicles and it's probably going to result in more accidents than this one leading to some hurdles in public perception along the way. I don't think many people realize how much work is still left to be done.
ljaques
Tincanny Valley, Daishi. Yes, we humans tend to relegate boring and repetitive things to habit where we can. As to assigning cause, I'd give the inattentive driver 75% of the blame, the rude truck driver 15%, Tesla software 5%, and Tesla marketing 5%. Any normal driver should have been able to avoid hitting the truck, and the vast majority of them would have had one hand on the wheel and one depressing the horn ring the whole time.
ljaques
I forgot to mention that all the new semi- and fully-automated cars should have safety overrides which 1) sound a loud alarm inside the cab and 2) apply the brakes before hitting an immovable object if the driver doesn't react. I kinda like the expanding steering wheel and the self-foaming cars in Demolition Man. Foam instead of air bags is safer on the ears, so I hope there is research toward that direction, DM tech being SciFi.
Koolski
To me, if I have to 1) Pay attention & 2) Have my hands on the steering wheel then it's useless. I'll just drive 100% by myself. Of course I've been driving 35 years and giving up control to a computer, even though I'm a software engineer, will be a tough sell.
T N Args
I believe Ford has a stated ambition to release for sale by 2022 a car that has no steering wheel, no accelerator pedal and no brake pedal. Let that put into context these 'early days' issues.
Dennis Taylor
NTSB said nothing about vehicle speed. The data record from the Tesla showed that the cruise set speed (Autopilot reference speed) was 74 mph. U.S. Route 27 speed limit at the crash site is 65 mph. Wouldn't any state trooper cite "excessive speed" as a factor in the crash? Follow-up question: why should Tesla (or any other automaker) allow the vehicle speed to exceed the posted speed limit - especially near an uncontrolled intersection on an unlimited access highway?
Nik
Vehicles are lethal weapons, as has been demonstrated recently by terrorists. Letting them loose, without human control..... well, you get what comes!