Tesla Autopilot Crash: What Needs Fixing

Article By : Junko Yoshida, EE Times

The U.S. NTSB dissects March 23 Tesla crash, uncovers dilemma

The National Transportation Safety Board’s widely anticipated preliminary report on the crash of a 2017 Tesla Model X on March 23 in California is out.

The review of the Mountain View accident remains “preliminary” and the NTSB offers neither a probable cause nor any recommended fixes. The play-by-play chronology leading up to the crash compiled by the NTSB is riveting. Based on the performance data downloaded from the vehicle is, here’s what the safety regulator wrote in its report:

  • The Autopilot system was engaged on four separate occasions during the 32-minute trip, including continuous operation for the last 18 minutes 55 seconds prior to the crash.
  • During this almost 19-minute segment, the vehicle provided two visual alerts and one auditory alert for the driver to take the steering wheel. These alerts came more than 15 minutes before the crash.
  • In the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions for a total of 34 seconds. For the last six seconds before the crash, the vehicle didn’t detect any hands on the wheel.
  • At eight seconds before impact, the Tesla was behind a lead vehicle, traveling about 65 mph.
  • Seven seconds before the crash, the Tesla began to steer left, following the lead vehicle.
  • At four seconds, the Tesla was no longer following its leader.
  • From three seconds prior to the moment of impact with the crash attenuator, Tesla speeded up from 62 to 70.8 mph. No pre-crash braking or evasive steering was recorded.

According to this record, the NTSB’s preliminary report unveiled new details, including exactly when Autopilot was turned on, how often and how long the driver’s hands took the wheel within a minute before the crash.

The industry is already abuzz with Monday-morning quarterbacking on this. Among the questions raised: Was Tesla’s driver monitoring system adequate to ensure the driver’s proper use of the Autopilot system? Why was the vehicle silent while driving straight into a concrete median barrier? Did the car’s forward radar notice anything amiss while careening into the barrier?

EE Times got the help of Phil Magney, founder of VSI Labs, to break this all down. Further, VSI Labs suggested a few real solutions to some of the problems it observed.

Why 2-minute+ grace period?
While pointing out that “the driver misused the system,” Magney also blamed the accident on “the liberal grace period in which Tesla permits before a disengagement.”

Asked to define “grace period,” he explained, “You get about two minutes from the time Autopilot is engaged until the system starts prompting you with warnings to grab the wheel.”

He added, “If you do not grab the wheel, the alerts get more pronounced until the system eventually disengages and you are presented with a message that says you may no longer use Autopilot for the duration of the trip.”

So, the question here is: why would the Autopilot system give the driver as long as two minutes to stay hands-free in the first place?

Magney suggests the real solution: “Take out the grace period and require the torque-sensing steering wheel to require hands on from the moment (or within a few seconds) of Autopilot engagement.”

Sensors aren’t perfect
Also, let’s not forget that sensors such as cameras and radars aren’t almighty in all situations.

“The camera/radar solution will get into trouble from time to time,” Magney acknowledged. “This is why the driver must stay in the loop.”

What sort of trouble do they get into?

 “The system relies exclusively on the camera for lane keeping. When lines are out of the ordinary the system can easily get confused.” In this case, Magney explained, “The vehicle misinterpreted the lines, and got caught between the two lanes where it hit the barrier.”

Asked how exactly the two lanes confused the camera, Magney explained, “Basically you had one lane that splits in two. The Tesla got confused and thought the area between the two lanes was in fact another lane. I suspect the lane markings were messed up or not properly applied.”

In the Mountain View accident, a likely contributing factor is the road surface changes. The dark surface is asphalt while the light surface is concrete. Autopilot may have misinterpreted the change of surfaces as a lane line leading to the improper trajectory.  If Autopilot had a lane model and was localizing against that lane model, this type of accident could be prevented.  This is why Autopilot (and any other L2 system) require constant driver attention and engagement. (Source: VSI Labs)

In the Mountain View accident, a likely contributing factor is the road surface changes. The dark surface is asphalt while the light surface is concrete. Autopilot may have misinterpreted the change of surfaces as a lane line leading to the improper trajectory. If Autopilot had a lane model and was localizing against that lane model, this type of accident could be prevented. This is why Autopilot (and any other L2 system) require constant driver attention and engagement. (Source: VSI Labs)

Indeed, the NTSB’s preliminary report noted:

As the Tesla approached the paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, it moved to the left and entred the gore area. The Tesla continued traveling through the gore area and struck a previously damaged crash attenuator at a speed of about 71 mph.

In the footnote, NTSB described “gore area” as “a triangular-shaped boundary created by white lines marking an area of pavement formed by the convergence or divergence of a mainline travel lane and an exit/entrance lane.”

Sensors don’t know right from wrong
Radars aren’t perfect either. Magney added, “Radar does a poor job on static objects. It has to filter out most of them, because if it did not [filter them out], there would be too many false positive. This creates hazards.”

Magney concluded: “The camera based lane-keep algorithms really don’t know the difference between right and wrong.  No ground truth to go by.”

So, the real solution? “Use a map based localization method so the vehicle understands the correct lanes and proper trajectories,” according to Magney. “Again, a map based localization would substantially improve this from happening.”

Phil Magney

Phil Magney

Magney remains optimistic, noting that these accidents are “addressable.” His first suggested measure, “to take out the grace period and require the torque sensing steering wheel,” should be relatively easy, he said. The second suggestion, “a map-based localization method,” might “require a bit more effort but it is feasible,” said Magney.

In summary, Magney noted that both methods could be “enabled with software updates,” and he would “expect to see them in Tesla vehicles within the next year.” 

Tesla is consistently vocal about the safety of its vehicles. However, as a company spokeswoman noted in March in the company’s blog, “Tesla Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur.” 

The company stressed that [Autopilot] “unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”

Nevertheless, drivers’ misuse of Tesla’s Autopilot system has caused at least one too many fatal accidents.

Tesla has yet to respond to the NTSB preliminary report. But the world is watching.

— Junko Yoshida, Chief International Correspondent, EE Times

Leave a comment