Tesla crash into fire truck
Photo: South Jordan, Utah, Police Department

As more vehicles debut driver-assistance features that automatically accelerate, brake, and steer, another crash has drawn attention to how these new technologies work—and in what situations they may fail.

Drivers should not view driver-assistance systems as "self-driving," and automakers should take care not to brand them as such, according to Consumer Reports experts.

"It's important that drivers realize the limitations of the new technology and not be overconfident," says Kelly Funkhouser, program manager for vehicle interface at CR.

The latest crash took place earlier this week in Utah, when a Tesla Model S driver who said she was using Tesla's Autopilot crashed into the rear of a fire truck that was stopped for a red light.

Autopilot is a suite of features that can help steer and brake, and is not meant to be a replacement for an attentive driver.

According to a statement from the South Jordan, Utah, Police Department, the driver "admitted that she was looking at her phone prior to the collision," and witnesses said she "did not brake or take any action to avoid the collision." The Model S was traveling at 60 mph before the crash, police said.

The driver of the Model S, identified only as a 28-year-old woman from Lehi, Utah, had minor injuries. The driver of the fire truck was not injured.

"This is a problem that's not unique to Tesla," Funkhouser says. "Many drivers may be surprised to find out that these systems aren't designed to completely stop from 60 mph when facing a stationary or stopped car." Indeed, the owner's manuals for driver-assist systems such as Cadillac Super Cruise, Tesla Autopilot, and Volvo Pilot Assist II all state that the systems may not be able to avoid stationary vehicles or obstacles.

On May 16, South Jordan police released a follow-up statement that included information that Tesla technicians recovered from the vehicle and shared with investigators. About 1 minute and 22 seconds before the crash, the driver "re-enabled Autosteer and Cruise Control, and then, within two seconds, took her hands off the steering wheel again," the report stated. "She did not touch the steering wheel for the next 80 seconds until the crash happened."

Additionally, the driver had her hands off the wheel more than a dozen times in the drive cycle before the crash. "On two such occasions, she had her hands off the wheel for more than one minute each time and her hands came back on only after a visual alert was provided," the report said. "Each time she put her hands back on the wheel, she took them back off the wheel after a few seconds."

The report shows that requiring drivers to have their hands on the steering wheel is an insufficient way to confirm they are paying attention, said CR's Funkhouser.

"The design of the Autopilot system almost encourages the driver to keep their hands off the wheel lest they accidentally bump the steering wheel too hard, which instantly turns off Autopilot," she said.

Funkhouser also noted the many how-to videos and ready-made devices designed to trick Autopilot into thinking that a driver's hands are on the wheel, including "Autopilot Buddy," a $179 device that attaches to a Tesla's steering wheel.

"If people can so easily find a way to bypass that system then there is a serious problem with that method," Funkhouser said.

However, driver monitoring may be able to reduce distraction while driver-assistance technology is in use. "Monitoring driver behavior through eye tracking or other biometrics may be a partial solution to the issue," she says.

Self-Driving Cars

That's what Cadillac's Super Cruise system does. Like Autopilot, it can steer and brake. But unlike Autopilot, Super Cruise uses sensors to track where the driver's eyes are looking, and will give audible and visual alerts if it detects that the driver isn't paying attention.

On Twitter on Monday, Tesla CEO Elon Musk responded to a Wall Street Journal report that says Tesla chose not to include eye-tracking technology on its vehicles because it cost too much and would not be effective.

"Eyetracking rejected for being ineffective, not for cost," Musk tweeted.

Tesla's Autopilot displays audio and visual warnings if it does not detect a driver's hands on the wheel, and it will shut the system off after too many warnings. While the Model S owner's manual says drivers should not use Autopilot on city streets, it does not prohibit them from doing so. By comparison, Cadillac's Super Cruise uses map data to determine what kind of road the car is on and will work only on a divided, limited-access highway.

"These cars can't drive themselves, but it's too easy for consumers to think they can," says William Wallace, senior policy analyst for Consumers Union, the advocacy division of Consumer Reports. "Drivers need to pay attention, and companies need to take responsibility for the fact that greater automation opens the door to dangerous, and foreseeable, distractions. From clearer names and sensible system limitations to effective driver monitoring, there's a lot more automakers can and must do."

There have been two reported fatal crashes when the driver of a Tesla vehicle was using Autopilot. Joshua Brown died in a May 2016 crash in Florida with the Autopilot activated in his 2015 Model S, which led the automaker to release software updates designed to keep drivers more engaged. In March, Wei Huang was using Autopilot when he was killed in a single-vehicle crash into a concrete barrier in California.

From the 'Consumer 101' TV Show

Today's cars come with cutting edge safety technology that can stop working with one simple thing: dirt. Consumer Reports' expert show on 'Consumer 101' where these important safety sensors are on a car and how to keep them clean.

Editor's Note: This article has been updated to include new information released by Tesla and the South Jordan Police Department.