Tesla Autopilot Isn’t Safe. Most Assisted Driving Systems Aren’t Either.
- Automated driving systems are lacking in the safety department, according to a new report.
- Tesla’s Full Self Driving system earned the worst marks, but most of the 14 systems tested poorly.
- Some say that autonomous and assisted driving systems are responsible for fatal crashes.
Thanks for signing up!
Access your favorite topics in a personalized feed while you’re on the go.
download the app
You may want to think twice before flipping on your autonomous driving system on the highway.
The vast majority of partial driving automation systems — which generally include cruise control, cooperative steering, and obstacle monitoring — earned dismal ratings in a series of tests conducted by the Insurance Institute for Highway Safety, an independent safety inspector.
“Some drivers may feel that partial automation makes long drives easier, but there is little evidence it makes driving safer,” said agency president David Harkey in a statement. “As many high-profile crashes have illustrated, it can introduce new risks when systems lack the appropriate safeguards.”
The agency tested 14 systems, 11 of which received a “poor” overall safety rating. Among those 11 were Tesla’s Autopilot — the EV maker’s more traditional highway system — and its “Full Self Driving” system, which works on city and suburban streets.
The tests found that “Full Self Driving” — a feature that, as of last month, was available in beta to about 400,000 Tesla drivers — was the most unsafe, with only two categories rated “acceptable” and none rated “good.”
The system performed inadequately in driver monitoring — i.e. detecting when drivers take their hands off the wheel — and in adaptive cruise control, which tested whether cruise control remained active while the driver’s attention was off the road.
That’s despite Tesla’s in-cabin camera, which is supposed to make sure drivers are paying attention on the road.
Tesla also earned “poor” marks in how it handles cooperative steering — how easy it is to steer away from obstacles or potholes without disengaging the system — and disengaging safety features.
Tesla did not immediately return a request for comment before publication.
Last year, the company recalled over 2 million vehicles due to similar concerns over Autopilot, stemming from an ongoing investigation by the National Highway Traffic Safety Administration. The tests in the safety report were conducted before the recall, CNN reported.
Multiple lawsuits against Tesla claim that Autopilot is responsible for various crashes, some of them fatal.
Despite the controversy surrounding its automated driving features, Tesla was far from the only automaker to receive a dismal safety score.
Automated driving systems from Ford, Genesis, and Mercedes-Benz also received “poor” ratings. Two systems, from Nissan and General Motors, received a “marginal” rating, while Lexus’s Teammate system (available in the 2022-24 LS) was the sole “acceptable” system.
The agency did not assign any system a “good” overall safety score.
A Genesis spokesman said the company was “aware” of its rating and pointed to the company’s record as “among the leaders” in Insurance Institute safety designations. He added that Genesis will introduce an in-cabin camera to its upcoming models.
A spokeswoman for Ford described the automaker’s Blue Cruise technology as a “highly effective driver monitoring system,” adding that “while we do not agree with IIHS’s findings, we will take their feedback into consideration as we continue to evaluate future updates.”
The other companies did not immediately return requests for comment before publication.
Despite the safety concerns, self-driving cars are becoming more of a reality. Earlier this month, California regulators approved Waymo, a self-driving taxi agency, to expand its operations in Los Angeles and San Francisco.
Waymo, owned by Google parent Alphabet, claims that data shows its vehicles are statistically safer than human drivers.