Is Tesla Autopilot Safe? MIT Study Finds Drivers Become Inattentive
A study on Tesla's Autopilot system and its effects on driver behavior has found that it made those at the wheel more inattentive.
The Massachusetts Institute of Technology (MIT) study, published in the journal Accident Analysis & Prevention, conducted over more than a year in the Boston area, examined whether automation changed how drivers looked at the road.
The MIT study looked at data from 290 human-initiated Autopilot (AP) disengagements and replicated observed glance patterns across a range of drivers, using what was called Bayesian Generalized Linear Mixed models.
It found that drivers' "off-road glances were longer with AP active than without and that their frequency characteristics changed."
The study concluded that visual behavior patterns of drivers changed before and after AP disengagement: "Visual behavior patterns change before and after AP disengagement. Before disengagement, drivers looked less on road and focused more on non-driving related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disengagement to manual driving were not compensated by longer glances ahead."
TechCrunch noted that despite its name, Tesla's "Full-Self Driving" (FSD) system is not fully autonomous, but rather an advanced driver assist system (ADAS) and the vehicles do require driver attention.
The study will add to the debate on the impact that automation has on road users. In August, the National Highway Transportation Safety Administration (NHTSA) launched an investigation into the company's Autopilot system when it was found responsible for 11 accidents involving crashes into parked emergency vehicles.
Most of the crashes took place when drivers were dealing with limited visibility, such as after dark or glaring sunlight, Reuters reported, raising issues of how Autopilot can handle challenging driving conditions.
The NHTSA probe covering an estimated 765,000 vehicles from the Model S, Model X, Model 3 and Model ranges, is its farthest reaching since Tesla introduced the semi-autonomous system.
"We will act when we detect an unreasonable risk to public safety," a NHTSA spokesperson told Reuters last month.

A number of Tesla owners could soon be testing out the newest version of its "Full Self-Driving" beta software, or version 10.0.1, on public roads, TechCrunch reported.
However, Tesla CEO Elon Musk has said not everyone who has paid for the FSD software can access the beta version.
Tesla will use telemetry data to capture personal driving metrics over a seven-day period to make sure drivers are still attentive enough at the wheel, the publication noted.
A number of drivers have reported that cars with FSD software enabled have had trouble identifying stationary emergency vehicles parked in the road.
In August, Tesla chief executive officer Elon Musk acknowledged the safety issues with FSD software. He tweeted: "FSD Beta 9.2 is actually not great imo, but Autopilot/AI team is rallying to improve as fast as possible."
Newsweek has contacted Tesla for comment.