A tweet from Elon Musk indicating that Tesla might allow some owners who are testing a 鈥淔ull Self-Driving鈥 system to disable an alert that reminds them to keep their hands on the steering wheel has drawn attention from U.S. safety regulators.
The National Highway Traffic Safety Administration says it asked Tesla for more information about the tweet. Last week, the agency said the issue is now part of a broader investigation into at least 14 Teslas that have crashed into emergency vehicles while using the Autopilot driver assist system.
Since 2021, Tesla has been beta-testing 鈥淔ull Self-Driving鈥 using owners who haven鈥檛 been trained on the system but are actively monitored by the company. Earlier this year, Tesla said 160,000, roughly 15% of Teslas now on U.S. roads, were participating. A wider distribution of the software was to be rolled out late in 2022.
Despite the name, Tesla still says on its website that the cars can鈥檛 drive themselves. Teslas using 鈥淔ull Self-Driving鈥 can navigate roads themselves in many cases, but experts say the system can make mistakes. 鈥淲e鈥檙e not saying it鈥檚 quite ready to have no one behind the wheel,鈥 CEO Musk said in October.
On New Year鈥檚 Eve, one of Musk鈥檚 most ardent fans posted on Twitter that drivers with more than 10,000 miles of 鈥淔ull Self-Driving鈥 testing should have the option to turn off the 鈥渟teering wheel nag,鈥 an alert that tells drivers to keep hands on the wheel.
Musk replied: 鈥淎greed, update coming in Jan.鈥
It鈥檚 not clear from the tweets exactly what Tesla will do. But disabling a driver monitoring system on any vehicle that automates speed and steering would pose a danger to other drivers on the road, said Jake Fisher, senior director of auto testing for Consumer Reports.
鈥淯sing FSD beta, you鈥檙e kind of part of an experiment,鈥 Fisher said. 鈥淭he problem is the other road users adjacent to you haven鈥檛 signed up to be part of that experiment.鈥
Tesla didn鈥檛 respond to a message seeking comment about the tweet or its driver monitoring.
Auto safety advocates and government investigators have long criticized Tesla鈥檚 monitoring system as inadequate. Three years ago the National Transportation Safety Board listed poor monitoring as a contributing factor in a 2018 fatal Tesla crash in California. The board recommended a better system, but said Tesla has not responded.
Tesla鈥檚 system measures torque on the steering wheel to try to ensure that drivers are paying attention. Many Teslas have cameras that monitor a driver鈥檚 gaze. But Fisher says those cameras aren鈥檛 infrared like those of some competitors鈥 driver assistance systems, so they can鈥檛 see at night or if a driver is wearing sunglasses.
Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argued that Tesla is contradicting itself in a way that could confuse drivers. 鈥淭hey鈥檙e trying to make customers happy by taking their hands off the wheel, even while the (owners) manual says 鈥榙on鈥檛 do that.鈥 鈥
Indeed, Tesla鈥檚 website says Autopilot and the more sophisticated 鈥淔ull Self-Driving鈥 system are intended for use by a 鈥渇ully attentive driver who has their hands on the wheel and is prepared to take over at any moment.鈥 It says the systems are not fully autonomous.
NHTSA has noted in documents that numerous Tesla crashes have occurred in which drivers had their hands on the wheel but still weren鈥檛 paying attention. The agency has said that Autopilot is being used in areas where its capabilities are limited and that many drivers aren鈥檛 taking action to avoid crashes despite warnings from the vehicle.
Tesla鈥檚 partially automated systems have been under investigation by NHTSA since June of 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing its path in Florida. The separate probe into Teslas that were using Autopilot when they crashed into emergency vehicles started in August 2021.
Including the Florida crash, NHTSA has sent investigators to 35 Tesla crashes in which automated systems are suspected of being used. Nineteen people have died in those crashes.
Consumer Reports has tested Tesla鈥檚 monitoring system, which changes often with online software updates. Initially, the system didn鈥檛 warn a driver without hands on the wheel for three minutes. Recently, though, the warnings have come in as little as 15 seconds. Fisher said he isn鈥檛 sure, though, how long a driver鈥檚 hands could be off the wheel before the system would slow down or shut off completely.
In shutting off the 鈥渟teering wheel nag,鈥 Fisher said, Tesla could be switching to the camera to monitor drivers, but that鈥檚 unclear.
Despite implying through the names that Autopilot and 鈥淔ull Self-Driving鈥 can drive themselves, Fisher said, it鈥檚 clear that Tesla expects owners to still be drivers. But the NTSB says human drivers can end up dropping their guard and relying too much on the systems while looking elsewhere or doing other tasks.
Those who use 鈥淔ull Self-Driving,鈥 Fisher said, are likely to be more vigilant in taking control because the system makes mistakes.
鈥淚 wouldn鈥檛 dream of taking my hands off the wheel using that system, just because it can do things unexpectedly,鈥 he said.
Koopman said he doesn鈥檛 see a great safety risk from disabling the steering wheel nag because the Tesla monitoring system is so flawed that disabling it doesn鈥檛 necessarily make Teslas any more dangerous.
NHTSA, he said, has enough evidence to take action to force Tesla to install a better monitoring system.
The agency says it doesn鈥檛 comment on open investigations.
鈥擳om Krisher, The Associated Press
RELATED: Uber suspends self-driving car tests after fatality
RELATED: Musk says he鈥檒l be Twitter CEO until a replacement is found