Since Tesla launched its Full Self-Driving (FSD) feature in beta in 2020, the company’s owner’s manual has been clear: Contrary to the name, cars using the feature can’t drive themselves.

Tesla’s driver assistance system is built to handle plenty of road situations—stopping at stop lights, changing lanes, steering, braking, turning. Still, “Full Self-Driving (Supervised) requires you to pay attention to the road and be ready to take over at all times,” the manual states. “Failure to follow these instructions could cause damage, serious injury or death.”

Now, however, new in-car messaging urges drivers who are drifting between lanes or feeling drowsy to turn on FSD—potentially confusing drivers, which experts claim could encourage them to use the feature in an unsafe way. “Lane drift detected. Let FSD assist so you can stay focused,” reads the first message, which was included in a software update and spotted earlier this month by a hacker who tracks Tesla development.

“Drowsiness detected. Stay focused with FSD,” read the other message. Online, drivers have since posted that they’ve seen a similar message on their in-car screens. Tesla did not respond to request for comment about this message, and WIRED has not been able to find this message appearing on a Tesla in-car screen.

The problem, researchers say, is that moments of driver inattention are exactly when safety-minded driver assistance features should demand drivers get ultra-focused on the road—not suggest they depend on a developing system to compensate for their distraction or fatigue. At worst, such a prompt could lead to a crash.

“This messaging puts the drivers in a very difficult situation,” says Alexandra Mueller, a senior research scientist at the Insurance Institute for Highway Safety who studies driver assistance technologies. She believes that “Tesla is basically giving a series of conflicting instructions.”

Plenty of research studies how humans interact with computer systems built to help them accomplish tasks. Generally it finds the same thing: People are really terrible passive supervisors of systems that are pretty good most of the time, but not perfect. Humans need something to keep them engaged.

In research in the aviation sector, it’s called the “out-of-the-loop performance problem,” where pilots, relying on fully automated systems, can fail to adequately monitor for malfunctions due to complacency after extended periods of operation. This lack of active engagement, also known as vigilance decrement, can lead to a reduced ability to understand and regain control of a malfunctioning automated system.

“When you suspect the driver is becoming drowsy, to remove even more of their physical engagement—that seems extremely counterproductive,” Mueller says.

“As humans, as we get tired or we get fatigued, taking more things that we need to do could actually backfire,” says Charlie Klauer, a research scientist and engineer who studies drivers and driving performance at the Virginia Tech Transportation Institute. “It’s tricky.”

Over the years, Tesla has made changes to its technology to make it more difficult for inattentive drivers to use FSD. The automaker began in 2021 to use in-car driver monitoring cameras to determine whether drivers were sufficiently paying attention while using FSD; a series of alerts warn drivers if they’re not looking at the road. Tesla also uses a “strike system” that can prevent a driver from using their driver assistance feature for a week if they repeatedly fail to respond to its prompts.


News Source Home

Disclaimer: This news has been automatically collected from the source link above. Our website does not create, edit, or publish the content. All information, statements, and opinions expressed belong solely to the original publisher. We are not responsible or liable for the accuracy, reliability, or completeness of any news, nor for any statements, views, or claims made in the content. All rights remain with the respective source.