Regarding the Feb. 17 news article “Musk claims driver in 2022 fatal Tesla crash did not have Full Self-Driving”:
Tesla CEO Elon Musk tweeted about a man who died in a crash while driving a Tesla, “He was not on FSD. The software had unfortunately never been downloaded.” Though there are likely multiple contributing factors to this crash, one factor that was not mentioned is driver misunderstanding of the capabilities of these advanced technologies. They have been given misleading names, such as Full Self-Driving, which is a component of Tesla’s Autopilot suite of driver-assistance features.
In my mind, an autopilot feature would mean that I could take my attention off the road and let the car take me to my destination automatically. As pointed out in research by the MIT AgeLab, the Insurance Institute for Highway Safety and others, drivers’ understanding of advanced vehicle technology can be influenced by their names. “Autopilot” has been criticized as one of the most misleading names. Tesla points out that Autopilot is “intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.” There are 43 additional warnings and cautions in the Tesla 3 owners’ manual associated with Autopilot technologies. Unfortunately, many drivers feel that life is too short to read their owner’s manual.
Though a better name might not tell drivers all they need to know, a bad one can mislead them about the capabilities and limitations of the technology.
Michael Perel, Annandale