Drivers just can not seem to understand that Teslas are not capable of driving themselves. One after another, they keep allowing Autopilot and then fall asleep – or passing out – behind the wheel, seemingly confident that the car will get them where they need to go in one piece.
But it seems the drivers are not only to
On Thursday, the Insurance Institute for Highway Safety published a study focused on automated driving systems, including Tesla's Autopilot – and based on the research, which specifically called Tesla, carmakers are at least partially responsible for the disconnect between what drivers think these systems can do and what they can actually do.
that the names of carmakers giving automated driving systems are causing confusion about systems' abilities
"Despite the limitations of today's systems, some of their names seem to overpromise when it comes to th The IIHS wrote, noting that its survey found that "one name in particular ̵
Meanwhile, twice as many respondents – a full 6 percent – thought it would be okay to take a nap with a system called "Autopilot" engaged compared to other systems, which just
Even the most advanced of today's automated driving systems are only capable of Level 2 autonomy, meaning we still have a long road ahead of drivers like
If we want to be able to safely take advantage of the technology as it stands, we clearly need to find some way to better educate the driving public as to the abilities and limitations of the current self-driving systems – and based on the results of this study, a good place to start might be regulating what carmakers can and can not call them
READ MORE: New studies highlight driver confusion about automated systems [The Insurance Institute for Highway Safety]
More on Autopilot: Tesla Crash Drivers Are Confused by "Autonomous" vs. "Autopilot"