Autopilot “is not a self-driving technology and does not replace the driver,” Tesla said in response to a 2020 case filed in Florida. “The driver can and must still brake, accelerate and steer just as if the system is not engaged.”
Tesla’s terminology is so confusing. If “Autopilot” isn’t self-driving technology, does that mean it’s different from “Full Self Driving”? And if so, is “Full Self Driving” also not a self-driving technology?
I heard Elon Musk call it: “Assisted full self driving”. Which doesn’t make any sense. LOL
its called “Full Self Driving (Supervised)” now
The term autopilot comes from aviation, where the only kind of problem resolution an autopilot does is turning itself off.
Other than that, it just flies from checkpoint to checkpoint.
If only we could implement similar testing protocols to the aviation version to validate it’s safety!
A full NTSB investigation for every single crash? I’m all for it!
Autopilot is a more basic driver assist system than FSD. FSD is what will eventually become what the name suggests but it’s obviously not there yet and everyone knows this. It’s just the name of the system.
FSD is just a lie because its a description of a product they intend to develop not something that exists on the car you are buying now
What’s your definition of self-driving system then if the current one doesn’t qualify?
The one where Tesla is responsible if there is an accident (but this user blocks people critical of Tesla, so probably won’t see this message).
Specifically Auto Pilot is lane keep and traffic aware cruise control (it will slow down if you’re going faster than the car in front) FSD adds auto lane changes (it can do it by itself or the driver can initiate with the turn signals), makes turns necessary to follow navigation. It does a pretty decent job on freeways.
That they are working on now is getting FSD to work better on city streets and secondary highways
You can’t call something Full Self Driving or Autopilot and then blame the driver. If you want to blame the driver then call it drive asist.
Right! That’s why you have the FSD turn it over to the driver the moment a crash is unavoidable to make the driver liable.
“at the time of the crash, the driver was in full control”
(but not a couple seconds before)
I think Tesla should rename Auto Pilot to Darwin Award Mode.
And improve motorcycle detection as well as use LIDAR.
It’s not that Teslas are killing their owners. Teslas are killing first responders to road accidents, kids getting off buses and motorcyclists. We’re all exposed to the problems caused by Musk cutting out testing to save some money.
The customers pay extra in order to be beta testers. Best deal ever!
That’s just the price we have to pay for this wonderful capitalist system. Worth it!
I like calling it cruise control with extra fatalities.
You’re also responsible for what you do when you’re drunk! Guess what. You cannot purchase ethical excuses. That’s YOUR Tesla. You own it. You’re in charge of it regardless of whether or not Tesla makes it impossible to access the controls.
Buyer beware. Stop buying proprietary garbage, ya idiots.
deleted by creator
I would say depends. If the user was using the feature correctly then Tesla should have some liability.
Most of the crashes I’ve seen the people were not using the feature correctly.