Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tesla Apologists: The judge/jury agreed that Tesla was "Full Self Driving" all the way to the scene of the crash.
 help



If I read the article it says autopilot, not FSD.

> If I read the article it says autopilot, not FSD.

What's the difference? And does it matter?

Both are misleadingly named, per the OP:

> In December 2025, a California judge ruled that Tesla’s use of “Autopilot” in its marketing was misleading and violated state law, calling “Full Self-Driving” a name that is “actually, unambiguously false.”

> Just this week, Tesla avoided a 30-day California sales suspension only by agreeing to drop the “Autopilot” branding entirely. Tesla has since discontinued Autopilot as a standalone product in the U.S. and Canada.

> This lands weight to one of the main arguments used in lawsuits since the landmark case: Tesla has been misleading customers into thinking that its driver assist features (Autopilot and FSD) are more capable than they are – leading drivers to pay less attention.


Autopilot is similar to cruise control that is aware of other cars, and lane keeping. I would fully expect the sort of accident that happened to happen (drop phone, stop controlling vehicle, it continues through an intersection).

FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

The fact that Tesla misleads consumers is a different issue from Autopilot and FSD being different.


Autopilot is similar to cruise control that is aware of other cars, and lane keeping.

Thanks for explaining why labeling it "Autopilot" is misleading and deceptive.


Is anyone actually being deceived? When you’re buying a Tesla, they definitely carefully explain these options to you.

This is not even funny anymore. You reap what you sow.

> FSD has much more sophisticated features, explicitly handling traffic stops and lights. I would not expect the sort of accident to happen with FSD.

FSD at one point had settings for whether it could roll through stop signs, or how much it could exceed the speed limit by. I've watched it interpret a railroad crossing as a weirdly malfunctioning red light with a convoy of intermittent trucks rolling by. It took the clearly delineated lanes of a roundabout as mere suggestions and has tried to barrel through them in a straight line.

I'd love to know where your confidence stems from.


My confidence comes only from what I hear people doing with the system. I have zero experience with it and consider most of the PR from Tesla to be junk.

"would not expect" is the way a cautious person demonstrates a lack of confidence.


I remember having this argument with a friend.

My argument was that the idea that the name Autopilot is misleading comes not from Tesla naming it wrong, it comes from what most people think "Autopilots" on an aircraft do. (And that is probably good enough to argue in court, that it doesn't matter what's factually correct, it matters what people understand based on their knowledge)

Autopilot on a Tesla historically did two things - traffic aware cruise control (keeps a gap from the car in front of you) and stays in its lane. If you tell it to, it can suggest and change lanes. In some cases, it'll also take an exit ramp. (which was called Navigate on Autopilot)

Autopilots on planes roughly also do the same. They keep speed and heading, and will also change heading to follow a GPS flight plan. Pilots still take off and land the plane. (Like Tesla drivers still get you on the highway and off).

Full Self Driving (to which they've now added the word "Supervised" probably from court cases but it always was quite obvious that it was supervised, you had to keep shaking the steering wheel to prove you were alert, same as with Autopilot btw), is a different AI model that even stops at traffic lights, navigates parking lots, everything. That's the true "summon my car from LA to NY" dream at least.

So to answer your question, "What's the difference" – it's huge. And I think they've covered that in earlier court cases.

But one could argue that maybe they should've restricted it to only highways maybe? (fewer traffic lights, no intersections), but I don't know the details of each recent crash.


Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking.

Tesla’s Autopilot being unable to swap from one road to another makes is way less capable than a decades old civilian autopilots which will get you to any arbitrary location as long as you have fuel. Calling the current FSD Autopilot would be overstating its capabilities, but reasonably fitting.


>"Autopilots do a lot more than that because flying an aircraft safely is a lot more complicated than turning a steering wheel left and right and accelerating or breaking."

Can you elaborate? My very limited knowledge but of very real airplane autopilots in little Cessna and Pipers is that they are in fact far easier than cars - they are a simple control feedback loop that maintains altitude and heading, that's it. You can crash into ground, mountain, or other traffic quite cheerfully. I would not be surprised to find adaptive cruise in cars is far more complex of a system than basic aircraft "autopilot".


Doesn’t basic airplane autopilot just maintain flight level, speed, and heading? What are some other things it can do?

Recover from upsets is the big thing. Maintaining flight level, speed, and heading while upside down isn’t acceptable.

Levels of safety are another consideration, car autopilot’s don’t use multiple levels of redundancy on everything because they can stop without falling out of the sky.


That's still massively simpler than making a self-driving car.

It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude you want and over a reasonable timescale it will do just that.


> It's trivially easy to fly a plane in straight level flight, to the extent that you don't actually need any automation at all to do it. You simply trim the aircraft to fly in the attitude

That seemingly shifts the difficulty from the autopilot to the airframe. But that’s not actually good enough, it doesn’t keep an aircraft flying when it’s missing a large chunk of wing for example. https://taskandpurpose.com/tech-tactics/1983-negev-mid-air-c...

Instead, you’re talking about the happy path and if we accept the happy path as enough there’s the weekend equivalents of self driving cars built using minimal effort, however being production worthy is about more than being occasionally useful.

Autopilot is difficult because you need to do several things well or people will defiantly die. Self driving cars are far more forgiving of occasional mistakes but again it’s the or people die bits that makes it difficult. Tesla isn’t actually ahead of the game, they are just willing to take more risks with their customers and the general public’s lives.


> Self driving cars are far more forgiving of occasional mistakes

I would say not, no.

It's almost impossible to crash a plane. There's nothing to hit except the ground, and you stay away from that unless you really really mean to get close.

It's very easy to crash a car, and if you do that most of the time you'll kill people outside the car, often quite a lot of them.

There are no production aircraft fitted with autopilots that can correct for breaking a wing off.


Autopilots have contributed to a significant number of crashes and that’s with a very safety conscious industry.

In a hypothetical Tesla style let’s take more risk approach, buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground. What is the worst possible thing an autopilot could do in this situation is eye opening here.

> There are no production aircraft fitted with autopilots that can correct for breaking a wing off.

That was a production aircraft still in service. https://simpleflying.com/how-many-f-15-eagles-are-still-in-s...

Granted that specific case depends on the aircraft being a lifting body etc so obviously doesn’t extend to commercial aviation. But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.


> Autopilots have contributed to a significant number of crashes and that’s with a very safety conscious industry.

"Contributed to", in the sense that the pilots decided to just blindly trust the autopilot and let it make a developing situation worse rather than, oh I don't know, maybe FLYING THE DAMN PLANE.

> buggy autopilots can surprisingly quickly get into a situation at cruising altitude which isn’t recoverable before hitting the ground

If you allow the autopilot to fly the plane into the ground, yes. If you're paying attention you ought to be able to recover just about anything, if most of the plane is still working. The vast majority of incidents where aircraft have departed controlled flight and crashed are because the pilots lost sight of the important thing - FLYING THE DAMN PLANE.

> But my point was lack of aerodynamic stability on its own isn’t enough that giving up is ok.

It's got nothing to do with aerodynamic stability. If you adjust the steering and suspension in a car correctly, it'll drive in a perfectly straight line with no user input for a surprisingly long way. With modern electronic power steering and throttle-by-wire systems it's actually surprisingly easy to turn an off-the-shelf car (even something cheap, secondhand, and quite old like a 2010s Vauxhall Corsa) into a simple line-following robot like we used to build at uni in the 80s and 90s in robotics class. Sure, you need a disused aerodrome to play with it, but it'll work.

There is the far greater problem that self-driving cars have to cope with a far more rapidly changing environment than an aircraft. A self-flying plane would be far easier to get right than a self-driving car.

A human driver can't just react, painfully slowly, in the way that current "self-driving" cars do, they have to anticipate and be "reacting" before the problem even begins to start. You do it yourself, even if you don't realise it. You hang back from that car because you know they're going to - there, right across two lanes, not so much as a glance in their mirror, what did I tell you? - they're going to do something boneheaded. That car's just pulled in, the passenger in the back is about to open their door right into your - nicely done, you moved out to the line and missed them by 50cm at least.

Self-driving cars can't do that, and probably never will. Self-flying aircraft won't need to do that.

And an autopilot is a surprisingly simple device that responds in simple and predictable ways to sensor inputs.


> "Contributed to", in the sense that the pilots decided to just blindly trust the autopilot and let it make a developing situation worse rather than, oh I don't know, maybe FLYING THE DAMN PLANE.

Excuses don’t save lives. You can’t trust pilots or drivers to always make the correct decision instantly. Any system designed in such a manner will get people killed.

> If you allow the autopilot to fly the plane into the ground, yes.

Things can be unrecoverable a full minute before impact. There’s some seriously harrowing NTSB reports, and that’s just what’s already happened possible failure modes are practically endless.


> Excuses don’t save lives. You can’t trust pilots or drivers to always make the correct decision instantly. Any system designed in such a manner will get people killed.

Okay, so what's your answer? Stick yet another computer in to go wrong and fly the plane into the ground when it gets the wrong idea about a situation? Add yet more sensors to the car to prevent the driver steering away from an obstacle because it thinks they're not using their indicators yet?

> Things can be unrecoverable a full minute before impact.

Can you find an example of one that isn't down to gross mechanical failure, or just plain Operator Idiocy?


> Okay, so what's your answer? Stick yet another computer in to go wrong and fly the plane into the ground when it gets the wrong idea about a situation? Add yet more sensors to the car to prevent the driver steering away from an obstacle because it thinks they're not using their indicators yet?

I’m not condemning the airline industry here, the safety conscious approach has done a good job over time especially in terms of redundancy. A major area of improvement is the way autopilots are communicating with pilots, but that’s a hard process.

The car industry isn’t doing nearly as well in terms of redundancy etc so there’s many obvious areas of improvement through solid engineering without changing anything fundamental. That said, communication is again lacking.

> Can you find an example of one that isn't down to gross mechanical failure, or just plain Operator Idiocy?

Operator Idiocy isn’t some clearly defined line, an aircraft with a moderate fuel leak can look like idiocy after the fact but it’s an easy mistake to make. That’s exactly the kind of thing autopilots could catch not just from fuel sensors but how the flight characteristics change as the aircraft gets lighter, but aircraft have happily flown into trouble over the ocean.


Airplane "autoland" goes back a ways:

https://en.wikipedia.org/wiki/Autoland


"Full Self Driving" hadn't even been released at the time of the crash.

well the other person in the comments said the guy literally held his accelerator to the floor the entire time. is that actually a reasonable standard, or are you preemptively out for blood because you would never let reality get in the way of a good agenda? ironic, given that you go out of your way to accuse others of this. methinks you doth protest too much?

The article says 'he dropped his phone and bent down to retrieve it'

and that’s somehow okay while you’re driving? i didn’t realize there was a “uhh but if you drop your phone it’s okay to cause a crash” clause in the statutes.

Of course not. I was simply reacting to the comments that he kept his accelerator on the gas pedal meaning he intended to beat/run the light.

Reaching down to grab your phone is stupid and clearly negligent but quite different than making the decision to intentionally run a light.


Hope he sees this, bro



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: