By Marc Dobin

Introduction: Tesla’s Autonomy Claims vs. Engineering Reality

In the world of autonomous driving, terminology matters. Tesla markets its driver assistance package as “Full Self-Driving” (FSD), which strongly suggests that the vehicle can drive itself. But according to the Society of Automotive Engineers (SAE), which sets the gold standard for defining vehicle automation, Tesla’s system is nowhere near full autonomy. So how does the engineering community view Tesla’s technology, and why is the term “Full Self-Driving” misleading? Let’s unpack this.

SAE Levels 0 Through 5: A Clear Framework

The SAE defines six levels of vehicle automation:

  • Level 0: No automation. The vehicle may assist with warnings or emergency braking, but you are driving.
  • Level 1: A single automated function, such as adaptive cruise control or lane-keeping. The driver must remain fully engaged.
  • Level 2: The vehicle can steer and control speed simultaneously, but the driver must supervise at all times.
  • Level 3: Conditional automation. The system can drive, but the driver must be ready to intervene when prompted.
  • Level 4: High automation. The vehicle can drive itself in limited conditions and doesn’t require human intervention.
  • Level 5: Full automation. The car drives itself under all conditions with no need for pedals or steering wheels.

The SAE makes an important distinction: At Levels 0 through 2, the human is driving. At Level 3 and above, the system is driving.

Where Tesla Really Falls: Solidly Level 2

Tesla’s FSD today is still Level 2. The vehicle can manage speed and lane positioning, but the driver is required to keep their hands on the wheel and eyes on the road. I owned a 2013 Model S P85 with regular cruise control, just like my dad’s 1973 Buick Riviera, and no autosteer. My 2016 Model S 90D had Autopilot 1.0—it steered and slowed for traffic. That was Level 2, just like Volvo’s Pilot Assist.

Despite this, Tesla markets FSD in a way that many consumers might reasonably assume reflects Level 4 or 5 capabilities—despite its actual classification as Level 2. Elon Musk frequently refers to a robotaxi fleet. Meanwhile, buyers today are left with a system that, by SAE standards, still requires active human supervision.

The Legal Grey Zone of Level 3

This is where things get tricky. SAE says that once a system reaches Level 3, the human is no longer the driver. But they must still be ready to take over. So what happens if the car fails to detect a road hazard and alerts the human too late?

Let’s say the car’s camera system gets obscured and it demands you take over. You’re technically not driving—until you are, seconds before impact. So who’s liable? You, for not responding in time? Or the manufacturer, for designing a system that handed you the wheel too late? As a lawyer, I can tell you: everybody gets sued.

Level 4 and the Manufacturer’s Liability

At Level 4, the manufacturer assumes far more risk. There is no hand-off to the driver. Vehicles like Waymo’s robotaxis are good examples—they drive themselves in geo-fenced areas with no expectation of human takeover.

If a Waymo car strikes a pedestrian, it’s not blaming the passenger. The system, and by extension the manufacturer, is in control. But what happens when these vehicles are sold to private parties? Will manufacturers build in indemnification clauses? Require specific insurance? Things are going to get messy.

Level 5: The Pod of the Future—and the Legal Nightmare

Level 5 is the dream—a car with no driver controls, capable of navigating any road. It’s also a liability minefield. If something goes wrong, the manufacturer can’t point to a driver. There isn’t one.

Imagine a Level 5 pod slamming into a school bus. Injuries, property damage, and lawsuits follow. The owner gets sued because they own the car. The manufacturer gets sued because they are the driver. And there’s no one else to blame.

Why Tesla Wants to Stay at Level 2

Despite all its hype, Tesla benefits from staying at Level 2. It gets to:

  • Sell a high-margin software package branded as “Full Self-Driving.” (Currently only as “Supervised’)
  • Avoid regulatory scrutiny and insurance complications tied to Levels 3+
  • Blame the dumbass behind the wheel when something goes wrong.

In a recent case, Tesla’s own expert argued that the system had been “overridden,” implying that the human was still in control. That defense only works if the system isn’t actually driving.

Conclusion: Words Matter

The engineering world has clear definitions for automation. Tesla has blurred them. By marketing a Level 2 system as “Full Self-Driving,” it risks misleading buyers—according to critics, including engineers, consumer advocates, and legal commentators—and undermining trust in real progress toward autonomy.

If we’re going to share the road with software behind the wheel, we need to agree on who is actually driving. Because when the lawsuits come—and they will—the answer won’t just be a technical detail. It will determine who pays.


Leave a Reply

Your email address will not be published. Required fields are marked *