A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.
I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.
Just no footage from the interior camera, no proof of FSD being used.
Others have pointed out critical holes in his story - namely that he claims that he was on a version of FSD that was not released at the time of his crash.
The link I gave you is the place he posted this. And you can see what version he says he was using:
https://www.notateslaapp.com/fsd-beta/
So you are parroting bullshit, the current version is 13.2.9.
Funny how people in the thread I linked to you, who drive Tesla themselves don’t question this?
Some people believe the FSD saw the shadow of the pole as a curb in the road, or maybe even the base of a wall. And that’s why the FSD decided to “evade”.
There are plenty examples in the comments from people who drive Tesla themselves, about how it steers into oncoming traffic, one describes how his followed black skid marks in the road wearing wildly left to right, another describes how his made an evasive maneuver because of a patch in the road. It just goes on and on with how faulty FSD is.
IDK what Tesla cars have what cameras. But I’ve seen plenty reporting on Tesla FSD, and none of it is good.
So why do you believe it’s more likely to be human error? When if it was a human not paying attention, it would be much more likely to weer slowly. rather than making an abrupt idiotic maneuver?
To me it seems you are the one who lacks evidence in your claims.
And problem with Tesla logging is that it’s a proprietary system that only Tesla has access to, that system needs to be open for everybody to examine.