Tesla reports another Robotaxi crash
Posted by hjouneau 1 day ago
Comments
Comment by Veserv 1 day ago
Yet somehow they claim that old versions, using old hardware, on arbitrary roads, using untrained customers as safety drivers somehow average 2.9 million miles per collision in non-highway environments [2], a ~72.5x difference in collision frequency, and 5.1 million miles per collision in all environments, a ~175x(!) difference in collision frequency, when their reporting and data are not scrutinized.
I guess their most advanced software and hardware and professional safety drivers just make it 175x more dangerous.
[1] https://techcrunch.com/2025/05/20/musk-says-teslas-self-driv...
[2] https://www.tesla.com/fsd/safety
[3] https://www.forbes.com/sites/alanohnsman/2025/08/20/elon-mus...
[3.a] Tesla own attorneys have argued that statements by Tesla executives are such nonsense that no reasonable person would believe them.
Comment by randoglando 1 day ago
Comment by AnthonyMouse 1 day ago
People often don't report minor accidents. Someone scrapes a pole without causing enough damage to hit their insurance deductible, are they going to file a police report? Mostly not. And then the older number had that in it and the newer one doesn't.
But the number for human drivers works like the old number. They're dividing miles driven by reported accidents. On top of that, they're using the average -- by miles -- which isn't the same as the median, and in particular that will over-represent drivers who drive the most miles, who are disproportionately professional drivers.
Comment by fragmede 11 hours ago
Comment by tim333 21 hours ago
I'm not sure what the guys in the taxis with their hands on the arm rest do. I guess they have a button that either stops the car or connects it to a remote control operator?
Comment by kstenerud 1 day ago
Ultimately, Tesla has two problems going on here:
1. Their crash rate is 2x that of Waymo.
2. They redact a lot of key information, which complicates safety assessments of their fleet.
The redactions actually hurt Tesla, because the nature of each road incident really matters: EVERY traffic incident must be reported, regardless of fault (even if it's a speeding car from the other direction that hits another vehicle which then hits the robotaxi - yes, that's actually in one of the Waymo NHTSA incident reports). When Tesla redacts the way they've been doing, it makes it very difficult to do studies like https://www.tandfonline.com/doi/full/10.1080/15389588.2025.2... which show how much safer Waymo vehicles are compared to humans WHEN IT COMES TO ACTUAL DAMAGE DONE.
We can't get that quality of info from Tesla due to their redaction practices. All we can reliably glean is that Tesla vehicles are involved in 2x the incidents per mile compared to Waymo. https://ilovetesla.com/teslas-robotaxi-dilemma-navigating-cr...
Comment by mmooss 1 day ago
Comment by orwin 1 day ago
Comment by TheAmazingRace 1 day ago
Comment by tim333 21 hours ago
Comment by fragmede 11 hours ago
Comment by rich_sasha 1 day ago
Every time when he has the choice to do something conservative or bold, he goes for the latter, and so long as he has a bit of luck, that is very much a winning strategy. To most people, I guess the stress of always betting everything on red would be unbearable. I mean, the guy got a $300m cash payout in 1999! Hands up who would keep working 100 hour weeks for 26 years after that.
I'm not saying it is either bad or good. He clearly did well out of it for himself financially. But I guess the whole cameras/lidar thing is similar. Because it's big, bold, from the outset unlikely to work, and it's a massive "fake it till you make it" thing.
But if he can crack it, again I guess he hits the jackpot. Never mind cars, they are expensive enough that Lidar cost is a rounding error. But if he can then stick 3d vision into any old cheap cameras, surely that is worth a lot. In fact wasn't this part of Tesla's great vision - to diversify away from cars and into robots etc. I'm sure the military would order thousands and millions of cheapo cameras that work 90% as well as a fancy Lidar - while being fully solid state etc.
That he is using his clients as lab rats for it is yet another reason why I'm not buying one. But to me this is totally in character for Musk.
Comment by gitaarik 6 hours ago
Comment by BrenBarn 1 day ago
Comment by randoglando 1 day ago
Comment by cn-watch 1 day ago
How could the richest man in the world give so little back.
Comment by cosmicgadget 14 hours ago
Comment by AnthonyMouse 1 day ago
Comment by Zigurd 20 hours ago
Effective altruism and other New Age garbage pseudo philosophy can't hold a candle to that.
Comment by oska 8 hours ago
To reveal my own bias / worldview, I loathe and detest Bill Gates in nearly every way and have done so for over three decades. I think he has had a massively negative impact on humanity, mainly by making the computer industry so shitty for 4+ decades but in other more controversial areas as well.
With Elon Musk, while perceiving a number of big faults in the man, I also acknowledge that he has helped advance some very beneficial technologies (like electric vehicles and battery storage). So I have a mixed opinion on him, while with Gates, he is almost all evil and has had a massive negative impact on the planet.
Comment by oska 8 hours ago
Comment by rich_sasha 5 hours ago
The way I see it, he converted his cars' greenness into other people's fumes. So not a net win after all.
Comment by johnthewise 22 hours ago
Comment by jauntywundrkind 1 day ago
The data is just not there for us outsiders to make any kind of case, and thats the skimping out crucial baseline we need.
Comment by DoesntMatter22 1 day ago
Comment by TheAmazingRace 1 day ago
Comment by natch 1 day ago
Comment by dang 1 day ago
Comment by themafia 1 day ago
They've been seen doing this at crime scenes and in the middle of police traffic stops. That speaks volumes too.
Comment by daheza 1 day ago
Comment by themafia 1 day ago
> presented with a set of options and they choose one
> they send a physical human to drive the car.
Those all sound like "controls" to me.
"Fleet response can influence the Waymo Driver's path, whether indirectly through indicating lane closures, explicitly requesting the AV use a particular lane, or, in the most complex scenarios, explicitly proposing a path for the vehicle to consider. "
https://waymo.com/blog/2024/05/fleet-response/
So they built new controls that typical vehicles don't have. Then they use them. I fail to see how any of this is "incorrect." It is, in fact, _built in_ to the system from the ground up.
Semantic games aside, it is obviously more incorrect to call them "completely self driving" especially when they "ask for help." Do human drivers do this while driving?
Comment by Mawr 1 day ago
Comment by mmooss 1 day ago
Comment by tasty_freeze 1 day ago
Comment by guywithahat 1 day ago
The only real test will be who creates the best product, and while waymo seems to have the lead it's arguably too soon to tell.
Comment by _aavaa_ 1 day ago
Comment by fpoling 1 day ago
Comment by _aavaa_ 1 day ago
But also, if you didn’t get the right result, I don’t care how quickly you didn’t get it.
Comment by MadnessASAP 1 day ago
Comment by Zigurd 20 hours ago
It's analogous to communications latency. High latencies are very annoying to humans, but below a threshold they stop mattering.
Comment by guywithahat 1 day ago
I will also add in my personal experience, while some filters work best together (like imu/gnss), we usually either used lidar or camera, not both. Part of the reason was combining them started requiring a lot more overhead and cross-sensor experts, and it took away from the actual problems we were trying to solve. While I suppose one could argue this is a cost issue (just hire more engineers!) I do think there's value in simplifying your tech stack whenever possible. The fewer independent parts you have the faster you can move and the more people can become an expert on one thing
Again Waymo's lead suggests this logic might be wrong but I think there is a solid engineering defense for moving towards just computer vision. Cameras are by far the best sensor, and there are tangible benefits other than just cost.
Comment by karlgkk 1 day ago
> solid engineering defense for moving towards just computer vision
COUNTERPOINT: WAYMO
Comment by guywithahat 15 hours ago
> Again Waymo's lead suggests this logic might be wrong but I think there is a solid engineering defense for moving towards just computer vision. Cameras are by far the best sensor, and there are tangible benefits other than just cost.
Comment by fragmede 1 day ago
Comment by karlgkk 12 hours ago
Comment by fragmede 8 hours ago
Comment by bulbar 1 day ago
Comment by ambicapter 1 day ago
Comment by guywithahat 15 hours ago
It's not that we ran into problems, it's that the tech didn't deliver what we hoped when we could have used the time to build something better.
Comment by solfox 1 day ago
Comment by bulbar 1 day ago
It didn't work out though and now multi sensor systems are eating their lunch.
Comment by gitaarik 6 hours ago
Simplifying things doesn't always make things easier.
Comment by cameldrv 1 day ago
The radar they had really couldn't detect stationary objects. It relied on the doppler effect to look for moving objects. That would work most of the time, but sometimes there would be a stationary object in the road, and then the computer vision system would have to make a decision, and unfortunately in unusual situations like a firetruck parked at an angle to block off a crash site, the Tesla would plow into the firetruck.
Given that the radar couldn't really ever be reliable enough to create a self driving vehicle, after he hired Karpathy, Elon became convinced that the only way to meet the promise was to just ignore the radar and get the computer vision up to enough reliability to do FSD. By Tesla's own admission now, the hardware on those 2016+ vehicles is not adequate to do the job.
All of that is to say that IMO Elon's primary reason for his opinions about Lidar are simply because those older cars didn't have one, and he had promised to deliver FSD on that hardware, and therefore it couldn't be necessary, or he'd go broke paying out lawsuits. We will see what happens with the lawsuits.
Comment by xnx 1 day ago
He "painted himself into a corner" is the more accurate expression when one is the cause of their own problem
Comment by vjvjvjvjghv 1 day ago
Comment by fpoling 1 day ago
Comment by jsnell 1 day ago
Comment by vjvjvjvjghv 1 day ago
Comment by guywithahat 15 hours ago
Comment by bsder 1 day ago
Actually, we know that vision alone doesn't work.
Sun glare. Fog. Whiteouts. Intense downpours. All of them cause humans to get into accidents, and electronic cameras aren't even as good as human eyes due to dynamic range limitations.
Dead reckoning with GPS and maps are a huge advantage that autonomous cars have over humans. No matter what the conditions are, autonomous cars know where the car is and where the road is. No sliding off the road because you missed a turn.
Being able to control and sense the electric motors at each wheel is a big advantage over "driving feel" from the steering wheel and your inbuilt sense of acceleration.
Radar/lidar is just all upside above and beyond what humans can do.
Comment by dyauspitr 1 day ago
Comment by goosejuice 1 day ago
https://www.npr.org/2025/12/06/nx-s1-5635614/waymo-school-bu...
Comment by xp84 1 day ago
Comment by goosejuice 1 day ago
Engineering problems aren't limited to a single solution anyhow. Anyone ruling out a camera based solution has very little imagination.
Comment by dyauspitr 12 hours ago
Comment by goosejuice 8 hours ago
If that's your definition of solved, be my guest, but it's a pretty silly one.
Comment by dyauspitr 7 hours ago
Detroit, Denver, Philadelphia and DC this year.
Comment by goosejuice 7 hours ago
Comment by xnx 1 day ago
Comment by dyauspitr 1 day ago
Comment by guywithahat 11 hours ago
Comment by dyauspitr 9 hours ago
Comment by lotsofpulp 1 day ago
Price is a factor. I’ve been using the free self driving promo month on my model Y (hardware version 4), and it’s pretty nice 99% of the time.
I wouldn’t pay for it, but I can see a person with more limited faculties, perhaps due to age, finding it worthwhile. And it is available now in a $40k vehicle.
It’s not full self driving, and Waymo is obviously technologically better, but I don’t think anyone is beating Tesla’s utility to price ratio right now.
Comment by jmpman 1 day ago
Comment by hnburnsy 10 hours ago
https://www.yahoo.com/news/articles/two-self-driving-waymos-...
Comment by koinedad 1 day ago
Comment by phyzome 1 day ago
(The one thing I would like to see done differently here is including an error interval.)
Comment by jsight 1 day ago
Of the Tesla accidents, five of them involved either collisions with fixed objects, animals, or a non-injured cyclist. Extremely minor versions of these with human drivers often go unreported.
Unfortunately, without the details, this comparison will end up being a comparison between two rates with very different measurement approaches.
Comment by phyzome 15 hours ago
Comment by Rebelgecko 1 day ago
Austin has relatively low miles so the confidence intervals are wider but not too far from what they show for other cities
Comment by tomhow 1 day ago
Please use the original title, unless it is misleading or linkbait; don't editorialize.
Comment by rsynnott 21 hours ago
> and other self driving technologies
I mean, this isn't self-driving. It has a safety driver.
Comment by altairprime 1 day ago
Comment by kevin_thibedeau 1 day ago
Comment by altairprime 15 hours ago
Comment by 0_____0 1 day ago
Comment by dylan604 1 day ago
Something like this: https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh...
Comment by dzhiurgis 12 hours ago
Comment by altairprime 8 hours ago
Comment by jsight 1 day ago
We need far higher quality data than this to reach meaningful conclusions. Implying conclusions based upon this extrapolation is irresponsible.
Comment by mmooss 1 day ago
Comment by jsight 16 hours ago
Comment by mmooss 12 hours ago
I don't see why you say that.
> My point is that the data naturally has error margins that are clearly large enough to make drawing concrete conclusions impossible.
I don't understand this one either.
Comment by bparsons 1 day ago
Comment by themafia 1 day ago
Comment by davidw 1 day ago
Comment by cosmicgadget 14 hours ago
Comment by Zigurd 13 hours ago
Comment by cosmicgadget 13 hours ago
Comment by Zigurd 11 hours ago
Nevertheless, getting big money donations out of politics would be a big improvement.
Comment by narrator 1 day ago
Comment by dubeye 22 hours ago
Comment by ndsipa_pomu 1 day ago
Comment by rsynnott 21 hours ago
Comment by 93po 1 day ago
This one is misleading both in that 8 "crashes" is statistically insignificant to draw conclusions as to its safety compared to humans, but also because these 'crashes' are not actually crashes and instead a variety of things, including hitting a wild animal of unknown size or potentially minor contact with other objects of unspecified impact strength.
They make other unsubstantiated and likely just wrong claims:
> The most critical detail that gets lost in the noise is that these crashes are happening with a human safety supervisor in the driver’s seat (for highway trips) or passenger seat, with a finger on a kill switch.
The robotaxi supervisors are overwhelmingly only the passenger seat - I've never actually seen any video footage of them in the driver seat, and Electrek assuredly has zero evidence of how many of the reported incidents involved someone in the driver seat. Additionally, these supervisors in the passenger seat are not instructed to prevent every single incident (they arent going to emergency brake for a squirrel) and to characterize them as "babysitting to prevent accidents" is just wrong.
This article is full of other glaring problems and lies and mistruths but it's genuinely not worth the effort to write 5 pages on it.
If you want some insight on why Fed Lambert might be doing this, look no further than the bottom of the page: Fred gives (sells?) "investment tips" which, you guessed it, are perpetually trying to convince people to sell and short Telsa: https://x.com/FredLambert/status/1831731982868369419
Feel free to look at his other posts: it's 95% trying to convince people that Telsa is going bankrupt tomorrow, and trying to slam Elon as much as possible - sometimes for good reasons (transphobia) but sometimes in ways that really harms his credibility, if he actually had any
Lambert has also been accused of astrotrufing in lawsuits, and had to go through a settlement that required him to retract all the libel he had spread: https://www.thedrive.com/tech/21838/the-truth-behind-electre...
That same source also touches on Fred and Seth's long history of swinging either side of the bandwagon in attempts to maximize personal gain off bullshit reporting. And basically being a massive joke in automotive reporting.
The owner of Eletrek, Seth Weintraub, also notably does the same thing: https://x.com/llsethj/status/1217198837212884993
Comment by Mawr 23 hours ago
Comment by 93po 16 hours ago
Comment by Veserv 12 hours ago
Despite that, the article and the public (the target of the hit piece that encourages people to endanger themselves with a system that has not been demonstrated to be safe with the direct intent of enriching the owners of Tesla) directly refute Tesla's ridiculous claims demonstrating they are off by multiple orders of magnitude using basic mandatory data reporting for their Robotaxi program which is using systems more advanced, fine-tuned, geofenced, with professional safety drivers (thus we can only reasonably assume that their normal system is worse), but which actually has scrutinized reporting requirements.
And yet now you argue that the entity fabricating ridiculous claims for their own enrichment, Tesla, is not only not responsible, but target of the hit piece, the ones that clearly and debunked Tesla's claims as deceptive, are not only responsible for refuting it but are responsible for demonstrating a level of rigor that is unimpeachable when the original fabricated claim lacks even the elements of rigor we expect out of your average middle school science fair, let alone a literal trillion dollar company.
Talk about double standards.
Comment by smoovb 11 hours ago
Comment by perrohunter 1 day ago
Comment by panarky 1 day ago
Comment by wizardforhire 1 day ago
In the past it took a lot less to get the situation fixed… and these were horrendous situations! [1][2] And yet tesla is a factor of 10 worse!
[1] https://en.wikipedia.org/wiki/Ford_Pinto
[2] https://en.wikipedia.org/wiki/Firestone_and_Ford_tire_contro...
Comment by 7e 1 day ago
Comment by platevoltage 1 day ago
Comment by NedF 1 day ago
Comment by thomassmith65 1 day ago
With 7 reported crashes at the time, Tesla’s Robotaxi was crashing roughly
once every 40,000 miles [...]. For comparison, the average human driver
in the US crashes about once every 500,000 miles.
This means Tesla’s “autonomous” vehicle, which is supposed to be the future of safety,
is crashing 10x more often than a human driver.
That is a possible explanation for why Musk believes in people having 10x as many children. /sComment by nodesocket 1 day ago
Comment by jsight 1 day ago
Comment by DoesntMatter22 1 day ago
Comment by jsight 1 day ago
There are real, legitimate concerns here, but they are hidden behind a lot of assumptions.
Comment by boringg 1 day ago
Comment by bryanlarsen 1 day ago
Comment by Tagbert 1 day ago
Comment by 93po 1 day ago
Comment by youarentrightjr 1 day ago
Comment by mmooss 1 day ago
The claim is that Musk is a victim.
Comment by ajross 1 day ago
Does anyone know what the cite for this might be? I'm coming up empty. To my knowledge, no one (except maybe insurance companies) tallies numbers for fender bender style accidents. This seems like a weirdly high number to me, it's very rare to find any vehicle that reaches 100k miles without at least one bump or two requiring repair.
My suspicion is that this is a count of accidents involving emergency vehicle or law enforcement involvement? In which case it's a pretty terrible apples/oranges comparison.
Comment by pavon 1 day ago
[1] https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/...
[2] https://www.nhtsa.gov/crash-data-systems/crash-report-sampli...
Comment by ajross 14 hours ago
[1] Source: I've hit a garbage can and told no one. Until this moment.
Comment by habosa 1 day ago
Comment by bryanlarsen 1 day ago
Comment by ajross 9 hours ago
Also I don't think that's correct; that's a ton of driving! I strongly suspect the number you're citing is the number of miles an average American spends in a road vehicle, not actually driving it. But that counts the same "car-mile" multiple times for all the occupants, when the statistic we're arguing about right now is about the vehicle, not the occupants.
Comment by furyofantares 1 day ago
It goes seem like a high number to me - in 30 years of pretty heavy driving I've probably done about 500k miles and I've definitely had more than one incident. But not THAT many more than one, and I've put 100k miles on a few vehicles with zero incidents. Most of my incidents were when I was a newer driver who drove fairly recklessly.
Comment by jsight 1 day ago
1/500k miles that includes the interstate will be very different from the rate for an urban environment.
Comment by senordevnyc 1 day ago
Comment by natch 1 day ago
A responsible journalist with half a clue would mention that, and tell us how that distorts the numbers. If we correct for this distortion, it’s clear that the truth would come out in Tesla’s favor here.
Instead the writer embraces the distortion, trying to make Tesla look bad, and one is left to wonder if they are intentionally pushing a biased narrative.
Comment by bryanlarsen 1 day ago
Using your own personal experience, it should be obvious that trivial fender benders are more common than once per lifetime but significantly less common than one every couple of years.
Comment by natch 3 hours ago
Comment by tigranbs 1 day ago
Waymo has a huge head start, and it is evident that the "fully autonomous" robotaxi date is far behind what Elon is saying publicly. They will do it, but it is not as close as the hype suggests.
Comment by verteu 1 day ago
Comment by 93po 1 day ago
Comment by phyzome 1 day ago
Comment by dylan604 1 day ago
Comment by phyzome 15 hours ago
Comment by Rebelgecko 1 day ago