Waymo robotaxi hits a child near an elementary school in Santa Monica
Posted by voxadam 12 hours ago
Comments
Comment by BugsJustFindMe 11 hours ago
> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.
> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.
> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.
I honestly cannot imagine a better outcome or handling of the situation.
Comment by jobs_throwaway 11 hours ago
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”
It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.
Comment by belorn 3 hours ago
One is to merge opposite directional roads into a single lane, forcing drivers to cooperate and take turn to pass it, one car at a time.
For a combined car and pedestrian road (max speed of 7km/h) near where I live, they intentionally added large obfuscating objects on the road that limited visibility and harder to navigate. This forces drivers to drive very slow, even when alone on the road, as they can't see if a car or person may be behind the next object.
In an other road they added several tight S curves in a row, where if you drive anything faster than 20km/h you will fail the turns and drive onto the artificial constructed curbs.
In other roads they put a sign in the middle of two way roads while at the same time drastically limiting the width to the curb, forcing drivers to slow down in order to center the car in the lane and squeeze through.
In each of those is that a human driver with human fear of crashing will cause drivers to pay extra attention and slow down.
Comment by Habgdnv 2 hours ago
Comment by throwaway7783 1 hour ago
Comment by Noumenon72 3 hours ago
Comment by consumer451 5 minutes ago
> An evaluation of 20 mph zones in the UK demonstrated that the zones were effective both in reducing traffic speed and in reducing RTIs. In particular child pedestrian injuries were reduced by 70 per cent from 1.24 per year in each area before to 0.37 per year after the zones were introduced
https://www.rospa.com/siteassets/images/road-safety/road-saf...
The "Vision Zero" program was started in Sweden, and is becoming more widely adopted.
Comment by datsci_est_2015 7 minutes ago
Comment by stouset 13 minutes ago
Comment by pastel8739 16 minutes ago
this sounds like exactly the right tradeoff, especially since these decisions actually increase convenience for those not in cars
Comment by causalscience 2 hours ago
Comment by dkarbayev 3 hours ago
Comment by tombert 3 hours ago
Comment by reddalo 3 hours ago
Roundabouts are safer. They're safer because they prevent everybody from speeding through the intersection. And, even in case of an accident, no head-on collisions happen in a roundabout.
Comment by mikepurvis 3 hours ago
Roundabouts are worse for land use though, which impacts walkability, and the safety story for pedestrians and bike users with them is decidedly not great as well.
Comment by Twirrim 2 hours ago
They're much safer for pedestrians than intersections. You're only crossing and dealing with traffic coming from one direction, stopping at a median, and then crossing further over.
Unlike trying to navigate a crosswalk where you have to play guessing games as to which direction some vehicle is going to come at you from while ignoring the lights (people do the stupidest things, and roundabouts are a physical barrier that prevents a bunch of that)
Comment by mikepurvis 44 minutes ago
I could handle it as an adult just walking my bike but it would be a nightmare for someone pushing a stroller or dependent on a mobility device.
Comment by kortilla 1 hour ago
This assumes a median, which is not present at most smaller roundabouts in the US.
Comment by ndr42 1 hour ago
Comment by darepublic 2 hours ago
Comment by mattlondon 2 hours ago
Many roads in London have parked cars on either side so only one can get through - instead of people cooperating you have people fighting, speeding as fast as they can to get through before someone else appears, or race on-coming cars to a gap in the parked cars etc. So when they should be doing 30mph, they are more likely doing 40-45. Especially with EVs you have near-instant power to quickly accelerate to get to a gap first etc.
And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Madness.
Comment by lmm 2 hours ago
Yes. They have made steady progress over the previous decades to the point where they can now have years with zero road fatalities.
> And putting obstacles in the road so you cant see if someone is there? That sounds really dangerous and exactly the sort of thing that caused the accident in the story here.
Counterintuitive perhaps, but it's what works. Humans adjust their behaviour to the level of perceived risk, the single most important thing is to make driving feel as dangerous as it is.
Comment by mattlondon 2 hours ago
From experience they will adjust their behaviour to reduce their total travel time as much as possible (i.e. speed to "make up" for lost time waiting etc) and/or "win" against other drivers.
I guess it is a cultural thing. But I cannot agree that making it harder to see people in the road is going to make anything safer. Even a robot fucking taxi with lidar and instant reaction times hit a kid because they were obscured by something.
Comment by stouset 10 minutes ago
Comment by lmm 1 hour ago
The evidence is that they do though. E.g. the Exhibition Road remodelling (removing curbs/signs/etc.) has been a great success and effectively reduced vehicle speeds, e.g. https://www.rbkc.gov.uk/sites/default/files/media/documents/...
Comment by jasonfarnon 2 hours ago
Comment by jjav 8 hours ago
We'd have to see video of the full scene to have a better judgement, but I wouldn't call it likely.
The car reacted quickly once it saw the child. Is that enough?
But most humans would have been aware of the big picture scenario much earlier. Are there muliple kids milling around on the sidewalk? Near a school? Is there a big truck/SUV parked there?
If that's the scenario, there is a real probability that a child might appear, so I'm going to be over-slowing way down pre-emptively even thought I haven't seen anyone, just in case.
The car only slows down after seeing someone. The car can react faster that I can after seeing someone, but as a human I can pre-react much earlier based on the big picture, which is much better.
Comment by oakesm9 8 hours ago
While in theory human drivers should be situationally aware of the higher risks of children being around, the reality is that the majority will be in their own bubble of being late to drop their kid off and searching for the first free spot they can find.
Comment by trhway 6 hours ago
The autonomous cars have really got more aggressive recently as i mentioned before:
https://news.ycombinator.com/item?id=46199294
Also Waymo handling road visibility issue:
Comment by stouset 6 minutes ago
I certainly do this. But asserting that most humans would usually do this? Have you ever actually seen humans drive cars? This is absolutely not what they do. On top of that, they run stop signs, routinely miss pedestrians in blind spots, respond to texts on their phone, or scroll around on their display to find the next song they want to put on.
Comment by nostrademons 5 hours ago
I have a similar school drop-off, and can confirm that the cars are typically going around 17-20mph around the school when they're moving. Also that yes, human drivers usually do stay much closer to the centerline.
However, Waymo was recently cleared to operate in my city, and I actually saw one in the drop-off line about a week ago. I pulled out right in front of it after dropping my kid off. And it was following the line of cars near the centerline of the road. Honestly its behavior was basically indistinguishable from a human other than being slightly more polite and letting me pull out after I put my blinker on.
Comment by cardiffspaceman 7 hours ago
Comment by loeg 4 hours ago
Comment by elzbardico 3 hours ago
Comment by eszed 3 hours ago
Comment by loeg 2 hours ago
Comment by Bud 1 hour ago
Comment by jobs_throwaway 7 hours ago
I wouldn't call it likely. Sure, there are definitely human drivers who are better than Waymo, but IME they're few and far between. Much more common to be distracted or careless.
Comment by wat10000 54 minutes ago
It's amazing how much nonsense we let slide with human drivers, and then get uptight about with anything else. You see the same attitude with bicycles. Cars run stop signs and red lights all day long and nobody bats an eye, but a cyclist does it and suddenly they're a menace.
Comment by almosthere 7 hours ago
Consider this scenario:
5 kids are walking on the sidewalk while you're driving past them. But suddenly a large dumpster is blocking your view of them just as you pass. You saw them before the dumpster, but not after your car and the dumpster completely blocks the view.
Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
Again, it's not like every driver will think about this, but many drivers will (even the bad ones).
Comment by jobs_throwaway 7 hours ago
I don't think this is true. There are infinitely many scenarios in a complex situation like a road with traffic, cars parked, pedestrians about, weather, etc. My brain might be able to quickly assess a handful, but certainly not all.
Comment by xorbax 4 hours ago
Not all of those need to be done "quickly". That's where LLMs fail
You note the weather when you leave. You understand the traffic five minutes ahead. You recognize pedestrians far ahead of time.
Computers can process a lot in fractions of a second. Humans can recognize context over many minutes.
The Waymo may have done better in the fraction of a second, but humans can avoid being in that situation to begin with.
Comment by jobs_throwaway 2 hours ago
Comment by robotresearcher 42 minutes ago
Comment by jjav 5 hours ago
If there's ten kids nearby, that's basically ten path scenarios, and that might be reduced if you have great visibility into some of them.
> My brain might be able to quickly assess a handful, but certainly not all.
What would you do if you can't assess all of them? Just keep driving same speed?
If the situation is too overwhelming you'll almost certainly back off, I know I would. If I'm approaching that school block and there's like 50 small kids running around in all directions, I have no idea what's going on and who is going where, so I'm going to just stop entirely until I can make some sense of it.
Comment by robotresearcher 35 minutes ago
There are a very, very large number of scenarios. Every single possible different state the robot can perceive, and every possible near future they can be projected to.
Ten kids is not 10 path scenarios. Every kid could do a vast number of different things, and each additional kid raises the number of joint states to another power.
This is trivially true. The game that makes driving possible for humans and robots is that all these scenarios are not equally likely.
But even with that insight, it’s not easy. Consider a simple case of three cars about to arrive at an all-way stop. Tiny differences in their acceleration - potentially smaller differences than the robot can measure - will result in a different ordering of cars taking turns through the intersection.
It’s a really interesting problem.
Comment by jobs_throwaway 2 hours ago
Comment by lillecarl 4 hours ago
Safe driving starts with speed, lowering speed and informing the passengers seems like a no-brainer.
Comment by almosthere 6 hours ago
Comment by estearum 4 hours ago
Comment by dmd 6 hours ago
Comment by shadowgovt 7 hours ago
Comment by IAmBroom 6 hours ago
Patently, obviously false. A human brain will automatically think of SOME scenarios. For instance, if a collision seems imminent, and the driver is holding a cup of coffee, these ideas are likely to occur to the driver:
IF I GRAB THE STEERING WHEEL AND BRAKE HARD, I MIGHT NOT HIT THAT PEDESTRIAN IN FRONT OF ME.
IF I DON'T CONTINUE HOLDING THE COFFEE CAREFULLY, I MIGHT GET SCALDED.
THIS SONG ON MY RADIO IS REALLY ROCKING!
IF I YANK MY WHEEL TO THE LEFT, I MIGHT HIT A CAR INSTEAD OF A HUMAN.
IF I BRAKE HARD OR SWERVE AT ANY TIME IN TRAFFIC, I CAN CAUSE AN ACCIDENT.
Experiments with callosal patients (who have damaged the connective bridge between the halves of their brains) demonstrate that this is a realistic picture of how the brain makes decisions. It offers up a set of possible actions, and attempts to choose the optimal one and discard all others.
A computer program would do likewise, EXCEPT it won't care about the coffee cup nor the radio (remove two bad choices from consideration).
It still has one bad choice (do nothing), but the SNR is much improved.
I'm not being hyperbolic; self-preservation (focusing on keeping that coffee in my hand) is a vital factor in decision-making for a human.
> ...where Waymo has pre-programmed ones (and some NN based ones).
Yes. And as time goes on, more and better-refined scenarios will be added to its programming. Eventually, it's reasonable to believe the car software will constantly reassess how many humans are within HUMAN_RUN_DISTANCE + CAR_TRAVEL_DISTANCE in the next block, and begin tracking any that in an unsafe margin. No human on Earth does that, continually, without fail.
> Does a human brain carry some worry that they suddenly decide to run and try to cross the street after the dumpster? Does Waymo carry that worry or just continue to drive at the exact same speed.
You continue to imply that Waymo cannot ever improve on its current programming. Does it currently consider this situation? Probably not. Will it? Probably.
Comment by almosthere 6 hours ago
Comment by stouset 9 minutes ago
Comment by mattlondon 2 hours ago
I have no idea what happened here but in my experience of taking waymos in SF, they are very cautious and I'd struggle to imagine them speeding through an area with lots of pedestrians milling around. The fact that it was going 17mph at the time makes me think it was already in "caution mode". Sounds like this was something of a "worst case" scenario and another meter or 2 and it would have stopped in time.
I think with humans, even if the driver is 100% paying attention and eyes were looking in exactly the right place where the child emerged at the right time, there is still reaction times - both in cognition but also physically moving the leg to press the pedal. I suspect that a waymo will out-react a human basically 100% of the time, and apply full braking force within a few 10s of milliseconds and well before a human has even begun to move their leg.
Comment by throw__away7391 1 hour ago
Comment by 2b3o4o 8 hours ago
Comment by recursive 8 hours ago
Here's my problem. If you follow the instructions on the sign, it still says to slow down. There's no threshold for slow enough. No matter how slow you're going, the sign says "Slow Down". So once you become ensnared in the visual cone of this sign, you'll be forced to sit stationary for all eternity.
But maybe there's a loop-hole. It doesn't say how fast you must decelerate. So if you come into the zone going fast enough, and decelerate slowly enough, you can make it past the sign with some remaining non-zero momentum.
You know, I've never been diagnosed on the spectrum, but I have some of the tendencies. lol.
Comment by GenerocUsername 7 hours ago
Comment by recursive 6 hours ago
Comment by estearum 4 hours ago
Comment by Dylan16807 2 hours ago
Much better to be specific than a vague "slow down". There's a road near me with two tight turns a couple blocks apart. One advises 25mph and the other advises 10mph.
Comment by recursive 4 hours ago
Comment by alistairSH 4 hours ago
Comment by Nition 6 hours ago
Everyone's replying to you as if you truly don't understand the sign's intention but I'm sure you do. It's just annoying to be doing everything right and the signs and headlines are still telling you you're wrong.
There was a driving safety safety ad campaign here: "Drive to the conditions. If they change, reduce your speed." You can imagine how slow we'd all be going if the weather kept changing.
We might have OCPD.
Comment by recursive 6 hours ago
In advertising: "Treat yourself. You deserve it!"
Me: What if someone who didn't deserve it heard this message. How can you possibly know what I deserve? Do all people deserve to treat themselves? Is the notion of deserving or treating really so vacuous?
Normies: jfc
Comment by Nition 6 hours ago
Comment by elzbardico 3 hours ago
I hate when people pretend to be smarter than everyone else by pointing this kind of utterance and insisting that someone, somehow, will parse those statements in the most literal and stupid manner.
Then there are the ignorant misanthropes that can't waste a chance to repeat their reductionist speculations about human cognition. Just like the idiot Elon Musk that wasted billions in irrecoverably fucked self-driving system based on computer-version because he underestimated the human visual cortex.
Fucking annoying midwits.
Comment by recursive 1 hour ago
Comment by throwway120385 7 hours ago
Comment by aembleton 3 hours ago
Comment by thfuran 2 hours ago
Comment by recursive 2 hours ago
This is idle XKCD-style musing.
Comment by soulofmischief 7 hours ago
Comment by sandworm101 7 hours ago
FYI, unless you are a commerical truck, a cop, or a racer, your speedometer will read slightly fast, sometimes as much as 5 to 10%. This is normal practice for cars as it limits manufacturer liability. You can check this using independant gps, ie not an in-dash unit. (Just imagine the court cases if a speedo read slower than the actual speed and you can understand why this started.)
Comment by lkbm 7 hours ago
[0] https://www.dmv.ca.gov/portal/handbook/california-driver-han...
Comment by 0xffff2 7 hours ago
Edit: However, elsewhere in the thread someone linked this Streetview image that shoes that this particular school zone is 15mph: https://maps.app.goo.gl/7PcB2zskuKyYB56W8?g_st=ac
Comment by thfuran 2 hours ago
Comment by tastyfreeze 7 hours ago
Comment by sandworm101 4 hours ago
Comment by TylerE 3 hours ago
Comment by loeg 4 hours ago
Comment by cardiffspaceman 7 hours ago
Comment by dekhn 6 hours ago
I think any fair evaluation of this (once the data was available) would conclude that Waymo was taking reasonable precautions.
Comment by jjav 5 hours ago
That's exactly part of the problem. If it is programmed to be over-cautious and go 17 in a 25 zone, that feels like it is safe. Is it?
It takes human judgment of the entire big picture to say meaningfully whether that is too slow or too fast. Taking the speed limit literally is too rigid, something a computer would do.
Need to take into account the flow of the kids (all walking in line vs. milling around going in all directions), their age (younger ones are a lot more likely to randomly run off in an unsafe direction), what are they doing (e.g. just walking, vs. maybe holding a ball that might bounce and make them run off after it), their clustering and so on.
Driving past a high school with groups of kids chatting on the sidewalk, sure 20mph is safe enough. Driving past an elementary school with a mass of kids with toys moving in different directions on the same sidewalk, 17mph is too fast.
And if I'm watching some smaller kids disappear behind a visual obstruction that makes me nervous they might pop up ahead of it on the street, I slow down to a crawl until I can clearly see that won't happen.
None of this context is encoded in the "25mph when children are present" sign, but for most humans it is quite normal context to consider.
But would be great to see video of the Waymo scene to see if any of these factors was present.
Comment by bee_rider 1 hour ago
Anyway, from the article,
> According to the NHTSA, the accident occurred “within two blocks” of the elementary school “during normal school drop off hours.” The safety regulator said “there were other children, a crossing guard, and several double-parked vehicles in the vicinity.”
So I mean, it is hard to speculate. Probably Waymo was being reasonably prudent. But we should note that this description isn’t incompatible with being literally in an area where the kids are leaving their parents’ cars (the presence of “several double parked cars brings this to mind). If that’s the case, it might make sense to consider an even-safer mode for active student unloading areas. This seems like the sort of social context that humans might have and cars might be missing.
But things speculation. It would be good to see a video.
Comment by Aloisius 8 hours ago
To put it another way. If an autonomous vehicle has a reaction time of 0.3 seconds, the stopping distance from 17 mph is about the same as a fully alert human driver (1 second reaction time) driving 10.33 mph.
Comment by linsomniac 4 hours ago
There's a case to be made that it wasn't slow enough.
Comment by supern0va 3 hours ago
Comment by cortesoft 4 hours ago
Comment by Sohcahtoa82 2 hours ago
Nobody was injured.
Comment by loeg 4 hours ago
Comment by robotresearcher 49 minutes ago
How do you know that? The article says it slowed from 17 mph. That’s cautious progress speed, not cruising speed.
Comment by eszed 3 hours ago
I've read studies saying that most drivers don't brake at max effort, even to avoid a collision. This may be at least one of the reasons that Waymo predicted that an attentive human would likely have been going faster than their car at the moment of impact. I've got a good idea of my fun-car's braking performance, because I drive it hard sometimes, but after reading that I started practicing a bit with my wife's car on the school run, and... Yeah: it's got a lot more braking power than I realized. (Don't worry, I brake hard on a long straight exit ramp, when no one's behind me, a fast slow-down is perfectly safe, and the kiddo loves it.) I've now got an intuitive feel for where the ABS will kick in, and exactly what kind of stopping distance I have to work with, which makes me feel like a safer driver.
Second, going off my experience of hundreds and hundreds of ride-share rides, and maybe thirty Waymo journeys, I'd call the best 10-15% of humans better drivers than Waymo. Like, they're looking further up the road to predict which lane to be in, based on, say, that bus two blocks away. They also drive faster than Waymos do, without a perceptual decrease in safety. (I realize "perceptual" is doing some work in that sentence!) That's the type of defensive and anticipatory urban driver I try to be, so I notice when it's done well. Waymo, though, is flat-out better, in every way, than the vast majority of the ride-share drivers I see. I'm at the point where I'll choose a Waymo any time it'll go where I'm headed. This story reinforces that choice for me.
Comment by tialaramex 13 minutes ago
Going early means you slow early, which means you also take longer to reach the child, but you're braking for all of that extra time, so you're slowing down even more.
Comment by netsharc 6 hours ago
Comment by Aloisius 5 hours ago
They even wrote a blog post about it:
https://waymo.com/blog/2023/07/past-the-limit-studying-how-o...
Comment by jerlam 6 hours ago
Comment by superfrank 3 hours ago
Waymos do this and have for years. They know where the people are around them and will take precautionary action based on that.
Here's a video from 2019 of one understanding that a car in the bike lane means the cyclists may dart out into the lane it's in and taking action based on that. https://waymo.com/blog/2019/05/safety-at-waymo-self-driving-...
That video is nearly 7 years old at this point and they've gotten much, much better since then.
If you think a fully-attentive human driver would have done better, I think you're kidding yourself.
I know you didn't make this point, but if anyone think the average LA driver would have done better than this I've got a bridge to sell you and that's really what matters more. (I say that as someone who used to live like half a mile from where this happened)
Comment by tim333 3 hours ago
Comment by Sparkle-san 8 hours ago
Comment by kcrwfrd_ 7 hours ago
Comment by estimator7292 5 hours ago
Comment by tialaramex 6 hours ago
https://www.gov.uk/theory-test/hazard-perception-test
... could in some circumstances know that there's a likelihood that a child will emerge suddenly and reduce their speed in anticipation where circumstances allow.
Note that: If you cut speed but other drivers can't see why they may overtake, even unsafely, because you are a nuisance to them. Slowing in anticipation that a child will run out from behind the SUV, only for a car behind you to accelerate around you and smack straight into the child at even higher speed, is not the desired outcome even though you didn't hurt anybody...
And yes, we'd need to see the video to know. It's like that Sully scenario. In a prepared test skilled pilots were indeed able to divert and land, but Sully wasn't prepared for a test. You're trained to expect engine failure in an aeroplane - it will happen sometimes so you must assume that, but for a jet liner you don't anticipate losing both engines, that doesn't happen. There's "Obviously that child is going in the road" and "Where the fuck did they come from?" and a lot in between and we're unlikely to ever know for sure.
Comment by Natsu 6 hours ago
Waymos constantly track pedestrians nearby, you can see it on the status screen if you ride in one. So it would be both better able to find pedestrians and react as soon as one was on a collision course. They have a bit more visibility than humans do due to the sensor placement, so they also can see things that aren't that visible to a person inside the car, not to mention being constantly aware of all 360 degrees.
While I suppose that in theory, a sufficiently paranoid human might outdo the robot, it looks to me like it's already well above the median here.
Comment by bee_rider 1 hour ago
Comment by dekhn 47 minutes ago
Comment by apitman 7 hours ago
Comment by trhway 3 hours ago
and that can potentially allow internal planning algorithm to choose more risky and aggressive trajectories/behavior, etc. say to reach target destination faster and thus deliver higher satisfaction to the passengers.
Comment by smohare 6 hours ago
Comment by usefulposter 8 hours ago
Note the weaselly "immediately detected the individual as soon as they began to emerge" in the puff piece from Waymo Comms. No indication that they intend to account for environmental context going forward.
If they already do this, why isn't it factored in the model?
Comment by jefftk 8 hours ago
Comment by jjav 5 hours ago
And I completely agree that from that instant forward, the car did everything correctly.
But if I was the accident investigator for this, I would be far more interested in what happened in the 30 seconds before the car saw the kid.
Was the kid visible earlier and then disappear behind an obstruction? Or did the kid arrive from the side and was never earlier visible? These are the more important questions.
Comment by chaboud 11 hours ago
Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.
So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.
Comment by jobs_throwaway 11 hours ago
Comment by themafia 7 hours ago
I feel like you have to say this out loud because many people in these discussions don't share this view. Billion dollar corporate experiments conducted in public are sacrosanct for some reason.
> I just think it's important to consider the counterfactual
More than 50% of roadway fatalities involve drugs or alcohol. If you want to spend your efforts improving safety _anywhere_ it's right here. Self driving cars do not stand a chance of improving outcomes as much as sensible policy does. Europe leads the US here by a wide margin.
Comment by jobs_throwaway 7 hours ago
Yes, and I find it annoying that some people do seem to think Waymo should never be criticized. That said, we already have an astounding amount of data, and that data clearly shows that the experiment is successful in reducing crashes. Waymos are absolutely, without question already making streets safer than if humans were driving those cars.
> If you want to spend your efforts improving safety _anywhere_ it's right here.
We can and should do both. And as your comment seems to imply but does not explicitly state, we should also improve road design to be safer, which Europe absolutely kicks America's ass on.
Comment by themafia 7 hours ago
I disagree. You need way more data, like orders of magnitude more. There are trillions of miles driven in the US every year. Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
> without question
There are _tons_ of questions. This is not even a simple problem. I cannot understand this prerogative. It's far too eager or hopeful.
> We can and should do both
Well Google is operating Waymo and "we" control road policy. One of these things we can act on today and the other relies on huge amounts of investments paying off in scenarios that haven't even been tested successfully yet. I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
Comment by jobs_throwaway 5 hours ago
Absurd, reductive, and non-empirical. Waymos crash and cause injury/fatality far less frequently than human drivers, full stop. You are simply out of your mind if you believe otherwise, and you should re-evaluate the data.
> Those miles often include driving in inclement weather which is something Waymo hasn't even scraped the surface of yet.
Yes. No one is claiming that Waymos are better drivers than humans in inclement weather, because they don't operate in those conditions. That does not mean Waymos are not able to outperform human drivers in the conditions in which they do operate.
> I see an environment forming where we ignore the hard problems and pray these corporate overlords solve the problem on their own. It's madness.
What's madness is your attitude that Waymos' track record does not show they are effective are reducing crashes. And again, working on policy does not prevent us from also improving technology as you seem to believe it does.
Comment by decimalenough 6 hours ago
Yeah, I'm sure Waymos would struggle in a blizzard in Duluth, but a) so would a human and b) Waymos aren't driving there. (Yet.)
Comment by themafia 6 hours ago
No. I'm not. I'm being realistic about the technology. You're artificially limiting the scope.
> so would a human
This is goalpost moving 101. The question isn't would a human driver also struggle but _would it be better_? You have zero data.
Comment by jobs_throwaway 5 hours ago
It is not moving the goalpost to say "so would a human". Comparison to human drivers is exactly the stated goalpost (and it should be).
> You have zero data.
Outrageously uninformed take. We have mountains of data that show Waymos in aggregate are safer drivers than humans.
Comment by ufmace 7 hours ago
Could you spell out exactly what "sensible" policy changes you were thinking of? Driving under the influence of drugs and/or alcohol is already illegal in every state. Are you advocating for drastically more severe enforcement, regardless of which race the person driving is, or what it does to the national prison population? Or perhaps for "improved transit access", which is a nice idea, but will take many decades to make a real difference?
Comment by bragr 6 hours ago
FWIW, your first OWI in Wisconsin, with no aggravating factors, is a civil offense, not a crime, and in most states it is rare to do any time or completely lose your license for the first offense. I'm not sure exactly what OP is getting at, but DUI/OWI limits and enforcement are pretty lax in the US compared to other countries. Our standard .08 BAC limit is a lot higher than many other countries.
Comment by ufmace 4 hours ago
To be a bit snarkier, and not directed at you, but I wish these supposedly superior Europeans would tell us what they actually want us to do. Should we enforce OWI laws more strictly, or lower the prison population? We can't do both!
Comment by Marsymars 8 minutes ago
Comment by veltas 11 hours ago
Comment by mlyle 11 hours ago
There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.
Comment by jobs_throwaway 7 hours ago
Comment by onetokeoverthe 11 hours ago
Comment by JKCalhoun 9 hours ago
An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.
(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)
Comment by coryrc 9 hours ago
At least it was already slowed down to 17 mph to start. Remember that viral video of some Australian in a pickup ragdolling a girl across the road? Most every comment is "well he was going the speed limit no fault for him!" No asshole, you hit someone. It's your fault. He got zero charges and the girl was seriously injured.
Comment by WheatMillington 8 hours ago
Comment by coryrc 4 hours ago
But you hit a kid in daytime? It's your fault. Period.
Comment by Dylan16807 2 hours ago
Comment by lurking_swe 4 hours ago
It’s possible a driver turns a corner (not wearing sunglasses) and suddenly the sun briefly blinds them, while a kid darts into the street.
I’ve seen kids (and ADULTS!) walk on the side of the street at night in all black or very very dark clothing. It’s extra amusing when they happen to be black (are they trying to get themselves killed?) It’s not the drivers fault if they genuinely can’t see a camouflaged person. I’ve had numerous close calls like this on rural and suburban roads and I think i’m a cautious driver. Make sure you are visible at night.
Or if a kid is riding a bicycle down a hill and flies into the middle of an intersection (dumb? brakes failed? etc). very possible to accidentally mow down the child.
HOWEVER, i do agree that 95% of the time it’s the drivers fault if they hit a kid. Poor awareness and speed are the biggest factors. It is certainly not 100% of the time the drivers fault though. That’s absurd. You really misunderstand how dumb some pedestrians (and parents) are.
But….it’s all besides the point. A child that doesn’t understand the importance of cross walks and looking both ways is too young to be walking alone, period. Yes even if they’re “right”. Being right isn’t helpful if you’re dead.
Comment by lelanthran 5 hours ago
It is absurd, but that doesn't mean that the attitude can't be useful!
In teaching my teenager to drive, I drilled into him the fact that, in every accident, regardless of who is "at fault", there is almost always something that the other party could have done to mitigate it. I gave him plenty of situations as examples...
You're going down a street that has kids on the sidewalk? You better be prepared to have one of those kids come out in front of the car while rough-housing, playing, whatever.
You had right of way? Maybe you did, but did you even look at the opposing traffic to see if it was safe to proceed or did you just look at the traffic light?
I've driven, thus far in my life, roughly 600000km (maybe more) with 2x non-trivial accidents, both ruled not my fault. In hindsight, I could have avoided both of them (I was young and not so self-aware).
I'm paranoid when driving, and my stats are much much better than Waymo's (have never injured anyone - even my 2x accidents only had me injured), even though I drive in all sorts of conditions, and on all sorts of roads (many rural, some without markings).
Most people don't drive like this though (although their accident rate is still better than Waymo's).
Comment by throwway120385 7 hours ago
You have a responsibility to be cautious in heavy equipment no matter what the signage on the road says, and that includes keeping a speed at which you can stop safely if a person suddenly steps onto the road in situations where people are around. If you are driving past a busy bar in downtown, a drunk person might step out and you have a responsibility to assume that might happen. If you have to go slower sometimes, tough.
Comment by Aloisius 6 hours ago
For instance, a sailboat must alter course if a collision can't be avoided by the give-way vessel alone:
Rule 17(b):
> When, from any cause, the vessel required to keep her course and speed finds herself so close that collision cannot be avoided by the action of the give-way vessel alone, she shall take such action as will best aid to avoid collision.
So if you sail your boat into a container ship and it tries to give way, but doesn't have the ability to do so quickly enough to prevent a collision, you're violating the rules if you don't also alter course as well.
Plus, if we're going to connect this to a pedestrian, if a sailboat suddenly cut in front of a container ship with zero concern for its limited maneuverability/ability to stop, the sailboat would also violate Rule 2 by neglecting precaution required by seamen and failing to consider the limitations of the vessels involved.
Comment by khat 7 hours ago
Comment by cardiffspaceman 7 hours ago
In the Coast Guard Auxiliary “Sailing and Seamanship” class that I attended, targeting would-be sailboat skippers, we were told the USS Ranger nuclear-powered aircraft carrier had the right-of-way.
Comment by fennecbutt 8 hours ago
That logic is utter bs, if someone jumps out when you're travelling at an appropriate speed and you do your best to stop then that's all that can be done. Otherwise by your logic the only safe speed is 0.
Comment by coryrc 4 hours ago
Comment by dilyevsky 7 hours ago
Comment by foxglacier 4 hours ago
Comment by aembleton 3 hours ago
Comment by jakewins 9 hours ago
Stopped buses similarly, people get off the bus, whip around the front of them and straight into the streets, so many times I’ve spotted someone’s feet under the front before they come around and into the street.
Not to take away from Waymo here, agree with thread sentiment that they seem to have acted exemplary
Comment by fennecbutt 8 hours ago
Comment by spockz 8 hours ago
Or you look for reflections in the cars parked around it. This is what I was taught as “defensive“ driving.
Comment by WheatMillington 8 hours ago
Comment by anthonyrstevens 2 hours ago
Comment by mattlondon 2 hours ago
Obviously the distances are different at that speed, but if the person steps out so close that you cannot react in time, you're fucked at any speed.
10mph will do serious damage still, so please for the sake of the children please slow yourself and your daughter's driving down to 0.5mph where there are pedestrians or parked cars.
But seriously I think you'd be more safe to both slow down and also to put more space between the parked cars and your car so that you are not scooting along with a 30cm of clearance - move out and leave lots of space so there is more space for sight-lines for both you and pedestrians.
Comment by yibg 8 hours ago
Near my house, almost the entire trip from the freeway to my house is via a single lane with parked cars on the side. I would have to drive 10 MPH the entire way (speed limit is 25, so 2.5x as long).
Comment by coryrc 4 hours ago
Remove the free parking if that's making the road unsafe. Or drive 10 mph. Done.
Comment by jjav 5 hours ago
Comment by rsch 3 hours ago
- Parked cars on the street. - Drive somewhat fast. - Avoid killing people.
Pick two.
Comment by hombre_fatal 8 hours ago
A single lane residential street with zero visibility seems like an obvious time to slow down. And that's what the Waymo did.
Comment by yibg 8 hours ago
Comment by thewebguyd 4 hours ago
It is a drivers responsibility to drive for the conditions. If conditions are calling for driving 40% slower, then that's what you do and suck it up.
If too many roads have conditions that require that, take that up with your municipality to fix the situation. Or, even better, advocate for better public transit and trains, and designing cities to move people, not move cars.
Comment by jeffbee 8 hours ago
Comment by thewebguyd 4 hours ago
Car culture in the US is toxic, and a lot of accidents and fatalities are a result of how poorly designed our infrastructure is. We design for cars, not for people (just one more lane bro, will totally fix traffic. Nevermind that a train can move double the capacity of that entire line of traffic).
Cars are the wrong solution, particularly in urban areas. A self driving car is still a car, and comes along with all the same problems that cars cause.
Comment by fennecbutt 8 hours ago
Lmao most drivers I see on the roads aren't even capable of slowing down for a pedestrian crossing when the view of the second half of the crossing is blocked by traffic (ie they cannot see if someone is about to step out, especially a child).
Humans are utterly terrible drivers.
Comment by Sparkle-san 8 hours ago
Comment by kakacik 9 hours ago
In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.
We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.
Comment by jsrozner 8 hours ago
Comment by JKCalhoun 9 hours ago
Comment by jobs_throwaway 7 hours ago
Does Waymo claim that? If so I haven't seen it. That should of course be the goal, but "better than the average human driver" should be the bar.
Comment by JKCalhoun 5 hours ago
Comment by aembleton 3 hours ago
Comment by jobs_throwaway 2 hours ago
Comment by insane_dreamer 7 hours ago
They don't look far enough ahead to anticipate what might happen and already put themselves in a position to prepare for that possibility. I'm not sure they benefit from accumulated knowledge? (Maybe Waymo does, that's an interesting question.) I.e., I know that my son's elementary school is around the corner so as I turn I'm already anticipating the school zone (that starts a block away) rather than only detecting it once I've made the turn.
Comment by loeg 4 hours ago
Comment by tmostak 2 hours ago
This certainly may have been true of older Teslas with HW3 and older FSD builds (I had one, and yes you couldn't trust it).
Comment by ajross 23 minutes ago
It's much more of a competition than I suspect a lot of people realize.
Comment by hn_user82179 6 hours ago
> The woman told police she was “eating yogurt” before she turned onto the road and that she was late for an appointment. She said she handed her phone to her son and asked him to make a call “but could not remember if she had held it so face recognition could … open the phone,” according to the probable cause statement.
> The police investigation found that she was traveling 50 mph in a 40 mph zone when she hit the boys. She told police she didn’t realize she had hit anything until she saw the boys in her rearview mirror.
The Waymo report is being generous in comparing to a fully-attentive driver. I'm a bit annoyed at the headline choice here (from OP and the original journalist) as it is fully burying the lede.
Comment by torginus 11 hours ago
This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.
Comment by mikkupikku 9 hours ago
Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."
Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.
Comment by ndsipa_pomu 10 hours ago
When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.
Comment by zdragnar 9 hours ago
I'm not sure how much of that Waymo's cars take into account, as the law technically takes into account line of sight things that a person could see but Waymo's sensors might not, such as children present on a sidewalk.
Comment by jefftk 8 hours ago
Are you sure? The ones I've seen have usually been 20 or 25mph.
Looking on Image Search (https://www.google.com/search?q=school+zone+speed+limit+sign) and limiting just to the ones that are photos of real signs by the side of the road, the first 10 are: 25, 30, 25, 20, 35, 15, 20, 55, 20, 20. So only one of these was 15.
Comment by enlightens 6 hours ago
Comment by jobs_throwaway 7 hours ago
Comment by cucumber3732842 8 hours ago
Comment by linsomniac 4 hours ago
Maybe. Depends on the position of the sun and shadows, I'm teaching my kids how to drive now and showing them that shadows can reveal human activity that is otherwise hidden by vehicles. I wonder if Waymo or other self-driving picks up on that.
Comment by accidc 6 hours ago
Does Waymo have the same object permanence and trajectory prediction (combined) to that of a human?
Once the video evidence it out, it might become evident.
Generally Waymo seems to be a responsible actor so maybe that is the case and this can help demonstrate potential benefits of autonomous vehicles.
Alternatively, if even they can't get this right then it may cast doubts about the maturity of the entire ecosystem
Comment by minimaltom 38 minutes ago
On this note specifically ive actually been impressed, ie when driving down Oak st in SF (fast road, tightly parked cars) I've often observed it slow if someone on a scooter on the sidewalk turns to look toward oncoming traffic (as if to start riding), or to slow passing parked box trucks (which block vision of potential pedestrians)
Comment by kyleee 6 hours ago
Good technical question
Comment by rpdillon 6 hours ago
Comment by drunner 8 hours ago
As described by the nhtsa brief:
"within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity"
The "that there were other children, a crossing guard, and several double-parked vehicles in the vicinity" means that waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
Comment by Rebelgecko 7 hours ago
You will get honked at by aggro drivers if you slow down to the school zone speed limit of 25mph. Most cars go 40ish.
And ofc a decent chunk of those drivers are on tiktok, tinder, Instagram, etc
Comment by jobs_throwaway 7 hours ago
Your median human driver? Sadly, I think not. Most would be rushing, or distracted, or careless.
> waymo is driving recklessly by obeying the speed limit here (assuming it was 20mph) in a way that many humans would not.
I don't think we can say at all that the Waymo was driving recklessly with the data we currently have
Comment by tintor 8 hours ago
Comment by jobs_throwaway 7 hours ago
Comment by mmooss 3 hours ago
Why is it likely? Are we taking the vendor's claims in a blog post as truth?
Comment by GuinansEyebrows 7 hours ago
Comment by jobs_throwaway 7 hours ago
Comment by micromacrofoot 11 hours ago
Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.
Comment by almosthere 7 hours ago
It is also crazy that this happened 6 days ago at this point and video was NOT part of the press releases. LOL
Comment by Bud 1 hour ago
Comment by IncreasePosts 9 hours ago
Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.
Comment by jobs_throwaway 7 hours ago
Comment by shaky-carrousel 8 hours ago
Comment by jobs_throwaway 7 hours ago
Comment by ahahahahah 8 hours ago
Comment by themafia 7 hours ago
> a huge portion of human drivers
What are you basing any of these blind assertions off of? They are not at all born out by the massive amounts of data we have surrounding driving in the US. Of course Waymo is going to sell you a self-serving line but here on Hacker News you should absolutely challenge that. In particular because it's very far out of line with real world data provided by the government.
Comment by jobs_throwaway 7 hours ago
>It's likely that a fully-attentive human driver would have done worse.
Is based off the source I gave in my comment, the peer-reviewed model
> a huge portion of human drivers
Is based on my experience and bits of data like 30% of fatal accidents involving alcohol
Like I said, if you have better data I'm glad to see it
Comment by themafia 7 hours ago
The data completely disagrees with you.
> Like I said, if you have better data I'm glad to see it
We all have better data. It's been here the entire time:
https://www.nhtsa.gov/research-data/fatality-analysis-report...
Comment by jobs_throwaway 5 hours ago
Again, I welcome you to point to data that contradicts my claims, but it seems you are unable
Comment by yesfitz 6 hours ago
The "Persons Killed, by Highest Driver Blood Alcohol Concentration (BAC) in the Crash"[1] report shows that in 2023, 30% of fatal crashes involved at least one driver with a BAC > 0.08 (the legal limit), and 36% involved a BAC > 0.01.
Interesting that "Non-motorist" fatalities have dropped dramatically for everyone under the age of 21, but increased for everyone between 21 and 74.[2] Those are raw numbers, so it'd be even more interesting to display them as a ratio of the group's size. Are less children being killed by drivers because there are less children generally? Changes in parents' habits? Backup cameras?
1: https://www-fars.nhtsa.dot.gov/Trends/TrendsAlcohol.aspx 2: https://www-fars.nhtsa.dot.gov/Trends/TrendsNonMotorist.aspx
Comment by drewda 7 hours ago
- Their "peer-reviewed model" compares Waymo vehicles against only "Level 0" vehicles. However even my decade-old vehicle is considered "Level 1" because it has an automated emergency braking system. No doubt my Subaru's camera-based EBS performs worse than Waymo's, still it's not being included in their "peer-reviewed model." That comparison is intentionally comparing Waymo performance against the oldest vehicles on the road -- not the majority of cars sold currently.
- This incident happened during school dropoff. There was a double-parked SUV that occluded the view of the student. This crash was the fault of that double-parked driver. But why was the uncrewed Waymo driving at 17 mph to begin with? Do they not have enough situational awareness to slow the f*ck down around dropoff time immediately near an elementary school?
Automotive sensor/control packages are very useful and will be even more useful over time -- but Waymo is intentionally making their current offering look comparatively better than it actually is.
Comment by ajross 21 minutes ago
Comment by scarmig 11 hours ago
Comment by kilotaras 9 hours ago
UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.
[0] https://www.safedrivingforlife.info/free-practice-tests/haza...
Comment by mlyle 9 hours ago
> No person shall drive a vehicle upon a highway at a speed greater than is reasonable or prudent having due regard for weather, visibility, the traffic on, and the surface and width of, the highway, and in no event at a speed which endangers the safety of persons or property.
The speed limit isn't supposed to be a carte blanche to drive at that speed no matter what; the basic speed law is supposed to "win." In practice, enforcement is a lot more clear cut at the posted speed limit and officers don't want to write tickets that are hard to argue in court.
Comment by throwway120385 7 hours ago
Comment by toast0 6 hours ago
And at the same time, if you were traveling at some speed and no damage was caused, it's harder to say that persons or property were endangered.
Comment by matt-attack 10 hours ago
Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.
Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.
Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.
Comment by Bratmon 4 hours ago
Comment by aleksiy123 4 hours ago
Comment by webdood90 9 hours ago
IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
Comment by mlyle 8 hours ago
How many humans drivers would pass it, and what proportion of the time? Even the best drivers do not constantly maintain peak vigilance, because they are human.
> IMO, the bar should be that the technology is a significant improvement over the average performance of human drivers (which I don't think is that hard), not necessarily perfect.
In practice, this isn't reasonable, because "hey we're slightly better than a population that includes the drunks, the inattentive, and the infirm" is not going to win public trust. And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
I think "better than the average performance of a 75th or 90th percentile human driver" might be a good way to look at things.
It's going to be a weird thing, because odds are the distribution of accidents that do happen won't look much like human ones. It will have superhuman saves (like that scooter one), but it will also crash in situations that we can't really picture humans doing.
I'm reminded of airbags; even first generation airbags made things much safer overall, but they occasionally decapitated a short person or child in a 5MPH parking lot fender bender. This was hard for the public to stomach, and if it's your kid who is internally decapitated by the airbag in a small accident, I don't think you'll really accept "it's safer on average to have an airbag!"
Comment by nearbuy 5 hours ago
Then you said, "this isn't reasonable", and the bar shouldn't be "slightly better" or "barely better". It should be at least better than the 75th percentile driver.
It sounds like you either misread the parent comment or you're phrasing your response as disagreement despite proposing roughly the same thing as the parent comment.
Comment by mlyle 5 hours ago
A 20% lower fatal crash rate compared to the average might be a significant improvement-- from a public health standpoint, this is huge if you could reduce traffic deaths by 20%.
But if you don't get the worst drivers to replace their driving with autonomous, that "20% less than average" might actually make things worse. That's my point. The bar has to be pretty dang high to be sure that you will actually make things better.
Comment by lkbm 7 hours ago
Sadly, you're right, but as rational people, we can acknowledge that it should. I care about reducing injuries and deaths, and the %tile of human performance needed for that is probably something like 30%ile. It's definitely well below 75%ile.
Comment by mlyle 7 hours ago
> > And, of course, a system that is barely better than average humans might worsen safety, if it ends up replacing driving by those who would normally drive especially safe.
It's only if you get the habitually drunk (a group that is overall impoverished), the very old, etc, to ride Waymo that you reap this benefit. And they're probably not early adopters.
Comment by lkbm 7 hours ago
You also solve for people texting (or otherwise using their phones) while driving, which is pretty common among young, tech-adopting people.
Comment by mlyle 7 hours ago
Yes, but the drivers who are 5th percentile drivers who cause a huge share of the most severe accidents are "special" in various ways. Most of them are probably not autonomy early adopters.
The guy who decided to drive on the wrong side of a double yellow on a windy mountain road and hit our family car in a probable suicide attempt was not going to replace that trip with Waymo.
Comment by chasd00 6 hours ago
Comment by mlyle 10 hours ago
Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.
But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.
(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).
Comment by Aloisius 4 hours ago
Only with instant reaction time and linear deceleration.
Neither of those are the case. It takes time for even a Waymo to recognize a dangerous situation and apply the brake and deceleration of vehicles is not actually linear.
Comment by mlyle 3 hours ago
Reaction time makes the math even better here. You travel v1 * reaction_time no matter what, before entering the deceleration regime. So if v1 gets smaller, you get to spend a greater proportion of time in the deceleration regime.
> linear deceleration.
After reaction time, stopping distance is pretty close to n^2. There's weird effects at high speed (contribution from drag) and at very low speed, but they have pretty modest contributions.
Comment by Aloisius 2 hours ago
Without that these vehicles could only start braking when certainty crossed some arbitrary threshold.
Comment by pastage 10 hours ago
Comment by mlyle 10 hours ago
There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.
Comment by drcongo 10 hours ago
Comment by recursive 9 hours ago
Comment by something765478 9 hours ago
Comment by acdha 9 hours ago
Comment by dekhn 6 hours ago
Comment by dboreham 9 hours ago
Comment by trollbridge 8 hours ago
I would not race at 17 MPH through such an area. Of course, Waymo will find a way to describe themselves as the heroes of this situation.
Comment by mlsu 5 hours ago
These giant SUVs really are the worst when it comes to child safety
Comment by calchris42 11 hours ago
The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.
Comment by dcanelhas 10 hours ago
What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself
Comment by aaomidi 10 hours ago
Comment by dcanelhas 10 hours ago
Comment by Teknoman117 9 hours ago
Comment by gerdesj 2 hours ago
I look for shadows underneath stationary vehicles. I might also notice pedestrians "vanishing". I have a rather larger "context" than any robot effort.
However, I am just one example of human. My experience of never managing to run someone over is just an anecdote ... so far. The population of humans as a whole manages to run each other over rather regularly.
A pretty cheap instant human sensor might be Bluetooth/BLE noting phones/devices in near range. Pop a sensor in each wing mirror and on the top and bottom. The thing would need some processing power but probably nothing that the built in Android dash screen couldn't handle.
There are lots more sensors that car manufacturers are trying to avoid for cost reasons, that would make a car way better at understanding the context of the world around it.
I gather that Tesla insist on optical (cameras) only and won't do LIDAR. My EV has four cameras and I find it quite hard to see what is going on when it is pissing down with rain, in the same way I do if I don't clean my specs.
Comment by barbazoo 8 hours ago
Comment by mholt 9 hours ago
This is the fault of the software and company implementing it.
Comment by BugsJustFindMe 8 hours ago
Some do, some of the time. I'm always surprised by how much credence other people give to the idea that humans aren't on average very bad at things, including perception.
Comment by navigate8310 1 hour ago
Comment by recursive 8 hours ago
Comment by random_duck 11 hours ago
Comment by direwolf20 11 hours ago
Comment by dylan604 10 hours ago
Comment by direwolf20 10 hours ago
What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?
Comment by recursive 9 hours ago
Comment by gruez 8 hours ago
So by that logic, if we cured cancer but the treatment came with terrible side effects it wouldn't be considered a "success"? Does everything have to perfect to be a success?
Comment by orwin 9 hours ago
Comment by seanmcdirmid 9 hours ago
The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.
Comment by jerlam 8 hours ago
Comment by JumpCrisscross 8 hours ago
Plenty of American cities regulate or even eliminated, in various measures, on-street parking.
Comment by seanmcdirmid 8 hours ago
Comment by jerlam 6 hours ago
Comment by trollbridge 8 hours ago
I guess you could keep doing that until kids just walk to and from school?
Comment by seanmcdirmid 8 hours ago
Comment by the_other 8 hours ago
It doesn't stop all on street parking beside the school, but it cuts it down a noticeable amount.
Comment by dboreham 9 hours ago
Comment by dylan604 7 hours ago
Comment by trollbridge 8 hours ago
Comment by parl_match 9 hours ago
> What else to you expect them to do, only run on grade–separated areas where children can't access?
no, i expect them to slow down when children may be present
Comment by direwolf20 9 hours ago
Comment by parl_match 3 hours ago
Comment by direwolf20 2 hours ago
Comment by parl_match 1 hour ago
The simple fact is that it hit a child and even though it wasn't a serious issue due to their safety policies, there's still room for improvement in these technologies.
And since it's a robot, and not a human, you can actually make changes and have them stick. For example, routing away from schools during certain hours.
Comment by dylan604 8 hours ago
Comment by direwolf20 8 hours ago
Comment by autoexec 9 hours ago
https://www.theverge.com/2022/1/28/22906513/waymo-lawsuit-ca...
Comment by BugsJustFindMe 10 hours ago
Comment by red75prime 7 hours ago
Comment by voidUpdate 11 hours ago
Comment by TeMPOraL 11 hours ago
Comment by xnx 9 hours ago
Indeed. Waymo is a much more thoughtful and responsible company than Cruise, Uber, or Tesla.
"Cruise admits to criminal cover-up of pedestrian dragging in SF, will pay $500K penalty" https://www.sfgate.com/tech/article/cruise-fine-criminal-cov...
Comment by direwolf20 11 hours ago
Comment by mitthrowaway2 7 hours ago
Comment by micromacrofoot 11 hours ago
Comment by flutas 6 hours ago
The Waymo blog post refused to say the word "child", instead using the phrase "young pedestrian" once.
The Waymo blog post switches to "the pedestrian" and "the individual" for the rest of the post.
The Waymo blog post also consistently uses the word "contact" instead of hit, struck, or collision.
The Waymo blog post makes no mention of the injuries the child sustained.
The Waymo blog post makes no mention of the school being in close proximity.
The Waymo blog post makes no mention of other children or the crossing guard.
The Waymo blog post makes no mention of the car going over the school zone speed limit (17 in 15).
Comment by SauntSolaire 2 hours ago
Comment by boh 8 hours ago
Comment by moomoo11 8 hours ago
That’s why they purchase goods and services (from others) and then cry about things they don’t and probably never will understand.
And why they can be ignored and just fed some slop to feel better.
I could lie but that’s the cold truth.
Edit: I'm not sure if the repliers are being dense (highly likely), or you just skipped over context (you can click the "context" link if you're new here)
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
Comment by boh 7 hours ago
Comment by moomoo11 7 hours ago
It isn't me vs them. It is just me being self-aware. Clearly, you had a problem with what I said so I must have struck a nerve.
Welcome to the real world bro.
Comment by butlike 7 hours ago
Comment by butlike 7 hours ago
Comment by moomoo11 5 hours ago
look at what I was replying to. if you still don't get it, then yeah I'm just proving my point and you can keep crying about it.
> So the TechCrunch headline should be "Waymo hits child better than a human driver would"? Not sure if the details reflect how the general public actually interprets this story (see the actual TC headline for exhibit A).
That is the general public sentiment I was referring to.
The fact that you go around asking dumb questions in bad faith to people is enough for me, last time I engage with you.
Have a good life!
Comment by ajdude 4 hours ago
Isn't the speed limit normally 15 mph or less in a school zone? Was the robotaxi speeding?
Comment by chmod775 3 hours ago
It's the "best outcome" if you're trying to go as fast as possible without breaking any laws or ending up liable for any damage.
German perspective, but if I told people I've been going 30km/h next to a school with poor visibility as children are dropped off around me, I would be met with contempt for that kind of behavior. I'd also at least face some partial civil liability if I hit anyone.
There's certainly better handling of the situation possible, it's just that US traffic laws and attitudes around driving do not encourage it.
I suspect many human drivers would've driven slower, law or no law.
Comment by ChrisMarshallNY 3 hours ago
Human reaction times are terrible, and lots of kids get seriously injured, or killed, when they run out from between cars.
Comment by aucisson_masque 4 hours ago
Where is the video recording ?
Comment by behringer 1 hour ago
Comment by alphazard 9 hours ago
Comment by WheatMillington 8 hours ago
Comment by fragmede 3 hours ago
Love him or hate him, releasing the video is something I can see Elon doing because assuming a human driver would have done worse, it speaks for itself. Release a web video game where the child sometimes jumps out in front of the car, and see how fast humans respond like the "land Starship" game. Assuming humans would do worse, that is. If the child was clearly visible through the car or some how else avoidable by humans, then I'd be hiding the video too.
Comment by rafram 1 hour ago
Comment by WalterBright 6 hours ago
I was very very lucky.
Comment by socalgal2 4 hours ago
Those kind of neighborhoods where the outer houses face the fast large roads I think are less common now but lots of them left over from the 50+ years ago.
Comment by WalterBright 2 hours ago
That incident still gives me the willies.
Comment by rdudek 11 hours ago
Comment by socalgal2 4 hours ago
what about all the traffic violations though?
Comment by zx8080 3 hours ago
Stopped or moved? Is it allowed in CA to move car at all after a serious accident happens?
Comment by rapind 3 hours ago
Comment by dyauspitr 11 hours ago
Comment by dust42 10 hours ago
Comment by butlike 7 hours ago
Comment by pizzafeelsright 7 hours ago
I am personally a fan of entirely automated but slow traffic. 10mph limit with zero traffic is fast enough for any metro area.
Comment by croes 9 hours ago
Comment by dfxm12 6 hours ago
Importantly, Waymo takes full ownership for something they write positively: Our technology immediately detected the individual.... But Waymo weasels out of taking responsibility for something they write about negatively.
Comment by packetslave 4 hours ago
the "Waymo Driver" is how they refer to the self-driving platform (hardware and software). They've been pretty consistent with that branding, so it's not surprising that they used it here.
> Importantly, Waymo takes full ownership for something they write positively [...] But Waymo weasels out of taking responsibility for something they write about negatively
Pretty standard for corporate Public Relations writing, unfortunately.
Comment by veltas 11 hours ago
Comment by chaboud 11 hours ago
Comment by seanmcdirmid 11 hours ago
Comment by AndrewKemendo 5 hours ago
The vast vast vast majority of human drivers would not have been able to accomplish that braking procedure that quickly, and then would not have been able to manage the follow up so quickly.
I have watched other parent drivers in the car pick up line at public schools for the last 16 years and people are absolutely trash at navigating that whole process and parents drive so poorly it’s absurd. At least half parents I see on their phones while literally feet away from hitting some kid.
Comment by mmooss 3 hours ago
> The vast vast vast majority of human drivers ... would not have been able to manage the follow up so quickly
You are saying the "vast vast vast majority of human drivers" wouldn't pull over after hitting a child?
I remember similar blind faith in and unlimited advocacy for anything Tesla and Musk said, and look how that has turned out. These are serious issues for the people in our communities, not a sporting event with sides.
Comment by raincole 5 hours ago
> From the Waymo blog
Yeah, like, no shit Sherlock. We'd better wait for some videos before making our opinions.
Comment by lostlogin 9 hours ago
If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.
Comment by anovikov 11 hours ago
Comment by jayd16 11 hours ago
Comment by gensym 10 hours ago
The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.
Comment by pastage 10 hours ago
Comment by lokar 11 hours ago
Comment by aanet 8 hours ago
Having performed this exact test on 3 dozen vehicles (L2/L3/L4) for several AV companies in the Bay Area [1], I would say that Waymo's response, per their blog post [2] has been textbook compliance. (I'm not defending their performance... just their response to the collision). This test / protocol is hard for any driver (including human driven vehicles), let alone ADAS/L3/L4 vehicles, for various reasons, including: pedestrian occlusion, late ped detection, late braking, slick roads, not enough braking, etc. etc.
Having said all that, full collision avoidance would have been best outcome, which, in this case, it wasn't. Wherever the legal fault may lie -- and there will be big debate here -- Waymo will still have to accept some responsibility, given how aggressively they are rolling out their commercial services.
This only puts more onus on their team to demonstrate a far higher standard of driving than human drivers. Sorry, that's just the way societal acceptance is. We expect more from our robots than from our fellow humans.
[1] Yes, I'm an AV safety expert
[2] https://waymo.com/blog/2026/01/a-commitment-to-transparency-...
(edit: verbiage)
Comment by ssteeper 1 hour ago
Comment by aanet 1 hour ago
That's a difficult question to answer, and the devil really is in the details, as you may have guessed. What I can say that Waymo is, by far, the most prolific publisher of research on AV safety on public roads. (yes, those are my qualifiers...)
Here's their main stash [1] but notably, three papers talk about comparison of Waymo's rider-only (i.e. no safety driver) performance vis-a-vis human driver, at 7.1 million miles [2], 25 million miles [3], 56 million miles [4]. Waymo has also been a big contributor to various AV safety standards as one would expect (FWIW, I was also a contributor to 3 of the standards... the process is sausage-making at its finest, tbh).
I haven't read thru all their papers, but some notable ones talk about the difficulty of comparing AV vs human drivers [5], and various research on characterising uncertainty / risk of collision, comparing AVs to non-impaired, eyes-on human driver [6]
As one may expect, at least one of the challenges is that human-driven collisions are almost always very _lagging indicators_ of safety (i.e. collision happened: lost property, lost limbs, lost lives, etc.)
So, net-net, Waymo still has a VERY LONG WAY to go (obviously) to demonstrate better than human driving behavior, but they are showing that their AVs are better-than-humans on certain high-risk (potential) collisions.
As somebody remarked, the last 1% takes 90% of time/effort. That's where we are...
---
[1] https://waymo.com/safety/research
[2] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[3] https://waymo.com/research/do-autonomous-vehicles-outperform...
[4] https://waymo.com/research/comparison-of-waymo-rider-only-cr...
[5] https://waymo.com/research/comparative-safety-performance-of...
[6] https://waymo.com/blog/2022/09/benchmarking-av-safety/
[edit: reference]
Comment by okdood64 6 hours ago
Why? This is only true if they weren't supposed to be on the road in the first place. Which is not true.
Comment by GoatInGrey 4 hours ago
If I program a machine and it goes out into the world and hurts someone who did not voluntarily release my liability, that's on me.
Comment by lmm 2 hours ago
Comment by danpalmer 5 hours ago
Comment by aanet 5 hours ago
Indeed, it is, and that is exactly why Waymo will have to accept some responsibility. I can bet that internally Waymo's PR and Legal teams are working overtime to coordinate the details with NHTSA. We, the general public, may or may not know the details at all, if ever. However, Waymo's technical teams (Safety, etc) will also be working overtime to figure out what they could have done better.
As I mentioned, this is a standard test, and Waymo likely has 1000s of variations of this test in their simulation platforms; they will sweep across all possible parameters to make this test tighter, including the MER (minimum expected response from the AV) and perhaps raise the bar on MER (e.g. brake at max deceleration in some cases, trading off comfort metrics in those cases; etc.) and calculate the effects on local traffic (e.g. "did we endanger the rear vehicles by braking too hard? If so, by how much??" etc). All these are expected actions which the general public will never know (except, perhaps via some technical papers).
Regardless, the PR effects of this collision do not look good, especially as Waymo is expanding their service to other cities (Miami just announced; London by EOY2026). This PR coverage has potential to do more damage to the company than the actual physical damage to the poor traumatized kid and his family. THAT is the responsibility only the company will pay for.
To be sure, my intuition tells me this is not the last such collision. Expect to see some more, by other companies, as they commercialize their own services. It's a matter of statistics.
Comment by mmooss 3 hours ago
> I would say that Waymo's response, per their blog post [2] has been textbook compliance.
Remember Tesla's blog posts? Of course Waymo knows textbook compliance just like you do, and of course that's what they would claim.
Comment by aanet 3 hours ago
Most likely, yes, the NHTSA investigation will be credible source of info for this case. HOWEVER, Waymo will likely fight it tooth-and-nail from letting it be public. They will likely cite "proprietary algorithms / design", etc. to protect it from being released publicly. So, net-net, I dunno... Will have to wait and see :shrug.gif:
But meanwhile, personally I would read reports from experts like Phil Koopman [1] and Missy Cummings [2] to see their take.
> Remember Tesla's blog posts?
You, Sir, cite two companies that are diametrically opposite on the safety spectrum, as far as good behavior is concerned. Admittedly, one would have less confidence in Waymo's own public postings about this (and I'd be mighty surprised if they actually made public their investigation data, which would be a welcome and an pioneering move).
On the other hand, the other company you mentioned, the less said the better.
Comment by aanet 2 hours ago
As I did suspect, legal scholars are already calling for "voluntary disclosure" from Waymo re: its annotated videos of the collision [2]. FWIW, my skepticism about Waymo actually releasing it remains...
[1] https://static.nhtsa.gov/odi/inv/2026/INOA-PE26001-10005.pdf
[2] https://www.linkedin.com/posts/matthew-wansley-62b5b9126_a-w...
Comment by maerF0x0 10 hours ago
I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .
> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”
Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.
A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.
Comment by Veserv 8 hours ago
However, the child pedestrian injury rate is only a official estimate (it is possible it may be undercounting relative to highly scrutinized Waymo vehicle-miles) and is a whole US average (it might not be a comparable operational domain), but absent more precise and better information, we should default to the calculation of 2-4x the rate.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
Comment by 10000truths 5 hours ago
Comment by Veserv 5 hours ago
Until then, it is only prudent to defer snap judgements, but increase caution, insist on rigor and transparency, and demand more accurate information.
Comment by smarnach 4 hours ago
No we should not. We should accept that we don't have any statistically meaningful number at all, since we only have a single incident.
Let's assume we roll a standard die once and it shows a six. Statistically, we only expect a six in one sixth of the cases. But we already got one on a single roll! Concluding Waymo vehicles hit 2 to 4 times as many children as human drivers is like concluding the die in the example is six times as likely to show a six as a fair die.
Comment by akoboldfrying 3 hours ago
Comment by smarnach 2 hours ago
Comment by NewJazz 4 hours ago
Comment by smarnach 4 hours ago
Comment by akoboldfrying 3 hours ago
Comment by Jblx2 6 hours ago
Comment by maerF0x0 8 hours ago
Comment by Spivak 7 hours ago
And for not totally irrational reasons like machine follows programming and does not fear death, or with 100% certainty machine has bugs which will eventually end up killing someone for a really stupid reason—and nobody wants that to be them. Then there's just the general https://xkcd.com/2030/ problem of people rightfully not trusting technology because we are really bad at it, and our systems are set up in such a way that once you reach critical mass of money consequences become other people's problem.
Washington banned automatic subway train operation for 15 years after one incident that wasn't the computer's fault, and they still make a human sit in the cab. That's the bar. In that light it's hard not to see these cars as playing fast and loose with people's safety by comparison.
Comment by sebzim4500 6 hours ago
Are they? It is now clear that Tesla FSD is much worse than a human driver and yet there has been basically no attempt by anyone in government to stop them.
Comment by deceptionatd 3 hours ago
No one in the _US_ government. Note that European governments and China haven't approved it in the first place.
Comment by tmostak 2 hours ago
Comment by fragmede 3 hours ago
Comment by naet 2 hours ago
Surprised at how many comments here seem eager to praise Waymo based off their PR statement. Sure it sounds great if you read that the Waymo slowed down faster than a human. But would a human truly have hit the child here? Two blocks from a school with tons of kids, crossing guards, double parked cars, etc? The same Waymo that is under investigation for passing school busses illegally? It may have been entirely avoidable for the average human in this situation, but the robotaxi had a blind spot that it couldn't reason around and drove negligently.
Maybe the robotaxi did prevent some harm by braking with superhuman speed. But I am personally unconvinced it was a completely unavoidable freak accident type of situation without seeing more evidence than a blog post by a company with a heavily vested interest in the situation. I have anecdotally seen Waymo in my area drive poorly in various situations, and I'm sure I'm not the only one.
There's the classic "humans are bad drivers" but I don't think that is an excuse to not look critically into robotaxi accidents. A human driver who hit a child next to a school would have a personal responsibility and might face real jail time or at the least be put on trial and investigated. Who at Waymo will face similar consequences or risk for the same outcome?
Comment by tokioyoyo 1 hour ago
Comment by bhewes 7 hours ago
Comment by phainopepla2 5 hours ago
Comment by Analemma_ 6 hours ago
Comment by dlg 8 hours ago
While it's third-hand, word on the local parent chat is that the parent dropped their kid off on the opposite side of the street from Grant. Even though there was a crossing guard, the kid ran behind a car an ran right out in to the street.
If those rumors are correct, I'll say the kid's/family's fault. That said, I think autonomous vehicles should probably go extra-slowly near schools, especially during pickup and dropoff.
Comment by sowbug 8 hours ago
They got the point.
Comment by joefarish 8 hours ago
Comment by doctorpangloss 7 hours ago
Comment by dlg 7 hours ago
Comment by trollbridge 8 hours ago
It is never a 6 year old's fault if they get struck by a robot.
Comment by blell 8 hours ago
Comment by altairprime 5 hours ago
If you want to see an end to this nonsensical behavior by parents, pressure your local city into having strict traffic enforcement and ticketing during school hours at every local school, so that the parent networks can’t share news with each other of which school is being ‘harassed’ today. Give license points to vehicles that drop a child across the street, issue parking tickets to double parkers, and boot vehicles whose drivers refuse to move when asked. Demand they do this for the children, to protect them from the robots, if you like.
But.
It’ll protect them much more from the humans than from the robots, and after a few thousand rockets are issued to parents behaving badly, you’ll find that the true threat to children’s safety on school roads is children’s parents — just as the schools have known for decades. And that’s not a war you’ll win arguing against robots. (It’s a war you’ll win arguing against child-killing urban roadway design, though!)
Comment by altairprime 3 hours ago
Comment by IAmBroom 6 hours ago
Comment by scottbez1 1 hour ago
There are always systemic factors that can be improved, for example working on street design to separate dangerous cars from children, or transportation policy by shifting transportation to buses, bikes, and walking where the consequences of mistakes are significantly reduced.
Cars are the #2 killer of children in the US, and it’s largely because of attitudes like this that ignore the extreme harm that is caused by preventable “accidents”
Comment by Zigurd 8 hours ago
Comment by rsch 3 hours ago
And it is not the child’s or their parents’ fault either:
Once you accept elementary school aged children exist, you have to accept they will sometimes run out like this. Children just don’t have the same impulse control as adults. And honestly even for adults stepping out a bit from behind an obstacle in the path of a car is an easy mistake to make. Don’t forget that for children an SUV is well above head height so it isn’t even possible for them to totally avoid stepping out a bit before looking. (And I don’t think stepping out vs. running out changes the outcome a lot)
This is why low speed limits around schools exist.
So I would say the Waymo did pretty well here, it travelled at a speed where it was still able to avoid not only a fatality but also major injury.
Comment by calibas 1 hour ago
Not sure where this is coming from, and it's directly contradicted by the article:
> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.” The company did not release a specific analysis of this crash.
Comment by seanmcdirmid 3 hours ago
I get what you are trying to say and I definitely agree in spirit, but I tell my kid (now 9) "it doesn't matter if it isn't your fault, you'll still get hurt or be dead." I spent a lot of time teaching him how to cross the street safely before I let him do it on his own, not to trust cars to do the right thing, not to trust them to see you, not to trust some idiot to not park right next to cross walk in a huge van that cars have no chance of seeing over.
If only we had a Dutch culture of pedistrian and road safety here.
Comment by bikamonki 5 minutes ago
Comment by Zopieux 7 hours ago
Yes, kids in developed countries have the autonomy to go to school by themselves from a very young age, provided the correct mindset and a safe environment. That's a combination of:
* high-trust society: commuting alone or in a small group is the norm, soccer moms a rare exception,
* safe, separated lanes for biking/walking when that's an option.
Comment by luses 1 hour ago
most commenters here are ignoring the structural incentives. the long term threat of waymo isn't safety, its the enclosure of public infrastructure. these companies are building a permission structure to lobby personal vehicles and public transit off the road.
transportation demand is inelastic. if we allow a transition where mobility is captured by private platforms, the consumer loses all leverage. the endgame is the american healthcare model: capture the market, kill alternatives, and extract max rent because the user has no choice. we need dense urban cores and mass transit, not a dependency on rent seeking oligopolies
Comment by aimor 8 hours ago
https://www.yahoo.com/news/articles/child-struck-waymo-near-...
Comment by JumpCrisscross 8 hours ago
Comment by toast0 6 hours ago
Comment by JumpCrisscross 2 hours ago
+/- 2 mph is acceptable speedometer and other error. (15 mph doesn’t mean never exceed under any legal inteprerstion I know.)
It’s reasonable to say Waymo would reduce speed in a 12 versus 15 in a way most humans would not.
Comment by nkrisc 8 hours ago
Comment by saalweachter 4 minutes ago
Comment by mmooss 3 hours ago
Comment by lmm 2 hours ago
We're not though. Drivers are allowed to kill as many people as they like as long as they're apologetic and weren't drinking; at most they pay a small fine.
Comment by mmooss 1 hour ago
Also, where I live that's manslaughter, a serious felony that can put you in jail.
Comment by xvector 3 hours ago
Comment by jsrozner 8 hours ago
Oh also, that video says "kid ran out from a double parked suv". Can you imagine being dumb enough to drive over the speed limit around a double parked SUV in a school zone?
Comment by Aloisius 7 hours ago
The 15 mph speed limit starts on the block the school is on. The article says the Waymo was within two blocks of the school, so it's possible they were in a 25 mph zone.
Comment by cucumber3732842 8 hours ago
Can you imagine being dumb enough to think that exceeding a one size fits all number on a sign by <10% is the main failing here?
As if 2mph would have fundamentally changed this. Pfft.
A double parked car, in an area with chock full street parking (hence the double park) and "something" that's a magnet for pedestrians, and probably a bunch of pedestrians should be a "severe caution" situation for any driver who "gets it". You shouldn't need a sign to tell you that this is a particular zone and that warrants a particular magic number.
The proper reaction to a given set of indicators that indicate hazards depends on the situation. If this were easy to put in a formula Waymo would have and we wouldn't be discussing this accident because it wouldn't have happened.
Comment by jsrozner 2 hours ago
The fact that it’s hard to turn this into a formula is exactly why robot drivers are bad.
Comment by BugsJustFindMe 7 hours ago
According to https://news.ycombinator.com/item?id=46812226 1mph slower might have entirely avoided contact in this particular case.
Comment by fwip 8 hours ago
In a school zone, when in a situation of low visibility, the car should likely be going significantly below the speed limit.
So, it's not a case of 17mph vs 15mph, but more like 17mph vs 10mph or 5mph.
Comment by AuthAuth 3 hours ago
Please pass this message on to 99.999% of human drivers who think speed limit is the minimum speed.
Comment by cucumber3732842 7 hours ago
The car clearly failed to identify that this was a situation it needed to be going slower. The fact that it was going 17 instead of 15 is basically irrelevant here except as fodder for moral posturing. If the car is incapable of identifying those situations no amount of "muh magic number on sign" is going to fix it. You'll just have the same exact accident again in a 20 school zone.
Comment by fwip 6 hours ago
If the car is going slower than the speed limit in this scenario, it is difficult to tell over the internet if that speed was appropriate. If the car is going over the speed limit, it is obviously inappropriate.
Comment by CaliforniaKarl 9 hours ago
Comment by Bukhmanizer 11 hours ago
Comment by jayd16 11 hours ago
Comment by pengaru 9 hours ago
The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.
They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.
Comment by simojo 11 hours ago
This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.
Comment by energy123 11 hours ago
Comment by Dlanv 9 hours ago
About 5x more kinetic energy.
Comment by margalabargala 7 hours ago
So if we're going to have cars drive irresponsibly fast near schools, it's better that they be piloted by robots.
But there may be a better solution...
Comment by samrus 8 hours ago
Comment by JumpCrisscross 8 hours ago
In my experience in California, always and yes.
Comment by margalabargala 7 hours ago
Comment by samrus 7 hours ago
But that depends on reliability, especially in unforseen (and untrained-upon) circumstances. We'll have to see how they do, but they have been doing better than expected
Comment by cucumber3732842 8 hours ago
Comment by cucumber3732842 7 hours ago
Jumping out of a plane wearing a parachute vs jumping off a building without one.
But acceleration is hard to calculate without knowing time or distance (assuming it's even linear) and you don't get that exponent over velocity yielding you a big number that's great for heartstring grabbing and appealing to emotion hence why nobody ever uses it.
Comment by Veserv 7 hours ago
US human drivers average ~3.3 trillion miles per year [1]. US human drivers cause ~7,000 child pedestrian injurys per year [2]. That amounts to a average of 1 child pedestrian injury per ~470 million miles. Waymo has done ~100-200 million fully autonomous miles [3][4]. That means they average 1 child pedestrian injury per ~100-200 million miles. That is a injury rate ~2-4x higher than the human average.
However, the child pedestrian injury rate is only a official estimate (possible undercounting relative to highly scrutinized Waymo miles) and is a whole US average (operational domain might not be comparable, though this could easily swing either way), but absent more precise and better information, we should default to the calculated 2-4x higher injury rate; it is up to Waymo to robustly demonstrate otherwise.
Furthermore, Waymo has published reasonably robust claims arguing they achieve ~90% crash reduction [5] in total. The most likely new hypotheses in light of this crash are:
A. Their systems are not actually robustly 10x better than human drivers. Waymos claims are incorrect or non-comparable.
B. There are child-specific risk factors that humans account for that Waymo does not that cause a 20-40x differential risk around children relative to normal Waymo driving.
C. This is a fluke child pedestrian injury. Time will tell. Given their relatively robustly claimed 90% crash reduction, it is likely prudent to allow further operation in general, though possibly not in certain contexts.
[1] https://afdc.energy.gov/data/10315
[2] https://crashstats.nhtsa.dot.gov/Api/Public/Publication/8137...
[3] https://www.therobotreport.com/waymo-reaches-100m-fully-auto...
[4] https://waymo.com/blog/2025/12/demonstrably-safe-ai-for-auto...
Comment by nearbuy 4 hours ago
1. The NHTSA data is based on police-reported crash data, which reports far fewer injuries than the CDC reports based on ED visits. The child in this case appeared mostly unharmed and situations like this would likely not be counted in the NHTSA data.
2. Waymo taxis operate primarily in densely populated urban environments while human driver milage includes highways and rural roads where you're much less likely to collide with pedestrians per mile driven.
Waymo's 90% crash reduction claim is at least an apples-to-apples comparison.
Comment by shawabawa3 4 hours ago
If this incident had happened with a human driven vehicle would it even have been reported?
I don't know exactly what a 6mph collision looks like but I think it's likely the child had nothing more than some bruises and if a human has done it they would have just said sorry, made sure they were ok, and left
Comment by ufmace 4 hours ago
Comment by moktonar 5 hours ago
Comment by ra7 4 hours ago
See past examples:
https://youtube.com/watch?v=hubWIuuz-e4 — first save is a child emerging from a parked car. Notice how Waymo slows down preemptively before the child starts moving.
https://www.reddit.com/r/waymo/s/ivQPuExwNW — detects foot movement from under the bus.
https://www.reddit.com/r/waymo/s/LURJ8isQJ6 — stops for dogs and children running onto the street at night.
Comment by null_deref 5 hours ago
Comment by andsoitis 53 minutes ago
> The Waymo Driver braked hard...
By Waymo Driver, they don't mean a human, do they?
Comment by opinion-is-bad 27 minutes ago
Comment by WarmWash 11 hours ago
Waymo hits a kid? Ban the tech immediately, obviously it needs more work.
Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
Comment by Filligree 11 hours ago
> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.
These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.
Comment by micromacrofoot 11 hours ago
They have to be, as a machine can not be held accountable for a decision.
Comment by pjscott 6 hours ago
Failure to acknowledge the existence of tradeoffs tends to lead to people making really lousy trades, in the same way that running around with your eyes closed tends to result in running into walls and tripping over unseen furniture.
Comment by dragonwriter 8 hours ago
Comment by TeMPOraL 10 hours ago
Comment by myrmidon 9 hours ago
Comment by micromacrofoot 9 hours ago
Comment by TeMPOraL 6 hours ago
In fact, if you substitute "company providing self-driving solution (integrated software + hardware)" for "company renting out commercial drivers" (or machine operators), then self-driving cars already fit well into existing legal framework. The way I see it, the only change self-driving cars introduce here is that there is no individual operator we could blame for the accident, no specific human we could heavily fine or jail, and then feel good about ourselves because we've issued retributive justice and everything is whole now. Everything else has already long been worked out.
Comment by JumpCrisscross 8 hours ago
This logic applies equally to all cars, which are machines. Waymo has its decision makers one more step removed than human drivers. But it’s not a good axiom to base any theory of liability on.
Comment by ycui1986 55 minutes ago
Comment by aucisson_masque 4 hours ago
The issue is that I don’t trust a private company word. You can’t even trust the president of the USA government nowadays… release the video footage or get lost.
Comment by Dlanv 9 hours ago
Bad any other car been there, probably including Tesla, the poor kid would have been hit with 4-10x more force.
Comment by Petersipoi 8 hours ago
Comment by namuol 2 hours ago
Comment by NoGravitas 9 hours ago
Comment by elzbardico 3 hours ago
Comment by pmontra 9 hours ago
Comment by ssl-3 8 hours ago
In that particular instance, I was cited myself -- after the fact, at the hospital -- and eventually went before a judge. In that hearing, it was established that I was guilty of failing to yield at an intersection.
(That was a rather long time ago and I don't remember the nature of the punishment that resulted. It may have been as little as a stern talking-to by the judge.)
Comment by hiddencost 9 hours ago
Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.
For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.
Comment by trollbridge 8 hours ago
Comment by jeffbee 9 hours ago
Comment by Archio 8 hours ago
The argument that questions "would a human be driving 17mph in a school zone" feels absurd to the point of being potentially disingenuous. I've walked and driven through many school zones before, and human drivers routinely drive above 17mph (in some cases, over the typical 20mph or 25mph legal limit). It feels like in deconstructing some of these incidences, critics imagine a hypothetical scenario in which they are driving a car and its their only job to avoid a specific accident that they know will happen in advance, rather than facing the reality of what human drivers are actually like on the road.
Comment by wackget 2 hours ago
Comment by voxadam 44 minutes ago
Comment by gooseyard 5 hours ago
Comment by RomanPushkin 4 hours ago
> In October 2025, a Waymo autonomous robotaxi struck and killed KitKat, a well-known bodega cat at Randa's Market in San Francisco's Mission District, sparking debates over self-driving car safety
It's a child now. All I wanna ask - what should happen, so they stop killing pets and people?
Comment by GoatInGrey 4 hours ago
Comment by NewJazz 4 hours ago
Comment by fragmede 3 hours ago
Comment by cryptoegorophy 3 hours ago
Comment by seanmcdirmid 3 hours ago
I can't tell if you are using sarcasm here or are serious. I guess it depends on your definition of normal person (obviously not average, but an idealized driver maybe?).
Comment by anon115 2 hours ago
Comment by koolba 8 hours ago
As this is based on detection of the child, what happens on Halloween when kids are all over the place and do not necessarily look like kids?
Comment by sweezyjeezy 8 hours ago
Comment by tintor 8 hours ago
The object could be a paper bag flying in the wind, or leaves falling from the tree.
Comment by sowbug 8 hours ago
Comment by Rudybega 8 hours ago
Comment by sowbug 7 hours ago
Autonomous vehicles won't be perfect. They'll surely make different mistakes from the ones humans currently make. People will die who wouldn't have died at the hands of human drivers. But the overall number of mistakes will be smaller.
Suppose you could wave your magic wand and have a The Purge-style situation where AVs had a perfect safety record 364 days of the year, but for some reason had a tricky bug that caused them to run over tiny Spidermen and princesses on Halloween. The number of fatalities in the US would drop from 40,000 annually to 40. Would you wave that wand?
Comment by rullelito 8 hours ago
Comment by Rudybega 8 hours ago
I suspect the cars are trying to avoid running into anything, as that's generally considered bad.
Comment by metalman 2 hours ago
Comment by mrcwinn 2 hours ago
Our car hits better is a win, I guess?
Glad the child is okay.
Comment by IAmBroom 6 hours ago
If Waymo has fewer accidents where a pedestrian is hit than humans do, Waymo is safer. Period.
A lot of people are conjecturing how safe a human is in certain complicated scenarios (pedestrian emerging from behind a bus, driver holds cup of coffee, the sun is in their eyes, blah blah blah). These scenarios are distractions from the actual facts.
Is Waymo statistically safer? (spoiler: yes)
Comment by gjm11 5 hours ago
Imagine that there are only 10 Waymo journeys per year, and every year one of them hits a child near an elementary school, while there are 1000000 non-Waymo journeys per year, and every year two of them hit children near elementary schools. In this scenario Waymo has half as many accidents but is clearly much more dangerous.
Here in the real world, obviously the figures aren't anywhere near so extreme, but it's still the case that the great majority of cars on the road are not Waymos, so after counting how many human drivers have had similar accidents you need to scale that figure in proportion to the ratio of human to Waymo car-miles.
(Also, you need to consider the severity of the accidents. That comparison probably favours Waymo; at any rate, they're arguing that it does in this case, that a human driver in the same situation would have hit the child at a much higher and hence more damaging speed.)
Comment by confidantlake 1 hour ago
Comment by jeffrallen 4 hours ago
Comment by fortran77 8 hours ago
> The young pedestrian “suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle’s path,” the company said in its blog post. Waymo said its vehicle “immediately detected the individual as soon as they began to emerge from behind the stopped vehicle.”
BUT! As a human driver, I avoid driving near the schools when school's letting out. There's a high school on my way home and kids saunter and jaywalk across the street, and they're all 'too cool' to press the button that turns on the blinking crosswalk. So I go a block out of my way to bypass the whole school area when I'm heading home that way.
Waymos should use the same rationale. If you can avoid going past a school zone when kids are likely to be there, do it!
Comment by chasd00 6 hours ago
I can see that, prioritize obstacle predictability over transit time. A school zone at certain times of day is very unpredictable with respect to obstacles but a more car congested area would be easier to navigate but slower. Same goes for residential areas during Halloween.
Comment by trollbridge 8 hours ago
Comment by insane_dreamer 7 hours ago
But in a human driver with FSD on, are they liable if FSD fails? My understanding is yes, they are. Tesla doesn't want that liability. And to me this helps explain why FSD adoption is difficult. I don't want to hand control over to a probabilistic system that might fail but I would be at fault. In other words, I trust my own driving more than the FSD (I could be right or wrong, but I think most people will feel the same way).
Comment by 0xffff2 7 hours ago
Comment by bpodgursky 11 hours ago
Comment by toast0 9 hours ago
> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries
Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.
[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...
Comment by xnx 8 hours ago
Comment by globular-toast 5 hours ago
Comment by toast0 4 hours ago
Comment by globular-toast 4 hours ago
But really, did you seriously read my post as meaning people literally can't go slower than 20 so just plough into whatever is in the way? I'm obviously talking about an open road situation. Hardly any human drivers go under 20 by choice.
Comment by toast0 1 hour ago
Comment by thatswrong0 8 hours ago
A child is probably more likely to die in a collision of the same speed as an adult.
Comment by gortok 11 hours ago
Comment by jobs_throwaway 11 hours ago
Comment by entuno 11 hours ago
Comment by bpodgursky 11 hours ago
Comment by rationalist 11 hours ago
Comment by bpodgursky 11 hours ago
Comment by rationalist 11 hours ago
That still doesn't excuse trying to make them look bad.
Comment by bpodgursky 10 hours ago
Comment by emptybits 9 hours ago
John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.
NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.
Comment by seanmcdirmid 9 hours ago
John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.
Honestly, I'd much rather be hit by a Waymo than John.
Comment by xnx 8 hours ago
Comment by emptybits 8 hours ago
If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jail time for an at-fault driver here is very uncommon but can be applied if serious injury or death results from a driver's conduct. This is quite conceivable with humans or AI, IMO. Who will face jail time as a human driver would in the same scenario?
Hit and run, leaving the scene, is also a criminal offence with potential jail time that a human motorist faces. You would hope this is unlikely with AI, but if it happens a small percentage of the time, who at Waymo faces jail as a human driver would?
I'm talking about edge cases here, not the usual fender bender. But this thread was about policy/regs and that needs to consider crazy edge cases before there are tens of millions of AI drivers on the road.
Comment by seanmcdirmid 8 hours ago
Waymo has deep pockets, so everyone is going to try and sue them, even if they don't have a legitimate grievance. Where I live, the city/state would totally milk each incident from a BigCo for all it was worth. "Hit and run" by a drunk waymo? The state is just salivating thinking about the possibility.
I don't agree with you that BigCorp doesn't have any skin in the game. They are basically playing the game in a bikini.
Comment by Sohcahtoa82 2 hours ago
You do know that insurance being mandatory doesn't stop people from driving without insurance, right?
> If you are hit by an underinsured driver, the government steps in and additional underinsured motorist protection (e.g. hit by an out of province/country motorist) is available to all and not expensive.
Jolly good for you.
If I don't carry underinsured coverage, and someone totals my car or injures me with theirs, I'm basically fucked.
Comment by asystole 8 hours ago
Ah great, so there's a lower chance of that specific John Smith hitting me again in the future!
Comment by emptybits 8 hours ago
The general deterrence effect we observe in society is that punishment of one person has an effect on others who observe it, making them more cautious and less likely to offend.
Comment by boothby 9 hours ago
Comment by direwolf20 9 hours ago
Comment by boothby 7 hours ago
Comment by direwolf20 5 hours ago
Comment by Aloisius 4 hours ago
Comment by direwolf20 2 hours ago
Comment by NoGravitas 9 hours ago
Comment by frankharv 11 hours ago
Most humans would be halfway into other lane after seeing kids near the street.
Apologist see something different than me.
Perception.
Comment by axus 11 hours ago
Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues
"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."
Comment by AnotherGoodName 10 hours ago
We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.
To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.
Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.
Comment by qwertyuiop_ 6 hours ago
* “Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene,” Waymo wrote in the post.*
Comment by xnx 9 hours ago
Comment by recursive 8 hours ago
Saved child from what? From themselves. You can't take full credit for partially solving a problem that you, yourself, created.
Comment by tekno45 8 hours ago
Big vehicles that demand respect and aren't expected to turn on a dime, known stops.
Comment by henning 11 hours ago
A: It thought it saw a child on the other side.
Comment by direwolf20 11 hours ago
Comment by whynotminot 11 hours ago
Any accident is bad. But accidents involving children are especially bad.
Comment by dylan604 10 hours ago
Comment by whynotminot 10 hours ago
But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.
Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)
Comment by dylan604 10 hours ago
Comment by whynotminot 10 hours ago
You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.
At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.
Again, this all seems very solvable.
Comment by trollbridge 8 hours ago
Comment by ripped_britches 9 hours ago
Comment by alkonaut 11 hours ago
I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.
Comment by jillesvangurp 10 hours ago
As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.
Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.
Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.
Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.
Comment by alkonaut 10 hours ago
I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.
Comment by jillesvangurp 3 hours ago
Comment by trillic 9 hours ago
Comment by jerlam 8 hours ago
I know Tesla FSD is its own thing, but crowdsourced results show that FSD updates often increase the amount of disengagements (errors):
https://electrek.co/2025/03/23/tesla-full-self-driving-stagn...
Comment by sowbug 8 hours ago
Comment by jerlam 6 hours ago
Comment by trollbridge 8 hours ago
Comment by jonas21 11 hours ago
Do you mean like this?
Comment by alkonaut 11 hours ago
Comment by xnx 8 hours ago
Comment by trollbridge 8 hours ago
Comment by xnx 7 hours ago
Comment by Archio 8 hours ago
It seems very strange to defend a system that is drastically less safe because when an accident happens, at least a human will be "liable". Does a human suffering consequences (paying a fine? losing their license? going to jail?) make an injury/death more acceptable, if it wouldn't have happened with a Waymo driver in the first place?
Comment by trollbridge 8 hours ago
In fact, I could see Google working on a highly complex algorithm to figure out cost savings from reducing safety and balancing that against the cost of spending more on marketing and lobbyists. We will have zero leverage to do anything if Waymo gradually becomes more and more dangerous.
Comment by fragmede 3 hours ago
> You take the population of vehicles in the field (A) and multiple it by the probable rate of failure (B), then multiply the result by the average cost of an out-of-court settlement (C). A times B times C equals X. This is what it will cost if we don't initiate a recall. If X is greater than the cost of a recall, we recall the cars and no one gets hurt. If X is less than the cost of a recall, then we don't recall.
-Chuck Palahniuk, Fight Club
Comment by sowbug 8 hours ago
Comment by alkonaut 4 hours ago
Comment by alkonaut 4 hours ago
Comment by WarmWash 11 hours ago
I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.
Comment by recursive 9 hours ago
Comment by crazygringo 9 hours ago
Comment by mhast 8 hours ago
If that's too annoying then bad parking by school areas so the situation doesn't happen.
Comment by crazygringo 8 hours ago
And why would you make Waymo's go slower than human drivers, when it's the human drivers with worse reaction times? I had interpreted the suggestion as applying to all drivers.
Comment by alkonaut 11 hours ago
Comment by maerF0x0 10 hours ago
Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.
One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.
Comment by toast0 9 hours ago
Comment by maerF0x0 9 hours ago
Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.
Comment by toast0 8 hours ago
We would really need to see the site to have an idea of the constraints, Santa Monica has some places where additional roadway can be accomodated and some places where that's not really an option.
Comment by criddell 11 hours ago
Comment by alkonaut 10 hours ago
Comment by criddell 9 hours ago
Comment by alkonaut 4 hours ago
Comment by crazygringo 9 hours ago
Comment by zamadatix 8 hours ago
Comment by simianwords 6 hours ago
So would you pick situation 1 or 2?
I would personally pick 1.
Comment by renewiltord 9 hours ago
That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...
The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.
This driver will be back on the street right away.
Comment by xnx 8 hours ago
Comment by jtrueb 11 hours ago
Comment by JumpCrisscross 8 hours ago
It’s already accepted. It’s already here. And Waymo is the safest in the set—we’re accepting objectively less-safe systems, too.
Comment by xnx 8 hours ago
All data indicates that Waymo is ~10x safer so far.
"90% Fewer serious injury or worse crashes"
Comment by lokar 11 hours ago
But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.
There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.
Comment by cameldrv 11 hours ago
Comment by joshribakoff 11 hours ago
How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.
Comment by callumgare 11 hours ago
Comment by lokar 11 hours ago
I also assume a human took over (called the police, moved the car, etc) once it hit the kid.
Comment by BugsJustFindMe 11 hours ago
Comment by 1vuio0pswjnm7 1 hour ago
It was formerly known as the Google Self Driving Car Project
Comment by jsrozner 8 hours ago
If you drive a car, you have a responsibility to do it safely. The fact that I am usually better than the bottom 50% of drivers, or that I am better than a drunk driver does not mean that when I hit someone it's less bad. A car is a giant weapon. If you drive the weapon, you need to do it safely. Most people these days are incredibly inconsiderate - probably because there's little economic value in being considerate. The fact that lots of drivers suck doesn't mean that waymo gets a pass.
Waymos have definitely become more aggressive as they've been successful. They drive the speed limit down my local street. I see them and I think wtf that's too fast. It's one thing when there are no cars around. But if you've got cars or people around, the appropriate speed changes. Let's audit waymo. They certainly have an aggressiveness setting. Let's see the data on how it's changing. Let's see how safety buffers have decreased as they've changed the aggressiveness setting.
The real solution? Get rid of cars. Self-driving individually owned vehicles were always the wrong solution. Public transit and shared infra is always the right choice.