Waymo is Having a Hard Time Stopping For School Buses
3 128Waymo's robotaxis have racked up at least 24 safety violations involving school buses in Austin since the start of the 2025 school year, and a voluntary software recall the company issued in December after a federal investigation has not fixed the problem.
Austin Independent School District initially reported at least 19 incidents of Waymo vehicles failing to stop for buses during loading and unloading -- illegal in all 50 states -- prompting NHTSA to open a probe. At least four more violations have occurred since the software update, including a January 19th incident where a robotaxi drove past a bus as children waited to cross the street and the stop arm was extended.
Waymo also acknowledged that one of its vehicles struck a child outside a Santa Monica elementary school on January 23rd, causing minor injuries. Austin ISD has asked Waymo to stop operating near schools during bus hours until the issue is resolved. Waymo refused. Three federal investigations have been opened in three months.
3 comments
Re: Kind of weird (Score: 5, Insightful)
by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @12:06AM (#65974042)
it is obvious if you understand the concept of driving instead of mimicking it statistically with some probability.
a simple difference that the "AI" proponents and "investors" can't seem to grasp and acknowledge.
Re: Kind of weird (Score: 5, Insightful)
by Mr. Dollar Ton ( 5495648 ) on Saturday February 07, 2026 @12:36AM (#65974072)
I'm sure someone though of it. What's obvious from the failures is that model training isn't a substitute for understanding, which the model is lacking. So it will always have a nonzero chance to fuck up an obvious situation, which is what we mostly deal with.
Of course you'll have people arguing it isn't different with people on the account of the outcome (people are slower, get tired, etc) but the fundamental difference is the understanding, and the model doesn't have it.
Hence Agrdaaeelbal instead of America.
Re: Kind of weird (Score: 5, Interesting)
by Mspangler ( 770054 ) on Saturday February 07, 2026 @09:31AM (#65974510)
Computers are deterministic, software based on statistical models is not.
My dissertation on Data Requirements to Train Neural Network Controllers for Use in Process Industries which dates to 1997 proved that.
The short version is the weights between the layers are non-linear, therefore the failure mode is non-linear as well. In other words you don't know what it's going to do.
Apparently 25 years of work has failed to solve the problem or we wouldn't be seeing all these hallucinations.