An officer was eventually able to manually drive the Waymo vehicle out of the way, but it took a few minutes. Officials were careful to note the delay didn't significantly hinder emergency response. But that's almost beside the point. Three days later, at a public meeting at Austin City Hall, police, fire, and EMS officials spent an hour telling council members exactly how many ways autonomous vehicles — led by Waymo — were making their jobs harder, more dangerous, and more unpredictable. KUT Radio
"I believe the technology was deployed too quickly in too vast amounts, with hundreds of vehicles, when it wasn't really ready," one police official told federal regulators last month. Wilson's Media
That quote should keep Waymo's executive team up at night. Not because it's new — variations of this complaint have been lodged against Waymo in San Francisco, Phoenix, and Los Angeles for years — but because the incidents keep coming despite every software update, every apology, every promise to do better.
What's Actually Happening in Austin
Austin has become Waymo's most contentious operating environment, and the data makes clear why. A KXAN investigation found Waymo vehicles failed to stop for Austin Independent School District buses more than 24 times from August 2025 to March 2026. The National Transportation Safety Board opened an investigation in January after the outlet reported on it. A preliminary NTSB report subsequently revealed something deeply uncomfortable: in one incident, a Waymo asked a remote assistance worker whether a nearby school bus was loading or unloading students — with the stop sign deployed and lights flashing. The worker said "no." The Waymo drove past. Six other human drivers did too, for what it's worth, but the Waymo's failure had a specific cause: a human-in-the-loop who made the wrong call in a remote call centre. KXAN
In at least six instances identified by TechCrunch, first responders have had to take control of Waymo vehicles and move them out of traffic during emergency situations, including one in which an officer was in the middle of responding to a mass shooting. That's not software glitching in a controlled test. Those are real officers, in real emergencies, physically climbing into a robotaxi while someone else is dying nearby. TechCrunch
The Austin Police Department logged at least 236 complaints of driverless car incidents to the city between July 2023 and mid-March 2026. These include Waymos blocking traffic, creating safety risks, and ignoring APD directions more than 170 times. Despite hundreds of complaints, police issued fewer than a dozen tickets in nearly three years, largely because the process for citing autonomous vehicles is cumbersome and time-consuming. An officer has to fill out paperwork that was designed for human drivers, then figure out who exactly to cite for a machine that has no driver's licence. KXAN
The incident record tells its own story:
Incident | Date | Location | Outcome |
|---|---|---|---|
1,500 Waymos stall during power outage | Dec. 20, 2025 | San Francisco, CA | 911 dispatcher on hold 53 min |
24+ illegal school bus passes | Aug. 2025–Mar. 2026 | Austin, TX | NHTSA + NTSB investigations |
Child struck outside elementary school | Jan. 2026 | Santa Monica, CA | Federal investigation opened |
Ambulance blocked at mass shooting | Mar. 1, 2026 | Austin, TX | 3 dead, Waymo manually moved |
Waymo blocks first responders at shooting | Mar. 2026 | Austin, TX | City Council emergency hearing |
The San Francisco Precedent
Before Austin became the focal point, San Francisco was Waymo's most visible headache — and the December 20, 2025 power outage remains the most instructive case study in what fleet-scale failure actually looks like.
As a fire at a PG&E substation plunged a third of San Francisco into darkness, one of the city's 911 dispatchers sat on hold with Waymo's first responder hotline for 53 minutes. The company's systems had become overwhelmed with requests from more than 1,500 confused robotaxis trying to navigate intersections without functioning traffic signals, rendering them inoperable for two minutes or more. KQED
Two minutes of paralysis across 1,500 vehicles spread across a city during a blackout isn't a software bug. It's a systems design failure at scale. When Waymo's incident response team is the one calling 911 because its own car won't move, something has gone badly wrong with the basic premise of how the product operates.
At a tense San Francisco Board of Supervisors hearing on March 2, 2026 — notably, one day after the Austin mass shooting incident — Waymo apologised for the power outage chaos and outlined remediation steps. Sam Cooper, the program manager for incident response at Waymo, said: "I want to be very clear that Waymo takes full responsibility for the communication gaps that occurred that evening." Supervisors, unimpressed, pressed on why first responders were having to move stalled Waymos out of intersections at all. KQED
The Counterintuitive Truth About Scale
Here's what every founder building an autonomous system needs to understand, and what the AV industry has systematically undersold: more vehicles don't necessarily mean a safer product. They mean more edge cases.
Waymo's core safety argument — that its vehicles have twelve times fewer injury crashes involving pedestrians than human drivers — is probably true in aggregate. And it should be taken seriously. But aggregate statistics obscure the nature of the incidents that are actually occurring. The fleet learns as it scales, and Waymo can issue software patches that theoretically prevent the same error twice. But every new city, every new traffic pattern, every new emergency scenario generates new errors that haven't been patched yet. As the fleet grows from hundreds to thousands to tens of thousands of vehicles, the raw number of novel incidents grows with it — even if the rate is improving. TechCrunch
That's the uncomfortable arithmetic of AV scaling. The industry has framed it as a virtuous cycle: more miles, more data, better models, fewer incidents. The reality first responders are living is different: more vehicles, more incidents in absolute terms, more strain on systems designed for human-driver behaviour.
"Our first responders should not be AAA." — District Supervisor Alan Wong, San Francisco, March 2, 2026
That line landed hard at the San Francisco hearing, and it's the cleanest articulation of the structural problem. Waymo has built a product that, in edge cases, needs public safety infrastructure to bail it out. Police officers are manually driving stalled robotaxis. Fire departments are coordinating with Waymo's remote call centres. EMS is navigating around frozen vehicles in active emergency zones. None of this was in anyone's deployment plan.
The Global Regulatory Dimension
Austin and San Francisco are the current flashpoints, but the regulatory and reputational questions land differently in other markets where Waymo and its competitors are watching closely. Waymo was photographed in London in March 2026 — it appeared on London streets for the first time — suggesting the company has European ambitions even as its US operating environment grows more fraught.
The UK's approach to AV regulation, codified in the Automated Vehicles Act 2024, puts formal liability on the "authorised self-driving entity" — the company, not a driver — for incidents during autonomous operation. That framework was designed precisely for scenarios like the ones unfolding in Austin: clear corporate accountability when an AV causes harm. The EU's equivalent framework under the updated Product Liability Directive similarly tightens the liability chain for autonomous systems.
In Australia, where Waymo has no operations but competitors like Zoox and Cruise (before its collapse) have been watched closely, state transport authorities have been building permitting frameworks with first-responder interaction protocols baked in. What's happening in Austin is actively informing those policy discussions. The incidents aren't just a US problem — they're reference cases for every regulator globally trying to decide how much latitude to give AV operators before pulling the plug, as California's DMV did with Cruise in October 2023.
What Founders Should Watch
Three signals worth tracking closely:
Texas Senate Bill 2807's enforcement teeth. Texas will launch a program in May allowing citizens, first responders, and law enforcement to submit complaints about driverless vehicles directly to the DMV, which will have authority to restrict or revoke an AV company's ability to operate on Texas roads if it finds an imminent danger to the public. If APD starts funnelling its 236-plus complaints into this new system, Waymo's Austin operations could face real operational restrictions — not just political pressure. KXAN
Waymo's no-show at the April 29 Austin Public Safety Commission meeting. Waymo was invited to the meeting but did not attend, instead saying via a spokesperson that it had already had "substantive conversations" with city and state leaders. That decision — to skip a public safety meeting while sending a written statement — is either a calculated legal strategy or a tone-deaf PR misstep. Probably both. Founders should watch whether this posture hardens or softens as the regulatory pressure builds. CBS Austin
The NHTSA investigation's trajectory. The school bus probe hasn't concluded. A child was struck in Santa Monica in January. The federal agency is accumulating a file on Waymo that, if it results in a formal safety defect determination rather than a voluntary recall, could set a precedent for how the entire AV industry handles ongoing incidents at scale.
The Waymo story isn't simple. Its safety aggregate data is genuinely impressive. The engineering problem it's solving is genuinely hard. But the incident pattern emerging across Austin, San Francisco, Atlanta, and Santa Monica suggests something specific: the product was scaled into operational contexts it wasn't fully prepared for, and the public safety infrastructure is paying the cost of that optimism. Until the company can demonstrate that its performance in edge cases — power outages, active shootings, school zones — matches its performance in normal conditions, the trust deficit with the people who respond to emergencies will keep widening. And that trust deficit has a way of becoming regulatory reality faster than any safety statistic can compensate for.





