Waymo driverless car in San Francisco. Photo credit: marleyPug / Shutterstock.com
When I started practicing law in Colorado over 25 years ago, the biggest legal battles I saw after a car crash were against insurance companies. That hasn’t changed much. What has changed are the vehicles on our roads. Now, we’re sharing lanes with cars that don’t even have drivers.
Companies like Waymo (a subsidiary of Google’s parent company, Alphabet) have begun rolling out their autonomous vehicles. You may have even spotted one: a sleek white minivan or Jaguar (pictured above) with a spinning sensor on top, quietly gliding by with no one behind the wheel. It’s futuristic, exciting, but a little unnerving.
So, what happens if you get hit by one of these vehicles? Who’s responsible when the “driver” is a computer?
Table of Contents
ToggleBack in 2017, Colorado passed SB 17-213, which gave the green light for “automated driving systems” to operate in our state. In plain English: the law says the car’s system itself can be the legal “driver.” No humans required.
That makes Colorado one of the more open states for testing autonomous vehicles. And unlike some other states, we don’t require a human “safety driver” to sit behind the wheel. In fact, Governor Polis recently vetoed a bill that would have forced commercial autonomous trucks to keep a CDL-licensed operator on board, saying it would stifle innovation.
In short: Colorado is welcoming this technology, but that also means we need to think about the risks.
Let’s imagine you’re driving through Capitol Hill, and a Waymo doesn’t stop in time at a light. Here’s how liability could shake out:
Here’s something interesting: Colorado law treats the company operating the ADS as the “driver,” even though no human is present. That’s very different from how our laws historically worked, where everything revolved around human error.
Driverless vehicles don’t just leave skid marks; they leave digital fingerprints. Every Waymo is constantly recording its surroundings, sensor readings, and decision-making. Think of it like a “black box” in an airplane.
That’s great for investigating crashes, but companies don’t hand over this data easily. They’ll have teams of lawyers arguing that the system worked as designed or that someone else was really at fault.
And let’s be honest: going up against one of the most powerful tech companies in the world can sound intimidating. But I’ve seen over and over that with the right legal strategy, the playing field can be leveled.
The truth is, autonomous vehicles may one day reduce crashes. Studies suggest that human error contributes to over 90% of accidents. Machines don’t drink, text, or fall asleep. But they do make mistakes, just different ones. Tesla’s self driving systems are particularly problematic. Software can misinterpret a bicyclist, sensors can get blinded by sunlight, and updates can fail, to name a few.
So while the promise is real, the risks are, too. And when people are hurt, it’s our job to make sure corporations don’t dodge accountability, and those wrongfully injured get the compensation they deserve.
At The O’Sullivan Law Firm, we’ve always fought to make sure injury victims in Colorado aren’t steamrolled by powerful corporations. Whether that’s an insurance company or a Silicon Valley tech giant, our mission is the same: to make sure you’re made whole under Colorado law.
If you or a loved one has been injured in a crash with a driverless car, call or text me at 303-388-5304. You deserve a lawyer who knows your name, who will dig into the data, and who won’t back down from billion-dollar companies.