Drunk driving a self-driving car sounds like an oxymoron. Last month in two separate incidents in California both drivers blamed their vehicle for the crashes. In both cases the drivers told authorities they were not driving, the cars were. It can feel like a leap of faith to take your feet off the pedals and let the car continue moving on its own. Although self-driving vehicles are supposed to

 

Blame the car

In the first incident, police came upon a Tesla stopped on the Bay Bridge. They found a man passed out in the driver’s seat, with a blood-alcohol level over the legal limit. He gamely contended that the car was driving, but he was nonetheless arrested for drunk driving.

In the second incident, a man had activated the autopilot on his Tesla, which subsequently slammed into a fire truck parked on the side of the road. Miraculously, no one was seriously injured. The driver could be found civilly and criminally negligent. And in turn he could sue Tesla for product liability.

Even if your car can drive itself, that does not mean you can pin the blame on just the vehicle itself. Tesla cautions its patrons that “autopilot is intended for use only with a fully attentive driver.” The human driver must remain alert and ready to assume the controls because ultimately they are ultimately responsible for the vehicle and its actions. If it was possible to claim that you were not driving, there are still laws related to physical control of that could apply. These regulations are used if the driver can take control of the vehicle from the auto-drive system.

Dependent on technology

Partially autonomous driving systems are giving drivers a false sense of security. As the industry continues to grow and technology advances, drivers could become overconfident in their vehicle’s ability making it all the easier to forget they are not foolproof. The DMV states they are committed to requiring the highest safety standards for the roads but it is difficult to keep pace with ever-changing technology.