A woman is walking her bicycle across a street at night. She is wearing light colored clothing. A car approaches. Its headlights shine upon her. The car does not slow down. It is traveling at 40 miles per hour. The car does not brake. Inside the car, another woman sits behind the wheel. She does not steer. She does not brake. She just sits there. At the last second, just before the collision, the woman behind the wheel shrieks. But it is too late to react. The pedestrian is down. She is dead.
In the 20th Century traditional car accident case, no question about who’s responsible: the driver. But this case is different. There is no driver. The car was driving itself. The car is owned by Uber. Uber’s engineers designed the car to be driverless. The woman sitting behind the wheel was not driving. She is an Uber employee and was supposed to be “monitoring” the vehicle, just in case the vehicle made a mistake.
This collision, which occurred Sunday night in Tempe, Arizona, was a major setback for Uber. But also for the entire self-driving car industry. It is believed to be the first pedestrian death caused by a self-driving car.
Portions of the dashboard camera footage show the exterior and interior of the car moments before the accident. Watch them here if you dare. Warning: you may find them disturbing:
What did you notice? Did you notice how bored the Uber “backup driver” was? Did you notice her hands were not near the wheel? Her hands should have been on the wheel, just in case. Did you notice how visible the pedestrian with her bicycle was?
This is one of the problems with self-driving car “backup drivers”. They get bored. And then they get distracted. And then they do not act like “backup drivers” at all. They act like dummies.
Why should you care? That “backup driver” is likely to be YOU one day, once self-driving cars become the norm. Instead of driving yourself to work, your car will drive you to work, but you will be the “backup driver” behind the self-steering wheel. Will you be attentive? Or will you get bored and distracted? Be honest with yourself.
And what if you get struck by a driverless car, like the woman in Tempe. Who does a New York car accident lawyer like me sue? Was it the “driver’s” fault for not paying attention? Maybe in part. But remember, the car itself was the main driver. The car made the first mistake. Which means the engineers who designed the car made the mistake. Or maybe it was the company executives who decided to put the car on the road knowing full well the vehicle had not been adequately tested. Finding the culpable person becomes complicated. It’s not like in the 20th century when you could almost always blame it on driver error. Now it is the “backup driver” but also the company – Uber — that designed and tested the vehicle for the road.
And the 21st century auto accident case is likely to be much more expensive to prosecute. I may have to hire an engineer to figure out what mistake the vehicle’s manufacturer made in its design.
To make matters worse, our laws have not caught up with self-driving cars. There is a legal vacuum regarding self-driving cars. There are few federal rules governing their testing or use on our roadways, and only a few States have any rules at all to regulate them.
This recent driverless car accident in Tempe was a reminder that self-driving technology is still in the infancy stage. The technology still has lots of bugs and unanswered questions. For example: Can we really expect the backup driver (eventually you) to be attentive to the road if he/she is just sitting like a dummy in the “driver’s” seat with nothing to do but watch the road? Will laws be passed to make the car manufacturer or designer “automatically” (strictly) liable for technology failures that cause serious injury? Or will lawyers have to hire engineers to prove liability?
Hang around for another decade or so and you’ll find out!