Self-driving or driverless cars are marketed as the safest way to travel. And while autonomous cars may become the safest mode of transportation in a few decades or so, for now they remain a relatively unsafe and understudied phenomenon.
Taking away the operator of a vehicle and allowing that vehicle to rely on artificial intelligence, sensors, computers, and a bunch of other futuristic things may sound like a good idea, but how safe are self-driving cars? “After all, driverless vehicles can eliminate human error, but they also eliminate critical thinking, good judgement, and decision-making abilities, all inherent to human beings, not computers,” reminds our Kansas City self-driving car accident attorney from Mayer & Rosenberg, P.C.
Surveys show that Americans are slowly getting used to the idea of riding in autonomous vehicles as opposed to driving them. One survey showed that 55 percent of U.S. drivers say that would be willing to switch to self-driving vehicles, but only when these vehicles have a proven better record compared to human-operated vehicles.
Indeed, the idea of riding in a driverless car sounds amazing. After all, it can eliminate an estimated up to 95 percent of human error causation for car accidents such as drunk driving, drugged driving, distracted driving, drowsy driving, aggressive driving, and falling asleep behind the wheel.
“However, let’s not forget that autonomous cars are a computer, and computers do not always function as intended,” warns our driverless vehicle accident attorney in Kansas City. This has been the case in several car crashes involving autonomous vehicles. In May 2016, Tesla’s self-driving vehicle got into a car crash after its brakes failed to engage properly despite a truck ahead giving clear indication that it was turning.
As a result of that crash, the passenger in the Tesla vehicle died, all because the cameras and sensors in the self-driving car were not functioning properly in the Autopilot system. Cases similar to this are not unheard of, which makes many Americans reluctant to believe that driverless vehicles can be a safer alternative to human-operated cars.
Clearly, riding in an autonomous vehicle will remain some distant thing from the future as long as car accidents involving self-driving cars continue to make headlines. However, it would be naïve to think that the Autopilot system has to be an ideal and errorless system 100 percent of the time.
For example, if a self-driving car would get in only one collision out of 50 trips compared to 10 car accidents involving human-driven vehicles, would it still make sense to continue driving human-operated vehicles? Well, you get the point.
Another question that remains unclear is how liability should be discerned and delegated in a car accident involving a driverless car. In a regular car accident involving two vehicles driven by human, most often, either of the drivers is responsible for that crash. But what about self-driving car accidents?
Because self-driving vehicles are still a relatively new phenomenon, specific laws and regulations have yet to be established in Kansas City, Missouri and all across the U.S. regarding driverless car accidents and liability. More likely than not, the injured might be able to sue the manufacturer of the autonomous vehicles and/or its parts and components or a mechanic responsible for inspection and maintenance of that vehicle (or the owner of the self-driving vehicle if he or she failed to properly maintain the vehicle).
Yes, determining liability in a car accident involving driverless cars is anything but easy. That’s why consulting with a Kansas City autonomous car accident lawyer is always a good idea. Contact Mayer & Rosenberg, P.C., to schedule a free consultation.
The information on this website is for general information purposes only. Nothing on this site should be taken as legal advice for any individual case or situation. This information is not intended to create, and receipt or viewing does not constitute, an attorney-client relationship.