On behalf of Restovich Braun & Associates posted in fatal motor vehicle accidents on Friday, January 22, 2016.
Self-driving cars are likely going to be hitting the public roadways en masse before too long. As of now, some self-driving cars are on the roadways, but they have human drivers that can take over the driving if necessary. The current predictions are that autonomous cars without drivers and cars that don't have a way for a human to take control of the car might eventually find their way onto the public roadways.
A lot of work has to be done before self-driving cars can be considered safe. In fact, when it comes to some morality issues, such as determining how to react when obstacles are present, things can be complicated.
Self-driving cars can be programmed to react to situations based on a number of factors. The question that is being posed by one research group is whether the cars should be programmed to minimize the loss of life in an accident or if they should be programmed to protect the occupants of the car.
Some have noted that this dilemma is a catch-22 situation. On the one hand, people will be more apt to buy a car that will protect them even if it means that others are harmed. On the other hand, the cars might not be widely accepted if they don't minimize the overall loss of life.
This dilemma is one that brings up another important point. Who will be held liable if a self-driving car kills someone? Will it be the manufacturer or the owner? Those are questions that must be answered before self-driving cars are sold to the public.