Death by Driverless Car

A number of news outlets are already reporting the first pedestrian death that occurred when an autonomous vehicle struck and killed a woman in Tempe Arizona.  According to the news reports, the vehicle was operated by Uber with a human driver for safety.  The woman was crossing a street, at a point other than a crosswalk, with her bicycle.  It should be pointed out that, jaywalking is not recommended nor is it legal in most municipalities.   The vehicle, a Volvo XC90, failed to detect her presence, struck and killed her.  First and foremost, this is a prime example of the fact that we as a society are a long way from having autonomous vehicles take us where we want to go.  Second, there was a human being in the vehicle when the accident occurred, supposedly for safety purposes.  So, why didn’t he take the wheel and try to avoid hitting the woman?  Did the event present itself so suddenly that there wasn’t time for him to react?  Or was he so surprised by the woman’s appearance that he was stunned and couldn’t move?  It is possible that the woman walked into traffic in front of the Volvo with no thought about what she doing.  In that case, she was most likely at fault.  However, we are constantly being told that autonomous vehicles will be much safer and that the accident rate will go down when these vehicles are in common use.  But, what about when the car’s computer has to make an ethical decision?  What does the vehicle do when it has to decide whether to kill a pedestrian or crash into another vehicle and risk killing some or all of its occupants?  Speaking from a forensic standpoint, remember, the movements of the vehicle will be preprogrammed at the manufacturer’s factory based on various scenarios that drivers face everyday.  Are you ready to forfeit your life on the basis of what some programmer thinks is the right thing to do?

%d bloggers like this: