Vous êtes sur la page 1sur 4

Pro Choice Driving

By Ryan Thomas
Self-driving cars might have seemed like a futuristic dream back when the
movie I, Robot came out in 2004, but today they seem just around the corner.
Business Insider predicts that by 2020, there will be over 10 million self-driving cars
on the road (Greenough). Each of these cars will be making complex decisions,
many of which will have serious moral implications. But autonomous vehicles are
still machines, and their decisions are merely the result of calculations based on
algorithms written by humans. That means that the weight of these decisions
currently falls on the shoulders of the engineers and programmers telling the cars
how to act.
If youre wondering what sort of philosophical questions cars would think
about, consider this: A car is carrying a single passenger is driving through a
neighborhood at 35 mph when a child runs out in front of the car chasing a ball. A
large van is traveling in the opposite lane at the same speed as the autonomous
car. The car does not have time to stop without hitting the child. There is a large
tree between the car and the sidewalk, so the car cannot swerve safely in that
direction. To summarize, the cars choices are between hitting the child, hitting the
tree, or hitting the oncoming van.
How would you act if you had control of the car in these circumstances? With
lives at risk, would you be willing to hand over a decision of such magnitude to
some far removed engineer? I dont think so. I posit that if it becomes possible for
autonomous vehicles to predict the number of fatalities in a given collision,
engineers should let customers decide for themselves the threshold of potential
lives saved after which the car will consider options that have a high probability of
killing the driver.
Now come back to our scenario. Lets ignore, for the moment, that
realistically you wouldnt even have time to make an informed decision, as were
thinking from the perspective of an autonomous car which can make these
calculations far more quickly than a human. Also ignore the fact that under current
law, you would have to pay for the damages to the oncoming van and injuries to its
driver, as vehicular and traffic laws will need to be updated with the advent of
autonomous vehicles, and are therefore beyond the scope of this essay.
Hitting the oncoming van will injure both you and the driver of the van, and
the combined speeds of the two vehicles will mean that each of you will suffer
worse injuries than you alone would had you hit the tree. This eliminates the choice
of hitting the van, but you are still left with the choice between hitting the child and

killing her or hitting the tree and putting yourself at risk. Since your chances of
dying in a collision with a tree at this speed arent too bad, the moral choice in this
scenario would be to prioritize the child and hit the tree.
Now lets raise the stakes. Instead of cruising through a suburb, your car is
racing down a two-lane highway at 60 mph. There is no divider between the two
lanes and the van from the previous scenario is in the opposite lane, traveling at the
same speed as you. A crowd of people are walking by the side of the road when one
of them falls in. Two of the members of the crowd rush over to pull their companion
out of the way, but your car knows its too late. No more swerving into trees and
saving everyone. This time, someone is going to die. Now your options are hitting
an oncoming van (killing both you and the other driver), hitting the three people in
the road (killing all three of them but causing little injury to you), and swerving into
the crowd, killing more pedestrians than you would have had you stayed your
course. Immediately swerving into the crowd can be eliminated as an option, as it
would make no sense to take deliberate action that would cause a greater loss of
life when simply staying the course would put you (the driver) in the same amount
of risk.
So what would you do? Would you sacrifice yourself and the hapless driver of
the van to prevent a greater loss of life? Or would you take the selfish action and
mow down the three pedestrians to save yourself?
Look back on the decisions you made in the first portion of this essay.
Wouldnt you feel more comfortable buying a self-driving car if you were the one
making these decisions for yourself, rather than some far-removed power handing
down a unilateral mandate? After all, its your life thats at stake here.
It is unnecessary to impose a single standard that takes away the freedom to
choose from drivers. For situations like the first one, in which a life could be saved
with only the car being damaged, the engineers could implement an opt out system
where drivers need to specifically mandate that their vehicle reject sustaining even
minimal damages in order to save a life. Its almost certain that virtually every
driver will make the humane choice. The cost of every regulation and law we
impose is the freedom of the people. Sometimes the benefit of the law is worth the
cost. For example, the Food and Drug Administration restricts the freedoms of
pharmaceutical corporations in order to protect consumers from purchasing the
wrong drugs. Were they not able to do this, customers might suffer serious health
complications or even die from taking the wrong medications. But if we force drivers
to risk damage to their cars, we would be imposing a choice that consumers would
all but surely make for themselves. Without any tangible benefit, its easy to see
why consumers should control how their car behaves in situations where a life can
be saved.

Its also necessary that altruistic drivers be given the choice to have their
cars put the lives of others before them in situations where some loss of life is
guaranteed. This option will persuade drivers with stronger consciences to adopt
self-driving cars, as they might not want to drive a car that they think would kill
others on their behalf. Having more people driving self-driving cars benefits society
as a whole as these vehicles are proven to be less accident prone than vehicles
under human operation. Reducing loss of life has always been imperative to the
automotive industry, as can be seen with advancements in adaptive cruise control,
occupant sensitive airbags, and emergency brake assist, and the opportunity to let
your car sacrifice you in order to prevent a greater loss of life can be seen as just
another advancement in the chain of progress.
There is, however, a moral objection to this claim that warrants rebuttal. If
engineers choose to pass the choice between prioritizing the life of the driver and
minimizing overall loss of life on to the consumer, they still need to create the
algorithm that chooses the few over the many. In doing this, it is arguable that they
are partly responsible for the increased loss of life. If they had simply designed one
option by which the car always tried to save the most lives, even if that meant
sacrificing the drivers, then fewer people would die because of accidents involving
self-driving cars.
The critical flaw in this argument is that it presents the choice as a binary
decision in which the engineer either makes the morally responsible decision or fails
to do so. What it does not take into consideration is the fact that consumers do not
necessarily need to purchase a self-driving car. A 2015 study found that 30 percent
of people would not choose to own a car which swerved into oncoming traffic to
avoid saving the lives of 10 other people if it meant that the owner would die as a
result. In the US alone, there are 253 million cars (Hirsh). On average, 30,800
people are killed per year in car accidents (Fatality Analysis Reporting System
Encyclopedia). Researchers have estimated that self-driving cars could reduce the
number of accidents per year by 90 percent, a total of 27,700 lives saved. But if
engineers only build self-sacrificing cars, the 30 percent mentioned earlier could
cause in increase of approximately 8,300 vehicular fatalities.
Granted, these calculations are based on an assumption that all drivers
switch to autonomous vehicles, but it is still evident that the lives lost as a result of
collisions involving autonomous vehicles would be far outweighed by the lives that
would be lost from drivers refusing to purchase such cars. Even if you disagree with
the individualistic and altruistic arguments made earlier in this essay, you should
still be swayed by the utilitarian guarantee that more people will die if consumers
arent given control over the behavior of their autonomous vehicle.

Works Cited
"Fatality Analysis Reporting System Encyclopedia." Http://www-fars.nhtsa.dot.gov/.
National Highway Traffic Safety Administration, n.d. Web. 26 Oct. 2015.
Greenough, John. "THE SELF-DRIVING CAR REPORT: Forecasts, Tech Timelines, and
the Benefits and Barriers That Will Impact Adoption." Businessinsider.com.
Business Insider, Inc, 29 July 2015. Web. 27 Oct. 2015.
Hirsh, Jerry. "253 Million Cars and Trucks on U.S. Roads; Average Age Is 11.4 Years."
Los Angeles Times. Www.latimes.com, 9 June 2014. Web. 01 Nov. 2015.

Vous aimerez peut-être aussi