Car_and_Driver_USA_July_2017
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
sh!t<br />
rest of the world. Here at C/D,<br />
we’re already on metric time <strong>and</strong><br />
will be transitioning to a metric<br />
alphabet in the coming months.<br />
Massaging seats—<br />
except for the Ford/<br />
Lincoln ones that really<br />
dig into your rear end.<br />
Your tired jokes about Subarus<br />
being driven by lesbians.<br />
“Pedal misapplication.”<br />
An inexplicably gentle<br />
euphemism for unforgivable<br />
stupidity.<br />
Bumper stickers that are<br />
already implied by the vehicle:<br />
“Go vegan!” on a Prius, or<br />
promoting gun rights on a<br />
pickup. Not Bullshit: “Driving<br />
a hybrid leaves me more money<br />
for ammo.”<br />
How little karting we <strong>and</strong> you<br />
do. Everybody could use more.<br />
Autonomy:<br />
The Trolley<br />
Problem Is Not<br />
the Problem<br />
—<br />
A common discussion<br />
around autonomous<br />
cars ties into an old<br />
hypothetical scenario<br />
in ethics called “the<br />
trolley problem.” In<br />
this thought experiment,<br />
a trolley is barreling<br />
out of control<br />
toward five people,<br />
whom it will surely kill. You could flip a switch <strong>and</strong> divert the trolley<br />
onto a sidetrack, where it will kill only one person. Do you flip the<br />
switch, thereby taking an active role in one death, or do nothing,<br />
allowing five deaths without any responsibility?<br />
The autonomous equivalent is a self-driving car making decisions<br />
about hitting people in a crosswalk versus at an outdoor café, hitting<br />
pedestrians versus hitting solid objects <strong>and</strong> endangering its own driver<br />
<strong>and</strong> occupants, plowing over Earth’s last western lowl<strong>and</strong> gorilla rather<br />
than running through the rest of the zoo, etc. The h<strong>and</strong>-wringers<br />
wonder how we can possibly program a car to make these decisions.<br />
They wonder if this isn’t our Ian Malcolm could/should moment. But<br />
the flaw in these scenarios is the assumption that a human driver makes<br />
a decision at all. It’s proved daily that we just hit whatever we’re pointed<br />
at when we panic. A machine couldn’t possibly do worse.<br />
Head-up displays. Station wagons. The Prius. Lewis Hamilton. Formula 1. Stability control. Bicycles. Turn signals.<br />
077