Click me
Transcribed

The Ethical Dilemma: Self-Driving Cars & The Laws of Robotics

the ETHICAL DILEMMA SELF-DRIVING CARS & THE LAWS OF ROBOTICS Experts believe self-driving cars will become commonplace by 2025. While this amazing technological progress demands celebration, it also raises concern over ethics and morality. the LAWS OF ROBOTICS Isaac Asimov, sci-fi author and professor, introduced the three laws of robotics in 1942. LAW LAW 1 man A robot may not injure a human being or, through inaction, allow a human being to come A robot must obey the orders given it by human beings except where such orders would to harm. conflict with the First Law. LAW 3 A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws. LAW He later added a fourth law, also called the Zeroeth Law: A robot may not harm humanity, or, by inaction, allow humanity to come to harm. Can the clear rules-based code of a computer handle the nuances of ethical dilemmas? Let's take a look at a few hypothetical scenarios: the TROLLEY PROBLEM Hey, have you ever heard about the Trolley Problem? Nope. What is it? A trolley with broken brakes is speeding towards five helpless people. Fortunately, you have the ability to steer the trolley to an alternate track... Obviously I'd do that... Wait, I'm not done. The other track has one person on the track as well. Which way will you go? Can't I just tell them to get off the tracks? No, they're all tied down. Let me think about this for a minute... 90% 90% of people say they would steer the train onto the alternate track. Moral intuition tells them that it's I'd still steer the trolley onto the other track since that better to kill one would kill fewer people. person than five. Okay, makes sense. So let's add a little twist to see if you'd always choose to kill less people. Why would I ever choose to kill more people? Just hear me out. So the runaway trolley is about to hit five people. Except this time, you're on a bridge that the trolley is about to pass under. The only thing that could stop the trolley is something heavy. It just so happens that you're standing next to a very large man. Your only hope to save the five people on the tracks would be to push the large man over the bridge and onto the track. What would you do? Nooo0oo!!!!! The large man doesn’t deserve that! But throwing him onto the track would save more people. Most people strongly oppose this version of the problem – even those who previously said they would rather kill one person than five. These two scenarios reveal the complexity of moral principles. the TUNNEL PROBLEM Okay, fine, forget the large man. Have you heard of the Tunnel Problem? leave me alone... Just humor me. This one doesn’t involve killing anyone. Fine. You're traveling on a single lane mountain road in a self-driving car that is quickly approaching a narrow tunnel. Suddenly, a child tries to run across the road but trips, blocking the entrance to the tunnel. The car only has two options: 1. Hit and kill the child 2. Swerve into the wall, killing you How should the car react? You said no one had to die!! I'm asking what the car should do! This is an important question! Hypothetical scenarios like the Tunnel Problem present some of the real difficulties of programing 36% ethics into autonomous vehicles. would swerve and kill the 64% passenger This is how people responded when asked how the car should react in the Tunnel problem: would continue straight and kill the child But who should get to decide? Here are the results of one survey: 11% Other 12% Manufacturer / Designer 44% Passenger 33% Lawmakers so building ethics into Ethics is a matter of sharing a world with oth autonomous cars is a lot more complex than just formulating the "correct" response to a set of data inputs. the INFINITE TROLLEY PROBLEM Okay, I have one more scenario for you, and this time I really promise you don’t have to kill anyone. Yeah right. What if you were the conductor of the trolley, rattling down a single track, heading towards a single victim? But this time you can hit the brakes to stop and save them. Well that's brake to a halt – no one I'd just easy. needs to die or feel guilty. ...but there's a caveat. Your trolley is infinitely long. The number of passengers on board is as many as it takes to make you reconsider stopping the train. Billions, even. I'd still... stop the trolley if I could save the person's life... But every passenger on your trolley has their own plans – all of which will be thrown off track (pun intended) if you halted their trip. So what's your price? How long does your trolley need to be for your passengers' convenience to outweigh the life of just one person? - Given the current number of vehicular fatalities, waiting for self-driving cars to be 99% (if not 100%) safe disregards the fact that many of these accidents could be prevented once the fatality rate for self-driving vehicles merely dips below that of physically-manned vehicles. By slowing down progress with self-driving cars, we are prioritizing the lives of the few over the lives of the many. Lives Lost From Self-Driving Vehicles as Technology Improves Opportunity for Lives to be Saved Lives Lost from Manned Vehicles Is waiting for perfection worth it? Sources: io9 | Medium Backchannel| Open Roboethics Initiative Robohub.org Time Magazine | The Yale Law Journal

The Ethical Dilemma: Self-Driving Cars & The Laws of Robotics

shared by anumyoon on Jan 16
121 views
0 shares
0 comments
Experts believe self-driving cars will become commonplace by 2025. While this amazing technological progress demands celebration, it also raises concern over ethics and morality.

Publisher

CJ Pony Parts

Category

Transportation
Did you work on this visual? Claim credit!

Get a Quote

Embed Code

For hosted site:

Click the code to copy

For wordpress.com:

Click the code to copy
Customize size