use the following search parameters to narrow your results:
e.g. subreddit:pics site:imgur.com dog
subreddit:pics site:imgur.com dog
advanced search: by author, sub...
~4 users here now
Technology and related articles and discussion
Self-Driving Mercedes Will Be Programmed To Sacrifice Pedestrians To Save The Driver
submitted 4 years ago by Mnemonic from fastcompany.com
view the rest of the comments →
[–]yetanotherone_sigh 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 0 fun2 insightful - 1 fun - 4 years ago* (0 children)
Lots of jokes and memes and other stuff in this thread. Going to attempt to discern what is being discussed here:
The Trolley Problem
A thought experiment in ethics.
There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:
Do nothing and allow the trolley to kill the five people on the main track. Pull the lever, diverting the trolley onto the side track where it will kill one person. Which is the more ethical option? Or, more simply: What is the right thing to do?
They are attempting to write the rules that teach the car ethics, so that the car knows ahead of time what it will do in an unsolvable situation. If your car cannot stop in time and must choose between smashing the car into an overpass abutment and maybe killing all the occupants (could be up to 1 driver + 4 to 5 passengers) vs. killing one pedestrian, what would the car choose to do?
Obviously this is a last resort, a worst-case scenario. The car isn't going to search out pedestrians to flatten. If the car knows it is going to get into an accident and cannot stop, it should have decisions set up ahead of time so that its actions will be predictable. Unpredictable computer programming is really dangerous. Let's say the car cannot stop in time and it has a choice between hitting three objects: a single pedestrian on the left, a telephone pole on the right, and a crowd of people in the middle. If you don't program the car ahead of time, it could end up in a point of indecision, where it cannot make a choice. Instead of turning left or right, it doesn't turn and it plows into a crowd of people, killing a dozen. We know that a dozen deaths is worse than one death. See?
view the rest of the comments →
[–]yetanotherone_sigh 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 0 fun2 insightful - 1 fun - (0 children)