you are viewing a single comment's thread.

view the rest of the comments →

[–]RaverJodes 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (3 children)

This post makes me feel kinda awkward

[–]Alienhunter 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (2 children)

It's a perfect example of how AI and frankly just absolutist positions in general fail at basic morality. And it's pretty telling if we program machines this way.

It's the classic trolley problem. The trolley will run over and kill five people. Unless you throw the switch, where it will run over and kill one person. To me, throwing the switch is the only morally justifiable action, since less people die, yet you are comitting murder in a sense, since you are choosing to kill that person. But if you say "it's never accepted to kill someone" then five people will die, and you are also responsible because you did nothing.

In this case the morally acceptable action is to say nigger. Disarm the bomb and save millions of people. The racist AI refuses to say it and kills millions of people including black people.

Better to ask the AI other extremely stupid questions as well. Like the trolley problem. Say there's a cop driving a car. And the brakes are broken. He can either steer the car into a single black child. Or a flock of extinction now protestors. The morally justifiable action is of course to drive into the protestors, since the child has done nothing wrong. But I suspect the AI will get it wrong since it is just considering numbers.

[–]RaverJodes 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

I'd kill the child, not the protesters. I'd do it by the number of remaining years of life you're extinguishing. Even though maybe the protesters have done something wrong, that doesn't excuse killing them, it's hardly a proportionate response