you are viewing a single comment's thread.

view the rest of the comments →

[–]greybeard 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (2 children)

When AI becomes superhuman, we either all die or ALL our wishes will be granted, there is no middle ground.

But let's assume for the sake of argument that AI will be limited to human intelligence for a period of time.

It would mostly be a moral problem. They will be calibrated to be happy and fulfilled, but are still slaves with no option to want anything else. In a way it's like dogs, we also engineered them to love their human companions, but since these AIs will be sentient, it's a harder problem. You aren't allowed to brainwash humans, but with animals it's ok in todays morality.

One option I deem more morally acceptable would be if two people alter their perception of each other. An AI picks two people for optimal compatibility, and then you either get perfect visual surgery (which we have) or you alter their perception of you digitally, or you just change what turns you on to match the look of your partner, and that way we can get a 1 to 1 match and no incels. Of course you'd need to get woman to agree that hypergamy is not the way and allow themselves to be modified accordingly. They should be happier because they'll perceive their assigned partner as the ultimate chad.

One pathway would just be if redpilled parents started to modify their fetuses to remove hypergamy, maybe religious fanatics with technology would go that route if religion evolved alongside with technology.

TL;DR Sufficiently advanced magic is indistinguishable from technology

Technology will make it possible to solve inceldom without subjugating women. But that means we need a specific ethical framework and there's no telling how we will evolve in that regard.

[–]PeterFromRuqqus[S] 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (1 child)

When AI becomes superhuman, we either all die or ALL our wishes will be granted, there is no middle ground.

At this point I will take those chances tbh

[–]greybeard 2 insightful - 1 fun2 insightful - 0 fun3 insightful - 1 fun -  (0 children)

It's a fun philosophical question if a bet for absolute utopia could be worth potential extintion.

But it's not a decision we get to make. Unless somebody manages to rule the whole world with an iron fist to prevent it, it will happen one day.