you are viewing a single comment's thread.

view the rest of the comments →

[–]Alduin 3 insightful - 1 fun3 insightful - 0 fun4 insightful - 1 fun -  (2 children)

I've not yet heard a convincing argument that we shouldn't make genetically enhanced humans. Basically all comes down to "they'll be better than us and only the rich will be able to afford it". So not really different than any other technological or medical advancement.

[–]Vigte[S] 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (1 child)

[–]Alduin 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 1 fun -  (0 children)

It's a decent argument. Here's my counter.

Firstly, you make the presumption that an AI will reach the same conclusion as you for what's best or most efficient. Not only do I think that's extremely unlikely, but if it were true then you might as well take AI out of the equation because evidently you, a human, reached the conclusion on your own.

Second, I don't think the replacement of one species with a superior one is necessarily a bad thing.

Third, and this relates to the first point, I think you are vastly underestimating AI. By all indications I've seen, we're about 10 years from Artificial General Intelligence, and then a matter of hours or days from Artificial Super Intelligence (might as well say they're one and the same). An AGI is as smart as a human, with the ability to absorb information, learn, and make decisions based on available data. An ASI is not just smarter than the smartest human, but smarter than all of humanity itself, and with capabilities we haven't even thought of. No amount of genetic editing can put humanity at that level for many many generations.

The two are really a separate topic but AI is every bit as likely to supplant the human race on it's own as it is to supplant genetically enhanced humans. Not that THAT is even necessarily a bad thing. It could be the best or worst thing to ever happen to humanity, and we have no idea which.