MSFT needs so much power to train AI that it's considering small nuclear reactors
submitted 7 months ago by [deleted] from (futurism.com)
[–]In-the-clouds 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 0 fun2 insightful - 1 fun - 7 months ago (0 children)
Training large language models is an incredibly power-intensive process that has an immense carbon footprint.
Here in the city where I live, our electricity is not based on Carbon dioxide (CO2) since a hydroelectric generator produces the power from liquid water pushing through turbines.
"Carbon footprint" is one of those propaganda phrases used to misdirect people away from the source of our problems: our sin. People don't pray and ask the Lord, what have I done? Forgive me.
For those who like true science, water vapor is the greatest greenhouse gas. But the controllers don't remind the public of this.
use the following search parameters to narrow your results:
e.g. sub:pics site:imgur.com dog
sub:pics site:imgur.com dog
advanced search: by author, sub...
~4 users here now
Intelligence demonstrated by machines.
DuckDuckGo: Artificial Intelligence
● Artificial intelligence ● Outline of A.I. ● History of A.I. ● Timeline of A.I.
Ask to be co-moderator or earn an invitation.
[–]In-the-clouds 1 insightful - 1 fun1 insightful - 0 fun2 insightful - 0 fun2 insightful - 1 fun - (0 children)