Email or username:

Password:

Forgot your password?
Top-level
Michael

@tofu I would agree with @starsider . You can run models like stable diffusion on your own hardware. Get one of those little power plugs that counts electricity usage. Connect everything. You'll see that generating your image won't take anywhere near the power consumption of a small city. That missinformation is not helpful - AI has it's problems, but you shouldn't go around and tell people that generating one image will boil the oceans

8 comments
Michael

@tofu @starsider And even training is kind of ok. According to this article GPT3 took 1287 MWh to train or the "annual driving emissions of 112 cars". One is a really helpful tool that can be used by millions, the other is Gary driving to wafflehouse. And to give a little bit of context: Those 500 tons are the CO2 emissions a single cruise ship is spilling out every 2 days. So not great, would be great to train on clean energy, but this critique here is not correct

centralnews.com.au/2024/05/10/

@tofu @starsider And even training is kind of ok. According to this article GPT3 took 1287 MWh to train or the "annual driving emissions of 112 cars". One is a really helpful tool that can be used by millions, the other is Gary driving to wafflehouse. And to give a little bit of context: Those 500 tons are the CO2 emissions a single cruise ship is spilling out every 2 days. So not great, would be great to train on clean energy, but this critique here is not correct

jordan

@mschfr @tofu @starsider None of this would even matter if we could just get our collective shit together as a society and make a big push for green(er) energy infrastructure. Who cares how much energy training and inference takes if it's coming from solar, wind, or nuclear power?

We fight over the wrong things, imo.

Tofu :blobcatace:​

@wagesj45 Yeah, but it's not easy to pull off, especially with people and governments steering from actual green options like nuclear power.

Michael

@tofu @wagesj45 I wouldn't agree with your nuclear power take, but even if so: The majority of the AI data centers are going up somewhere in the USA and the USA is far from steering away from nuclear power. And even we here in Germany are at the lowest CO2/kWh since sometimes in the 19th century even after switching off all nuclear

jordan

@mschfr @tofu USA is also far from steering away from large power hungry data centers. What I'm saying is if we're faced with two struggles, let's pick the right one.

Michael replied to jordan

@wagesj45 @tofu Yeah - AI has a lot of problems, so let's not try to push some false narrative here. There is a lot to critique out there

Piko Starsider :verified_paw:

@mschfr @tofu GPT-3 is ancient at this point. GPT-4 took 24000 MWh, about 20 times more. I don't know about the newer o1 models.

Michael

@starsider @tofu Yeah, but even 20x the energy consumption would give it the CO2 impact of 1,5 month of one single cruise ship. And only a few big players worldwide can even think about training such a model.

Go Up