Email or username:

Password:

Forgot your password?
Top-level
Tofu :blobcatace:โ€‹

@catsalad I think this is blown out of proportion, especially considering that genAI models for image generation aren't that big and can run on consumer grade hardware.

13 comments
Piko Starsider :verified_paw:

@tofu @catsalad Yes, but the energy it takes to train those models is still huge. And they will keep training models as long as people keep using AI "art" services.

Tofu :blobcatace:โ€‹

@starsider Yes, but my point is that this prompt is kind of blowing unrelated things out of proportion. An image at this sort of size wouldn't even take a second on a 4090, meaning the amount of energy consumed is going to be pretty low.

Piko Starsider :verified_paw:

@tofu Yes, but we should keep all costs in mind. If there's a high enough demand of local models there will be also demand for online services and for training new models. Making AI art should be frowned upon, and this is just one of the reasons.

Michael

@tofu I would agree with @starsider . You can run models like stable diffusion on your own hardware. Get one of those little power plugs that counts electricity usage. Connect everything. You'll see that generating your image won't take anywhere near the power consumption of a small city. That missinformation is not helpful - AI has it's problems, but you shouldn't go around and tell people that generating one image will boil the oceans

Michael

@tofu @starsider And even training is kind of ok. According to this article GPT3 took 1287 MWh to train or the "annual driving emissions of 112 cars". One is a really helpful tool that can be used by millions, the other is Gary driving to wafflehouse. And to give a little bit of context: Those 500 tons are the CO2 emissions a single cruise ship is spilling out every 2 days. So not great, would be great to train on clean energy, but this critique here is not correct

centralnews.com.au/2024/05/10/

@tofu @starsider And even training is kind of ok. According to this article GPT3 took 1287 MWh to train or the "annual driving emissions of 112 cars". One is a really helpful tool that can be used by millions, the other is Gary driving to wafflehouse. And to give a little bit of context: Those 500 tons are the CO2 emissions a single cruise ship is spilling out every 2 days. So not great, would be great to train on clean energy, but this critique here is not correct

jordan

@mschfr @tofu @starsider None of this would even matter if we could just get our collective shit together as a society and make a big push for green(er) energy infrastructure. Who cares how much energy training and inference takes if it's coming from solar, wind, or nuclear power?

We fight over the wrong things, imo.

Tofu :blobcatace:โ€‹

@wagesj45 Yeah, but it's not easy to pull off, especially with people and governments steering from actual green options like nuclear power.

Michael

@tofu @wagesj45 I wouldn't agree with your nuclear power take, but even if so: The majority of the AI data centers are going up somewhere in the USA and the USA is far from steering away from nuclear power. And even we here in Germany are at the lowest CO2/kWh since sometimes in the 19th century even after switching off all nuclear

jordan

@mschfr @tofu USA is also far from steering away from large power hungry data centers. What I'm saying is if we're faced with two struggles, let's pick the right one.

Michael replied to jordan

@wagesj45 @tofu Yeah - AI has a lot of problems, so let's not try to push some false narrative here. There is a lot to critique out there

Piko Starsider :verified_paw:

@mschfr @tofu GPT-3 is ancient at this point. GPT-4 took 24000 MWh, about 20 times more. I don't know about the newer o1 models.

Michael

@starsider @tofu Yeah, but even 20x the energy consumption would give it the CO2 impact of 1,5 month of one single cruise ship. And only a few big players worldwide can even think about training such a model.

Go Up