this post was submitted on 26 Aug 2024
1565 points (97.6% liked)

memes

17749 readers
874 users here now

Community rules

1. Be civilNo trolling, bigotry or other insulting / annoying behaviour

2. No politicsThis is non-politics community. For political memes please go to !politicalmemes@lemmy.world

3. No recent repostsCheck for reposts when posting a meme, you can only repost after 1 month

4. No botsNo bots without the express approval of the mods or the admins

5. No Spam/Ads/AI SlopNo advertisements or spam. This is an instance rule and the only way to live. We also consider AI slop to be spam in this community and is subject to removal.

A collection of some classic Lemmy memes for your enjoyment

Sister communities

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] i_love_FFT@lemmy.ml 6 points 1 year ago (6 children)

I'd like all ai service to publish the energy used in training the model and performing inference.

"Queries uses an average of X kWh of power. A model training run requires X MWh, and the development of this model over the years required X TWh of power."

Then we could judge companies by that metric. Off course, rich people would look for the most power-draining model for the sake of it.

[–] FrenchThrowAway@jlai.lu 7 points 1 year ago* (last edited 1 year ago) (4 children)

That's already something that Meta is doing for their Llama models:

Source

You can extrapolate openai models consumption from these I guess

[–] Skullgrid@lemmy.world 8 points 1 year ago (3 children)

ok, but

  1. Is it still bad if they use renewables? in which case, it's not horrendous, is it?

  2. what about the rest of their servers?

  3. Fuck facebook

[–] FrenchThrowAway@jlai.lu 0 points 1 year ago
  1. Power consumption is still power consumption, so 2 290 000kgCo2 is a lot, even if it's way lower than what it would have been with coal plants
  2. They only talk about power consumption and not server hardware footprint, cause power consumption is the easier to offset
  3. Yes
load more comments (2 replies)
load more comments (2 replies)
load more comments (3 replies)