this post was submitted on 12 Aug 2025
-9 points (37.1% liked)
Controversial - the place to discuss controversial topics
475 readers
1 users here now
Controversial - the community to discuss controversial topics.
Challenge others opinions and be challenged on your own.
This is not a safe space nor an echo-chamber, you come here to discuss in a civilized way, no flaming, no insults!
Extraordinary claims require extraordinary evidence, "trust me bro" is not a valid argument.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Hmm, after running an LLM locally I'm not too surprised about low energy use while querying (but even then, 3Wh is very low, I was getting like 5 Wh per query while running DeepSeek locally, that said datacenter GPUs will be more efficient for this). But also, this doesn't go into great depths about the impact of training.
It doesn't go into the impact of training, because that isn't what these AI-phobes complain about.
Even just using it will result in calling it "slop" and complaining about its energy usage.
If they complained about the training, they would have a point. But they don't. They complain about anything of AI. Even if it has revolutionized things like protein folding in medicine.
My sources account for the training.
Even if you add a zero to AI's CO2 emissions and water usage, it's still a drop in the bucket compared to eating beef or taking a flight for holiday
Check the sources, that does account for training.
Even if we went with 10wh, it's still a drop in the bucket compared to other lifestyle choices we make
The 3Wh figure is pretty much "pulled out of the ass". Looking at other sources listed, it does seem true that similar-sized models use ~2-3Wh per query, amortized training included. So yeah I concede the point.