547
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 02 Jul 2024
547 points (100.0% liked)
TechTakes
1427 readers
97 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
there’s this type of reply guy on fedi lately who does the “well actually querying LLMs only happens in bursts and training is much more efficient than you’d think and nvidia says their gpus are energy-efficient” thing whenever the topic comes up
and meanwhile a bunch of major companies have violated their climate pledges and say it’s due to AI, they’re planning power plants specifically for data centers expanded for the push into AI, and large GPUs are notoriously the part of a computer that consumes the most power and emits a ton of heat (which notoriously has to be cooled in a way that wastes and pollutes a fuckton of clean water)
but the companies don’t publish smoking gun energy usage statistics on LLMs and generative AI specifically so who can say
“It only uses 5x as much energy as a regular search! Think of how much energy YOU’RE using with searches!” Okay, so you’re just using 5x as much energy for worse results? And also probably doing it more often than people who just use a normal search engine, because they don’t expect the search engine to talk to them. I’ve never understood how that was supposed to be an exoneration for it, even without taking into account that nobody ever seems to know whether or not that figure includes energy spent on training.
AI bros use literally the same whatabout excuses for their ghastly power consumption that I know from years of bitcoin bros doing the same
like, at least christmas lights bring joy