this post was submitted on 15 Sep 2025
442 points (87.3% liked)
Technology
75169 readers
2870 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A lot of these studies they list are already years outdated and irrelevant. The models are much more efficient now, and it’s mainly the Musk owned AI data centers that are high pollution. Most of the pollution from the majority of data centers is not from AI, but other use.
The old room-sized ENIAC computers used 150-200 kW of power, and couldn’t do even a fraction of what your smart phone can do. The anti-AI people are taking advantage of most people’s ignorance, intentionally using outdated studies, and implying that the power usage will continue to grow- when in fact it has already shrunk dramatically.
A Phone can't do anything. It can send/receive and the datacenter does the work. Surely everyone understands this.
A modern AI data center have already shot right past 200 Terrawatt hours and are on track to double again in the next two years.
People can't be this blind.

A phone can do a lot. Much much more than ENIAC era supercomputer (I think you'll have to get pretty close to the end of the previous century to find a supercomputer more powerful than a modern smartphone)
What a phone can't do is run an LLM. Even powerful gaming PCs are struggling with that - they can only run the less powerful models and queries that'd feel instant on service-based LLMs would take minutes - or at least tens of seconds - on a single consumer GPU. Phones certainly can't handle that, but that doesn't mean that "cant' do anything".
I've run small models (a few Gb in size) on my steam deck. It gives reasonably fast responses (faster than a person would type).
I know that they're far from state-of-the art, but they do work and I know that the Steam Deck is not going to be using much power.
LoL. Guess I can just get rid of phone’s processor then, huh?
And again, you link an image from an outdated study. Because the new data shows the use declining, so it wouldn’t help your fear mongering.
Reality is "fear mongering" is it? I agree.
If it were reality, you’d have some recent data. Might as well make projections on computer power use by starting with the ENIAC, and then you can claim computers are consuming more than our current energy output.