this post was submitted on 09 Aug 2025
934 points (99.1% liked)

Technology

73877 readers
4692 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] a_wild_mimic_appears@lemmy.dbzer0.com 1 points 1 day ago (1 children)

The energy costs are overblown. An response costs about 3Wh, which is about 1 minute of runtime for a 200W Pc, or 10 Seconds of a 1000W microwave. See the calculations made here and below for the energy costs. if you want to save energy, go vegan and ditch your car; completely disbanding ChatGPT amounts for 0,0017% of the CO2 Reduction during Covid 2020 (this guy gave the numbers, but had an error in magnitude, which i fixed in my reply, calculator output is attached. It would help climate activists if they concentrated on something that is worthwhile to criticize.

If i read a book, and use phrases out of that book in my communication, it is covered under fair use - the same should be applicable for scraping the web, or else we can close the internet archive next. Since LLM output isn't copyrightable, i see no issues with that - and copyright law in the US is an abomination which is only useful for big companies to use as a weapon, small artists don't really profit from that.

[–] verdigris@lemmy.ml 1 points 1 day ago (1 children)

The costs for responses are overblown, but the costs for training are not.

Adding the cost for training, which is a one time cost, to ChatGPT raises the power consumption from 3W to 4W. That's the high-end calculation btw.