this post was submitted on 14 Aug 2025
795 points (98.5% liked)

Technology

74055 readers
5741 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] redsunrise@programming.dev 299 points 1 day ago (5 children)

Obviously it's higher. If it was any lower, they would've made a huge announcement out of it to prove they're better than the competition.

[–] ChaoticEntropy@feddit.uk 23 points 1 day ago

I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.

[–] Ugurcan@lemmy.world 32 points 1 day ago* (last edited 1 day ago) (3 children)

I’m thinking otherwise. I think GPT5 is a much smaller model - with some fallback to previous models if required.

Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in American Investor Language (AIL).

And 2025’s investors doesn’t give a flying fuck about energy efficiency.

[–] Sl00k@programming.dev 1 points 19 hours ago

It also has a very flexible "thinking" nature, which means far far less tokens spent on most peoples responses.

[–] PostaL@lemmy.world 27 points 1 day ago (2 children)

And they don't want to disclose the energy efficiency becaaaause ... ?

[–] AnarchistArtificer@slrpnk.net 12 points 1 day ago

Because the AI industry is a bubble that exists to sell more GPUs and drive fossil fuel demand

[–] Hobo@lemmy.world 1 points 1 day ago* (last edited 1 day ago)

Because, uhhh, whoa what's that? ducks behind the podium

[–] RobotZap10000@feddit.nl 20 points 1 day ago* (last edited 1 day ago) (1 children)

They probably wouldn't really care how efficient it is, but they certainly would care that the costs are lower.

[–] Ugurcan@lemmy.world 7 points 1 day ago (1 children)

I’m almost sure they’re keeping that for the Earnings call.

[–] panda_abyss@lemmy.ca 2 points 1 day ago (1 children)

Do they do earnings calls? They’re not public.

[–] Tollana1234567@lemmy.today 1 points 1 day ago

probably VC money, the investors going to want some answers.

[–] thatcrow@ttrpg.network 1 points 1 day ago

It warms me heart to see ya'll finally tune-in to the scumbag tactics our abusers constantly employ.

[–] T156@lemmy.world 2 points 1 day ago

Unless it wasn't as low as they wanted it. It's at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.