this post was submitted on 26 Aug 2025
430 points (93.7% liked)

Science Memes

16447 readers
2726 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
(page 2) 12 comments
sorted by: hot top controversial new old
[–] Rhaedas@fedia.io -2 points 1 day ago (13 children)

An LLM isn't imagining anything, it's sorting through the enormous collection of "imaginations" put out by humans to find the best match for "your" imagination. And the power used is in the training, not in each generation. Lastly, the training results in much more than just that one image you can't stop thinking about, and you'd find the best ones if you could prompt better with your little brain.

I'm curious whose feelings I hurt. The anti-AI crowd? Certainly they'd agree with my point of LLMs not thinking. Users of LLMs? I hope most of you understand how the tool works. Maybe it's the meme crowd who just wanted everyone to chuckle and not think about it too much.

[–] DoctorPress@lemmy.zip 3 points 1 day ago

"prompt better" in the context: "Make no mistakes" a truly engineering power!

[–] NewOldGuard@lemmy.ml 1 points 20 hours ago (2 children)

The training is a huge power sink, but so is inference (I.e. generating the images). You are absolutely spinning up a bunch of silicon that’s sucking back hundreds of watts with each image that’s output, on top of the impacts of training the model.

load more comments (2 replies)
load more comments (11 replies)
load more comments
view more: ‹ prev next ›