this post was submitted on 03 Aug 2025
427 points (86.6% liked)

Fuck AI

3671 readers
1110 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

Source (Bluesky)

you are viewing a single comment's thread
view the rest of the comments
[–] KeenFlame@feddit.nu 0 points 6 days ago (1 children)

Where is your source? It sounds unbelievable

[–] ysjet@lemmy.world -1 points 5 days ago (1 children)

Source is the commercial and academic uses I've personally seen as an academic-adjacent professional that's had to deal with this sort of stuff at my job.

[–] KeenFlame@feddit.nu 0 points 4 days ago (1 children)

What was the data you saw on what volume of requests to non-llm models as they relate to utility? I can't figure out what profession have access to this kind of statistic? It would be very useful to know, thx.

[–] ysjet@lemmy.world 1 points 4 days ago* (last edited 4 days ago) (1 children)

I think you've misunderstood what I was saying- I don't have spreadsheets of statistics on requests for LLM AIs vs non-LLM AIs. What I have is exposure to a significant amount of various AI users, each running different kinds of AIs, and me seeing what kind of AI they're using, and for what purposes, and how well it works or doesn't.

Generally, LLM-based stuff is really only returning 'useful' results for language-based statistical analysis, which NLP handles better, faster, and vastly cheaper. For the rest, they really don't even seem to be returning useful results- I typically see a LOT of frustration.

I'm not about to give any information that could doxx myself, but the reason I see so much of this is because I'm professionally adjacent to some supercomputers. As you can imagine, those tend to be useful for AI research :P

[–] KeenFlame@feddit.nu 0 points 3 days ago (1 children)

Ah ok that's too bad. Super computers typically don't have tensor cores though, and most LLM use is presumably client use on ready trained models which desktop or mobile cpus can manage now so it will be impossible to know then

[–] ysjet@lemmy.world 1 points 2 days ago (1 children)

yyyyes they do have tensor cores? Where did you get such an absurd idea from?

[–] KeenFlame@feddit.nu 1 points 1 day ago (1 children)

Superconputers to me refer to the mainfrane room sized monstrosities that are focused on stuff like calculating ballistic trajecories in the cold war

[–] ysjet@lemmy.world 1 points 1 day ago

These days, they're usually racks and racks and racks of specialized rackmount servers with all kinds of hardware, hilarious amounts of ram, networked storage, tensor cores, etc stuffed inside, all networked together via fiber optics to run in parallel as one big PC with many CPUs.