this post was submitted on 09 Jun 2025
211 points (98.6% liked)

Fuck AI

3065 readers
885 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
 

[OpenAI CEO Sam] Altman brags about ChatGPT-4.5's improved "emotional intelligence," which he says makes users feel like they're "talking to a thoughtful person." Dario Amodei, the CEO of the AI company Anthropic, argued last year that the next generation of artificial intelligence will be "smarter than a Nobel Prize winner." Demis Hassabis, the CEO of Google's DeepMind, said the goal is to create "models that are able to understand the world around us." These statements betray a conceptual error: Large language models do not, cannot, and will not "understand" anything at all. They are not emotionally intelligent or smart in any meaningful or recognizably human sense of the word. LLMs are impressive probability gadgets that have been fed nearly the entire internet, and produce writing not by thinking but by making statistically informed guesses about which lexical item is likely to follow another.

OP: https://slashdot.org/story/25/06/09/062257/ai-is-not-intelligent-the-atlantic-criticizes-scam-underlying-the-ai-industry

Primary source: https://www.msn.com/en-us/technology/artificial-intelligence/artificial-intelligence-is-not-intelligent/ar-AA1GcZBz

Secondary source: https://bookshop.org/a/12476/9780063418561

you are viewing a single comment's thread
view the rest of the comments
[–] Psaldorn@lemmy.world 18 points 1 day ago (2 children)

If you ever tried to use ai for code you'd know how dumb it was.

Generally it's ok, it'll get some stuff done but if it thinks something is a certain way you can't convince it otherwise. It hallucinates documentation, admits it made it up then carries on telling you to use the made up parts of the code.

Infuriating.

Like I said though, generally pretty good at helping you learn a new language if you have knowledge to start with.

People learning from scratch are cooked, it makes crazy decisions sometimes that will compound over time and leave you with trash.

[–] Kolanaki@pawb.social 18 points 1 day ago (1 children)

If you ever tried to use ai for code you'd know how dumb it was.

If you ever tried using it for anything you are pretty familiar with, you'd know how dumb it was.

That's the only reason I think people still think AI is great; they don't know shit so they think the AI is giving them good info when it's not.

[–] Pringles@sopuli.xyz 1 points 8 hours ago

I actually started finding use cases for copilot in excel. It is still dumb, but it can mass process data quickly so if you write your prompt well, it can save you a lot of time.

And that is exactly how "AI" should be used. It can boost productivity if used as the tool that it is.

[–] danielquinn@lemmy.ca 9 points 1 day ago (1 children)

I've actually tried to use these things to learn both Go and Rust (been writing Python for 17 years) and the experience was terrible. In both cases, it would generate code that referenced packages that didn't exist, used patterns that aren't used anymore, and wrote code that didn't even compile. It was wholly useless as a learning tool.

In the end what worked was what always works: I got a book and started on page 1. It was hard, but I started actually learning after a few hours.

[–] Psaldorn@lemmy.world 3 points 1 day ago

I used Gemini for go and was pleasantly surprised, might be important to note that I don't ask it to generate a whole thing but more like "in go how do I " and sort of build up from there myself.

Chatgpt and deepseek were a lot more failure prone.

As an aside I found Gemini very good at debugging blender issues where the UI is very complex and unforgiving, and issues with that are super hard to search for, different versions and similarly named things etc.

But as soon as you hit something it will not accept has changed it's basically useless. But often that got me to a point where I could find posts in forums about "where did functionality x move to"

Just like VR I think the bubble will burst and it will remain a niche technology that can be fine tuned for certain professions or situations.

People getting excited for ways for ai to control their PCs are probably going to be in for a bad time..