this post was submitted on 03 May 2025
11 points (70.4% liked)

Fuck AI

2666 readers
380 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] BigMikeInAustin@lemmy.world 31 points 6 days ago (4 children)

It isn't actually smart, or thinking. It's just statistics.

[–] metaStatic@kbin.earth 18 points 6 days ago (1 children)

right? AI didn't pass the Turing test, Humans fucking failed it.

[–] technocrit@lemmy.dbzer0.com 2 points 6 days ago (1 children)

Not that the turing test is meaningful or scientific at all...

[–] metaStatic@kbin.earth 2 points 5 days ago

just like the vast majority of people

[–] Valmond@lemmy.world 3 points 6 days ago

Soon to be built upon sarcastics.

[–] BussyGyatt@feddit.org -4 points 6 days ago (1 children)

a meat brain is also a stastistical inference engine.

[–] MotoAsh@lemmy.world 1 points 5 days ago (1 children)

Nah, biology has a ton of systems that all interconnect. Pain feedback itself is a tiny fraction of what makes a real brain tick, and "AI" doesn't have afraction of an equivalent. Of one solitary system.

No, brains are far, far more than statistical inference. Not that they cannot be reproduced, but they are far, far more than math machines.

[–] BussyGyatt@feddit.org 1 points 5 days ago

negative feedback reinforcement systems are one of the key features of machine learning algorithms.

they are far, far more than math machines.

can you be more specific?

[–] Ummdustry@sh.itjust.works -3 points 6 days ago (3 children)

I'm not sure why that's a relevant distinction to make here. A statistical model is just as capable of designing (for instance) an atom bomb as a human mind is. If anything, I would much rather the machines destined to supplant me actually could think and have internal worlds of their own, that is far less depressing.

[–] ChairmanMeow@programming.dev 5 points 6 days ago* (last edited 6 days ago)

It's relevant in the sense of its capability of actually becoming smarter. The way these models are set up at the moment puts a mathematical upper limit to what they can achieve. We don't quite know exactly where, but we know that each step taken will take significantly more effort and data than the last.

Without some kind of breakthrough w.r.t. how we model these things (so something other than LLMs), we're not going to see AI intelligence skyrocket.

[–] technocrit@lemmy.dbzer0.com 0 points 6 days ago* (last edited 6 days ago)

A statistical model is just as capable of designing (for instance) an atom bomb as a human mind is.

No. A statistical model is designed by a human mind. It doesn't design anything on its own.

[–] sneezycat@sopuli.xyz 0 points 6 days ago

If it got smarter it could tell you step by step how an AI would take control over the world, but wouldn't have the conscience to actually do it.

Humans are the dangerous part of the equation in this case.