view the rest of the comments
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
They didn't just start calling it AI recently. It's literally the academic term that has been used for almost 70 years.
i mean...by that definition nothing currently in existence deserves to be called "AI".
none of the current systems do anything remotely approaching "perceptual learning, memory organization, and critical reasoning".
they all require pre-processed inputs and/or external inputs for training/learning (so the opposite of perceptual), none of them really do memory organization, and none are capable of critical reasoning.
so OPs original question remains:
why is it called "AI", when it plainly is not?
(my bet is on the faceless suits deciding it makes them money to call everything "AI", even though it's a straight up lie)
Because a bunch of professors defined it like that 70 years ago, before the AI winter set in. Why is that so hard to grasp? Not everything is a conspiracy.
I had a class at uni called AI, and no one thought we were gonna be learning how to make thinking machines. In fact, compared to most of the stuff we did learn to make then, modern AI looks godlike.
Honestly you all sound like the people that snidely complain how it's called "global warming" when it's freezing outside.
just because the marketing idiots keep calling it AI, doesn't mean it IS AI.
words have meaning; i hope we agree on that.
what's around nowadays cannot be called AI, because it's not intelligence by any definition.
imagine if you were looking to buy a wheel, and the salesperson sold you a square piece of wood and said:
"this is an artificial wheel! it works exactly like a real wheel! this is the future of wheels! if you spin it in the air it can go much faster!"
would you go:
"oh, wow, i guess i need to reconsider what a wheel is, because that's what the salesperson said is the future!"
or would you go:
"that's idiotic. this obviously isn't a wheel and this guy's a scammer."
if you need to redefine what intelligence is in order to sell a fancy statistical model, then you haven't invented intelligence, you're just lying to people. that's all it is.
the current mess of calling every fancy spreadsheet an "AI" is purely idiots in fancy suits buying shit they don't understand from other fancy suits exploiting that ignorance.
there is no conspiracy here, because it doesn't require a conspiracy; only idiocy.
p.s.: you're not the only one here with university credentials...i don't really want to bring those up, because it feels like devolving into a dick measuring contest. let's just say I've done programming on industrial ML systems during my bachelor's, and leave it at that.
These arguments are so overly tired and so cyclic that AI researchers coined a name for them decades ago - the AI effect. Or succinctly just: "AI is whatever hasn't been done yet."
i looked it over and ... holy mother of strawman.
that's so NOT related to what I've been saying at all.
i never said anything about the advances in AI, or how it's not really AI because it's just a computer program, or anything of the sort.
my entire argument is that the definition you are using for intelligence, artificial or otherwise, is wrong.
my argument isn't even related to algorithms, programs, or machines.
what these tools do is not intelligence: it's mimicry.
that's the correct word for what these systems are capable of. mimicry.
intelligence has properties that are simply not exhibited by these systems, THAT'S why it's not AI.
call it what it is, not what it could become, might become, will become. because that's what the wiki article you linked bases its arguments on: future development, instead of current achievement, which is an incredibly shitty argument.
the wiki talks about people using shifting goal posts in order to "dismiss the advances in AI development", but that's not what this is. i haven't changed what intelligence means; you did! you moved the goal posts!
I'm not denying progress, I'm denying the claim that the goal has been reached!
that's an entirely different argument!
all of the current systems, ML, LLM, DNN, etc., exhibit a massive advancement in computational statistics, and possibly, eventually, in AI.
calling what we have currently AI is wrong, by definition; it's like saying a single neuron is a brain, or that a drop of water is an ocean!
just because two things share some characteristics, some traits, or because one is a subset of the other, doesn't mean that they are the exact same thing! that's ridiculous!
the definition of AI hasn't changed, people like you have simply dismissed it because its meaning has been eroded by people trying to sell you their products. that's not ME moving goal posts, it's you.
you said a definition of 70 years ago is "old" and therefore irrelevant, but that's a laughably weak argument for anything, but even weaker in a scientific context.
is the Pythagorean Theorem suddenly wrong because it's ~2500 years old?
ridiculous.