I remember the term AI being in use long before the current wave of LLMs. When I was a child, it was used to describe the code behind the behaviour of NPC in computer games, which I think is still used today. So, me, no, I don't get agitated when I hear it, I don't think it's a marketing buzzword invented by capitalistic a-holes. I do think that using "intelligence" in AI is far too generous, whichever context it's used in, but we needed some word to describe computers pretending to think and someone, a long time ago, came up with "artificial intelligence".
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
Thank you for reminding me about NPCs,
we have indeed been calling them AI for years,
even though they are not capable of reasoning on their own.
Perhaps we need a new term,
e.g. AC (Artificial Consiousness),
which does not exists yet.
The term AI still agitates me though,
since most of these are not intelligent.
For example,
earlier this week I saw a post on Lemmy,
where a LLM suggested to a user to uninstall a package, which would definitely have broken his Linux distro.
Or my co-workers,
who asked development questions I had to the LLMs they use, which yet has to generate me something usefull / something that actually works.
To me it feels like they are pushing their bad beta products upon us,
in the hopes that we pay to use them,
so they can use our feedback to improve them.
To me they don't feel intelligent nor consious.
You are misunderstanding what AI means, probably due to its overuse in pop culture. What you are think of is a subcategory of AI. It goes: AI > Machine Learning > Artificial Life
Stop down voting me I'm right.
When I was doing my applied math PhD, the vast majority of people in my discipline used either "machine learning", "statistical learning", "deep learning", but almost never "AI" (at least not in a paper or a conference). Once I finished my PhD and took on my first quant job at a bank, management insisted that I should use the word AI more in my communications. I make a neural network that simply interpolates between prices? That's AI.
The point is that top management and shareholders don't want the accurate terminology, they want to hear that you're implementing AI and that the company is investing in it, because that's what pumps the company's stock as long as we're in the current AI bubble.
People keep saying this, but AI has been used for subroutines nowhere near actual artificial intelligence since at LEAST as long as video games have existed
Much much longer than that. The term has been used since AI began as a field of study in the 50s. And it's never referred to human level intelligence. Sure, that was the goal, but all of the different sub branches of AI are still AI. Whether it's expert systems, LLMs, decision trees, etc, etc, etc. AI is a broad term that covers the entire spectrum, and always has been. People that complain about it just want AI to only refer to AGI, which already has a term. AGI.
Im willing to bet a good 60%+ are complaining about the word now because they are regurgitating anti AI talking points they've heard that they think sound good
My coworker just gave me this rant the other day about AI.
Of course we have “real” AI. We can literally be surprised while talking to these things.
People who claim it’s not general AI consistently, 100% of the time, fail to answer this question: what can a human mind do that these cannot?
In precise terms. You say “a human mind can understand” then I need a precise technical definition of “understand”. Because the people making this claim that “it’s not general AI” are always trying to wave their own flag of technical expertise. So, in technical terms, what can a general AI do, that an LLM cannot?
Go and tell your LLM to click a button, or log into your Amazon account, or send an email, or do literally anything that's an action. I'm waiting.
A 4 year old has more agency than your "AI" nowadays. LLMs are awesome at spitting out text, but they aren't true AI.
Edit: I should add, LLMs only work with input. If there's no input there is no output. So whatever you put in there, it will just sit there forever doing nothing until you give it an input again. It's much closer to a mathematical function than any kind of intelligence that has its own motivation and can act on its own.
send an email
chatGPT can explain me what to do in cli to send an e-mail. Give it access to a cli and an internet connection and it will be able to do it itself
Which again is literally just text and nothing more.
No matter how sophisticated ChatGPT gets, it will never be able to send the email itself. Of course you could pipe the output of ChatGPT into a cli, then tell ChatGPT to only write bash commands (or whatever you use) with every single detail involved and then it could possibly send an email (if you're lucky and it only uses valid commands and literally no other text in the output).
But you can never just tell it: Send an email about x, here is my login and password, send it to whatever@email.com with the subject y.
Not going to work.
All it lacks is an API that allows it to send commands. This is not a limitation of its intelligence, if it "knows" when to put text in a bash codebox, it will know when to send an API call.
Ask your brain to click a button, it cannot either, all it does is sending and receiving electric signals. Fortunately, it is surrounded by a body that reacts to these signals.
No, it's just a buzzword, just saw a joke today that AI means "absent indian".
Not even driverless cars are actually driverless: https://www.jwz.org/blog/2024/01/driverless-cars-always-have-a-driver/
you are not alone, it annoys me to no avail and I keep correcting & explaining to people who have no clue about how computers and LLMs work.
NFT
Humans possess an esoteric ability to create new ideas out of nowhere, never before thought of. Humans are also capable of inspiration, which may appear similar to the way that AI's remix old inputs into "new" outputs, but the rules of creativity aren't bound by any set parameters the way a LLM is. I'm going to risk making a comment that ages like milk and just spitball: true artificial intelligence that matches a human is impossible.
We do have A.I. The Turing test is there for a reason. We just don't have what movies told us A.I. would be like. Corporations don't need an A.I. that can think for itself to replace you. In fact, that's one of the reasons to replace you.
Title: Unpopular Opinion: The Term "AI" is Just a Marketing Buzzword!
Hey fellow Redditors, let's talk about the elephant in the room: AI. 🤖💬
I can't be the only one feeling a bit agitated by how the term "Artificial Intelligence" gets thrown around, right? Real AI seems like a distant dream, and what we have right now are these Large Language Models (LLMs). They're good at passing Turing tests, but let's be real – they're not thinking on their own.
Am I the only one who thinks "AI" is just a fancy label created by those rich, capitalistic individuals already knee-deep in LLM stocks? It feels like a slick way to boost investments and make us believe these machines are more intelligent than they really are. Thoughts? 🔍🧠💭