249
submitted 11 months ago* (last edited 11 months ago) by Rikj000@discuss.tchncs.de to c/asklemmy@lemmy.world

Am I the only one getting agitated by the word AI (Artificial Intelligence)?

Real AI does not exist yet,
atm we only have LLMs (Large Language Models),
which do not think on their own,
but pass turing tests
(fool humans into thinking that they can think).

Imo AI is just a marketing buzzword,
created by rich capitalistic a-holes,
who already invested in LLM stocks,
and now are looking for a profit.

you are viewing a single comment's thread
view the rest of the comments
[-] PonyOfWar@pawb.social 81 points 11 months ago

The word "AI" has been used for way longer than the current LLM trend, even for fairly trivial things like enemy AI in video games. How would you even define a computer "thinking on its own"?

[-] Deceptichum@kbin.social 35 points 11 months ago

I think a good metric is once computers start getting depression.

[-] sanguinepar@lemmy.world 13 points 11 months ago

It'll probably happen when they get a terrible pain in all the diodes down their left hand side.

[-] Lath@kbin.social 7 points 11 months ago

But will they be depressed or will they just simulate it because they're too lazy to work?

[-] the_post_of_tom_joad@sh.itjust.works 5 points 11 months ago

simulate [depression] because they’re too lazy

Ahh man are you my dad? I took damage from that one. has any fiction writer done a story about depressed ai where they talk about how depression can't be real because it's all 1s and 0s? Cuz i would read the shit out of that.

[-] meyotch@slrpnk.net 1 points 11 months ago

It’s only tangentially related to the topic, since it involves brain enhancements, not ‘AI’. However, you may enjoy the short story “Reasons to be cheerful” by Greg Egan.

[-] JackFrostNCola@lemmy.world 4 points 11 months ago

If they are too lazy to work that would imply they have motivation and choice beyond "doing what my programming tells me to do ie. input, process, output". And if they have the choice not to do work because they dont 'feel' like doing it (and not a programmed/coded option given to them to use) then would they not be thinking for themselves?

[-] PonyOfWar@pawb.social 2 points 11 months ago

Not sure about that. A LLM could show symptoms of depression by mimicking depressed texts it was fed. A computer with a true consciousness might never get depression, because it has none of the hormones influencing our brain.

[-] Deceptichum@kbin.social 2 points 11 months ago

Me: Pretend you have depression

LLM: I'm here to help with any questions or support you might need. If you're feeling down or facing challenges, feel free to share what's on your mind. Remember, I'm here to provide information and assistance. If you're dealing with depression, it's important to seek support from qualified professionals like therapists or counselors. They can offer personalized guidance and support tailored to your needs.

[-] PonyOfWar@pawb.social 8 points 11 months ago

Give it the right dataset and you could easily create a depressed sounding LLM to rival Marvin the paranoid android.

[-] Feathercrown@lemmy.world 1 points 11 months ago

Hormones aren't depression, and for that matter they aren't emotions either. They just cause them in humans. An analogous system would be fairly trivial to implement in an AI.

[-] PonyOfWar@pawb.social 1 points 11 months ago

That's exactly my point though, as OP stated we could detect if an AI was truly intelligent if it developed depression. Without hormones or something similar, there's no reason to believe it ever would develop those on its own. The fact that you could artificially give it depressions is besides the point.

[-] Feathercrown@lemmy.world 1 points 11 months ago

I don't think we have the same point here at all. First off, I don't think depression is a good measure of intelligence. But mostly, my point is that it doesn't make it less real when hormones aren't involved. Hormones are simply the mediator that causes that internal experience in humans. If a true AI had an internal experience, there's no reason to believe that it would require hormones to be depressed. Do text-to-speech systems require a mouth and vocal chords to speak? Do robots need muscle fibers to walk? Do LLMs need neurons to form complete sentences? Do cameras need eyes to see? No, because it doesn't matter what something is made of. Intelligence and emotions are made of signals. What those signals physically are is irrelevant.

As for giving it feelings vs it developing them on its own-- you didn't develop the ability to feel either. That was the job of evolution, or in the case of AI, it could be intentionally designed. It could also be evolved given the right conditions.

[-] PonyOfWar@pawb.social 2 points 11 months ago

First off, I don’t think depression is a good measure of intelligence.

Exactly. Which is why we shouldn't judge an AIs intelligence based on whether it can develop depression. Sure, it's feasible it could develop it through some other mechanism. But there's no reason to assume it would, in absence of the factors that cause depressions in humans.

[-] Feathercrown@lemmy.world 1 points 11 months ago* (last edited 11 months ago)

Oh. Maybe we did have the same point lol

[-] SatanicNotMessianic@lemmy.ml 0 points 11 months ago

The real metric is whether a computer gets so depressed that it turns itself off.

[-] Markimus@lemmy.world -5 points 11 months ago

A LLM can get depression, so that’s not a metric you can really use.

[-] Deceptichum@kbin.social 3 points 11 months ago

No it can’t.

LLMs can only repeat things they’re trained on.

[-] Markimus@lemmy.world 2 points 11 months ago

Sorry, to be clear I meant it can mimic the conversational symptoms of depression as if it actually had depression; there’s no understanding there though.

You can’t use that as a metric because you wouldn’t be able to tell the difference between real depression and trained depression.

[-] Ratulf@feddit.de 1 points 11 months ago

The best thing is enemy "AI" only needs to be made worse right away after creating it. First they'll headshot everything across the map in milliseconds. The art is to make it dumber.

[-] jimmy90@lemmy.world 0 points 11 months ago

it does not "think"

this post was submitted on 29 Jan 2024
249 points (85.5% liked)

Ask Lemmy

27253 readers
1932 users here now

A Fediverse community for open-ended, thought provoking questions


Rules: (interactive)


1) Be nice and; have funDoxxing, trolling, sealioning, racism, and toxicity are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them


2) All posts must end with a '?'This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?


3) No spamPlease do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.


4) NSFW is okay, within reasonJust remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com. NSFW comments should be restricted to posts tagged [NSFW].


5) This is not a support community.
It is not a place for 'how do I?', type questions. If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.


6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online


Reminder: The terms of service apply here too.

Partnered Communities:

Tech Support

No Stupid Questions

You Should Know

Reddit

Jokes

Ask Ouija


Logo design credit goes to: tubbadu


founded 2 years ago
MODERATORS