494
you are viewing a single comment's thread
view the rest of the comments
[-] nickwitha_k@lemmy.sdf.org 4 points 1 month ago

Current mainstream AI has no possible path to AGI. I am supportive of AGI to make the known universe less lonely but LLMs ain't it.

[-] MossyFeathers@pawb.social 2 points 1 month ago

Okay, and? What are you trying to say?

[-] nickwitha_k@lemmy.sdf.org 3 points 1 month ago

There's a vocal group of people who seem to think that LLMs can achieve consciousness despite the fact that it is not possible due to the way that LLMs fundamentally work. They have largely been duped by advanced LLMs' ability to sound convincing (as well as a certain conman executive officer). These people often also seem to believe that by dedicating more and more resources to running these models, they will achieve actual general intelligence and that an AGI can save the world, releasing them of the responsibility to attempt to fix anything.

That's my point. AGI isn't going to save us and LLMs (by themselves), regardless of how much energy is pumped into them, will not ever achieve actual intelligence.

[-] MossyFeathers@pawb.social 2 points 1 month ago

But an AGI isn't an LLM. That's what's confusing me about your statement. If anything I feel like I already covered that, so I'm not sure why you're telling me this. There's no reason why you can't recreate the human brain on silicon, and eventually someone's gonna do it. Maybe it's one of our current companies, maybe it's a future company. Who knows. Except that a true AGI would turn everything upside down and inside out.

[-] nickwitha_k@lemmy.sdf.org 2 points 1 month ago

I think, possibly, my tired brain at the time thought that you are implying LLM -> AGI. And I do agree that that's no reason, beyond time and available technology that a model of a brain cannot be made. I would question whether digital computers are capable of accurately simulating neurons, at least, without requiring more components (more bits of resolution).

For full disclosure, I am supportive of increasing the types of sentience in the known universe. Though, not at the expense of biosphere habitability. Whether electronic or biological, sharing the world with more types of sentients would make it a more interesting place.

Except that a true AGI would turn everything upside down and inside out.

Very likely. Especially if "human rights" aren't pre-emptively extended to cover non-human sentients. But, the existence of AGI, alone, is not likely to cause either doomsday or save us from it, which seem to be the most popularly envisaged scenarios.

[-] MossyFeathers@pawb.social 1 points 1 month ago

I think, possibly, my tired brain at the time thought that you are implying LLM -> AGI.

Ah, okay. I've been there lol. I hope I didn't come off as confrontational, I was very confused and concerned that I had badly explained myself. My apologies if I did.

[-] nickwitha_k@lemmy.sdf.org 2 points 1 month ago

No, you're good. I hope that I didn't come off as aggressive, myself.

this post was submitted on 05 Oct 2024
494 points (89.6% liked)

solarpunk memes

2837 readers
372 users here now

For when you need a laugh!

The definition of a "meme" here is intentionally pretty loose. Images, screenshots, and the like are welcome!

But, keep it lighthearted and/or within our server's ideals.

Posts and comments that are hateful, trolling, inciting, and/or overly negative will be removed at the moderators' discretion.

Please follow all slrpnk.net rules and community guidelines

Have fun!

founded 2 years ago
MODERATORS