538
Whats your such opinion (discuss.tchncs.de)
submitted 11 months ago by cryptix@discuss.tchncs.de to c/asklemmy@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] KeenFlame@feddit.nu -2 points 11 months ago

The machine learning models and developments we see these last years called "AI" for some reason, is as big, if not bigger than the IT and internet revolution, and has applications on a broader spectrum than anyone can currently imagine.

[-] soggy_kitty@sopuli.xyz 7 points 11 months ago

This theory is not nearly rare enough to match the image

[-] KeenFlame@feddit.nu 0 points 11 months ago

As you can see I get downvoted and hated for it so

[-] soggy_kitty@sopuli.xyz 1 points 11 months ago* (last edited 11 months ago)

Dont get it confused. Downvotes are not a measure of hatred. People up and downvote depending on if they agree, and even that measure does not represent "correctness" of a comment.

Honestly try to ignore upvote scores, people who obsess over them and want to get upvoted are sheep who repeat popular phrases to get internet points

[-] KeenFlame@feddit.nu 0 points 10 months ago

I am not confused

I am very certain I receive hate for it

[-] MayonnaiseArch@beehaw.org -1 points 11 months ago

It doesn't help that it's dumb as shit

[-] bipmi@beehaw.org 5 points 11 months ago

That is such a cold take. People say this all the time. Ive literally seen and heard people compare this to the industrial revolution before.

[-] KeenFlame@feddit.nu 1 points 11 months ago

As you can see, the opposite is true here

[-] KeenFlame@feddit.nu 1 points 11 months ago* (last edited 11 months ago)

Idiotic to downvote AND hate my opinion in a place where we should post hot takes then say it's not a hot take. And still nobody agrees.

[-] BudgetBandit@sh.itjust.works 1 points 11 months ago

IMHO as long as no new random "neurons" form, it’s not AI as in Artificial Intelligence, just "a lot of ifs"

[-] 31337@sh.itjust.works 1 points 11 months ago

I think the human brain works kind of the opposite of that. Babies are born with a shitload of neural connections, then the connections decrease over a person's lifetime. ANNs typically do something similar to that while training (many connection weights will be pushed toward zero, having little or no effect).

But yeah, these LLMs are typically trained once, and frozen during use. "Online learning" is a type of training that continually learns, but current online methods typically lead to worse models (ANNs "forget" old things they've "learned" when learning new things).

this post was submitted on 07 Dec 2023
538 points (87.7% liked)

Asklemmy

43898 readers
1024 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS