1136
Old comic, more relevant than ever
(www.smbc-comics.com)
A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.
Rules
This is a science community. We use the Dawkins definition of meme.
Machine learning is a subset of artificial intelligence, so I don't see anything wrong here. The character's using a more generic term when talking to a layperson.
I think the point that they're making is that they used the latest buzz word for the people dishing out the dough.
Yes, and I'm saying there's nothing wrong with that "buzz word." It's accurate, just more generic.
I see a lot of people these days raising objections that LLMs and whatnot "aren't really artificial intelligence!" Because they're operating from the definition of artificial intelligence they got from science fiction TV shows, where it's not AI unless it replicates or exceeds human intelligence in all meaningful ways. The term has been widely used in computer science for 70 years, though, applying to a broad range of subjects. Machine learning is clearly within that range.
There's a distinction into "narrow AI" and "Artificial General Intelligence".
AGI is that sci-fi AI. Whereas narrow AI is only intelligent within one task, like a pocket calculator or a robot arm or an LLM.
And as you point out, saying that you're doing narrow AI is absolutely not interesting. So, I think, it's fair enough that people would assume, when "AI" is used as a buzzword, it doesn't mean the pocket calculator kind.
Not to mention that e.g. OpenAI explicitly states that they're working towards AGI.
If I built a robot pigeon that can fly, scavenge for crumbs, sing matings calls, and approximate sex with other pigeons, is that an AGI? It can't read or write or talk or compose music or draw or paint or do math or use the scientific method or debate philosophy. But it can do everything a pigeon can. Is it general or not? And if it's not, what makes human intelligence general in a way that pigeon intelligence isn't?