16
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 12 Jul 2023
16 points (63.8% liked)
World News
32321 readers
762 users here now
News from around the world!
Rules:
-
Please only post links to actual news sources, no tabloid sites, etc
-
No NSFW content
-
No hate speech, bigotry, propaganda, etc
founded 5 years ago
MODERATORS
And monkeys could fly out of my ass. Before we start hand wringing about AI someone would probably need to actually invent one. We're probably closer to actual room temperature fusion at this point than we are an actual general purpose AI.
Instead of wasting time worrying about a thing that doesn't even exist and probably won't in any of our lifetimes, we should probably do something about the things actually killing us like global warming and unchecked corporate greed.
Exactly. There was an article floating around just a couple of days ago that from what I recall was saying that billionaires were funding these AI-scare studies in top universities, I presume to distract the public from the very real and near scare of climate disaster, economic inequality, etc. Here, unfortunately paywalled: https://www.washingtonpost.com/technology/2023/07/05/ai-apocalypse-college-students/
There is this concept called "crityhype". It's a type of marketing mascarading as criticism. "Careful, AI might become too powerful" is exactly that
A lot of the folks worried about AI x-risk are also worried about climate, and pandemics, and lots of other things too. It's not like there's only one threat.
It's all about risks, if you worry about being runover ok it's reasonable, but if you worry about shark attacks when you live in the forest it is ludicrous and a waste of time.
@fubo @xapr I don’t doubt that, but that begs the question whether the unrealistic concerns raised for by those folks outweigh the realistic ones that need more actual and funding. For example, how much money are the billionaires and top elites putting in to solve climate change, past/future pandemics compared to studying AI-driven doom? I don’t know the answer, and I welcome you to find out.
Right, the attention being paid to AI risk just seems vastly disproportionate compared to other much more serious imminent threats.
@xapr @orclev here’s an archive link for the article for those who want it https://archive.is/K1TUL
I absolutely hate this craze. Most of the questions I get about AI are just facepalming because everyone is feeding off each other with these absurd things that could hypothetically happen. Clearly because actually explaining it doesn't generate clicks and controversy
Solving real problems is hard because if it wasn't they would be solved already, but making up fake problems is really easy.
Amen. The “AI” everyone is freaking out about is good at a narrow range of things, but either dumb as shit or completely incapable otherwise
And the unnecessary cruelty @orclev@lemmy.world puts poor monkeys through.
Come on man let the poor things out. No matter what they did to you. They don't deserve that.