28
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 14 Apr 2024
28 points (100.0% liked)
TechTakes
1427 readers
107 users here now
Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.
This is not debate club. Unless it’s amusing debate.
For actually-good tech, you want our NotAwfulTech community
founded 1 year ago
MODERATORS
Is artificial intelligence the great filter that makes advanced technical civilisations rare in the universe?
This professor is arguing we need to regulate AI because we haven't found any space aliens yet and the most conceivably explanation why is that they all wiped themselves out with killer AIs.
And hits some of the greatest hits:
Zero mentions of global warming of course.
I kinda want to think that the author has just been reading some weird ideas. At least he put himself out there and wrote a paper with human sentences! It's all aboard the AI hype train for sure, and constantly makes huge logical leaps, but it somehow doesn't make me feel as skeezy as some of the other stuff on here.
I hate that you can't mention the Fermi paradox anymore without someone throwing AI into the mix. There's so much more interesting discussions to have about this than the idea that we're all gonna be paperclipped by some future iteration of spicy autocomplete.
But what's even worse is that those munted dickheads will then claim that they have also found the solution to the Fermi paradox, which is, of course, to give more money to them so they can make their shitty products ~~even worse~~ safer.
Also:
Somehow Clippy 9000 that's clever enough to outsmart the entirety of the human race because it's playing 4D chess with multiverse time travel, is, at the same time, too stupid to come up with any plan that doesn't kill itself in the end, too?
Yeah, the fermi paradox really doesn't work here, an AI that was motivated and smart enough to wipe out humanity would be unlikely to just immediately off itself. Most of the doomerism relies on "tile the universe" scenarios, which would be extremely noticeable.