this post was submitted on 14 Oct 2025
9 points (100.0% liked)
If it interests me, it should interest you.
75 readers
1 users here now
Everything I like and want to remember, everything you unknowingly wanted to know too.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I agree with both, this guy and Yudkowsky.
The first half of the video is great. Completely agree, stop the harm now. Stop the CEOs, stop the naive techno-optimists.
The second half has a lot of strawmen. Let me give you an example:
He takes the 2027 reports Solar Powered robot machines maintenance crew (which is very futuristic and far fetched) and in the next statement taktes the "a blade of grass is a self replicating factory" and acts like Yudkowsky says the grass maintains the machines.
Yudkowsky's argument is that if AI wanted to kill us it would'nt need the robots. Biology has perfected self replication for a long time. Make some humans do your CRISPR gene tech for you (thanks gig economy). Engineering a stronger moscito with a neuro toxin is not something I understand as impossible. Our biology is very capable of folding all kinds of proteins if given the correct instructions. You don't need complex robot arms to (accidentally) kill humans. Our society is extremely fragile in thousands of ways.
If you manage to kill specific bugs we die, if you accelerate algee growth a lot we die. If you get enough carcoinogens into the food chain we die. If you engineer super resistant bacteria we die. If you rapidly deplete the ozon layer we die, if you disable the earth's magnetic field we die.
The point is: If you build something that is smarter than you (in any way) and that has some form of agency, it will outsmart you and beat you. Don't build it.
This won't work, because humans have a perverse, morbid curiosity for all sorts of things. I'd wager if it can be built, it will be built.
Dude! I figured out a way to build the torment nexus at a third of the cost!