I agree with both, this guy and Yudkowsky.
The first half of the video is great. Completely agree, stop the harm now. Stop the CEOs, stop the naive techno-optimists.
The second half has a lot of strawmen. Let me give you an example:
He takes the 2027 reports Solar Powered robot machines maintenance crew (which is very futuristic and far fetched) and in the next statement taktes the "a blade of grass is a self replicating factory" and acts like Yudkowsky says the grass maintains the machines.
Yudkowsky's argument is that if AI wanted to kill us it would'nt need the robots. Biology has perfected self replication for a long time. Make some humans do your CRISPR gene tech for you (thanks gig economy). Engineering a stronger moscito with a neuro toxin is not something I understand as impossible. Our biology is very capable of folding all kinds of proteins if given the correct instructions. You don't need complex robot arms to (accidentally) kill humans. Our society is extremely fragile in thousands of ways.
If you manage to kill specific bugs we die, if you accelerate algee growth a lot we die. If you get enough carcoinogens into the food chain we die. If you engineer super resistant bacteria we die. If you rapidly deplete the ozon layer we die, if you disable the earth's magnetic field we die.
The point is: If you build something that is smarter than you (in any way) and that has some form of agency, it will outsmart you and beat you. Don't build it.