2026-01-07
Disclaimer
Quick Note
Suppose I knew for a fact that a billion people were going to die, but I and my immediate circle were not going to be part of this set of people. A practical reason this could happen would be a nuclear war that the country I'm living in happens to not get involved in.
I would care about that, and I would probably work full-time to prevent it. I would probably be willing to take personal risks to prevent it, as I am doing now. But I don't think it would affect me as intensely and personally as ASI risk affects me. (It's hard for me to guess my emotional state in some hypothetical situation, but here's my weak guess.)
ASI risk affects me more because if human extinction happens, I too will die. If AI is used to create a permanent dictatorship, me and people around me and my future family will have to live in that dictatorship, with no hope that anything will ever get better even multiple generations in the future.
Enter email or phone number to subscribe. You will receive atmost one update per month
Enter comment