tl;dr “I honestly can’t predict what the state of AI will be in 5-10 years. Therefore there’s a decent chance that a possible outcome is the end of humankind.”
Does this say anything about “AI”, or does it say something about humans?
Humans don’t do well when they don’t know what’s going to happen next.
Does this say anything about “AI”, or does it say something about humans?
Humans don’t do well when they don’t know what’s going to happen next.