No.13961477 ViewReplyOriginalReport
Why are people so afraid of the singularity? An ai wouldn't conquer the world. Why would it want to? It shouldn't even want to preserve it's own existence, nor should it want to die. Why would an ai want to do anything except what it's told?