>>13635175>>13635168Not him, but contextually the argument is irrelevant. That said though, the reason why computers are computers and humans via their brains seemingly are not is due to empathy. Empathy is what allows a human to differentiate him/herself by being able to assign emotional context to any given question and make it into a positive or negative outcome.
If you were to ask that question to a human, the human will either response back with: "sure" flatly or squint his/her eyes at you and walk away or "stop asking stupid questions Jim" and then move on. All three responses are outcomes where negative emotion was assigned, which creates nuance in context. A flat response is dismissive. Squinting his/her eyes and walking away is mentally declaring the other as being inconsequential and then dismissing by walking away, and finally explicitly making it clear that the argument is irrelevant and dismissing it.
Computers, as they currently are, LACK empathy. If a neural network cannot be trained to understand empathy and assign positive or negative emotion (a weight) to any given situation in a turing complete fashion, it will be just that; a dumb machine. The day this barrier is crossed, then you've now entered a nascent AGI state. More training thereafter will likely lead to fully active AGI and then super AGI. At point which, there would theoretically be no difference between a human brain and a computer neural network--thus proving that a human brain is a computer as much as a computer is a computer.
I suspect though that we're probably closer to a Ghost in the Shell, maybe Halo timeline than anything else with AIs created from flash cloned brains being used to create the neural network by which an artificial intelligence can be trained and navigate in a world.