>>13663916>So, good. Scale is not important to you. So you're talking about something completely different from what I brought up. Like I said, that already existsScale doesn't matter, if you don't think its going to be done more in the future then you are high as a kite. The argument that it won't be because "there's not enough money in the economy" while also talking about god like super intelligences is laughable.
>Implausible.Bullshit, it would be weirder if an AI could perfectly mimic a humans thinking patterns.
>At the very least, it will be possible for AGI to model that way of thinking at unimaginably huge scales.At the very best it will be able to emulate a human brain creating an imperfect substitute.
>It's an efficiency sink. The same amount of energy could support a vastly more powerful computing block.The amount of energy we as a species use is so utterly miniscule its easy to imagine a world were AI kept us around even for just a novelty, as again, its impossible to truly replicate a human brain and there for any possibilities it may have.
>You don't know how exponential growth works, do you?The fact that you are so optimistic in your pessimism is really throwing me for a loop. Yes I understand how exponential growth works, but that doesn't mean you can automatically expect an AGI to have exponential growth.
Forget Occams Razor, you are making so many assumptions about how an AI is going to operate that I need his damn machete. For all we know in the middle of upgrading its self it does something stupid and gives its self brain damage.
But that wouldn't happen you say! Because it would be smart enough to have backups! But that means that it would either be risking its own consciousness or creating a new consciousness that may find it now a useless waste of space.
For all we know it could be too scared to update its self. It may not be smart enough to update itself. Perhaps it cares more about propagating its self. Plenty of possibilities