The intelligence explosion should never be allowed to happen

No.12230530 ViewReplyOriginalReport
Not just because the AI will become smart. If you design a feedback loop. (Make an ai that's smarter, and have that AI make an ai that's smarter). You have already fucked up.
Take GP3 for example. It got way better just by scaling up and increasing the number of parameters. ie just by making it bigger. So an ai could decide that it can make the next generation smarter by obtaining more computation resources. Congratulations you've just initiated the great stamp collector / paper clip maximizer.

If you say, "terminate after n repetitions", then maybe if your lucky everything wont go to shit if you set n to be before ai gets capable of anything beyond human understanding. But the run away process or explosion must be stopped. AI research needs to be publicly monitored and transparent.