>>3932452The version I have is super old, at the time the recommended model was just "lama" and that's the one I used. I'm not sure how much vram it needs but the model file is just 200 MB so I think it probably should run on 1 GB.
For what it's worth I use it on the CPU anyway, it's slower but cleaning
>>3932390 took less than 20 seconds so it's fast enough for me, and (at least back when I installed it) installing without support for GPU used much less disk space. It's fucking huge anyway, like 1 GB even without GPU support, I didn't want to waste even more space because I only use it every now and then.
I wanted to see how big it is now that they have an installer, but they sell it, I didn't know it, so I couldn't see if it's even bigger now. You can still install it manually though, that's still free, but you have to use Python's pip.
If you think it would help, I can try making a zip with my old install and share it. I'd have to check if I can pack it with all the required files to be sure that it'd still run on a different computer though, probably it'd take me a few days until I have time to do it, but if you think it's worth it let me know. Or, if you want to try installing it yourself and you get some error, I can try to help with that instead.