Note that the only requirement to run models is memory size. Theoretically you could buy the cheapest/slowest GPU out there, as long as it has sufficient memory for whatever models you want to use. Performance itself is about how long you’re ready to wait for 1 image, which can vary anywhere between few seconds and minutes.
I’ve heard people just buying Mac mini’s due to their memory size (unified), even tho the performance is not great, but you can run larger models as opposed to RTX 4060ti
As a poor easterner, best I could do is RTX 4060Ti with 16GB of VRAM advertised for AI.
Glad to hear I don’t need the latest and greatest to generate images. My budget is still far from affording even that, so I will have to wait. :-)
Note that the only requirement to run models is memory size. Theoretically you could buy the cheapest/slowest GPU out there, as long as it has sufficient memory for whatever models you want to use. Performance itself is about how long you’re ready to wait for 1 image, which can vary anywhere between few seconds and minutes.
I’ve heard people just buying Mac mini’s due to their memory size (unified), even tho the performance is not great, but you can run larger models as opposed to RTX 4060ti