You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently we do not have plan to develop quantized model of Ranni. But for the LLM part, it might be easy to incorporate quantization tools from community for the LLama2 model used here.
Is there a way to use quantized models? The current version is really out of reach for most people with consumer-grade GPUs.
Also, will you train / release models for SD1.5 and SDXL?
Thanks!
The text was updated successfully, but these errors were encountered: