-
Notifications
You must be signed in to change notification settings - Fork 323
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request for support of AMD 8gb vram gpus +ubuntu +ROCm without using LLaVA CLIP to reduce required amount of vram to run SUPIR #81
Comments
We made it work with 8 bit on Nvidia GPUs. not published yet but working to implement it hopefully soon. Sadly nothing for AMD yet |
you mean 8 Gib vram nvidia gpus? hope we have AMD gpus support soon, by the way do you have SUPIR google colab notebook or kaggle notebook? , and thanks alot for your response and all of your great contributions 💯💯💯💯 |
yes we made it work on Kaggle as well but not published yet. coming soon hopefully |
Will it be able to run 16gb ram too ? I mean, I have 8gb vram Nvidia GPU and 16gb system ram so 8bit model will be able to run ? Also will there be any decrease in quality of upscale with 8bit ? |
it is not fully implemented yet but join our discord and you can ask me once we have : https://discord.com/servers/software-engineering-courses-secourses-772774097734074388 |
Made it work with AMD 7800XT without xformers, on Linux. I am not sure if if 8GB VRAM will work because 16 GB VRAM barely works. |
let's hope supir gets more optimized soon 🤞🤞🤞🤞 cause it do a real magic 💯💯💯💯 |
8 GB NVIDIA support came we also now have a kaggle notebook that supports Graido and SUPIR |
great news, thanks. |
Request for support of AMD 8gb vram gpus +ubuntu +ROCm without using LLaVA CLIP to reduce required amount of vram to run SUPIR, instead we can use online clip models to caption images, and use this captions in the program, and i hope you add extension to use SUPIR in Automatic 1111 🤞🤞🤞🤞 thanks a lot for this great program 💯💯💯💯 hope you add the online demo soon 🤞🤞🤞🤞
The text was updated successfully, but these errors were encountered: