Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can it work on AMD GPUs? #5

Open
Philomorph opened this issue Dec 5, 2023 · 1 comment
Open

Can it work on AMD GPUs? #5

Philomorph opened this issue Dec 5, 2023 · 1 comment

Comments

@Philomorph
Copy link

I've been using various versions of Stable Diffusion with my 7800RX.

One popular method is lshqqytiger's SD fork that uses Direct-ml. Is something like that possible here? SD doesn't run great on AMD - it's kind of a hack, and even 16GB of VRAM isn't enough sometimes. The low memory version of this could be a game changer for AMD owners.

@IronGem
Copy link

IronGem commented Jul 13, 2024

It can. I was able to get it to use my AMD GPU by using pip to install the ROCm version of torch instead of the cuda118 version.
pip install torch --index-url https://download.pytorch.org/whl/rocm6.0 (see pytorch.org)
You will have to use pip uninstall torch first in your venv to remove the one you have currently.
Unfortunately, my gpu only has 4G of memory, so even on the lowest settings I could never get an image to render before I got an 'out of memory' error. So I have been trying to get DemoFusion to try rendering on my CPU, I know it would be slower, but at least I would get something. I have already tried installing the 'CPU' version of torch, but have not been able to get a successful run yet. If anyone can tell me what to tweak to get it to use the CPU I would appreciate the help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants