Conversation
|
This works for me on my 6750 XT, but I have to add |
| RUN . /clone.sh generative-models https://github.com/Stability-AI/generative-models 45c443b316737a4ab6e40413d7794a7f5657c19f | ||
|
|
||
|
|
||
| FROM rocm/pytorch:rocm6.0.2_ubuntu22.04_py3.10_pytorch_2.1.2 |
There was a problem hiding this comment.
Is this the only difference between this docker file and the nvidia one?
There was a problem hiding this comment.
Additionally, all references to CodeFormer was removed, and the env variable NVIDIA_VISIBLE_DEVICES
There was a problem hiding this comment.
I think the environment variable would not make a difference in this case right?
I am wondering if we can configure this different image as a build arg instead of duplicating the entire dockerfile
|
hey man i tried testing your branch but is rx 550-580 supported? i got this error any idea how to make it supported? i think maybe pytorch version is too new? or rocm version, but i cant find which version to install |
|
i tried changin from rocm6 to rocm 5.7 |
|
got it run but still uses my cpu instead of gpu |
|
Do you think it would be possible to merge this feature @AbdBarho ? I would love to take advantages of the update of your repo, but still using my AMD GPU. |
|
Confirmed on Ubuntu 24.04 with a 7800xt |
|
Confirmed working on Manjaro with a 7800 GRE. I had to add |
where did u add that? i have issues with it working with my rx570 |
I can't check the file currently, but I think it was in the base-service at the top of the file, something like this (taken from the current docker-compose.yml): |
Closes issue #63
Added a new docker profile for AUTOMATIC1111 / AMD Rocm support.
docker compose --profile auto-rocm upTried on my 7900 XTX, works okay.
Xformers has an experimental AMD support in 0.0.25, but I could not get it to work. facebookresearch/xformers@44b0d07