Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No generation with VULKAN backend for some parameters #518

Open
olivbrau opened this issue Dec 10, 2024 · 5 comments
Open

No generation with VULKAN backend for some parameters #518

olivbrau opened this issue Dec 10, 2024 · 5 comments

Comments

@olivbrau
Copy link

Hi,
I've tried the VULKAN backend and encountered strange results.
(I used model 2.1, but the issues arise also with other models)

1) no image if no --vae-on-cpu parameter
sd -m .\v2-1_768-nonema-pruned.safetensors --sampling-method euler --steps 20 -H 512 -W 512--type q8_0 -p "a cute birman cat"
--> the generation phase is realised, and then during the phase where the logs says "decoding latents", suddenly the simulation ends suddenly and no image is written
--> if I add --vae-on-cpu, it works

2) black image for some image size
sd -m .\v2-1_768-nonema-pruned.safetensors --sampling-method euler --steps 20 -H 512 -W 512 --type q8_0 -p "a cute birman cat" --vae-on-cpu -->OK
sd -m .\v2-1_768-nonema-pruned.safetensors --sampling-method euler --steps 20 -H 768 -W 768 --type q8_0 -p "a cute birman cat" --vae-on-cpu -->KO the phase "generating image"hangs and the script ends suddenly
sd -m .\v2-1_768-nonema-pruned.safetensors --sampling-method euler --steps 20 -H 512 -W 832 --type q8_0 -p "a cute birman cat" --vae-on-cpu --> KO the process goes until the end, but the image is totally black
For the 2 last examples, the same command with CPU backend works fine

@stduhpf
Copy link
Contributor

stduhpf commented Dec 10, 2024

  • Does it work if you use --vae-tilinginstead of --vae-on-cpu?
  • Can you try sync: update ggml #509 to see if it fixes anything?

@olivbrau
Copy link
Author

With --vae-tiling instead of --vae-on-cpu, it worked

For the ggml update, I don't know how to try, I've not yet implemented a build environnement for this stable-diffusion.cpp project

@stduhpf
Copy link
Contributor

stduhpf commented Dec 10, 2024

The ggml update might fix the black image. Not sure about what's causing the crash when generating at 768x768, maybe you're just running out of memory?

@olivbrau
Copy link
Author

olivbrau commented Dec 10, 2024

So I'm waiting for the next release ! :-)

For the crash, I think you're right : with 512x512 it needs 2.4 Go VRAM on a total of 4 Go for my card, and 768x768 is 2.25 times more pixels than 512*512, so it won't run in my case. It is strange, because sometimes, if I let f32 type, it crashes with an explicit message : out of memory, with the amount it is trying to put on the GPU

@stduhpf
Copy link
Contributor

stduhpf commented Dec 10, 2024

Btw, I think SD2.1 support is broken (unrelated to Vulkan). I don't get black images like you do, but still very bad results. (see here #122 (comment))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants