DragGAN? DragSD!
#11440
Replies: 1 comment 1 reply
-
I just find one repo who is working on it, maybe it helps you. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I found out about DragGAN a month ago, and it seems the code and models are released now: https://github.com/XingangPan/DragGAN
This can also be realized in Stable Diffusion, if one uses an inpaint model. In DragGAN we have points that we can drag on the surface, and because it is a GAN, it renders it immediately. SD does not work so fast, so what can we do? We could program a frontend ui canvas that lets you "smear" or morph parts of the image with a customizable brush size, and upon releasing the mouse button, it renders a masked inpaint over the dragged area with 0.3 denoise. This should work P E R F E C T in theory, but I am not a frontend guy, I have no idea how this could be integrated into the gradio environment. (And even thinking about it makes me sweat)
Maybe someone can take this idea and apply their skills to it, because I think it is a great idea, to have an SD alternative to DragGAN, because the implementation shouldnt be that hard. Even if DragGAN can be implemented in its entirety into A1111 it would be nice to have a cheaper alternative for users with 6GB VRAM for example.
I would code it myself but I am really struggling with even how to begin this idea of Drag-SD. Feedback is very welcome
Beta Was this translation helpful? Give feedback.
All reactions