Skip to content

Multi-GPU Support #4139

Answered by ltdrdata
laetokang asked this question in Q&A
Jul 30, 2024 · 7 comments · 2 replies
Discussion options

You must be logged in to vote

SwarmUI provides a UI that can handle multiple ComfyUI instances as backends at once.
https://github.com/mcmonkeyprojects/SwarmUI

Currently, ComfyUI does not provide a method to execute workflows in parallel.

If you are a developer and want to implement inference functionality for multiple GPUs, I think modifying the KSampler would be the most effective approach.

If I had a multiple GPU environment, I would like to experiment with this, but I'm not sure if PyTorch can properly handle this scenario.

It's important to note that several custom nodes use implementations that hijack the Sampling function. Your modifications might cause issues with these nodes.

Replies: 7 comments 2 replies

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by laetokang
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
3 replies
@sonukaloshiya
Comment options

@ltdrdata
Comment options

@frankyifei
Comment options

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
9 participants