-
Notifications
You must be signed in to change notification settings - Fork 27.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support cuda malloc #16549
base: dev
Are you sure you want to change the base?
support cuda malloc #16549
Conversation
@wkpark thank you so much can you explain what cuda malloc exactly does and brings performance or memory optimization? |
Please refer to the following articles: |
@wkpark ty so much why comfyui made it disabled by default do you know? we are all moved to torch 2.4 |
that's not the latest change. already reverted two month ago. https://github.com/comfyanonymous/ComfyUI/commits/master/cuda_malloc.py |
imported from comfy: https://github.com/comfyanonymous/ComfyUI/blob/f1d6cef71c70719cc3ed45a2455a4e5ac910cd5e/cuda_malloc.py original commits: - comfyanonymous/ComfyUI@799c08a: Auto disable cuda malloc on some GPUs on windows. - comfyanonymous/ComfyUI@D39c58b: Disable cuda malloc on GTX 750 Ti. - comfyanonymous/ComfyUI@85a8900: Disable cuda malloc on regular GTX 960. - comfyanonymous/ComfyUI@30de083: Disable cuda malloc on all the 9xx series. - comfyanonymous/ComfyUI@7c0a5a3: Disable cuda malloc on a bunch of quadro cards. - comfyanonymous/ComfyUI@5a90d3c: GeForce MX110 + MX130 are maxwell. - comfyanonymous/ComfyUI@fc71cf6: Add some 800M gpus to cuda malloc blacklist. - comfyanonymous/ComfyUI@861fd58: Add a warning if a card that doesn't support cuda malloc has it enabled. - comfyanonymous/ComfyUI@192ca06: Add some more cards to the cuda malloc blacklist. - comfyanonymous/ComfyUI@caddef8: Auto disable cuda malloc on unsupported GPUs on Linux. - comfyanonymous/ComfyUI@2f93b91: Add Tesla GPUs to cuda malloc blacklist.
imported from comfy:
https://github.com/comfyanonymous/ComfyUI/blob/f1d6cef71c70719cc3ed45a2455a4e5ac910cd5e/cuda_malloc.py
original commits:
Description
detect supported GPUs and add
PYTORCH_CUDA_ALLOC_CONF=backend:cudaMallocAsync
automatically.cuda_malloc
-cuda-malloc
,--disable-cuda-malloc
cmd argsChecklist: