Docker-compose for LocalAI CUDA 12 autoinstall WITHOUT MODELS #2167
antoniostasi1987
started this conversation in
General
Replies: 2 comments 3 replies
-
Pre installed models are only present in the All-in-One images. You can use the standard images, that does not have any pre-installed model, see https://localai.io/docs/reference/container-images/ for a list depending on your HW. |
Beta Was this translation helpful? Give feedback.
3 replies
-
Seconding this for @mudler but for adding SOTA models like LLaMA3 or maybe CodeQwen + DeepSeekCoder, directly from HuggingFace? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all,
someone can help mi to build a docker-compose.yaml file for build a LocalAI version that support CUDA 12 WITHOUT pre-installed models? Thanks a lot to all!
Beta Was this translation helpful? Give feedback.
All reactions