You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Environment, CPU architecture, OS, and Version:
Linux server 6.8.0-47-generic #47-Ubuntu SMP PREEMPT_DYNAMIC Fri Sep 27 21:40:26 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
OS Version: Ubuntu 24.04
Portainer: 2.19.5-CE
CPU: intel 13900K
Ram: 32gb
GPU: 4090
Describe the bug
(Suspected issue: "Bugged" LLAMA-CPP builds on later versions)
"builds everything for about 2 hours and ends with this"
Upon reproducing the "bug" it seems to be expecting input from the user.
patching file examples/llava/clip.cpp
patch unexpectedly ends in middle of line
Reversed (or previously applied) patch detected! Assume -R? [n]
To Reproduce
deploy docker using the following docker-compose.yaml
Additional context
I am not the one who personally experienced this issue, It is an issue reported in the #Help discord channel (Creating issue as requested) Altho i have reproduced the issue on another image.
The text was updated successfully, but these errors were encountered:
I just ran into this same issue trying to start v2.23.0 for the first time. Is there a solution for this, or can you tell me what the last working version is, to pull instead?
I just ran into this same issue trying to start v2.23.0 for the first time. Is there a solution for this, or can you tell me what the last working version is, to pull instead?
Thanks!
currently the only solution i know of is to use image v2.20.1
this is the latest version that works for me as i require rebuild. If you do not require rebuild then the fix is to simply disable rebuild.
I believe this is an issue with a change in llama-cpp and not LocalAi but i am not sure if there is a way to "-y" the command to apply patch or not.
LocalAI version:
localai/localai:latest-gpu-nvidia-cuda-12 : SHA ff0b3e63d517
(Also occurs on v2.22.1 container image)
Environment, CPU architecture, OS, and Version:
Linux server 6.8.0-47-generic #47-Ubuntu SMP PREEMPT_DYNAMIC Fri Sep 27 21:40:26 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
OS Version: Ubuntu 24.04
Portainer: 2.19.5-CE
CPU: intel 13900K
Ram: 32gb
GPU: 4090
Describe the bug
(Suspected issue: "Bugged" LLAMA-CPP builds on later versions)
"builds everything for about 2 hours and ends with this"
Upon reproducing the "bug" it seems to be expecting input from the user.
To Reproduce
deploy docker using the following docker-compose.yaml
Expected behavior
Build and launch of server
Logs
Log Provided by user:
Log from v2.22.1 container:
Additional context
I am not the one who personally experienced this issue, It is an issue reported in the #Help discord channel (Creating issue as requested) Altho i have reproduced the issue on another image.
The text was updated successfully, but these errors were encountered: