Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ollama local models detected but unable to get responses. It gives error "There was an error processing your request: No details were returned" #985

Open
oliverbarreto opened this issue Jan 3, 2025 · 11 comments

Comments

@oliverbarreto
Copy link

Describe the bug

I follow the setup instructions to install Bolt with docker and Ollama native Mac app. I spin up the Bolt container and access the app. I then select Ollama as provider, and llama3.2 as model... I write a simple prompt "create a simple todo app" ... it sends the message but the app ALWAYS shows the same error in the notification po
Captura de pantalla 2025-01-03 a las 14 41 22
p-up in the lower right corner "There was an error processing your request: No details were returned"

Link to the Bolt URL that caused the error

http://localhost:5173

Steps to reproduce

  1. I got Ollkama running locally on Macboc Pro M1: standard setup with environment variable set OLLAMA_HOST=0.0.0.0:11434

  2. Install Bolt.diy according to docs: to use docker
    node -v
    v22.12.0

  3. check node install:
    echo $PATH
    :/usr/local/bin:

  4. copy .env.example to .env.local and added

(after spinning up the container, I inspected docker container info and both vars are loaded correctly in the environment)

  1. download version 0.0.5 from GitHub, copy pasted locally and run instructions to build the Docker Image:

Using npm script:

npm run dockerbuild

OR using direct Docker command:

docker build . --target bolt-ai-development
Run the Container:

docker-compose --profile development up

you can see in the terminal that at least the logged info in the terminal says ALL env vars are EMPTY so it defaults to blank strings

docker compose --profile development up -d
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.
[+] Running 1/1
✔ Container boltdiy-005-app-dev-1 Started

  1. I open up brave browser or safari, and try "create a simple todo app" setting up Ollama as provider, and llama3.2 as model, which is ready in my Ollama setup (the combo lista all of my currently available models in Ollama instance)

ollama list
NAME ID SIZE MODIFIED
llama3-groq-tool-use:latest 36211dad2b15 4.7 GB 2 months ago
llama3.2:latest a80c4f17acd5 2.0 GB 2 months ago
mistral:latest f974a74358d6 4.1 GB 2 months ago
llama3.1:latest 42182419e950 4.7 GB 2 months ago

NOW I HAVE AN ERROR, the page ALWAYS shows a notification "There was an error processing your request: No details were returned"

Here are Docker logs with all the way from a fresh start:
2025-01-03 14:32:39 ★═══════════════════════════════════════★
2025-01-03 14:32:39 B O L T . D I Y
2025-01-03 14:32:39 ⚡️ Welcome ⚡️
2025-01-03 14:32:39 ★═══════════════════════════════════════★
2025-01-03 14:32:39
2025-01-03 14:32:39 📍 Current Version Tag: v"0.0.5"
2025-01-03 14:32:39 📍 Current Commit Version: "no-git-info"
2025-01-03 14:32:39 Please wait until the URL appears here
2025-01-03 14:32:39 ★═══════════════════════════════════════★
2025-01-03 14:32:46 ➜ Local: http://localhost:5173/
2025-01-03 14:32:46 ➜ Network: http://172.22.0.2:5173/
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Anthropic
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Cohere
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Deepseek
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Google
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Groq
2025-01-03 14:33:36 INFO LLMManager Registering Provider: HuggingFace
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Hyperbolic
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Mistral
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Ollama
2025-01-03 14:33:36 INFO LLMManager Registering Provider: OpenAI
2025-01-03 14:33:36 INFO LLMManager Registering Provider: OpenRouter
2025-01-03 14:33:36 INFO LLMManager Registering Provider: OpenAILike
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Perplexity
2025-01-03 14:33:36 INFO LLMManager Registering Provider: xAI
2025-01-03 14:33:36 INFO LLMManager Registering Provider: Together
2025-01-03 14:33:36 INFO LLMManager Registering Provider: LMStudio
2025-01-03 14:32:39 fatal: not a git repository (or any parent up to mount point /)
2025-01-03 14:32:39 Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
2025-01-03 14:32:41 fatal: not a git repository (or any parent up to mount point /)
2025-01-03 14:32:41 Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
2025-01-03 14:32:41 [warn] Data fetching is changing to a single fetch in React Router v7
2025-01-03 14:32:41 ┃ You can use the v3_singleFetch future flag to opt-in early.
2025-01-03 14:32:41 ┃ -> https://remix.run/docs/en/2.13.1/start/future-flags#v3_singleFetch
2025-01-03 14:32:41 ┗
2025-01-03 14:32:41 fatal: not a git repository (or any parent up to mount point /)
2025-01-03 14:32:41 Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
2025-01-03 14:33:40 2:33:40 PM [vite] ✨ new dependencies optimized: vite-plugin-node-polyfills/shims/buffer, vite-plugin-node-polyfills/shims/global, vite-plugin-node-polyfills/shims/process, nanostores, js-cookie, chalk, @remix-run/cloudflare, remix-utils/client-only, @radix-ui/react-tooltip, react-toastify, @nanostores/react, ignore, framer-motion, isomorphic-git, isomorphic-git/http/web, @ai-sdk/openai, @webcontainer/api, @ai-sdk/anthropic, @ai-sdk/cohere, @ai-sdk/google, @ai-sdk/mistral, ollama-ai-provider, @openrouter/ai-sdk-provider, remix-island, ai/react, node:path, jszip, file-saver, @octokit/rest, diff, istextorbinary, date-fns, @radix-ui/react-dialog, react-resizable-panels, @codemirror/autocomplete, @codemirror/commands, @codemirror/language, @codemirror/search, @codemirror/state, @codemirror/view, @radix-ui/react-dropdown-menu, @radix-ui/react-context-menu, react-markdown, @uiw/codemirror-theme-vscode, @codemirror/lang-vue, @codemirror/lang-javascript, @codemirror/lang-html, @codemirror/lang-css, @codemirror/lang-sass, @codemirror/lang-json, @codemirror/lang-markdown, @codemirror/lang-wast, @codemirror/lang-python, @codemirror/lang-cpp, rehype-raw, remark-gfm, rehype-sanitize, unist-util-visit, @xterm/addon-fit, @xterm/addon-web-links, @xterm/xterm, shiki, @radix-ui/react-switch
2025-01-03 14:33:40 2:33:40 PM [vite] ✨ optimized dependencies changed. reloading
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Anthropic
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Cohere
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Deepseek
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Google
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Groq
2025-01-03 14:33:44 INFO LLMManager Registering Provider: HuggingFace
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Hyperbolic
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Mistral
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Ollama
2025-01-03 14:33:44 INFO LLMManager Registering Provider: OpenAI
2025-01-03 14:33:44 INFO LLMManager Registering Provider: OpenRouter
2025-01-03 14:33:44 INFO LLMManager Registering Provider: OpenAILike
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Perplexity
2025-01-03 14:33:44 INFO LLMManager Registering Provider: xAI
2025-01-03 14:33:44 INFO LLMManager Registering Provider: Together
2025-01-03 14:33:44 INFO LLMManager Registering Provider: LMStudio
2025-01-03 14:37:58 INFO LLMManager Getting dynamic models for Ollama
2025-01-03 14:37:58 ERROR LLMManager Error getting dynamic models Ollama : TypeError: fetch failed
2025-01-03 14:37:59 ERROR api.chat Error: No models found for provider Ollama
2025-01-03 14:37:59 at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
2025-01-03 14:37:59 at processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-01-03 14:37:59 at chatAction (/app/app/routes/api.chat.ts:116:20)
2025-01-03 14:37:59 at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
2025-01-03 14:37:59 at /app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4899:19
2025-01-03 14:37:59 at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4963:16)
2025-01-03 14:37:59 at async Promise.all (index 0)
2025-01-03 14:37:59 at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:17)
2025-01-03 14:37:59 at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4835:17)
2025-01-03 14:37:59 at callDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3992:19)
2025-01-03 14:37:59 at submit (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3755:21)
2025-01-03 14:37:59 at queryImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3684:22)
2025-01-03 14:37:59 at Object.queryRoute (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3629:18)
2025-01-03 14:37:59 at handleResourceRequest (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
2025-01-03 14:37:59 at requestHandler (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
2025-01-03 14:37:59 at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25
2025-01-03 14:41:19 INFO LLMManager Getting dynamic models for Ollama
2025-01-03 14:41:19 ERROR LLMManager Error getting dynamic models Ollama : TypeError: fetch failed
2025-01-03 14:41:19 ERROR api.chat Error: No models found for provider Ollama
2025-01-03 14:41:19 at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
2025-01-03 14:41:19 at processTicksAndRejections (node:internal/process/task_queues:95:5)
2025-01-03 14:41:19 at chatAction (/app/app/routes/api.chat.ts:116:20)
2025-01-03 14:41:19 at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
2025-01-03 14:41:19 at /app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4899:19
2025-01-03 14:41:19 at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4963:16)
2025-01-03 14:41:19 at async Promise.all (index 0)
2025-01-03 14:41:19 at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4772:17)
2025-01-03 14:41:19 at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:4835:17)
2025-01-03 14:41:19 at callDataStrategy (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3992:19)
2025-01-03 14:41:19 at submit (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3755:21)
2025-01-03 14:41:19 at queryImpl (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3684:22)
2025-01-03 14:41:19 at Object.queryRoute (/app/node_modules/.pnpm/@remix-run[email protected]/node_modules/@remix-run/router/router.ts:3629:18)
2025-01-03 14:41:19 at handleResourceRequest (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
2025-01-03 14:41:19 at requestHandler (/app/node_modules/.pnpm/@remix-run[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
2025-01-03 14:41:19 at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run[email protected][email protected]_react@[email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25

Expected behavior

I woould like to be able to make a call and have it working

Screen Recording / Screenshot

Captura de pantalla 2025-01-03 a las 14 11 04
Captura de pantalla 2025-01-03 a las 14 41 22

Platform

  • OS:
    macOS 15.1 (24B83)
  • Browser:.
    Brave Version 1.73.104 Chromium: 131.0.6778.204 (Official Build) (arm64)](https://brave.com/latest/)
    & Safari Version 18.1 (20619.2.8.11.10)

Provider Used

Ollama

Model Used

llama3.2

Additional context

When I use the app with Google Gemini, it works... it sends the requests andworks fine...
Captura de pantalla 2025-01-03 a las 15 05 28

@Roninos
Copy link

Roninos commented Jan 3, 2025

Hi! I have the same thing. It seems to me that those who participate in the development do not want others to use this product. You need to notify the Github administration. Earlier versions worked badly, but now they don't work at all.

@rblana
Copy link

rblana commented Jan 3, 2025

+1

@inaggesDaruom
Copy link

I had this problem as well. In the docker-compose, only the service for production uses the .env.local.
That's why these warnings appear in the console :

WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.

You need a .env file instead of .env.local". this fixed the problem for me

@frenchcharly
Copy link

frenchcharly commented Jan 3, 2025

I'm encountering the same problem

  1. I managed to make it work with pnpmafter creating a .env file with OLLAMA_API_BASE_URL=http://127.0.0.1:11434
  2. haven't been able to make it work with docker, i get the container running but i get the same error as OP

Also:

  1. when running via pnpm run dev, the app created cannot run via the UI's terminal as seen bellow:
image

Ollama installed and up to date with latest version (tried with homebrew or desktop app, same result)

Configuration

  • macOS 14.7.1
  • node v20.10.0
  • npm v10.9.0
  • pnpm v9.15.2
  • Docker v27.3.1
  • Ollama v0.5.4

@rblana
Copy link

rblana commented Jan 3, 2025

I managed to run bolt.diy with my local llama after I created a .env file with .
The problem now is that I get a chat GPT like response, the code is embed inline in the response.
I suspect is a limitation with the Context size. I belive i need to add a custom Modelfile DEFAULT_NUM_CTX=32768 but I am not sure how to do this.

@juanmcampos
Copy link

I managed to run bolt.diy with my local llama after I created a .env file with . The problem now is that I get a chat GPT like response, the code is embed inline in the response. I suspect is a limitation with the Context size. I belive i need to add a custom Modelfile DEFAULT_NUM_CTX=32768 but I am not sure how to do this.

make a file with any name like modelx

inside of it
FROM qwen2.5-coder:14b
PARAMETER num_ctx 32768

then save it and run

ollama create -f modelx qwen2.5-coder-ctx:14b

after that you will have a model in ollama with name qwen2.5-coder-ctx:14b

@rblana
Copy link

rblana commented Jan 3, 2025

I rember I head it here: https://youtu.be/8ommGcs_-VU?si=t2wzC4fOkQ7S6Md-&t=182

@rblana
Copy link

rblana commented Jan 3, 2025

yup this fixes it:

Create a newfile Modelfile-qwen2.5-coder_32b

FROM qwen2.5-coder:32b
PARAMETER num_ctx 32768

then run
ollama create -f Modelfile-qwen2.5-coder_32b qwen2.5-coder-ctx:32b

@sosimtechvaibhav
Copy link

yup this fixes it:

Create a newfile Modelfile-qwen2.5-coder_32b

FROM qwen2.5-coder:32b
PARAMETER num_ctx 32768

then run ollama create -f Modelfile-qwen2.5-coder_32b qwen2.5-coder-ctx:32b

Getting error even after doing this

@rajinder-yadav
Copy link

I'm getting the same error. I also tried creating an image using a Modelfile, still same error.

app-dev-1  |  INFO   LLMManager  Getting dynamic models for Ollama
app-dev-1  |  ERROR   LLMManager  Error getting dynamic models Ollama : TypeError: fetch failed
app-dev-1  |  ERROR   api.chat  Error: No models found for provider Ollama
app-dev-1  |     at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at chatAction (/app/app/routes/api.chat.ts:116:20)
app-dev-1  |     at Object.callRouteAction (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
app-dev-1  |     at /app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:4899:19
app-dev-1  |     at callLoaderOrAction (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:4963:16)
app-dev-1  |     at async Promise.all (index 0)
app-dev-1  |     at defaultDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:4772:17)
app-dev-1  |     at callDataStrategyImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:4835:17)
app-dev-1  |     at callDataStrategy (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:3992:19)
app-dev-1  |     at submit (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:3755:21)
app-dev-1  |     at queryImpl (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:3684:22)
app-dev-1  |     at Object.queryRoute (/app/node_modules/.pnpm/@[email protected]/node_modules/@remix-run/router/router.ts:3629:18)
app-dev-1  |     at handleResourceRequest (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@[email protected][email protected]/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
app-dev-1  |     at /app/node_modules/.pnpm/@[email protected]_@[email protected][email protected][email protected][email protected]_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25

Image (created) is running and working from the terminal just fine.

$ ollama ps
NAME                    ID              SIZE      PROCESSOR          UNTIL                   
llama3.2-bolt:latest    5f8672eff6ca    8.4 GB    61%/39% CPU/GPU    About a minute from now    

I am on openSUSE TW Linux.

@sosimtechvaibhav
Copy link

after doing all the steps this worked:
Remove node modules and lock files
rm -rf node_modules pnpm-lock.yaml

Clear pnpm cache
pnpm store prune

hope it helps

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants