Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deepseek not working on existing projects #956

Open
clickway opened this issue Dec 31, 2024 · 21 comments
Open

Deepseek not working on existing projects #956

clickway opened this issue Dec 31, 2024 · 21 comments

Comments

@clickway
Copy link

clickway commented Dec 31, 2024

Describe the bug

Thought initially was a problem with the api key, however after running some tests, I can see the issue coming from the existing project I'm working on. So if I'm to start a new project from scratch this I would not encounter this issue.


A bit of background: Started the project initially on Bolt.new, moved to the local version of Bolt.diy, continue the work using Google Gemini API 2.0 - relatively fine, then today I tried to use the newest version of Deepseek v.3 both coder/chat and it doesn't work, getting this error when using the prompt:
image
``

Link to the Bolt URL that caused the error

localhost app

Steps to reproduce

  1. Load a current project you are working on via Bolt.diy
  2. Change the LLM to DeepSeek API
  3. Type any request, or simple 'test'
  4. Error shown via the toast message (see screenshot)

Expected behavior

to work?

Screen Recording / Screenshot

deepseek_error

Platform

  • OS: [macOS]
  • Browser: [Chrome]
  • Version: [latest]

Provider Used

No response

Model Used

Deepseek v3 Chat / Coder direct api + via OpenRouter

Additional context

No response

@tasktuner
Copy link

Same issue here

@swatchie-1
Copy link

yes facing the same issue. Thanks for bringing it up.

@electroheadfx
Copy link

electroheadfx commented Jan 1, 2025

Same issue, is it a problem with the api url ? https://api.deepseek.com/beta' I tried too with https://api.deepseek.com

@dev1ender
Copy link

Same issue, is it a problem with the api url ? https://api.deepseek.com/beta' I tried too with https://api.deepseek.com

I dont think it works too I tried this method but error remains

@mbenhard
Copy link

mbenhard commented Jan 1, 2025

Same issue here. I think it might be due to the low input context window (64K when using their API). I might be wrong though.

Also noticed that by the default DeepSeek sets maximum output length to 4K and max_tokens value needs to be set, not sure how it is configured in bolt.diy

@dev1ender
Copy link

dev1ender commented Jan 1, 2025

Reply @mbenhard

Actually its configuration seems right as Deepseek DOC says output context length is 8k not 4k also the bolt.diy configuration correctly configured to 8K

path: app/lib/modules/llm/providers/deepseek.ts

  staticModels: ModelInfo[] = [
    { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
    { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 },
  ];

Not sure is because of that

@dwlynch055
Copy link

same issue here.

@clickway
Copy link
Author

clickway commented Jan 2, 2025

Reply @mbenhard

Actually its configuration seems right as Deepseek DOC says output context length is 8k not 4k also the bolt.diy configuration correctly configured to 8K

path: app/lib/modules/llm/providers/deepseek.ts

  staticModels: ModelInfo[] = [
    { name: 'deepseek-coder', label: 'Deepseek-Coder', provider: 'Deepseek', maxTokenAllowed: 8000 },
    { name: 'deepseek-chat', label: 'Deepseek-Chat', provider: 'Deepseek', maxTokenAllowed: 8000 },
  ];

Not sure is because of that

Look it works, if you were to start a new prompt chat from scratch, and is quite fast compared to Google Gemini but no matter how you try to use it on an existing project or start new project and import a folder then it fails to do anything.

@tasktuner
Copy link

Yep same here @clickway I think thats the main issue. Not with new code

@electroheadfx
Copy link

new prompt chat from scratch,

No worked for me from a new prompt chat from scratch

@elivonai91
Copy link

same issue here

@Mistes974
Copy link

Is there any workaround ?

@bilalinamdar
Copy link

bilalinamdar commented Jan 3, 2025

I also purchased top up from deepseek and after entering the api receiving same error. not able to bypass it.
image

@komax74
Copy link

komax74 commented Jan 3, 2025

I am in the same situation; if I create a new Deepseek project, it works perfectly. However, I continuously receive a timeout error if I try to modify a project created on bolt.new and exported (which works perfectly in preview but I get an error with every thing I write when trying to edit it). I also tried ChatGPT 4, same issue, only Gemini works.
Please help us solve this problem as I am spending a lot of tokens on bolt.new, it would be great to continue the project.

@kyo-1
Copy link

kyo-1 commented Jan 3, 2025

Only a new Deepseek project will work properly

@electroheadfx
Copy link

I am in the same situation; if I create a new Deepseek project, it works perfectly. However, I continuously receive a timeout error if I try to modify a project created on bolt.new and exported (which works perfectly in preview but I get an error with every thing I write when trying to edit it). I also tried ChatGPT 4, same issue, only Gemini works. Please help us solve this problem as I am spending a lot of tokens on bolt.new, it would be great to continue the project.

@komax74 @kyo-1 How you do that, new Deepseek project out me the error too.

@shivansh488
Copy link

in my case it is not working in any of the above specified case

@Kingjam99
Copy link

Same issue

@3assemm
Copy link

3assemm commented Jan 5, 2025

I am facing the same problem with all other Models except with Google. Even when using Google, after exceeding the 8k, it throws that error and try to use LLM Anthropic

app-dev-1 | INFO api.chat Reached max token limit (8000): Continuing message (1 switches left)
2025-01-05 16:27:59 app-dev-1 | INFO stream-text Sending llm call to Anthropic with model claude-3-5-sonnet-latest
image

@RaphPi
Copy link

RaphPi commented Jan 5, 2025

Is there a link with the #820 issue ?

@abdellatiff
Copy link

Same issue here, but i got no error message at all, i just import new project either from file or from git, basically the project was in bolt.new, but nothing happens when i right any prompt, it keeps loading forever, but it works in a new chat.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests