-
-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can not Set Model Tokens to Local Model with OpenAI API Format #810
Comments
Is qwen from OpenAI? |
Nope, but if I have local model loaded with vLLM which is using OpenAI format, is that how I load it right? I am refering to #721 |
graph_config = { 补充,测试这个配置不能生成代码openai.BadRequestError: Error code: 400 - {'error': {'code': 400, 'message': 'the request exceeds the available context size. try increasing the context size or enable context shift', 'type': 'invalid_request_error'}} |
client = ChatOpenAI( graph_config = { result = smart_scraper_graph.run() 补充,测试这个不能生成代码 |
Describe the bug
I have my own local model loaded using vLLM. So, it's using openAI API format. When I tried it to
SmartScraperGraph
, I got the error:TypeError: Completions.create() got an unexpected keyword argument 'model_tokens'
To Reproduce
Here is the error
Expected behavior
The result return as expected
The text was updated successfully, but these errors were encountered: