Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failure in Calling Tool to Create Document with AWS Bedrock as LLM Provider #659

Open
Henderson11 opened this issue Dec 26, 2024 · 1 comment

Comments

@Henderson11
Copy link

Using aws bedrock as the llm provider to call the tool to create a document failed, and the browser network request response part displayed the print information An error occurred.
And I don't have this problem when using openai as provider,Screenshots of actual usage scenarios are as follows:
image
It is also worth mentioning that in addition to using Provider in the file /lib/ai/index.ts Except for Registry as a model provider, there are no other code changes, and the back-end code is also unchanged. From the screenshots, the model does respond to the tool call request to create a document and renders the document content on the UI interface. on, but it seems that a bug occurs in the second stage: sending the results of the tool call back to llm to generate further summary responses. To a large extent, this bug can be reproduced, as long as you also use the same aws bedrock latest claude 3.5 model as llm provider:
import { experimental_wrapLanguageModel as wrapLanguageModel } from 'ai';
`/lib/ai/index.ts:
import { customMiddleware } from './custom-middleware';
import { models } from './models';
import { registry } from './registry';

export const customModel = (apiIdentifier: string) => {
const model = models.find(m => m.apiIdentifier === apiIdentifier);
if (!model) {
throw new Error(Model with apiIdentifier ${apiIdentifier} not found);
}

return wrapLanguageModel({
model: registry.languageModel(${model.provider}:${model.apiIdentifier}),
middleware: customMiddleware,
});
};
`

@Henderson11
Copy link
Author

`// Define your models here.

export interface Model {
id: string;
label: string;
apiIdentifier: string;
description: string;
provider: 'openai' | 'bedrock';
}

export const models: Array = [
{
id: 'gpt-4o-mini',
label: 'GPT 4o mini',
apiIdentifier: 'gpt-4o-mini',
provider: 'openai',
description: 'Small model for fast, lightweight tasks',
},
{
id: 'gpt-4o',
label: 'GPT 4o',
apiIdentifier: 'gpt-4o',
provider: 'openai',
description: 'For complex, multi-step tasks',
},
{
id: 'claude-3.5-sonnet',
label: 'Claude 3.5 Sonnet',
apiIdentifier: 'us.anthropic.claude-3-5-sonnet-20241022-v2:0',
provider: 'bedrock',
description: 'Anthropic的高级模型,适用于复杂任务',
},
{
id: 'claude-3.5-haiku',
label: 'Claude 3.5 Haiku',
apiIdentifier: 'us.anthropic.claude-3-5-haiku-20241022-v1:0',
provider: 'bedrock',
description: '快速响应的轻量级模型',
},
] as const;

export const DEFAULT_MODEL_NAME: string = 'gpt-4o-mini';`
The above code is the content of /lib/ai/models.ts

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant