You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Using aws bedrock as the llm provider to call the tool to create a document failed, and the browser network request response part displayed the print information An error occurred.
And I don't have this problem when using openai as provider,Screenshots of actual usage scenarios are as follows:
It is also worth mentioning that in addition to using Provider in the file /lib/ai/index.ts Except for Registry as a model provider, there are no other code changes, and the back-end code is also unchanged. From the screenshots, the model does respond to the tool call request to create a document and renders the document content on the UI interface. on, but it seems that a bug occurs in the second stage: sending the results of the tool call back to llm to generate further summary responses. To a large extent, this bug can be reproduced, as long as you also use the same aws bedrock latest claude 3.5 model as llm provider:
import { experimental_wrapLanguageModel as wrapLanguageModel } from 'ai';
`/lib/ai/index.ts:
import { customMiddleware } from './custom-middleware';
import { models } from './models';
import { registry } from './registry';
export const customModel = (apiIdentifier: string) => {
const model = models.find(m => m.apiIdentifier === apiIdentifier);
if (!model) {
throw new Error(Model with apiIdentifier ${apiIdentifier} not found);
}
Using aws bedrock as the llm provider to call the tool to create a document failed, and the browser network request response part displayed the print information An error occurred.
And I don't have this problem when using openai as provider,Screenshots of actual usage scenarios are as follows:
It is also worth mentioning that in addition to using Provider in the file /lib/ai/index.ts Except for Registry as a model provider, there are no other code changes, and the back-end code is also unchanged. From the screenshots, the model does respond to the tool call request to create a document and renders the document content on the UI interface. on, but it seems that a bug occurs in the second stage: sending the results of the tool call back to llm to generate further summary responses. To a large extent, this bug can be reproduced, as long as you also use the same aws bedrock latest claude 3.5 model as llm provider:
import { experimental_wrapLanguageModel as wrapLanguageModel } from 'ai';
`/lib/ai/index.ts:
import { customMiddleware } from './custom-middleware';
import { models } from './models';
import { registry } from './registry';
export const customModel = (apiIdentifier: string) => {
const model = models.find(m => m.apiIdentifier === apiIdentifier);
if (!model) {
throw new Error(
Model with apiIdentifier ${apiIdentifier} not found
);}
return wrapLanguageModel({
model: registry.languageModel(
${model.provider}:${model.apiIdentifier}
),middleware: customMiddleware,
});
};
`
The text was updated successfully, but these errors were encountered: