You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
When enabling full file upload in an agent flow, I can upload a file for asking questions about it.
However, if the context size appears to be too large, nothing is raised to the chat, i.e no error and no message. The chat just stops with no response. However, in the logs, I can see the error.
To Reproduce
Steps to reproduce the behavior:
Create an agent flow with a supervisor, a worker, and their dependencies
Configure the agent flow to enable full file upload
Upload a large text file to the chat
Ask a question about it
Expected behavior
If an error is raised internally, it should be also raised to the chat so that the user can be aware that there is a technical issue.
Screenshots
See in the chat, there's no response:
But there is an error "string too long" in the logs:
2024-12-17 15:42:42 [VERBOSE]: [llm/error] [1:chain:LangGraph > 3:chain:supervisor > 4:chain:RunnableLambda > 5:chain:RunnableSequence > 8:llm:ChatOpenAI] [3.12s] LLM run errored with error: "400 Invalid 'messages[1].content': string too long. Expected a string with maximum length 1048576, but got a string with length 8045636 instead.\n\nError: 400 Invalid 'messages[1].content': string too long. Expected a string with maximum length 1048576, but got a string with length 8045636 instead.\n at APIError.generate (/usr/local/lib/node_modules/flowise/node_modules/openai/error.js:45:20)\n at OpenAI.makeStatusError (/usr/local/lib/node_modules/flowise/node_modules/openai/core.js:293:33)\n at OpenAI.makeRequest (/usr/local/lib/node_modules/flowise/node_modules/openai/core.js:337:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:1558:29\n at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)"
2024-12-17 15:42:42 [VERBOSE]: [chain/error] [1:chain:LangGraph > 3:chain:supervisor > 4:chain:RunnableLambda > 5:chain:RunnableSequence] [3.56s] Chain run errored with error: "400 Invalid 'messages[1].content': string too long. Expected a string with maximum length 1048576, but got a string with length 8045636 instead.\n\nError: 400 Invalid 'messages[1].content': string too long. Expected a string with maximum length 1048576, but got a string with length 8045636 instead.\n at APIError.generate (/usr/local/lib/node_modules/flowise/node_modules/openai/error.js:45:20)\n at OpenAI.makeStatusError (/usr/local/lib/node_modules/flowise/node_modules/openai/core.js:293:33)\n at OpenAI.makeRequest (/usr/local/lib/node_modules/flowise/node_modules/openai/core.js:337:30)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async /usr/local/lib/node_modules/flowise/node_modules/@langchain/openai/dist/chat_models.cjs:1558:29\n at async RetryOperation._fn (/usr/local/lib/node_modules/flowise/node_modules/p-retry/index.js:50:12)"
2024-12-17 15:42:42 [VERBOSE]: [chain/error] [1:chain:LangGraph > 3:chain:supervisor > 4:chain:RunnableLambda] [3.61s] Chain run errored with error: "Aborted!\n\nError: Aborted!\n at agentNode (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:614:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RunnableLambda.supervisorNode [as func] (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:582:57)\n at async /usr/local/lib/node_modules/flowise/node_modules/@langchain/langgraph/node_modules/@langchain/core/dist/runnables/base.cjs:1531:34"
2024-12-17 15:42:42 [VERBOSE]: [chain/error] [1:chain:LangGraph > 3:chain:supervisor] [3.69s] Chain run errored with error: "Aborted!\n\nError: Aborted!\n at agentNode (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:614:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RunnableLambda.supervisorNode [as func] (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:582:57)\n at async /usr/local/lib/node_modules/flowise/node_modules/@langchain/langgraph/node_modules/@langchain/core/dist/runnables/base.cjs:1531:34"
2024-12-17 15:42:42 [VERBOSE]: [chain/error] [1:chain:LangGraph] [3.97s] Chain run errored with error: "Aborted!\n\nError: Aborted!\n at agentNode (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:614:15)\n at process.processTicksAndRejections (node:internal/process/task_queues:95:5)\n at async RunnableLambda.supervisorNode [as func] (/usr/local/lib/node_modules/flowise/node_modules/flowise-components/dist/nodes/multiagents/Supervisor/Supervisor.js:582:57)\n at async /usr/local/lib/node_modules/flowise/node_modules/@langchain/langgraph/node_modules/@langchain/core/dist/runnables/base.cjs:1531:34"
The text was updated successfully, but these errors were encountered:
bodzebod
changed the title
[BUG] uploading a too large file is not raised in the chat as an error
[BUG] uploading a too large file as full file upload is not raised in the chat as an error
Dec 17, 2024
Describe the bug
When enabling full file upload in an agent flow, I can upload a file for asking questions about it.
However, if the context size appears to be too large, nothing is raised to the chat, i.e no error and no message. The chat just stops with no response. However, in the logs, I can see the error.
To Reproduce
Steps to reproduce the behavior:
Expected behavior
If an error is raised internally, it should be also raised to the chat so that the user can be aware that there is a technical issue.
Screenshots
See in the chat, there's no response:
But there is an error "string too long" in the logs:
Flow
Test Agent Agents (1).json
Setup
Additional context
The text was updated successfully, but these errors were encountered: