Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix not working with Gemini models #2021

Open
wants to merge 5 commits into
base: main
Choose a base branch
from

Conversation

lh0x00
Copy link

@lh0x00 lh0x00 commented Nov 9, 2024

Description

This PR's did the following:

  • Fixed missing gemini in LLM's provider check.
  • Handled case where model returns json in code block.
  • Fixed Gemini model not working: removed unnecessary noop, updated function to parse function calls returning Gemini (based on their source code/docs).

All tested by my app working properly again.

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)

@CLAassistant
Copy link

CLAassistant commented Nov 9, 2024

CLA assistant check
All committers have signed the CLA.

@lh0x00
Copy link
Author

lh0x00 commented Nov 9, 2024

@Dev-Khant Gemini will not be available for now, I hope you consider and merge it soon if possible.

@spike-spiegel-21 spike-spiegel-21 self-assigned this Nov 9, 2024
mem0/llms/configs.py Outdated Show resolved Hide resolved
@spike-spiegel-21 spike-spiegel-21 added the lgtm This PR has been approved by a maintainer label Nov 9, 2024
@lh0x00 lh0x00 changed the title Add missing LLM providers Fix not working with Gemini models Nov 9, 2024
@lh0x00
Copy link
Author

lh0x00 commented Nov 9, 2024

@spike-spiegel-21 @Dev-Khant While working on the library I resolved some related bugs so that Gemini could work with mem0 as documented.

mem0/llms/gemini.py Outdated Show resolved Hide resolved
mem0/llms/gemini.py Outdated Show resolved Hide resolved
elif self.llm_provider == "gemini":
# The `noop` ​​function n should be removed because it is unnecessary
# and causes the error: "should be non-empty for OBJECT type"
_tools = [UPDATE_MEMORY_TOOL_GRAPH, ADD_MEMORY_TOOL_GRAPH]
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Trying to add a noop ​​function here seems unnecessary, and will return an error that the value in the object cannot be empty (from the Gemini related library).

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @lh0x00 we want to keep the noop operation as it is present for all the llms, so is there a workaround for this? Because removing this would be inconsistent.

@lh0x00
Copy link
Author

lh0x00 commented Nov 15, 2024

@Dev-Khant I updated to resolve test fails.

@lh0x00
Copy link
Author

lh0x00 commented Dec 31, 2024

Is this PR still under consideration my friends? @Dev-Khant @spike-spiegel-21

Copy link
Member

@Dev-Khant Dev-Khant left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lh0x00 Why is there a need for removing code block from the generated response? Can you please elaborate more on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants