Tool usage using locally served LLMs #8675
Unanswered
pramodatre
asked this question in
Questions
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm trying to follow though this notebook for using tools with Haystack: https://colab.research.google.com/github/deepset-ai/haystack-cookbook/blob/main/notebooks/tools_support.ipynb
I'm using a locally served LLM, Mistral 7B (served using SGLang) instead of OpenAI models. The pipelines work but the tools are not getting invoked as expected.
This may be a naive question but was wondering if I can get this example working with a locally served LLM?
Any inputs will be very helpful! Thanks a lot in advance!
Beta Was this translation helpful? Give feedback.
All reactions