-
Notifications
You must be signed in to change notification settings - Fork 29
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement settings file for Ollama agent #307
Conversation
…nd port; Precise warning message
…sible null exception
@StevenBucher98 I have just discovered OllamaSharp nuget package that would simplify the agent code and provide easy streaming service. Would you consider switching to it? This is how easy is to setup session and start streaming: var _ollama = new OllamaApiClient("http://localhost:11434");
var _models = await _ollama.ListLocalModelsAsync();
_ollama.SelectedModel = _models.FirstOrDefault().Name;
await foreach (var stream in _ollama.GenerateAsync("How to list files with PowerShell?"))
{
Console.Write(stream.Response);
} Edit: I have the PR it the works, with the streaming and context support. |
OllamaSharp uses the MIT license, so it's totally fine to use it for this agent. I will close this PR to favor the other one. |
Important:
I created another PR that utilizes the OllamaSharp library, featuring context, streaming, and settings support. I strongly recommend reviewing and merging #310. However, if you are unable to use third-party libraries for any reason, please consider this PR instead.
PR Summary
This PR implements basic settings file for Ollama agent:
$HOME\.aish\agent-config\ollama\ollama.config.json
PR Context
This PR allows user to specify custom Ollama model and Ollama enpoint location, instead of hardcoded values.
This partially implements #155 (no history and streaming)
@StevenBucher98 : This is quick and dirty, but I needed the settings file ASAP to test the different models without constant agent recompilation. I'm open for suggestions.