Use your locally running AI models to assist you in your web browsing
-
Updated
Dec 29, 2024 - TypeScript
Use your locally running AI models to assist you in your web browsing
A generalized information-seeking agent system with Large Language Models (LLMs).
[ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
MVP of an idea using multiple local LLM models to simulate and play D&D
Chat with your pdf using your local LLM, OLLAMA client.(incomplete)
A local chatbot for managing docs
The client for the Symmetry peer-to-peer inference network. Enabling users to connect with each other, share computational resources, and collect valuable machine learning data.
📚 NovelGenerator - AI-powered fiction book generator that uses Ollama's LLMs to create complete novels with coherent plot structures, developed characters and multiple writing styles.
Run gguf LLM models in Latest Version TextGen-webui
Alacritty + Fish + Zellij + Starship + Neovim + i3 + Supermaven + Ollama 🦙 = 🚀
Read your local files and answer your queries
Add a description, image, and links to the localllm topic page so that developers can more easily learn about it.
To associate your repository with the localllm topic, visit your repo's landing page and select "manage topics."