A command line utility that queries websites for answers using a local LLM
-
Updated
Jan 23, 2024 - Python
A command line utility that queries websites for answers using a local LLM
💬 Discord AI chatbot using Ollama with the new Ollama API
ollama plugin for asdf version manager
Simple TUI client for ollama
Building a Chain of Thought RAG Model with DSPy, Qdrant and Ollama
An AI Toolbox for Simplified Access to AWS Bedrocks, Ollama from Rust
Language Server Protocol for accessing Large Language Models
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
C program for interacting with Ollama server from a Linux terminal
llamachan is a project that realises the idea of a dead internet for an imageboard
Create the prompts you need to write your Novel using AI
Ollama Chat is a GUI for Ollama designed for macOS.
Desktop UI for Ollama made with PyQT
CrewAI Local LLM is a GitHub repository for a locally hosted large language model (LLM) designed to enable private, offline AI model usage and experimentation.
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."