AICI: Prompts as (Wasm) Programs
-
Updated
Jul 11, 2024 - Rust
AICI: Prompts as (Wasm) Programs
ChatFlameBackend is an innovative backend solution for chat applications, leveraging the power of the Candle AI framework with a focus on the Mistral model
LLM inference engine written in pure rust and Cuda (still under development)
Lightweight and extensible LLM Inference serving benchmark tool written in Rust.
A terminal style user interface to chat with AI characters using llama LLMs for locally processed AI.
A minimalistic LLM-powered Telegram assistant written in Rust that uses a self-contained Sqlite database and is very easy to install.
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."