This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
-
Updated
Jan 12, 2021 - Python
This is the repository holding code and data for "FrugalML: How to Use ML Prediction APIs More Accurately and Cheaply".
Bringing local LLMs to a Minecraft front-end through commands.
LLM Kit - Python Large Language Model Kit for generating data of your choice
Large Multi-Language Models for News Translation
Simple chat interface for local AI using llama-cpp-python and llama-cpp-agent
How to stream LLM responses using AWS API Gateway Websockets and Lambda
AccIo - Enterprise LLM : Unifying intelligence at your command!
Python-based WebSocket for CLI LLaVA inference.
Effortlessly create and manage your own AI infrastructure with Radiantloom AI. Privacy, security, and flexibility meet ease-of-use in this innovative open-source platform.
Mamba for Vision, Perception and Action
Detailed code explanation of google LLM gemini
In this workshop, we demonstrate how to choose the right container and right instance types, optimize container parameters, and set up the right autoscaling policies and how to use APIs to get recommendations with Amazon SageMaker
Generate any Tattoo using AI
ASP.NET Core API for document processing using local LLMs. Features include summarization, analysis, sentiment detection, and document comparison. Compatible with OpenAI API-standard LLM servers.
EmbeddedLLM: API server for Embedded Device Deployment. Currently support IpexLLM/DirectML./CPU
The project was undertaken as part of the Intel Unnati Industrial Training program for the year 2024. The primary objective of this project aligns with Problem Statement PS-04: Introduction to GenAI LLM Inference on CPUs and subsequent LLM Model Finetuning for the development of a Custom Chatbot.
LLM inference engine written in pure rust and Cuda (still under development)
This python app generates NIST 800 53 control implementation for each control and generate the CSV file.
Code to benchmark APIs available from LLM vendors and demostrate how they work
Add a description, image, and links to the llm-inference topic page so that developers can more easily learn about it.
To associate your repository with the llm-inference topic, visit your repo's landing page and select "manage topics."