LangChain, DSPy, etc. make it easy to abstract away the actual LLM so that you can easy try different LLMs. But I don't want the whole framework. Is there anything out there that _just_ abstracts away the LLM so that I can swap out an Anthropic model for OpenAI, etc?
I'm skeptical since prompts don't transfer that easily. You'll need to therefore switch both. To which case, "feature flagging" between implementations is easy with https://github.com/dagworks-inc/hamilton , see https://hamilton.dagworks.io/en/latest/code-comparisons/langchain/#switch-to-using-anthropic
Have you checked out the llm package that Simon Willison built for the command line? Swaps out models pretty conveniently. Great demo here: https://simonwillison.net/2024/Jun/17/cli-language-models/
I have been using Ollama that let me run models locally and add API keys to run Inference on OpenAI etc models.
I've been working on a cli tool https://github.com/o3-cloud/cllm. It only supports OpenAI chat compilations compatible models, but looking to add support for as many models as possible. My goal is to keep the main cllm command as simple as possible and instead provide external cli tools that work with cllm's output to create more complex processes.
experimenting with this to load balance across multiple llm's and avoid those 429's, maybe something here could help you https://docs.litellm.ai/docs/routing
John Berryman LangChain has different components like PromptTemplate, ChatAI, and Pipreine. You can pick and choose the subset of components you need. Here is a blog for illustration. There is an example of how you can swap OpenAI call with Anthropic . https://medium.com/state-of-the-art-technology/taming-the-llm-zoo-with-langchain-part-ii-2d744e8fa0cc
yo.
Trying to understand, can you guys share what were your use cases to have integration with multiple LLMs in production
making tech more useful and humane
4wAre you looking for a nice multi-backend client library, or something like https://github.com/jxnl/instructor to help with the data structuring?