John Berryman’s Post

View profile for John Berryman, graphic

Consultant in Large Language Model Application Development

LangChain, DSPy, etc. make it easy to abstract away the actual LLM so that you can easy try different LLMs. But I don't want the whole framework. Is there anything out there that _just_ abstracts away the LLM so that I can swap out an Anthropic model for OpenAI, etc?

Rand Fitzpatrick

making tech more useful and humane

4w

Are you looking for a nice multi-backend client library, or something like https://github.com/jxnl/instructor to help with the data structuring?

Stefan Krawczyk

CEO @ DAGWorks Inc. | Co-creator of Hamilton & Burr | Pipelines: Data, Data Science, Machine Learning, & LLMs

3w

I'm skeptical since prompts don't transfer that easily. You'll need to therefore switch both. To which case, "feature flagging" between implementations is easy with https://github.com/dagworks-inc/hamilton , see https://hamilton.dagworks.io/en/latest/code-comparisons/langchain/#switch-to-using-anthropic

Brian Dailey

Advisor / Founder / Fractional CTO

4w

Have you checked out the llm package that Simon Willison built for the command line? Swaps out models pretty conveniently. Great demo here: https://simonwillison.net/2024/Jun/17/cli-language-models/

Nishant Gandhi

Sway AI | Ex-DataRobot | IIT Patna | Public Speaker

4w

I have been using Ollama that let me run models locally and add API keys to run Inference on OpenAI etc models.

Like
Reply
Owen Zanzal

Principal Engineer at Everactive

4w

I've been working on a cli tool https://github.com/o3-cloud/cllm. It only supports OpenAI chat compilations compatible models, but looking to add support for as many models as possible. My goal is to keep the main cllm command as simple as possible and instead provide external cli tools that work with cllm's output to create more complex processes.

experimenting with this to load balance across multiple llm's and avoid those 429's, maybe something here could help you https://docs.litellm.ai/docs/routing

Like
Reply
Ravindra Sadaphule

Director of Engineering | LLM, RAG, Generative AI | Stanford GSB | M.S. (Artificial Intelligence) from Johns Hopkins | MBA (Information Technology) from American Public University | AI Technology Advisor

3w

John Berryman LangChain has different components like PromptTemplate, ChatAI, and Pipreine. You can pick and choose the subset of components you need. Here is a blog for illustration. There is an example of how you can swap OpenAI call with Anthropic . https://medium.com/state-of-the-art-technology/taming-the-llm-zoo-with-langchain-part-ii-2d744e8fa0cc

Julia Neagu

CEO & Co-Founder | Quotient AI

4w

yo.

Ravi Somepalli

Hands on Engineering Manager| Founder at Lakumbra| Engineer| Application Architect

4w

Trying to understand, can you guys share what were your use cases to have integration with multiple LLMs in production

Like
Reply
See more comments

To view or add a comment, sign in

Explore topics