Skip to main content

IBM

The LangChain integrations related to IBM watsonx.ai platform.

IBM® watsonx.ai™ AI studio is part of the IBM watsonx™ AI and data platform, bringing together new generative AI capabilities powered by foundation models and traditional machine learning (ML) into a powerful studio spanning the AI lifecycle. Tune and guide models with your enterprise data to meet your needs with easy-to-use tools for building and refining performant prompts. With watsonx.ai, you can build AI applications in a fraction of the time and with a fraction of the data. Watsonx.ai offers:

  • Multi-model variety and flexibility: Choose from IBM-developed, open-source and third-party models, or build your own model.
  • Differentiated client protection: IBM stands behind IBM-developed models and indemnifies the client against third-party IP claims.
  • End-to-end AI governance: Enterprises can scale and accelerate the impact of AI with trusted data across the business, using data wherever it resides.
  • Hybrid, multi-cloud deployments: IBM provides the flexibility to integrate and deploy your AI workloads into your hybrid-cloud stack of choice.

Installation and Setup

Install the integration package with

pip install -qU langchain-ibm

Get an IBM watsonx.ai api key and set it as an environment variable (WATSONX_APIKEY)

import os

os.environ["WATSONX_APIKEY"] = "your IBM watsonx.ai api key"

Chat Model

ChatWatsonx

See a usage example.

from langchain_ibm import ChatWatsonx

LLMs

WatsonxLLM

See a usage example.

from langchain_ibm import WatsonxLLM

Embedding Models

WatsonxEmbeddings

See a usage example.

from langchain_ibm import WatsonxEmbeddings

Was this page helpful?


You can also leave detailed feedback on GitHub.