This repository provides LangChain components for various AWS services. It aims to replace and expand upon the existing LangChain AWS components found in the langchain-community
package in the LangChain repository.
- LLMs: Includes LLM classes for AWS services like Bedrock and SageMaker Endpoints, allowing you to leverage their language models within LangChain.
- Retrievers: Supports retrievers for services like Amazon Kendra and KnowledgeBases for Amazon Bedrock, enabling efficient retrieval of relevant information in your RAG applications.
- Graphs: Provides components for working with AWS Neptune graphs within LangChain.
- Agents: Includes Runnables to support Amazon Bedrock Agents, allowing you to leverage Bedrock Agents within LangChain and LangGraph.
- More to come: This repository will continue to expand and offer additional components for various AWS services as development progresses.
Note: This repository will replace all AWS integrations currently present in the langchain-community
package. Users are encouraged to migrate to this repository as soon as possible.
You can install the langchain-aws
package from PyPI.
pip install langchain-aws
Here's a simple example of how to use the langchain-aws
package.
from langchain_aws import ChatBedrock
# Initialize the Bedrock chat model
llm = ChatBedrock(
model="anthropic.claude-3-sonnet-20240229-v1:0",
beta_use_converse_api=True
)
# Invoke the llm
response = llm.invoke("Hello! How are you today?")
print(response)
For more detailed usage examples and documentation, please refer to the LangChain docs.
We welcome contributions to this project! Please follow the contribution guide for instructions to setup the project for development and guidance on how to contribute effectively.
This project is licensed under the MIT License.