Bedrock
Amazon Bedrock is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like
AI21 Labs
,Anthropic
,Cohere
,Meta
,Stability AI
, andAmazon
via a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. UsingAmazon Bedrock
, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning andRetrieval Augmented Generation
(RAG
), and build agents that execute tasks using your enterprise systems and data sources. SinceAmazon Bedrock
is serverless, you donβt have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.
%pip install boto3
from langchain.llms import Bedrock
llm = Bedrock(
credentials_profile_name="bedrock-admin", model_id="amazon.titan-text-express-v1"
)
Using in a conversation chainβ
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
conversation = ConversationChain(
llm=llm, verbose=True, memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")
Conversation Chain With Streamingβ
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from langchain.llms import Bedrock
llm = Bedrock(
credentials_profile_name="bedrock-admin",
model_id="amazon.titan-text-express-v1",
streaming=True,
callbacks=[StreamingStdOutCallbackHandler()],
)
conversation = ConversationChain(
llm=llm, verbose=True, memory=ConversationBufferMemory()
)
conversation.predict(input="Hi there!")