Skip to main content

DeepInfra

This page covers how to use the DeepInfra ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific DeepInfra wrappers.

Installation and Setup​

  • Get your DeepInfra api key from this link here.
  • Get an DeepInfra api key and set it as an environment variable (DEEPINFRA_API_TOKEN)

Available Models​

DeepInfra provides a range of Open Source LLMs ready for deployment. You can list supported models for text-generation and embeddings. google/flan* models can be viewed here.

You can view a list of request and response parameters.

Wrappers​

LLM​

There exists an DeepInfra LLM wrapper, which you can access with

from langchain.llms import DeepInfra

Embeddings​

There is also an DeepInfra Embeddings wrapper, you can access with

from langchain.embeddings import DeepInfraEmbeddings