The default option for self-hosting Literal AI is to use the public Docker image on the Docker registry. This image can be pulled and run on any machine with Docker installed.

While this is the easiest way to get started, it does require the platform to authenticate with our servers. In the future it may have some light telemetry to help us improve the platform, but this will never include the content of the conversations between your users and your LLM application.

If you need to run the platform in an air-gapped environment, you can contact us here to get access to the private Docker image. This version of the platform can be run without needing to authenticate with our servers.

Registering your self-hosted platform

To register your self-hosted platform, please fill the form here. You will then receive an email containing the following :

  • Your Literal Client ID
  • Your Literal Authorization Token

Deploying the Image

You have several options for hosting the platform:

Update the SDKs

By default, the Literal AI SDKs point to the cloud hosted version of the platform. To point the SDKs to your self-hosted platform you will have to update the url parameter in the SDK instantiation:

from literalai.client import LiteralClient

MY_LITERAL_API_URL = "http://localhost:3000"

sdk = LiteralClient(api_key="YOUR_API_KEY", url=MY_LITERAL_API_URL)

Update Chainlit

If you are using Chainlit, you will need to update the LITERAL_API_URL environment variable to point to your self-hosted platform.