Get Started
The default option for self-hosting Literal AI is to use the public Docker image on the Docker registry. This image can be pulled and run on any machine with Docker installed.
While this is the easiest way to get started, it does require the platform to authenticate with our servers. In the future it may have some light telemetry to help us improve the platform, but this will never include the content of the conversations between your users and your LLM application.
Registering your self-hosted platform
To register your self-hosted platform, please fill the form here. You will then receive an email containing the following :
- Your Literal Client ID
- Your Literal Authorization Token
Deploying the Image
You have several options for hosting the platform:
Azure
Deploy the Literal AI platform on Azure in a few minutes with the Azure Developer CLI.
AWS
Deploy the Literal AI platform on AWS in a few minutes with the CDK.
Google Cloud
Coming soon.
Manual Deployment
Deploy the Literal AI platform anywhere. You will have to provision the infrastructure and deploy the Docker image.
Update the SDKs
By default, the Literal AI SDKs point to the cloud hosted version of the platform. To point the SDKs to your self-hosted platform you will have to update the url
parameter in the SDK instantiation:
Update Chainlit
If you are using Chainlit, you will need to update the LITERAL_API_URL
environment variable to point to your self-hosted platform.
Was this page helpful?