Via the Python and TypeScript SDK, you can acces the prompt API. With the API you can get the prompt to use in your code and add variable values in a conversation flow. You cannot draft and save prompt templates via the API, this should be done via the Literal AI Prompt Playground.

Get a Prompt

Getting a prompt is the first step to use it in your code. You can get a prompt by its name. Prompt names are their IDs (hence you cannot change the name).

Create a prompt

You can create a new prompt on the Literal AI platform using the Prompt Playground, but you can also create one using the API. The name of the prompt is its ID, so you cannot change this afterwards.

Format a Prompt

Once you got your prompt, you can format it to get messages in the OpenAI format.

Combining prompts with integrations (like the OpenAI integration) allows you to log the generations and to track which prompt versions were used to generate them.

You format messages like this:

Make sure to always prepare your messages with Prompt.format_messages – or its TypeScript equivalent, even if you do not have variables to resolve. This ensures messages are logged properly when using our OpenAI instrumentation.

Langchain Chat Template

Since Langchain has a different format for prompts, you can convert a Literal AI prompt to a Langchain Chat Template.

You can combine the prompt with the Langchain integration to log the generations and to track which prompt versions were used to generate them.