POST
/
prompt-templates
/
{prompt_name}
curl --request POST \
  --url https://api.promptlayer.com/prompt-templates/{prompt_name} \
  --header 'Content-Type: application/json' \
  --data '{
  "version": 1,
  "workspace_id": 123,
  "label": "<string>",
  "provider": "openai",
  "input_variables": {}
}'
{
  "id": 123,
  "prompt_name": "<string>",
  "prompt_template": {
    "content": [
      {
        "type": "<any>",
        "text": "<string>"
      }
    ],
    "input_variables": [],
    "template_format": "f-string",
    "type": "<any>"
  },
  "metadata": {
    "model": {
      "provider": "<string>",
      "name": "<string>",
      "parameters": {}
    },
    "customField": "<string>"
  },
  "commit_message": "<string>",
  "llm_kwargs": {}
}

Retrieve a prompt template using it’s prompt_name. Optionally, specify version (version number) or label (release label like “prod”) to retrieve a specific version. If not specified, the latest version is returned.

You can also specify a provider to return LLM-specific arguments that can be passed directly into your LLM client. To format the template with input variables, use input_variables.

Path Parameters

prompt_name
string
required

Body

application/json

Response

200
application/json

Successful Response

The response is of type object.