POST
/
lemur
/
v3
/
generate
/
task
Run a task using LeMUR
curl --request POST \
  --url https://api.assemblyai.com/lemur/v3/generate/task \
  --header 'Authorization: <api-key>' \
  --header 'Content-Type: application/json' \
  --data '{
  "transcript_ids": [
    "64nygnr62k-405c-4ae8-8a6b-d90b40ff3cce"
  ],
  "context": "This is an interview about wildfires.",
  "final_model": "anthropic/claude-3-5-sonnet",
  "temperature": 0,
  "max_output_size": 3000,
  "prompt": "List all the locations affected by wildfires."
}'
{
  "request_id": "5e1b27c2-691f-4414-8bc5-f14678442f9e",
  "usage": {
    "input_tokens": 27,
    "output_tokens": 3
  },
  "response": "Based on the transcript, the following locations were mentioned as being affected by wildfire smoke from Canada:\n\n- Maine\n- Maryland\n- Minnesota\n- Mid Atlantic region\n- Northeast region\n- New York City\n- Baltimore\n"
}

Authorizations

Authorization
string
header
required

Body

application/json

Params to run the task

prompt
string
required

Your text to prompt the model to produce a desired output, including any context you want to pass into the model.

transcript_ids
string<uuid>[]

A list of completed transcripts with text. Up to a maximum of 100 files or 100 hours, whichever is lower. Use either transcript_ids or input_text as input into LeMUR.

input_text
string

Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000. Use either transcript_ids or input_text as input into LeMUR.

context

Context to provide the model. This can be a string or a free-form JSON value.

final_model
default:default

The model that is used for the final prompt after compression is performed.

Available options:
anthropic/claude-3-5-sonnet,
anthropic/claude-3-opus,
anthropic/claude-3-haiku,
anthropic/claude-3-sonnet,
anthropic/claude-2-1,
anthropic/claude-2,
default,
anthropic/claude-instant-1-2,
basic,
assemblyai/mistral-7b
max_output_size
integer
default:2000

Max output size in tokens, up to 4000

temperature
number
default:0

The temperature to use for the model. Higher values result in answers that are more creative, lower values are more conservative. Can be any value between 0.0 and 1.0 inclusive.

Required range: 0 <= x <= 1

Response

LeMUR task response

response
string
required

The response generated by LeMUR.

request_id
string<uuid>
required

The ID of the LeMUR request

usage
object
required

The usage numbers for the LeMUR request