POST
/
lemur
/
v3
/
generate
/
question-answer
Ask questions using LeMUR
curl --request POST \
  --url https://api.assemblyai.com/lemur/v3/generate/question-answer \
  --header 'Authorization: <api-key>' \
  --header 'Content-Type: application/json' \
  --data '{
  "transcript_ids": [
    "64nygnr62k-405c-4ae8-8a6b-d90b40ff3cce"
  ],
  "context": "This is an interview about wildfires.",
  "final_model": "anthropic/claude-3-5-sonnet",
  "temperature": 0,
  "max_output_size": 3000,
  "questions": [
    {
      "question": "Where are there wildfires?",
      "answer_format": "List of countries in ISO 3166-1 alpha-2 format",
      "answer_options": [
        "US",
        "CA"
      ]
    },
    {
      "question": "Is global warming affecting wildfires?",
      "answer_options": [
        "yes",
        "no"
      ]
    }
  ]
}'
{
  "request_id": "5e1b27c2-691f-4414-8bc5-f14678442f9e",
  "usage": {
    "input_tokens": 27,
    "output_tokens": 3
  },
  "response": [
    {
      "answer": "CA, US",
      "question": "Where are there wildfires?"
    },
    {
      "answer": "yes",
      "question": "Is global warming affecting wildfires?"
    }
  ]
}

Authorizations

Authorization
string
header
required

Body

application/json

Params to ask questions about the transcripts

questions
object[]
required

A list of questions to ask

transcript_ids
string<uuid>[]

A list of completed transcripts with text. Up to a maximum of 100 files or 100 hours, whichever is lower. Use either transcript_ids or input_text as input into LeMUR.

input_text
string

Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000. Use either transcript_ids or input_text as input into LeMUR.

context

Context to provide the model. This can be a string or a free-form JSON value.

final_model
default:default

The model that is used for the final prompt after compression is performed.

Available options:
anthropic/claude-3-5-sonnet,
anthropic/claude-3-opus,
anthropic/claude-3-haiku,
anthropic/claude-3-sonnet,
anthropic/claude-2-1,
anthropic/claude-2,
default,
anthropic/claude-instant-1-2,
basic,
assemblyai/mistral-7b
max_output_size
integer
default:2000

Max output size in tokens, up to 4000

temperature
number
default:0

The temperature to use for the model. Higher values result in answers that are more creative, lower values are more conservative. Can be any value between 0.0 and 1.0 inclusive.

Required range: 0 <= x <= 1

Response

LeMUR question & answer response

request_id
string<uuid>
required

The ID of the LeMUR request

usage
object
required

The usage numbers for the LeMUR request

response
object[]
required

The answers generated by LeMUR and their questions