POST
/
lemur
/
v3
/
generate
/
action-items
Extract action items
curl --request POST \
  --url https://api.assemblyai.com/lemur/v3/generate/action-items \
  --header 'Authorization: <api-key>' \
  --header 'Content-Type: application/json' \
  --data '{
  "transcript_ids": [
    "64nygnr62k-405c-4ae8-8a6b-d90b40ff3cce"
  ],
  "context": "This is an interview about wildfires.",
  "final_model": "anthropic/claude-3-5-sonnet",
  "temperature": 0,
  "max_output_size": 3000,
  "answer_format": "Bullet Points"
}'
{
  "request_id": "5e1b27c2-691f-4414-8bc5-f14678442f9e",
  "usage": {
    "input_tokens": 27,
    "output_tokens": 3
  },
  "response": "Here are some potential action items based on the transcript:\n\n- Monitor air quality levels in affected areas and issue warnings as needed.\n\n- Advise vulnerable populations like children, the elderly, and those with respiratory conditions to limit time outdoors.\n\n- Have schools cancel outdoor activities when air quality is poor.\n\n- Educate the public on health impacts of smoke inhalation and precautions to take.\n\n- Track progression of smoke plumes using weather and air quality monitoring systems.\n\n- Coordinate cross-regionally to manage smoke exposure as air masses shift.\n\n- Plan for likely increase in such events due to climate change. Expand monitoring and forecasting capabilities.\n\n- Conduct research to better understand health impacts of wildfire smoke and mitigation strategies.\n\n- Develop strategies to prevent and manage wildfires to limit air quality impacts.\n"
}

Authorizations

Authorization
string
header
required

Body

application/json

Params to generate action items from transcripts

transcript_ids
string<uuid>[]

A list of completed transcripts with text. Up to a maximum of 100 files or 100 hours, whichever is lower. Use either transcript_ids or input_text as input into LeMUR.

input_text
string

Custom formatted transcript data. Maximum size is the context limit of the selected model, which defaults to 100000. Use either transcript_ids or input_text as input into LeMUR.

context

Context to provide the model. This can be a string or a free-form JSON value.

final_model
default:default

The model that is used for the final prompt after compression is performed.

Available options:
anthropic/claude-3-5-sonnet,
anthropic/claude-3-opus,
anthropic/claude-3-haiku,
anthropic/claude-3-sonnet,
anthropic/claude-2-1,
anthropic/claude-2,
default,
anthropic/claude-instant-1-2,
basic,
assemblyai/mistral-7b
max_output_size
integer
default:2000

Max output size in tokens, up to 4000

temperature
number
default:0

The temperature to use for the model. Higher values result in answers that are more creative, lower values are more conservative. Can be any value between 0.0 and 1.0 inclusive.

Required range: 0 <= x <= 1
answer_format
string
default:Bullet Points

How you want the action items to be returned. This can be any text. Defaults to "Bullet Points".

Response

LeMUR action items response

response
string
required

The response generated by LeMUR.

request_id
string<uuid>
required

The ID of the LeMUR request

usage
object
required

The usage numbers for the LeMUR request