Skip to main content

How to use Fast Large Language Model (Fast LLM)

The Fast Large Language Model (Fast LLM) integration is used when the AI agent needs to quickly generate short, formal, or interim responses. This includes:

  • Filler phrases to keep the conversation going while processing a request
  • Templates or boilerplate responses used before the main reply

Limitations

  • Fast LLM is not intended for primary generation and should not be used for complex tasks
  • It typically does not require manual configuration — it is automatically linked based on the AI agent type

For configuration details, see the section: How to Use LLM (Large Language Models)