LLM cells let you generate responses from a large language model (LLM).

Prerequisites

You will need a connection of type OpenAI or Anthropic to use LLM cells.

Using Variables

Reference variables from other cells in your prompt using the {{variable_name}} syntax.

Example

Highlight all errors from the following log output and create a table with a human-readable summary.

<log_output>
{{splunk_1}}
</log_output>
Note: The full output of referenced cells will be sent to the LLM prompt, which may consume a large number of tokens.