AI Gate

Intelligent content selection

What is AI Gate?

AI Gate (#ai_gate) uses an LLM to evaluate conditions that can't be expressed with simple operators. It asks a yes/no question about a variable's content and branches accordingly.

Basic Usage

ai-gate-basic.echo
[#IF {{userMessage}} #ai_gate(Is this message about a sensitive topic?)]
I'll handle this topic with extra care and empathy.
[END IF]

[#IF {{productDescription}} #ai_gate(Does this description mention competing products?)]
[#INCLUDE competitive_response_guidelines]
[END IF]

How It Works

When a template is rendered:

  • All #ai_gate conditions are collected from the template
  • All conditions are evaluated in parallel using Promise.all()
  • Results are cached — identical conditions won't call the LLM twice
  • The LLM receives: the variable value + your question, and answers “yes” or “no”

Example: Content Moderation

moderation.echo
You are a content moderator for a family-friendly platform.

A user submitted this content: "{{user_submission}}"

[#IF {{user_submission}} #ai_gate(Does this contain hate speech or harassment?)]
This content has been flagged for review. Do not approve it.
Explain to the user why it violates our community guidelines.
[ELSE IF {{user_submission}} #ai_gate(Does this contain adult or explicit content?)]
This content is not suitable for our family-friendly platform.
Politely ask the user to modify their submission.
[ELSE]
This content appears appropriate. You may approve it.
[END IF]

Example: Intelligent Routing

triage.echo
You are a customer support triage agent.

Customer message: "{{message}}"

[#IF {{message}} #ai_gate(Is the customer expressing frustration or anger?)]
**Priority: HIGH** - Frustrated customer detected.
Acknowledge their frustration first. Use empathetic language.
Offer immediate solutions or escalation to a manager.
[ELSE IF {{message}} #ai_gate(Is this a billing or payment related issue?)]
Route to billing department. Verify account details before discussing charges.
[ELSE IF {{message}} #ai_gate(Is this a technical problem or bug report?)]
Route to technical support. Gather device/browser info and steps to reproduce.
[ELSE]
General inquiry. Handle with standard support procedures.
[END IF]

Configuration

AI Gate requires an AI provider configuration:

const echo = createEcho({
  aiProvider: {
    type: "openai",       // or "anthropic"
    apiKey: "sk-...",
    model: "gpt-4o-mini", // optional
    timeout: 10000         // optional, milliseconds
  }
})

Caching

Results are cached for 5 minutes using a hash of the value, question, provider type, and model. Identical conditions within the same render won't call the LLM twice.

Performance

All AI Gate conditions run in parallel. A template with 5 gates that each take 500ms will complete in ~500ms total, not 2500ms. Use deterministic operators (#equals, #contains, etc.) when possible — AI Gate adds latency and cost.

Supported Providers

  • OpenAI — default model: gpt-4o, env: OPENAI_API_KEY
  • Anthropic — default model: claude-sonnet-4-5-20250929, env: ANTHROPIC_API_KEY