Prompts deservea language.

Echostash is prompt infrastructure powered by Echo PDK — the first DSL for dynamic prompts. Variables, conditionals, roles, tools, schemas. Write prompts that think, not just fill blanks.

75%
Token reduction
<50ms
Render latency
5
LLM providers
14+
Eval assertions
Visual prompt editor with role blocks and variable badgesCode editor with Echo DSL syntax highlightingAnalytics dashboard with render metrics and error ratesPlayground with tool definitions and mock responsesEvaluation runs with quality gates and test resultsCI/CD integration with automated prompt testing

Your prompts are still strings?

Everyone else copies prompt text into a GUI. You write real code.

The old way
prompt = f"""You are a support agent.

The customer is {name} with a {tier} plan.

They might be premium or free, we send
everything either way.

They asked about: {issue}

Here are ALL our policies (even irrelevant):
{all_policies}

Here are ALL product docs (5000 tokens):
{all_docs}
"""
With Echo PDK
[#ROLE system]
You are a support agent for {{company}}.
[END ROLE]

[#IF {{tier}} #equals(premium)]
You have priority support and extended tools.
[END IF]

[#IF {{issue_type}} #one_of(billing,refund)]
[#INCLUDE billing_policies]
[END IF]

[#ROLE user]
{{question}}
[END ROLE]

Same prompt. 75% fewer tokens. Only relevant context reaches the LLM.

Powered by Echo PDK

A real language for prompts

Not a string builder. Not a GUI. A domain-specific language with variables, conditionals, roles, tools, and schemas — all resolved at render time.

Send only what matters

Variables & Conditionals

Dynamic variables with defaults, nested access, and typed values. Conditionals filter out irrelevant sections — 75% fewer tokens on average.

[#IF {{tier}} #equals(premium)]
Priority support enabled.
Extended tools: {{tools}}
[ELSE]
Standard support. Upgrade for more.
[END IF]
Structure prompts as conversations

Roles & Messages

Define multi-turn conversations with system, user, and assistant roles. Conditionally include messages based on context.

[#ROLE system]
You are a {{role}} for {{company}}.
[END ROLE]

[#ROLE user]
{{question}}
[END ROLE]
Function calling in the prompt

Tools & Schema

Define tool definitions and JSON schemas directly in your template. Conditional tools based on user permissions or context.

[#TOOL search_docs]
description: Search knowledge base
parameters:
  query:
    type: string
    required: true
[END TOOL]

[#SCHEMA]
answer:
  type: string
  required: true
confidence:
  type: number
[END SCHEMA]
Dynamic model config

Meta Templates

Choose the model, temperature, and provider based on the input. Creative tasks get GPT-4o, extraction gets GPT-4o-mini — all in the prompt.

provider: openai
[#IF {{task}} #equals(creative)]
model: gpt-4o
temperature: 0.9
[ELSE]
model: gpt-4o-mini
temperature: 0.2
[END IF]

The platform to ship them

Echo PDK is the language. Echostash is the infrastructure. Version, test, deploy, and serve prompts to any LLM provider.

Evals & Quality Gates

14 assertion types, A/B testing, and automated pass/fail thresholds. Block bad deploys before they ship.

Version Control

Every change creates an immutable version. Commit, diff, tag, publish, and deploy to named targets.

Playground

Visual editor with variable badges, code mode with Monaco, render preview, and LLM run with tool mocking.

SDKs & API

JavaScript and Python SDKs with a fluent API. Fetch, render, and convert prompts in one line.

5 Provider Converters

One prompt, any LLM. Convert to OpenAI, Anthropic, Google, Vercel AI, or LangChain format instantly.

Server-Side Render

Render prompts on the server with variables. Get back resolved messages, tools, and model config. Batch up to 50.

One prompt, any provider

Fetch, render, convert — in one line. Spread the result directly into your LLM call.

import { Echostash } from "@goreal-ai/echostash"

// Initialize the client
const es = new Echostash({
  apiKey: process.env.ECHOSTASH_API_KEY
})

// Fetch, render, and convert to OpenAI format
const messages = await es.prompt("welcome-message")
  .vars({ userName: "Alice", language: "English" })
  .openai()

Works with your stack

Built-in converters for OpenAI, Anthropic, Google, Vercel AI SDK, and LangChain. Same prompt, any model.

OpenAI
Anthropic
Google
Meta
Mistral
example.ts
const result = await es
  .prompt("support-agent")
  .vars({ name: "Alice" })
  .openai()

// { messages, tools, model, temperature }
// Spread directly into your LLM call
await openai.chat.completions.create(result)

The language is open source

Echo PDK is MIT licensed. Echostash is the platform built on top. Use the DSL standalone, or get the full infrastructure.

Ship prompts like you ship code.

A real language. Version control. Evals. Quality gates. Five provider converters. Start free.

Read the Docs