Open Protocol

The open protocol forprompt sharing.

Publish, discover, and deploy prompts across tools and teams. Now with eval integration and one-call deploy.

Standards-based. Developer-friendly.

Built on OpenAPI 3.0 for automatic client generation and comprehensive documentation.

openapi.yaml
# PLP OpenAPI Specification (excerpt)
openapi: "3.0.0"
info:
  title: "Prompt Library Protocol"
  version: "1.1.0"

paths:
  /prompts:
    get:
      summary: "List prompts"
      parameters:
        - name: search
          in: query
        - name: tags
          in: query

  /prompts/{id}:
    get:
      summary: "Get a prompt"
    post:
      summary: "Render a prompt"

  /prompts/{id}/deploy:
    post:
      summary: "Deploy to environment"

  /prompts/{id}/eval:
    post:
      summary: "Run evaluation suite"

One-Call Deploy

New

Deploy prompts to any environment with a single API call. POST /prompts/{id}/deploy — staging, production, or custom targets.

Eval Integration

New

Run eval suites over the open protocol. POST /prompts/{id}/eval with test cases and assertions to validate prompt quality before shipping.

JS & Python SDKs

First-party JavaScript and Python SDKs with full TypeScript support. Fetch, render, deploy, and eval — all from code.

Open Standard

PLP is fully open source under the MIT license. Build, extend, and integrate without restrictions.

OpenAPI Specification

Complete OpenAPI 3.0 spec for easy integration. Generate clients in any language automatically.

Interoperability

Any PLP-compatible server works with any PLP client. Mix and match implementations freely.

Discovery API

Search, filter, and discover prompts programmatically. Build your own registry or marketplace.

Secure by Design

Built-in authentication patterns, rate limiting specs, and security best practices.

Deploy and eval over an open protocol.

PLP is MIT licensed and community-driven. One API call to deploy, one to eval. Build without restrictions.