§.Public API & SDK
Endpoint reference
Every `/api/v1` endpoint with method, parameters, and an example request.
Updated 2026-04-13
List prompts
GET /api/v1/prompts
Query params:
page (integer, default 1)
per_page (integer, default 20, max 100)
folder_id (uuid, filter to one folder)
tag (string, filter to one tag slug)
type (system|user|multi-turn|template)
Response: { data: PromptSummary[], meta: { page, per_page, total } }Get one prompt
GET /api/v1/prompts/{id}
Response: { data: Prompt }
Prompt includes: id, title, description, current_content,
prompt_type, target_model, intent, tags[], folder_id,
current_version, created_at, updated_atGet resolved content
GET /api/v1/prompts/{id}/resolved
Query params:
fragment_vars (JSON object, overrides for fragment variable bindings)
Response: { data: { content: string, fragments_used: string[] } }
The content field has every ${fragment:id} reference substituted
and is ready to pass to an LLM provider directly.List versions
GET /api/v1/prompts/{id}/versions
Query params:
page, per_page
Response: { data: VersionSummary[], meta }
VersionSummary has: version_number, change_summary, change_source,
parent_version_id, created_at, content_length (NOT content)Get one version
GET /api/v1/prompts/{id}/versions/{version}
Response: { data: Version }
Version includes everything from VersionSummary plus the content.Read-only
The public API is read-only in this release. No mutations via
/api/v1. If you need to create or update prompts programmatically, do it from the PromptAssay UI or file a feature request.Related