cURL
curl --request POST \ --url https://api.prompt-wall.com/v1/chat \ --header 'Authorization: Bearer <token>' \ --header 'Content-Type: application/json' \ --data ' { "prompt": "<string>", "messages": [ { "role": "system", "content": "<string>" } ], "model": "gpt-4o-mini", "provider": "openai", "temperature": 0.2 } '
{ "ok": true, "answer": "<string>", "confidence": "<string>", "governance_action": "<string>", "verified_source_used": true, "policy": "<string>", "latency_ms": 123, "tokens": { "prompt": 123, "completion": 123, "total": 123 }, "matches": [ "<string>" ], "request_id": "<string>" }
Runs the full pipeline: scanner → policy → grounding → tool → LLM → judge → enforcement. LLM provider determined by your API key’s mode (BYOK or Managed).
Bearer authentication header of the form Bearer <token>, where <token> is your auth token.
Bearer <token>
<token>
Optional conversation history
Show child attributes
Override default model
"gpt-4o-mini"
openai
anthropic
google
azure
bedrock
Governed chat response