Why BYOK
- Keep your existing LLM provider contracts
- No PromptWall markup on tokens
- Full control over model selection, temperature, etc.
Setup
- Dashboard → Apps → Create new → Mode: Webhook BYOK
- Paste your LLM provider API key — encrypted with your KMS key at rest
- Select provider (openai / anthropic / google / azure / bedrock) and default model
- Copy the PromptWall API key — use it in your app
Flow
Credentials storage
- Encrypted at rest with Fernet using a KMS-derived key per tenant
- Never logged, never visible in dashboards
- Rotated with one click from dashboard