Skip to main content

Why BYOK

  • Keep your existing LLM provider contracts
  • No PromptWall markup on tokens
  • Full control over model selection, temperature, etc.

Setup

  1. Dashboard → Apps → Create new → Mode: Webhook BYOK
  2. Paste your LLM provider API key — encrypted with your KMS key at rest
  3. Select provider (openai / anthropic / google / azure / bedrock) and default model
  4. Copy the PromptWall API key — use it in your app

Flow

Your app POSTs /v1/chat with prompt

PromptWall scanner + policy + grounding

PromptWall decrypts your LLM key, calls provider

PromptWall judges the answer against tool results

Returns governed response

Credentials storage

  • Encrypted at rest with Fernet using a KMS-derived key per tenant
  • Never logged, never visible in dashboards
  • Rotated with one click from dashboard