Verify Only
You already have an answer.
We validate it.
Webhook BYOK
You bring your LLM API key.
We run the full pipeline.
Webhook Managed
We provide the LLM.
You just send prompts.
1. Verify Only
Flow:- You already call the LLM yourself and want an independent sanity check
- You’re adding governance to an existing pipeline with minimal surface-area change
- You want lowest latency (~150ms p95)
2. Webhook BYOK (Bring Your Own Key)
Flow:- You already have provider contracts (OpenAI enterprise, Azure, etc.)
- You want full pipeline: scan → LLM → verify → audit
- You don’t want PromptWall markup on tokens
3. Webhook Managed
Flow:- You want the simplest integration (no LLM provider setup)
- Variable load where fixed commitment doesn’t make sense
- Team doesn’t want to manage multiple vendor relationships
Comparison
| Verify | BYOK | Managed | |
|---|---|---|---|
| LLM tokens | n/a | Customer pays provider | PromptWall |
| Integration complexity | Low | Medium | Low |
| Latency | ~150ms | ~800ms | ~800ms |
| Our cost per request | $0.0001 | $0.001 | $0.001 + tokens |
| Best for | Existing pipelines | Enterprise with provider contracts | Startups, variable load |
Switching modes
A single tenant can use multiple modes via different API keys:- Create a new “app” in the dashboard
- Choose the mode per-app
- Point your code at the appropriate API key