v29 enclave live · PCR0 published · source public
The guardrail layer you can prove.
Drop-in OpenAI/Anthropic proxy. Safety classification inside an AWS Nitro Enclave whose binary you can rebuild from public source and attest yourself. Your prompts, responses, and provider keys never leave the trust boundary.
Verifiable, not just promised
Every request is classified inside an AWS Nitro Enclave whose binary you can rebuild from public source and check against the PCR0 we publish.
Zero data retention by construction
The events table that leaves the enclave has no fields for prompt or response. The dataclass type-checks the invariant at compile time.
We never see your provider keys
Your OpenAI / Anthropic key is wrapped under a KMS key whose policy only releases plaintext to an attested enclave. The parent process never holds it.
Best-in-class classifiers
Dispatch across IBM Granite Guardian 4.1 (BYOC custom criteria) and Qwen3Guard-Gen 0.6 / 4 / 8B. Choose latency vs. accuracy per endpoint.
Drop-in proxy
Point your OpenAI/Anthropic SDK at our endpoint. Same wire format, same response shape. No SDK rewrites.
Custom rules without retraining
BYOC: ship plain-English rules (“no internal codenames”) per endpoint. Granite Guardian evaluates each criterion in declared order.
Pricing
Simple, usage-based.
One plan. First 500 requests every month included; pay only for what you use beyond that. No seats, no minimums, cancel anytime.
Standard
Free tier + pay-as-you-go overage.
- First 500 requests every month — free
- $1.99 per additional 500 requests (billed per-request, fractional)
- Unlimited endpoints, custom policies, and team members
- All classifier tiers (fast / expert / heavy) + BYOC
- Live attestation, archived per-request COSE docs (opt-in)
Card required for the $0 plan so usage above 500 can be billed.
Trust
Read the code that classifies your traffic.
The exact source baked into every running enclave is public. Build it yourself; compare PCR0 to the value we publish; sign off the trust chain end-to-end.