Deployment

Ship the website and the Guardrails API. Keep evidence portable, controls verifiable, and policy enforcement centralized—regardless of where you host.

Architecture overview

The website proxies API requests through /__guardrails/*, so the frontend code works identically in local, staging, and production.

Deploy the website

From website/:

npm install
npm run build

This produces a dist/ folder. Deploy it to your static host of choice.

Environment note: The proxy path (/__guardrails/*) is configured in astro.config.mjs. In production, your hosting platform handles the rewrite to your Guardrails API.

Deploy the Guardrails API

The Guardrails API is a FastAPI application in hexarch-guardrails-py/.

Minimum requirements:

Example production startup:

python -m hexarch_cli serve api \
  --host 0.0.0.0 \
  --port 8099 \
  --init-db \
  --enable-docs \
  --cors-origins https://yourdomain.com \
  --database-url postgresql://user:pass@host:5432/hexarch

API proxy configuration

In production, configure your reverse proxy (NGINX, Caddy, Cloudflare) or hosting platform to route:

/__guardrails/* → https://your-guardrails-api/*

This keeps the frontend code consistent. The Proof Demo and all API calls use the same relative paths in every environment.

Production considerations

Database

Audit chain integrity

The tamper-evident audit chain is only as trustworthy as your deployment. Consider:

Scaling

The Guardrails API is stateless (database is the state). You can run multiple instances behind a load balancer. Use Postgres with connection pooling for high-throughput deployments.

Next steps