monitoring Live dashboard · refreshed hourly (Redis TTL)

πŸ‡ΊπŸ‡ΈUS Gaming Revenue Dashboard

US-only executive dashboard covering β€” states where gaming/sports-betting revenue is publicly reported and parsed (loading…) β€” β€” operators, β€” months of history, β€” all-time GGR. All values in USD (no FX).

Executive Snapshot

GGR Latest Month
β€”
GGR YTD
β€”
All jurisdictions Β· all verticals
Tax Paid YTD
β€”
State + federal contributions
All-Time GGR
β€”

Pipeline Health

File-collection scope (different from Executive Snapshot above which counts only jurisdictions with parsed $$$ data). Pipeline Health counts every regulator the collector attempted, including those that only return a landing page (anti-bot WAFs like CO/AZ/MI) or no data at all.

Regulators Tracked
β€”
17 US states + 10 intl.
Collection Runs
β€”
cron + ad-hoc backfills
Files Mirrored
β€”
success / attempted
Total Archived
β€”
SHA-256 verified bytes

Top operators by GGR

last 12 mo Β· brand-level only

GGR by jurisdiction

last 12 months Β· USD

Monthly GGR by jurisdiction

last 24 months Β· USD

Files mirrored over time

last 90 days

Per-state × vertical

latest run

Format mix

latest run

Cadence mix

latest run

How the dashboard stays fresh

Every Monday at 12:00 UTC a Python collector streams the latest reports from each US state regulator, verifies each file with SHA-256, parses preview rows with openpyxl / pdfplumber, and writes both a static JSON snapshot (Reports page) and a time-series row in Postgres. A FastAPI service reads from Postgres, caches the JSON response in Redis (TTL 1 hour), and serves it at /api/regulator-reports/*. Nginx reverse-proxies that path to the API container on 127.0.0.1:8200. This page fetches those endpoints on load and renders them with Chart.js 4.

01
Collect
httpx streaming + SHA-256 + pdfplumber
02
Store
Postgres 15 time-series (runs, snapshots)
03
Serve
FastAPI + asyncpg + SQLAlchemy 2
04
Cache
Redis LRU, 128 MB, 1 h TTL
05
Render
Chart.js 4, this page