althor
All work
Case study · Production

Enterprise AI governance platform

An AI-adoption governance system spanning submission, review, tool approval, and policy enforcement across a global workforce — built so adoption tracking and compliance review run against the same surface.

The problem

A global organization with employees experimenting with AI tools faster than the policy team could track them. Some experimentation was harmless. Some involved customer data going into consumer chat interfaces. The existing process — an email chain to a shared mailbox — produced exactly zero usable records and no signal about what was actually happening.

The brief was to build a governance surface that didn't get in the way of adoption — fast enough that people would use it, structured enough that the governance council could actually review what was submitted.

Architecture

UsersAzure Static Web AppsAzure FunctionsCosmos DBAzure Communication Services (notify)Azure Key Vault (secrets)

RBAC tiers:   End Users  →  AI Champions  →  AI Council  →  System Admins
             (submit)      (review)         (approve)        (operate)

The architecture is deliberately boring. Static frontend, serverless backend, document store. The interesting decisions are upstream of the stack — in the role model, the workflow shape, and where the platform meets the existing Microsoft 365 surface.

Four-tier RBAC, by Entra ID security group

The role model maps directly to existing organizational reality:

Membership is owned by Entra ID security groups, not the application database. Adding someone to the AI council is a security-group change, not a database edit. This is the first thing IT asks for and the last thing most platforms get right.

Embedded where users already work

The platform ships as a native Microsoft Teams app and a SharePoint web part. Users don't have to learn a new surface; they file submissions from where they were already working. The same React app boots inside Teams' iframe with MSAL acquiring a Teams-context token, and inside SharePoint via SPFx.

This is the difference between a tool people grudgingly use and a tool people actually use. The compliance surface meets users at their desk, not at a separate URL with a separate login.

Audit-first data model

Every state transition on a submission produces an event: who, when, what changed, and why. The submission itself is a projection over the event stream. Council review reads the events; reports read the events; nothing reconstructs history from a current-state snapshot.

Decisions and trade-offs

Cosmos DB over relational

Submissions are document-shaped: a top-level record plus an event log plus a comments thread. The query patterns are mostly per-submission and per-user. A document store fits the shape. The cost model also fits — bursty usage with quiet periods.

Bicep + per-branch preview environments

Every PR spins up its own static web app slot for design review. Reviewers click a link, see the change in the actual UI, comment on the PR. The cycle time on UX feedback dropped from days to hours.

No secrets in source

Every secret is a Key Vault reference resolved at runtime via managed identity. No app settings panel with API keys. No .env file in a repo. The first audit question — "where do secrets live?" — has a one-sentence answer.

Stack

React 18 TypeScript Vite Tailwind CSS React Query MSAL.js Azure Functions Azure Static Web Apps Cosmos DB Entra ID Bicep IaC GitHub Actions Playwright

What I'd do again

What I'd do differently

All work