Resources for enterprise AI buyers.
Practical tools, checklists, and guidance for teams scoping production AI work — written by operators who've shipped these systems and lived with them.
AI Implementation Readiness Checklist.
Twenty-five questions across five pillars. If you can confidently answer most of them before kickoff, your project is ready to be scoped. If you can't, that's where scoping should start.
The 25-question readiness checklist
1. Workflow & ROI (5)
- Have you named the single workflow this AI system will change — not a category, a workflow?
- Is there one metric that will tell you the system is working (e.g. time-to-resolve, deflection rate, hours saved)?
- Do you know the current baseline of that metric, measured today, not estimated?
- Have you sized the value (annual hours, dollars, or revenue) of moving that metric by a realistic amount?
- Is there a named executive sponsor who owns the metric, not just the AI initiative?
2. Data & Inputs (5)
- Does the data the model needs exist today, in a system you can access programmatically?
- Is the data permissioned correctly — for users, for the AI service, and for any sub-processors?
- Do you have a representative sample (50–500 examples) you could hand a vendor on day one?
- Is there a "ground truth" anywhere (resolved tickets, signed contracts, reviewed documents) that can power evals?
- How fresh does data need to be in production — minutes, hours, or daily-batch?
3. Evals & Quality (5)
- Have you defined what "good enough" looks like, in a way an engineer could write a test for?
- Who will judge outputs during the build phase — a domain expert, ops lead, or end users?
- Will the system fail gracefully (escalate, ask, or refuse) when it isn't confident?
- Have you decided what kinds of errors are unacceptable (e.g. wrong dollar amounts, wrong customer)?
- Will you track quality continuously after launch, or only at sprint review?
4. Governance & Risk (5)
- Do you know which compliance frameworks (SOC 2, HIPAA, GDPR, sector-specific) the system must satisfy?
- Have you decided whether customer or regulated data can be sent to a third-party model provider?
- Is there an audit-trail requirement — every prompt, response, retrieval, and decision logged?
- Has security review been pre-briefed, with a path to approval, or is it a post-build surprise?
- Do you have a documented escalation and incident protocol for AI-driven errors?
5. Change Management & Ops (5)
- Will end users be trained, paid, or measured differently because of this system?
- Is there an internal champion close to the workflow who will adopt and defend the system early?
- Is there a rollback plan if quality drops or behaviour drifts?
- Who owns the system in production — engineering, ops, or a shared on-call?
- Have you budgeted for ongoing tuning and evals, not just the build?
Most teams pass on three pillars and have real gaps in two. That's normal. The work in scoping is to either close those gaps fast or design the engagement around them.
Coming soon.
We're publishing additional resources as engagements turn into reusable templates and patterns.
Enterprise AI Scoping Guide
How we run a 30-minute scoping conversation: the questions we ask, the outputs we produce, and the decisions you should be ready to make.
AI Governance Starter Pack
Policy template, vendor review template, eval-gating template, and an executive briefing format. Pulled from real governance engagements.
Eval Patterns for Production AI
Patterns we use for ground-truth datasets, judge prompts, regression suites, and shadow-mode rollouts.
Production AI Architecture Reference
Reference architecture for retrieval, agent orchestration, observability, and rollback in regulated enterprise environments.
We can run the checklist with you in 30 minutes.
Bring a workflow you'd like to AI-enable. We'll walk the 25 questions live and tell you whether it's a sprint, a retainer, a governance engagement, or not yet ready.