A rigorous assessment.
A clear score.
An actionable plan.
The assessment isn't a survey. It's a structured engagement that combines document review, stakeholder interviews, and practitioner analysis to produce a scored, evidence-backed picture of your organization's AI readiness.
Five phases. Scoped to your organization. One complete picture.
The engagement is designed to move with urgency without cutting corners. From initial discovery to final findings, the pace is shaped by your organization's complexity and stakeholder availability, not an arbitrary clock.
Discovery & Scoping
An initial conversation to understand your organization, your current AI state, and what you're hoping to achieve. We confirm scope, identify key stakeholders, and schedule the engagement. If there's not a strong fit, I'll tell you directly.
- Understand your organizational context and current AI posture
- Identify the right stakeholders for interviews
- Agree on scope, timeline, and what "success" looks like
Document Review
Before any interviews, I review existing documentation. Architecture diagrams, data governance policies, org charts, current AI initiatives, vendor contracts, and any prior AI audit or strategy work. This grounds the interviews in evidence rather than impressions.
- Technical architecture and infrastructure documentation
- Data governance and quality policies
- Existing AI/ML initiatives, tools, and vendor relationships
- Organizational structure and relevant team compositions
Stakeholder Interviews
Structured 45-minute interviews with key stakeholders across technical and non-technical functions. I ask the same core questions across roles to triangulate honest answers: what leadership says about AI literacy often differs sharply from what the team experiences.
- CTO / CIO / Head of Engineering: technical foundation and infrastructure readiness
- Head of Data / Data Engineering: data maturity and pipeline reliability
- Lead Engineers / Senior ICs: ground-level capability and tooling reality
- Business Unit Leaders: AI adoption, use case clarity, and change readiness
- CISO / Legal / Compliance: governance, risk, and regulatory posture
Scoring & Analysis
Each of the 8 dimensions is scored 0–100 based on the evidence collected, not self-reported answers alone. I weight qualitative and quantitative signals to produce a composite score that reflects organizational reality, not optimistic estimates.
- Per-dimension scoring with documented evidence for each rating
- Gap identification: what's missing and how critical it is
- Blocker analysis: what will stop progress if left unaddressed
- Effort-to-impact assessment for each identified gap
Findings, Guidance & Strategy Session
The assessment findings are delivered following the analysis phase: scored, documented, and structured around what your organization needs to understand and act on. The format is calibrated to your situation: some organizations need a detailed written report; others need a tighter executive presentation. We agree on the right shape during scoping. We then walk through the findings together in a strategy session with your leadership team, focused on what to do next.
- AI Readiness Score with per-dimension breakdown and tier classification
- Written findings with evidence citations and gap analysis
- Strategic guidance on where to move first, sequenced by effort and impact
- Executive-ready summary of findings
- Leadership walkthrough session
- Follow-up access as you begin acting on findings
What gets measured, and why it matters.
The eight dimensions were developed from direct experience leading AI engineering teams, not from academic literature or vendor frameworks. They represent the categories where organizations consistently either succeed or fail.
Data Infrastructure
The foundation of everything. AI systems are only as reliable as the data that powers them. Organizations routinely overestimate their data readiness.
- Data accessibility and discoverability
- Data quality, freshness, and completeness
- Pipeline reliability and observability
- Data governance and cataloging maturity
Technology & Cloud
Your stack determines which AI patterns are feasible without major infrastructure investment. Many organizations have significant hidden rework ahead.
- Cloud platform maturity and strategy
- API-first architecture readiness
- MLOps and model deployment infrastructure
- Compute capacity and scaling capability
Engineering Capability
Building a proof of concept is not the same as productionizing AI. Most teams can prototype; far fewer can maintain, monitor, and improve models in production.
- ML/AI skills depth across the engineering team
- Ability to ship AI features reliably
- Model monitoring and maintenance capability
- Speed from experiment to production
AI Literacy
Non-technical staff who can't use AI tools confidently are a drag on transformation. Adoption is the last mile, and organizations consistently underinvest in it.
- AI tool usage across non-technical functions
- Prompt engineering and workflow integration skills
- Trust, confidence, and feedback quality
- AI training and onboarding programs
Strategy & Sponsorship
Without clear executive alignment and dedicated ownership, AI initiatives stall the moment they encounter the first meaningful obstacle, and they always do.
- Executive alignment and AI ownership clarity
- Dedicated AI budget and investment horizon
- Existence and quality of an AI strategy or roadmap
- Accountability structures for AI outcomes
Use Case Clarity
AI investments without prioritized, validated use cases produce sprawl, not results. Organizations need a framework for knowing where to start and how to evaluate return.
- Identified and validated AI use case pipeline
- ROI framework and measurement methodology
- Prioritization criteria and decision process
- Active pilots and their evaluation rigor
Governance & Risk
Organizations routinely deploy AI without policies in place. A single public incident around bias, data use, or model behavior can do lasting damage.
- AI policy and responsible use framework
- Bias detection and mitigation practices
- Data privacy and compliance controls
- AI security and adversarial risk awareness
Culture & Change
AI transformation requires people to change how they work. Organizations that underestimate the change management dimension consistently struggle to realize AI's potential, even after strong technical foundations are in place.
- Organizational appetite and trust for AI change
- Experimentation culture and failure tolerance
- Change management capability and track record
- Leadership communication on AI vision
What this is not.
There are a lot of AI assessments in the market. Most of them share the same limitations.
Practical details.
Timeline
The engagement is designed to move with urgency. Actual timeline depends on your organization's size, stakeholder availability, and the depth of documentation review required. We'll establish a realistic schedule during the discovery call.
Your Team's Time
Each stakeholder needs 45–60 minutes for their interview. One person should be available to coordinate document sharing and scheduling. Total ask on your team: 5–8 hours across 5–6 people.
Format
Conducted entirely virtually. Interviews are one-on-one over video, document review is handled via secure file share, and the strategy session includes your full leadership team.
A complete picture of your AI readiness, designed to move fast.
The assessment is structured to deliver clarity quickly, without cutting corners on what actually matters. You'll have the findings and a clear direction before most organizations have finished debating whether to start.
Book an Assessment →