- 01.
Define the scope: business units, geographies, and functions to be assessed
- 02.
Identify and map key stakeholders — C-suite, IT, operations, data, legal and compliance
- 03.
Align on assessment goals: efficiency gains, risk reduction, competitive positioning, or full transformation
- 04.
Set expectations on timeline, outputs, and how results will be used internally
- 05.
Schedule discovery interviews across all key stakeholder groups
Scoping document + stakeholder map + interview schedule
"What would success look like 12 months after acting on this assessment?" — ask this in the first meeting.
- 01.
Audit data assets: quality, volume, labelling, and accessibility across the organisation
- 02.
Review current data architecture — cloud, on-premise, or hybrid setup
- 03.
Assess data governance policies, ownership structures, and lineage tracking
- 04.
Identify integration points: APIs, data pipelines, warehouses, and data lakes
- 05.
Flag data privacy and regulatory constraints (GDPR, HIPAA, sector-specific rules)
Data maturity scorecard (1–5 scale across 6 dimensions)
Poor data quality is the #1 reason AI projects fail. Surface this early to set realistic client expectations.
- 01.
Survey employees on AI literacy and comfort levels across all departments
- 02.
Map existing technical talent: data engineers, analysts, and ML engineers
- 03.
Assess leadership's AI vision, appetite for change, and decision-making speed
- 04.
Identify AI champions and potential resistors within the organisation
- 05.
Review training programs, L&D budget, and hiring pipeline for AI roles
Skills gap analysis + culture readiness heat map
Use a 3-tier segmentation: AI-ready, AI-aware, and AI-unaware employees to guide training priorities.
- 01.
Map high-volume, repetitive, or decision-heavy processes across the business
- 02.
Run structured workshops to surface AI use case ideas from frontline staff
- 03.
Score each use case on value potential vs. implementation feasibility
- 04.
Prioritise a shortlist of 5–10 use cases for deeper analysis
- 05.
Identify quick wins (3–6 months) vs. strategic bets (12–24 months)
Use case prioritisation matrix (2×2: value vs. effort)
Start with pain — "What takes too long or has too many errors?" before "Where can AI help?"
- 01.
Inventory the current technology stack and identify AI-adjacent tools already in use
- 02.
Assess build vs. buy vs. partner options for each priority use case
- 03.
Review existing vendor relationships and enterprise software AI roadmaps
- 04.
Evaluate cloud provider AI services and pre-built model capabilities
- 05.
Assess security posture and deployment requirements for AI systems
Technology readiness matrix + recommended vendor shortlist
Many clients already have AI features in Salesforce, Microsoft 365, and SAP. Start there — it reduces risk and cost.
- 01.
Consolidate findings into an overall AI readiness score across 5 domains
- 02.
Benchmark against industry peers where sector data is available
- 03.
Draft a phased AI roadmap: foundation → pilot → scale
- 04.
Define success metrics and KPIs for each recommended initiative
- 05.
Present findings and facilitate an executive prioritisation workshop
- 06.
Produce the final report with investment estimates and risk register
AI Readiness Report + executive presentation + 90-day action plan
Frame the roadmap in business outcomes, not technology milestones. Boards fund revenue and risk reduction.
Each domain is scored 1–5 in the final readiness report, producing an overall composite score for the client.