Most UK organisations now “dabble” with AI, but few build everyday competence across roles. Government and industry have signalled a major push on workforce skills, with plans involving leading tech firms to train millions of workers. If you’re a director, operations lead or trustee, the question is no longer “should we train?” but “how do we upskill the whole organisation quickly, safely and measurably?”. gov.uk
This article gives you a simple, non-technical 30–60–90 day plan to take your people from curiosity to competence. It includes roles, KPIs, costs, a skills map, procurement questions and practical risks to manage. Where helpful, we signpost credible UK resources and programmes you can plug in immediately.
What “good” looks like within 90 days
- Capability: at least 70% of target staff complete foundation AI training; 20% become “superusers” able to coach peers.
- Adoption: three live, measurable use cases per team (e.g. drafting, analysis, support), with saved hours evidenced in your KPIs.
- Quality: outputs pass basic acceptance tests (accuracy checks, sourcing, tone, brand, privacy), and are reviewed by humans before release.
- Guardrails: your light-touch policy and training cover safe data use and explainability; managers know when to escalate risks.
- Momentum: a rolling monthly “show-and-tell” keeps good practice moving across teams; your training budget and time allocation are clear.
Charities making steady progress with digital skills show similar patterns—clear priorities, visible leadership and structured peer learning—yet many still report gaps in AI skills at senior level. That’s why an organisation-wide plan matters. charitydigitalskills.co.uk
Who does what: lightweight roles you can assign this week
1) Executive Sponsor
- Sets the goal (“free 5 hours per person per month by January”); approves budget/time.
- Chairs the monthly review; removes blockers across departments.
2) AI Lead (0.2–0.4 FTE)
- Owns the plan, quality tests and dashboard; runs short clinics and show-and-tells.
- Coordinates with HR, DPO/legal and IT on policies and tool access.
3) Team Superusers (10–20% of staff)
- Model good practice, pair with colleagues and capture before/after examples.
- Escalate tricky edge cases (personal data, confidentiality) early.
4) HR and DPO/legal advisors
- Keep training and policies aligned to current UK guidance on fairness, transparency and security.
- Help managers explain AI-assisted decisions appropriately to staff and service users.
For non-technical teams, aligning training with clear explainability and data-protection basics keeps trust high while avoiding heavy legalese. The ICO’s practical guidance series remains a good compass for HR, DPOs and managers. ico.org.uk
The essential skills map (plain English)
- Task framing: turn vague goals into steps, success criteria and examples.
- Prompting for outcomes: specify audience, tone, format, and “what to ignore”.
- Fact checking and sourcing: cross-check claims and ask the model to show sources you can verify.
- Privacy and confidentiality: know what not to paste; prefer approved tools/workflows for anything sensitive.
- Data hygiene: use current templates, naming and storage so your outputs are reusable (and searchable).
- Quality review: use a short checklist before sharing; apply human judgement for anything public or high risk.
- Peer coaching: short “over-the-shoulder” sessions; capture repeatable tips in a shared playbook.
These skills are teachable to any role. Pair them with a few simple acceptance tests and your organisation will see fast, compounding gains in productivity and quality.
Your 30–60–90 day plan
Days 1–30: Start small, prove value, set guardrails
- Pick 2–3 teams with repetitive drafting or analysis (sales, bids, comms, service desk, fundraising).
- Kick-off workshop (90 minutes): demonstrate three tasks, set shared quality tests, agree KPIs and baselines.
- Safety and policy: ship a two-page, plain-English policy and a 30-minute microlearning module for all staff. If you need a head start, adapt our AI Policy Pack templates.
- Quality tests: use 8–10 simple checks (accuracy, source, tone, reading age, privacy, brand). See our practical evaluation guide: The 10 Tests that Predict AI Quality.
- Baseline time and errors on 3 priority tasks per team; capture “before/after” samples.
Days 31–60: Role-based training and peer coaching
- Train by role (frontline, managers, specialists). Keep modules under 60 minutes and anchored on real tasks.
- Superuser clinics every fortnight; rotate presenters; record 5-minute demos.
- Usage and quality dashboard: weekly view of active users, tasks completed, time saved, and pass rates on quality tests.
- Content teams can run a fast sprint to improve output quality and consistency—see The 14‑Day AI Content Sprint.
Days 61–90: Scale, certify and stabilise costs
- Internal “bronze” certificate for superusers; managers nominate the next cohort.
- Expand use cases to 1–2 higher-skill tasks per team; apply the same quality tests.
- Cost governance: set monthly token/seat budgets and a change control for new tools; review ROI with finance.
- Plan production handover for one or two use cases—our 12‑week launch plan outlines a safe route.
Training routes you can use tomorrow
- Help to Grow: Management is a government-backed 12‑week leadership programme with mentoring and practical modules—useful for embedding change, not just tools. Essentials content is free online; the full course is 90% subsidised with a £750 fee. gov.uk
- Skills for Careers offers free courses in priority areas (including digital) depending on eligibility—useful to support staff who need broader foundations alongside AI training. skillsforcareers.education.gov.uk
- Sector insights: the Charity Digital Skills Report tracks adoption and leadership gaps—use its findings to brief trustees and target training. charitydigitalskills.co.uk
Note: the DSIT AI Upskilling Fund pilot that match‑funded SME training closed to applications in August 2024, but new government–industry initiatives aim to reach a far wider slice of the workforce. gov.uk
What to train by role (and how much time to book)
| Role | Focus areas | Time in 90 days |
|---|---|---|
| Frontline staff | Task framing, safe prompting, fact‑checking, templates, hand‑offs | 3–4 hours self‑paced + three 45‑min clinics |
| Managers | Use‑case selection, acceptance tests, KPI tracking, change coaching | 2 hours workshops + three 30‑min reviews |
| Specialists (legal, finance, data) | High‑risk scenarios, sampling for errors, documentation, approvals | 2–3 hours targeted sessions |
| Superusers | Peer coaching, reusable playbooks, quality assurance, cost control | 6–8 hours including facilitation |
Optional: align managers and HR/DPOs to UK guidance on transparency, fairness and security to reduce rework later. ico.org.uk
KPIs that prove progress (use these verbatim)
- Adoption: % of target staff who complete training; % who use AI weekly.
- Productivity: median minutes saved per task; total hours saved per month; backlog reduction.
- Quality: pass rate on acceptance tests; rework rate; number of escalations.
- Risk: number of privacy escalations; % of outputs with sources; spot‑check error rate.
- Cost: spend per active user; spend per approved use case; forecast vs budget variance.
Capture a 4‑line “win story” each week: task, old time, new time, what changed. These make adoption contagious.
Procurement questions for training providers (or internal L&D)
- Evidence of impact: can you show before/after metrics for similar UK SMEs/charities?
- Role‑based design: do you tailor modules to frontline, managers and specialists?
- Quality assurance: what acceptance tests do you teach, and how are results tracked?
- Data handling: how do you avoid exposing personal or confidential data in exercises?
- Train‑the‑trainer: can you equip our superusers to continue peer coaching?
- Compatibility: can content adapt to our approved tools and brand templates?
- Cost control: do you include guidance on budgeting seats/tokens and monitoring usage?
- Leadership support: do you offer a short manager track (coaching, KPIs, change)?
- Accessibility and inclusion: are materials inclusive and device‑agnostic?
- Credentials: can learners earn a certificate or CPD credit?
For leadership teams, consider blending practical AI training with management courses like Help to Grow to embed change across planning, finance and people. gov.uk
Costs and realistic budgets
| Item | Indicative range (ex VAT) | Notes |
|---|---|---|
| Foundation training (self‑paced + clinics) | £80–£250 per person | Use internal superusers to reduce costs over time. |
| Manager track + change coaching | £300–£700 per manager | Blended sessions; include KPI dashboards. |
| Train‑the‑trainer (superusers) | £1,500–£4,000 per cohort | Creates internal capability; pays back within one quarter. |
| Leadership programme (optional) | £750 per leader | Help to Grow: Management (90% subsidised, £750 fee). gov.uk |
| Tooling (AI seats/tokens) | £10–£40 per user/month | Start with a small, approved set; monitor usage weekly. |
Tip: combine leadership training with role‑based AI modules to accelerate behaviour change, not just awareness. Free or subsidised options exist for wider skills uplift beyond AI. skillsforcareers.education.gov.uk
Risks to manage (without derailing momentum)
- Data exposure: never paste personal or confidential data into unapproved tools; teach redaction and use approved environments.
- Over‑trusting outputs: require sources for factual claims; use a second pair of eyes for public or client‑facing content.
- Fairness and explainability: for AI‑assisted decisions that affect people, ensure clear explanations and meaningful human oversight; document your approach. ico.org.uk
- Shadow tools: publish a short list of approved tools and a simple request process for new ones; monitor spend per active user monthly.
- Digital inclusion: some staff or volunteers may need extra support (devices, connectivity, confidence) to benefit fully. theguardian.com
Governance, reporting and stories that move hearts and minds
At your monthly review, show three things: the numbers (adoption, time saved, quality), the risks (escalations and how you resolved them), and one story from a team that gained real hours back. These are what sustain momentum with boards, staff and funders. For charities, recent data shows adoption rising quickly—but leadership skills often lag, so bring trustees into the journey early. charitydigitalskills.co.uk
If you’d like to deepen the evaluation side alongside training, borrow ideas from our 5‑Day AI Evaluation Sprint to keep quality and costs predictable from day one.
Useful links (save for your next team meeting)
- Help to Grow: Management (leadership and management course). gov.uk
- Help to Grow: Management Essentials (free online modules). helptogrow.campaign.gov.uk
- AI Upskilling Fund pilot (closed, background). gov.uk
- Charity Digital Skills Report 2025 (sector snapshot). charitydigitalskills.co.uk