Why you need an AI contract addendum now
AI pricing and terms are still moving targets. What looks like a bargain in a demo can become a budget risk once usage ramps, a model is deprecated, or export fees appear at exit. The UK Competition and Markets Authority (CMA) has warned about risks to choice, switching and fair dealing in foundation model markets. For buyers, that translates into being precise about price changes, portability and support if the vendor pivots or partners change. An addendum lets you lock these into plain‑English, buyer‑friendly clauses without rewriting a supplier’s standard terms. gov.uk
This playbook gives you 18 contract clauses, the business reason behind each, and the proof points you can request. It’s written for non‑technical leaders and fits neatly alongside the Two‑Week AI Vendor Bake‑Off and the Vendor Due Diligence Pack you may already be using.
How to use this addendum
- Run a 30‑minute internal triage: What are you buying (API, SaaS app, fine‑tuning, services)? Who will use it? What data goes in/out? What does “good” look like in 90 days?
- Attach this addendum to the order form, not the main MSA, so it’s clearly “applies to this AI service only”.
- Prioritise the clauses in the order below: price certainty, data rights, service levels, exit/switching.
- Ask for 2–3 objective proofs: a price sheet with unit definitions, certification link, and a one‑page exit plan outline.
Part 1 — Pricing and usage: 8 clauses that cap spend and remove surprises
1) Clear units and rounding
Define the billable unit (for example: “per 1,000 input tokens”, “per model call”, “per active named user per month”) and disallow rounding up beyond the next billable unit. Require the price sheet as an appendix with worked examples at your expected volumes.
2) Annual uplift cap
Cap annual price increases (for example CPI (UK) + 2% with a ceiling of 6%) and ban mid‑term unilateral changes to unit definitions. Any new model family or feature can be added by mutual agreement, not auto‑opt‑in.
3) Volume tiers with floors
Agree transparent volume tiers, with a floor so discounts don’t fall away if usage dips in a slow quarter. Lock the tier for the first 12 months once crossed.
4) Throughput and concurrency SLA
For APIs, set a minimum sustained throughput (for example tokens per second or requests per minute) and concurrency limits that match busy periods. If breached, trigger service credits or burst capacity at the discounted tier.
5) Safety and content filtering costs
Specify whether moderation/safety calls are included or billed separately. If separate, cap the ratio (for example no more than one moderation call per model call unless an incident is active).
6) Caching, embeddings and storage
List all storage‑based charges (vector databases, fine‑tune checkpoints, logs) and set retention defaults (for example 30 days for logs) unless you consent to longer. This aligns with secure‑by‑design guidance to minimise unnecessary retention. cisa.gov
7) Model changes and deprecation notice
Require 180 days’ notice for breaking changes or model retirements, with side‑by‑side access to the replacement model during migration, and no price uplift during that period. This protects choice and reduces lock‑in. gov.uk
8) Exit data export and egress costs
Fix a cost‑free export window at termination (for example 30 days), including schema, prompts, system instructions, finetune artefacts and vector data, plus reasonable vendor assistance. Government buying frameworks routinely include exit management schedules; borrow that language even in private contracts. crowncommercial.gov.uk
Part 2 — Data rights and security: 5 clauses that protect what matters
9) No training on your data, by default
State clearly: the supplier must not use your inputs or outputs for model training or product improvement without explicit, written opt‑in. Include a warranty that this setting is technically enforced across subprocessors. Aligns with secure deployment guidance to control data flows. cisa.gov
10) Data residency and subprocessor transparency
Pin data residency to the UK or EEA where required, and attach the current subprocessor list with 30 days’ notice for changes. Require a right to object if a change materially increases risk or cost.
11) Retention and deletion SLA
Define log and dataset retention (for example 30 days operational logs, 0 copies in developer sandboxes), with a deletion certificate within 14 days of request and at exit. This follows the “minimise and control” theme in secure AI operations. cisa.gov
12) Baseline assurance (Cyber Essentials, ISO/IEC 42001)
Ask suppliers processing sensitive data to hold Cyber Essentials (or Plus) and show a plan for ISO/IEC 42001 (the AI management system standard) or equivalent alignment. Cyber Essentials is the UK’s baseline in many public procurements, and ISO/IEC 42001 demonstrates structured AI governance. gov.uk
13) Risk management framework alignment
Ask for a short mapping to the NIST AI Risk Management Framework functions (Govern, Map, Measure, Manage). This gives your board comfort that risks are tracked in a recognised structure without drowning in paperwork. nist.gov
Part 3 — Service levels, support and incident response: 3 clauses that keep service dependable
14) Uptime and accuracy service levels
For platforms, seek 99.9% monthly uptime with clear exclusions. For critical tasks (for example contract drafting assist), include an “operable quality” commitment such as maximum failed response rate or timeout rate, with credits if breached.
15) Support responsiveness
Set support SLAs by severity (for example P1 within 1 hour, P2 within 4 hours, business hours for P3). Tie P1/P2 breaches to additional service credits.
16) Safety incidents
Define an “AI safety incident” (for example, model produces prohibited content without filters or ignores guardrails). Require 24‑hour notification, a mitigations plan, and a temporary rollback to the last safe model. This mirrors secure‑by‑design guidance on detect and respond. cisa.gov
Part 4 — Switching and exit: 2 clauses that guarantee an orderly handover
17) Exit management plan from day one
Attach a one‑page exit plan now. Minimums: named exit owner, inventory of artefacts to hand over (prompts, templates, vector indexes, fine‑tune weights where applicable), export format, and 10 days of reasonable assistance. This borrows best practice from government call‑off schedules. crowncommercial.gov.uk
18) Interoperability and open standards
Require widely used, open formats for exports (for example JSON/CSV/Parquet for data; documented embeddings shape; prompt templates in plain text/Markdown). Reference the UK’s Open Standards principles to reduce future lock‑in. gov.uk
The 10‑minute buyer’s checklist (print and take to the call)
- Show me the price sheet with units and two worked examples at our forecast volumes.
- Confirm your maximum annual price uplift and when it can apply.
- What are the concurrency and throughput limits? What are typical values for customers like us?
- Are moderation/safety calls included? If separate, what’s the expected ratio?
- List all storage‑based fees (embeddings, logs, fine‑tunes). Default retention?
- How much notice do you give for model changes/deprecations? What support do you provide?
- Provide your subprocessor list and data residency options.
- Confirm “no training on our data” is enforced by default across subprocessors.
- Provide current Cyber Essentials (or Plus) and any AI governance assurance (for example ISO/IEC 42001 roadmap). gov.uk
- Share a one‑page exit plan and export formats. crowncommercial.gov.uk
KPIs to track post‑signature
- Cost per completed task (baseline vs month 2 vs month 4).
- Throughput at peak (requests per minute or tokens per second achieved vs SLA).
- Deflection or time‑saved (for example % of email replies drafted by AI, minutes saved per document).
- Safety and quality (failed response rate, moderation false positives, human edits required).
- Support responsiveness (median time to first response for P1/P2).
- Switching readiness (export tested, exit artefact inventory maintained quarterly).
If you need a cost lens to attach to these, pair this article with our 90‑Day AI Cost Guardrail.
What “good” looks like from a supplier
Strong suppliers increasingly map their practices to recognised standards and guidance. You’re looking for:
- A short mapping to the NIST AI RMF functions (Govern, Map, Measure, Manage), showing how risks are identified and mitigated in their service. nist.gov
- Evidence of secure‑by‑design deployment: clear boundaries, least‑privilege access, tested incident response for AI‑specific threats (prompt injection, data poisoning). cisa.gov
- Open standards for interoperability and exports, to support future switching. gov.uk
- Transparency on partnerships and model changes, respecting the CMA’s emphasis on choice and switching. gov.uk
Public sector lessons SMEs can borrow
Even if you’re not buying through a framework, it’s sensible to copy two ideas from government procurement:
- Exit management is a standard schedule. Ask for it up front, with roles, artefacts and timeboxed assistance. crowncommercial.gov.uk
- Baseline cyber assurance is proportionate, not blanket. Cyber Essentials (or equivalent) is often requested where suppliers process personal or sensitive data; it’s a reasonable bar for many SME AI workflows. gov.uk
If you are in a regulated environment or working with the public sector, the UK’s AI procurement guidance and the WEF “AI Procurement in a Box” materials offer practical prompts on avoiding black‑box systems and lock‑in. gov.uk
For interoperability, the government’s Open Standards principles are a useful reference to set expectations with vendors on formats and APIs. gov.uk
Red flags worth walking away from
- Unilateral rights to change pricing or unit definitions with immediate effect.
- No commitment on model deprecation notice or migration support.
- Refusal to state “no training on your data” by default.
- Storage‑based fees without clear retention limits.
- No exit plan, or exports only in proprietary formats.
- Zero transparency on subprocessors or data residency options.
If you see more than two of the above, consider a portable stack approach and revisit the shortlist.
Putting it into practice this month
- Shortlist two vendors and run a structured demo week using the vendor bake‑off checklist.
- Send this addendum with your use‑case brief and KPIs. Ask vendors to redline.
- Score price sheets, assurance proofs (Cyber Essentials link, AI governance statement), and exit plan drafts.
- Run a 30‑minute risk review against secure deployment guidance, focusing on data flows and retention. cisa.gov
- Decide, sign a 60‑day proof‑of‑value with the addendum attached, and review after eight weeks.