Skip to main content

UK AI Regulation: The FCA, ICO, and the Principles-Based Approach After Brexit

· 5 min read
David Sanker
Lawyer, Legal Knowledge Engineer & UAPK Inventor

The UK made a deliberate choice not to copy the EU AI Act. After Brexit, the government opted for a cross-regulator, sector-specific, principles-based approach to AI regulation — lighter-touch by design, aiming to position the UK as a pro-innovation AI jurisdiction.

In practice, "lighter-touch" doesn't mean "ungoverned." It means the rules live inside sector regulators — the FCA, ICO, PRA, CMA — rather than in a single prescriptive statute. For AI teams building products for the UK market, understanding this distributed regulatory structure is essential.

The Cross-Regulator Framework

In 2023, the UK government published a pro-innovation AI regulation framework built around five core principles that all regulators must apply to AI in their sectors:

  1. Safety, security and robustness
  2. Appropriate transparency and explainability
  3. Fairness
  4. Accountability and governance
  5. Contestability and redress

These principles don't have legal force on their own — they're implemented through each regulator's existing powers. The FCA applies them to AI in financial services. The ICO applies them to AI processing personal data. The CMA applies them to AI in competition contexts.

The FCA's AI Position

The FCA has been more active on AI than most realize. Key publications:

AI and Machine Learning (CP22/24, DP5/22): The FCA has consulted on AI use in financial services and clarified that existing rules — the Consumer Duty, SYSC, MAR — apply to AI systems.

Consumer Duty: Effective from July 2023, the Consumer Duty requires firms to deliver good outcomes for retail customers. For AI: if an AI system produces poor outcomes (unfair pricing, discriminatory decisions, unsuitable recommendations), the firm is liable under the Duty regardless of the fact that an algorithm produced the output.

Model Risk Management (SS1/23): The PRA's supervisory statement on Model Risk Management applies to AI and ML models used in financial decisions. It requires model validation, ongoing monitoring, and documentation of model governance.

Senior Managers Regime (SMCR): Named individuals are personally accountable for AI systems within their remit. If an AI trading system causes harm, the Head of Algorithmic Trading is the accountable person. This is more direct personal liability than the EU AI Act creates.

For FCA-regulated firms deploying AI agents: the Consumer Duty + Model Risk Management + SMCR creates a governance requirement even without specific AI legislation.

The ICO's AI Guidance

The ICO (Information Commissioner's Office) has published substantial AI guidance under UK GDPR (the retained EU GDPR post-Brexit):

Explaining AI decisions: Guidance on the right to explanation under UK GDPR Article 22, clarifying that "solely automated" decisions include situations where humans play only a nominal role. The ICO's interpretation is that a rubber-stamp human doesn't break the "solely automated" chain.

AI Auditing Framework: The ICO has issued a framework for auditing AI systems for data protection compliance, covering: purpose limitation, data minimisation, accuracy, storage limitation, security, and accountability.

Biometric Data: The ICO has taken an aggressive stance on biometric AI — facial recognition in public spaces, emotion recognition, behavioral biometrics. Multiple enforcement actions in 2023-2024.

UK AI Bill and Future Legislation

The AI Bill is expected in the 2025-2026 parliamentary session. Current indications:

  • Principles-based, not prescriptive
  • Liability framework for AI developers and deployers
  • Possibly mandatory incident reporting for high-risk AI
  • Potentially a "responsible AI" certification scheme

Unlike the EU AI Act, UK legislation is unlikely to include mandatory conformity assessments or a prohibited AI practices list. The emphasis will remain on outcomes rather than process.

Post-Brexit Divergence in Practice

For firms operating in both the EU and UK:

TopicEU AI ActUK Approach
High-risk AIMandatory conformity assessment, CE markingSector regulator oversight (no mandatory assessment)
Prohibited AIExplicit list (real-time biometrics, social scoring)No statutory list; FCA/ICO discretion
TransparencyMandatory disclosure requirementsConsumer Duty outcomes-based
Personal liabilityCompany fines; national enforcementSMCR — named individual accountability
General-purpose AIGPAI code of practiceNo equivalent (yet)

The practical divergence: EU compliance requires documenting what you did (conformity assessment, technical documentation, CE marking). UK compliance requires demonstrating what outcomes you produced (Consumer Duty evidence, FCA supervision response, ICO audit cooperation).

The UAPK UK Configuration

For a UK financial services AI agent:

{
"constraints": {
"require_human_approval": ["trade:execute", "decision:advisory", "payment:execute"],
"audit_retention_days": 2555
},
"policy": {
"jurisdiction_allowlist": ["UK", "IE"],
"require_capability_token": true,
"approval_thresholds": {
"action_types": ["trade:execute"],
"amount": 10000,
"currency": "GBP"
}
}
}

require_human_approval on advisory and execution actions satisfies both the Consumer Duty (humans accountable for outcomes) and SMCR (named individual responsible for what the system does).

The audit log provides the Consumer Duty evidence: for any retail customer interaction, you can demonstrate what the AI assessed, what it decided, and whether a human reviewed it.

jurisdiction_allowlist: ["UK", "IE"] — Ireland is included because many UK financial firms use Dublin entities post-Brexit for EU market access; separate manifests or jurisdiction-specific agents are cleaner where the regulatory requirements differ significantly.

UK GDPR vs. EU GDPR

UK GDPR (retained post-Brexit) is currently nearly identical to EU GDPR. The practical differences are small:

  • ICO enforcement rather than EU DPA enforcement
  • UK International Data Transfer Agreements (IDTAs) instead of EU SCCs for transfers out of UK
  • Potentially diverging amendments as UK updates its data protection law

For AI agents: build for EU GDPR standard and you're compliant with UK GDPR. The divergence risk is future legislation, not current requirements.