UAE Central Bank issues AI guidelines, reshaping how banks and advisors deploy algorithms in retail and wealth services.
- The CBUAE published a comprehensive AI guidance note on 22 February 2026, covering all licensed financial institutions under its supervision.
- Five core principles govern AI use: governance and accountability, fairness, transparency and explainability, human oversight, and data management.
- Customers must be informed when AI drives key decisions and given access to human review or opt-out options where appropriate.
- Boards and senior management are directly accountable for AI systems and outcomes, including model selection, deployment and monitoring.
- Third-party AI vendors must meet the same governance, explainability and consumer-protection standards as regulated institutions themselves.
- The guidance sits alongside SCA robo-advisory regulations, creating a layered compliance framework for AI-driven investment services.
A New AI Governance Framework for UAE Financial Services
The Central Bank of the UAE (CBUAE) has issued a comprehensive consumer protection framework governing artificial intelligence and machine learning use across the financial sector. Published on 22 February 2026, the guidance note establishes clear expectations for all institutions under CBUAE supervision - banks, insurers, exchange houses and finance companies alike. It addresses the full range of AI applications in finance, from credit scoring and fraud detection to algorithmic pricing and customer-engagement tools.
The guidance forms part of a broader push towards responsible AI deployment across UAE financial markets. It sits alongside existing rules from the Securities and Commodities Authority (SCA) on robo-advisory services, creating a layered regulatory environment for AI-driven advisory and investment tools. Together, these frameworks signal that UAE regulators expect both innovation and clear accountability from licensed institutions deploying AI at scale.
Scope and Who Is Affected
The guidance applies to every institution within the CBUAE's supervisory ecosystem. That includes commercial and investment banks, finance companies, exchange houses, insurance firms and other regulated entities. It covers AI and machine learning use cases across both retail and wholesale financial services, with explicit reference to suitability assessments, credit underwriting, product pricing, fraud and anti-money laundering monitoring, robo-advisory tools and other algorithmic decisioning systems.
While framed as principles-based guidance rather than detailed technical rules, the CBUAE makes clear that institutions are expected to align their internal frameworks accordingly. Compliance will be assessed as part of ongoing supervisory engagement, including routine examinations and thematic reviews. The step is described in reporting by Gulf News and Khaleej Times as reflecting the Central Bank's "proactive supervisory approach" to digital transformation risks.
Five Core Principles
The guidance sets out five principles that must underpin any AI or machine learning deployment. The first is governance and accountability: boards of directors and senior management are explicitly made responsible for AI systems and outcomes. Institutions must maintain detailed inventories of all AI models in use, covering their purpose, data inputs, key assumptions, performance metrics and known limitations. Regular reporting to senior leadership on model performance and compliance issues is required, and staff must be trained on AI-related risks.
The second principle is fairness and non-discrimination. AI systems must not produce discriminatory or manipulative outcomes, particularly in lending decisions, insurance underwriting, pricing and access to advisory services. Institutions must test models regularly to identify and correct biases - including those arising from unrepresentative training data. The CBUAE explicitly warns against using AI for pressure-selling or targeting customers with unsuitable financial products.
The remaining three principles address transparency and explainability, effective human oversight, and data management and privacy. On transparency, institutions must disclose to customers when AI is driving key decisions - such as credit approvals, claim outcomes or portfolio allocations - and provide understandable explanations of how those decisions are reached. On oversight, the Central Bank stresses that "human in the loop" mechanisms must be designed into processes where AI outputs influence material decisions, and management must be able to challenge AI systems rather than treating them as black boxes. On data, institutions must comply with UAE personal data protection laws, maintain robust data-governance controls, and apply cybersecurity measures commensurate with the sensitivity of the data and services involved.
Consumer Rights Under the New Guidance
Consumer protection is the explicit centrepiece of the guidance. Customers must be told when AI drives a key decision and must be able to request a human review. Institutions are also expected to offer alternative arrangements for customers who prefer not to be subject to AI-based decisioning. Reporting by Zawya notes that the CBUAE discourages practices where AI could exploit behavioural biases - for example, hyper-personalised product targeting aimed at financially vulnerable customers.
CBUAE Governor Khaled Mohamed Balama stated the guidance "aims to establish a clear framework for the responsible use of artificial intelligence and machine learning in the financial sector, in a way that enhances consumer protection, reinforces governance and transparency principles, and emphasises the importance of human oversight and data protection requirements." Regional analysis from PwC Middle East, cited in Zawya, links the guidance to a wider industry shift from AI experimentation to full-scale integration - where governance and accountability structures have become critical.
Outsourcing and Third-Party AI Vendors
Institutions remain fully responsible for AI functions they outsource to third parties - a point the CBUAE emphasises clearly. Thorough due diligence is required on AI vendors, covering their data-protection standards, model governance, security controls and ability to support explainability and audit requirements. Contracts must include safeguards for data protection, confidentiality, cybersecurity, regulatory access and the right to terminate or suspend services where risk thresholds are breached.
Ongoing monitoring of vendor performance and model changes is also required. Crucially, the guidance states that institutions must be able to suspend the use of any AI system immediately if material risks or failures emerge, including where a third-party system is involved. This will directly affect technology vendors and fintechs pitching AI-driven tools into the UAE market, as licensed institutions will need to hold them to the same standards.
Robo-Advisory Services and the SCA Framework
The CBUAE guidance intersects directly with robo-advisory regulations issued by the Securities and Commodities Authority (SCA) in 2025. Those SCA rules allow licensed portfolio management firms - including banks and digital wealth managers - to provide AI-powered automated investment recommendations, subject to strict governance, independent IT audits, cybersecurity controls and transparent fee and risk disclosures.
As noted in a February 2026 analysis published by UAE Advisor Guide, the two frameworks together create a layered compliance obligation for firms operating robo-advisory platforms. Such firms must show that their algorithms produce suitable investment portfolios and respect client risk profiles, as the SCA requires. They must also demonstrate that those algorithms are explainable, monitored for bias and subject to strong data-governance controls, in line with CBUAE expectations where it is the supervising authority.
Supervisory Implications for Institutions
Although principles-based, the guidance signals closer scrutiny of AI tools during CBUAE examinations. Institutions should prepare to demonstrate how their AI governance frameworks operate in practice - including decision-making processes, escalation routes and board oversight. Testing, validation and monitoring methodologies for AI models will also be assessed, as will consumer-facing disclosures and records of human reviews.
The framework is expected to shape internal audit plans, risk-management practices and procurement standards for AI solutions across the sector. Latham and Watkins, in their overview of AI regulation in the UAE, note that financial institutions should align internal governance frameworks with emerging norms on transparency, accountability and data protection. The Central Bank's February 2026 note is described by Middle East AI News as "forward-looking" - setting expectations proactively rather than reacting to AI-related incidents after the fact.
What Clients are Asking their Advisors
What does the CBUAE AI guidance require banks to tell customers about automated decisions?
Banks and other licensed institutions must disclose when AI is driving key decisions such as credit approvals, pricing or portfolio recommendations. Customers have the right to request a human review of those decisions and, in certain circumstances, to opt out of AI-driven processes entirely.
How does the CBUAE AI guidance affect robo-advisory platforms in the UAE?
Robo-advisory platforms regulated by the SCA already face strict governance, cybersecurity and disclosure requirements under 2025 rules. The CBUAE guidance adds a further layer for firms under its supervision, requiring that AI-driven investment tools are also explainable, tested for bias, and subject to robust data-governance and human oversight controls.
Are UAE banks responsible for AI tools supplied by third-party vendors?
Yes. The CBUAE makes clear that licensed institutions remain fully responsible for outsourced AI functions and must conduct thorough due diligence on vendors. Contracts must include data-protection safeguards, audit rights and the ability to suspend or terminate services if compliance issues arise.
What is the difference between the CBUAE AI guidance and formal regulations?
The guidance note sets out principles rather than detailed binding technical rules, but it carries real supervisory weight. Institutions under CBUAE oversight are expected to align their internal frameworks with these principles and will be assessed on them during routine examinations and thematic reviews.
Further Reading
CBUAE Guidance Note - Responsible Use of AI in the Financial Sector (Official PDF)The National - UAE Central Bank issues new guidelines on the use of AI in financial sector
UAE Advisor Guide - UAE Approves Robo-Advisory Investment Platforms
All content for information only. Not endorsement or recommendation.