Category: Data Protection

  • AI Accountability: How UAE Data Laws are Updated to Control the Next Wave of Artificial Intelligence

    Recent legislative developments in the United Arab Emirates, specifically in Dubai and financial free zones like the Dubai International Financial Centre (DIFC), clearly show a drive to integrate AI governance within existing data protection frameworks, rather than creating entirely separate laws. This approach represents a smart strategy aimed at balancing the encouragement of technological innovation with ensuring individual trust and protecting their rights.

    Read more: AI Accountability: How UAE Data Laws are Updated to Control the Next Wave of Artificial Intelligence
    1. The Legal Framework as the Data Gateway Guard
      The Federal Personal Data Protection Law No. 45 of 2021 in the UAE, and the updated DIFC Data Protection Law No. 5 of 2020 (amended in July 2025), serve as the first line of defense against the misuse of AI technologies.
    • Convergence with AI: AI systems are entirely dependent on personal data (such as health or behavioral data). Therefore, the focus on requirements for explicit consent, individuals’ rights to rectify their data, and the right to request cessation of processing (the right to be forgotten) directly applies to companies that use this data to train AI models or make automated decisions.
    • Tightening Accountability: The 2025 DIFC amendments introduced a private right of action for data subjects harmed by data breaches, significantly increasing the litigation risk for companies using AI irresponsibly. This mandates that organizations conduct more rigorous Data Protection Impact Assessments (DPIAs), particularly when dealing with high-risk AI systems.
    1. Proactive Governance to Encourage “Ethical AI”
      The UAE and Dubai are moving beyond the concept of merely penalizing harm after the fact, towards establishing a proactive, guiding framework centered on ethics and transparency.
    • The Charter for the Development and Use of Artificial Intelligence: This Charter (issued in June 2024) sets key principles that require AI developers and users to adhere to transparency, accountability, and fairness. These principles, also emphasized by the Dubai AI Principles, compel companies to explain how AI reaches its decisions (explainability), a critical point in the field of data protection.
    • Synthetic Data: Digital Dubai launched a framework on the use of Synthetic Data—data generated by algorithms to mimic real data without including any personal identifiers. This approach is an innovative solution that allows AI companies to train their massive models without violating individual privacy, achieving the perfect balance between innovation and privacy.
    1. Legislative Evolution through AI Tools Themselves
      The UAE government is adopting AI at the core of the legislative process, accelerating its ability to issue laws that keep pace with technological advancement.
    • The Smart Legislative System: The Cabinet launched the first integrated smart legislative system for developing laws, which uses AI to analyze international and local legislation and propose amendments. This step reduces the “time gap” between innovation (such as the emergence of generative AI models) and the issuance of necessary governance legislation, ensuring that data protection rules are always current and relevant to the new challenges posed by AI.
      Conclusion:
      The amendments to the data protection rules in the UAE and Dubai serve as an executive roadmap for AI governance. Instead of focusing on general AI legislation, existing data protection mechanisms are being used to enforce transparency and accountability on AI models, while adopting innovative solutions like synthetic data to enable safe innovation. This balance between enablement and protection boosts the confidence of international businesses and the public in the nation’s digital environment.
  • Personal Data is Relative: CJEU Confirms GDPR Definition Depends on the Recipient’s Perspective

    The Court of Justice of the European Union (CJEU) has delivered a significant judgment in the case European Data Protection Supervisor (EDPS) v Single Resolution Board (SRB) (C-413/23 P), fundamentally clarifying the scope of the term “personal data” under EU law. The ruling confirms that the definition is relative and context-specific, depending on the recipient’s ability to re-identify the data subject, offering both clarity and complex new compliance requirements for organizations sharing pseudonymized data.

    Read more: Personal Data is Relative: CJEU Confirms GDPR Definition Depends on the Recipient’s Perspective

    Background: The Conflict Over Pseudonymized Data
    The dispute stemmed from the resolution of the Spanish bank, Banco Popular Español, by the Single Resolution Board (SRB), an EU agency. The SRB collected numerous comments and personal opinions from affected former shareholders and creditors. The SRB then engaged the consulting firm Deloitte to evaluate over a thousand of these submissions.
    To protect privacy, the SRB pseudonymised the comments by replacing direct identifiers with a unique alphanumeric code. Critically, the SRB retained the re-identification key, while Deloitte did not have access to it.
    The conflict arose when individuals complained to the European Data Protection Supervisor (EDPS), arguing that the SRB had failed to inform them that their data would be transferred to a third party (Deloitte)—a breach of transparency obligations. The EDPS initially sided with the complainants, arguing the data was still personal because the SRB held the key. However, the case progressed based on the SRB’s core argument: since Deloitte could not reasonably re-identify the individuals, the data should be treated as anonymous in the consultant’s hands.
    The New Standard: Relativity Over Absolutism
    The CJEU ultimately rejected the “absolute” view that data remains personal data for everyone simply because one party (the original controller) holds a key.
    The Court established that pseudonymised data “must not be regarded as constituting, in all cases and for every person, personal data.”
    The key takeaway is the application of the “means reasonably likely to be used” test, assessed from the perspective of the recipient. If a third party receives pseudonymised data and lacks the legal or technical means, or if the effort required is disproportionate, to link the data back to an individual, the data may be considered anonymous in their hands. For that specific recipient, the full scope of the GDPR obligations would not apply.
    Dual Responsibility: The Controller’s Absolute Burden
    While the recipient benefits from the “relative” test, the judgment simultaneously reinforces a strict, absolute obligation on the original data controller (the disclosing party).
    The CJEU confirmed that the SRB was in breach of its obligations, ruling that the controller’s duty to inform data subjects about the recipients or categories of recipients of their data (the right to be informed) must be assessed at the time of data collection and from the controller’s own perspective.
    Since the SRB retained the key, the data remained personal data for the SRB. Consequently, the SRB was obliged to inform the individuals of the data sharing, even if the data was considered anonymous in Deloitte’s hands. Pseudonymisation is thus a risk mitigation tool, not a loophole to circumvent transparency.
    Practical Implications for Businesses

    • Risk Assessment is Essential: Organisations engaging in data sharing must perform a robust, case-by-case assessment of the recipient’s capabilities to re-identify the data. This analysis determines the data’s status for the recipient.
    • Transparency First: Controllers cannot rely on the possibility of anonymity at the recipient level to skip transparency obligations. Privacy notices must clearly disclose all data recipients, even if the data shared is pseudonymised.
    • Personal Opinions Confirmed: The CJEU also confirmed that an individual’s personal opinions or views are inherently linked to their author and automatically qualify as personal data.
      In conclusion, the EDPS v SRB ruling provides both clarity and complexity. It empowers organizations to use pseudonymisation as a means of reducing regulatory overhead for recipients, but it places a clear, unwavering responsibility on the original data controller to maintain high standards of transparency and control over the identifying information.