Data Protection in 2025: Key Rulings and Strategic Takeaways for 2026

The year 2025 confirms the growing momentum of increasingly structured global case law in the field of personal data.
These decisions are reshaping corporate obligations and shedding light on the grey areas left by existing laws. 2025 also marks the multiplication of litigation related to artificial intelligence in the context of personal data.
In this issue, we offer a clear and practical overview of the most significant decisions from around the world, to help you anticipate risks and integrate these developments into your practices. An essential read to understand the latest trends in a constantly evolving regulatory environment.
The Lexing® network members provide a snapshot of the current state of play worldwide. The following countries have contributed to this issue: Australia, Brazil, Finland, Greece, Ivory Coast, Hong Kong, Mexico, New Zealand, Nigeria, Portugal, Romania, Singapore, South Africa, Sweden, USA (East).
FREDERIC FORSTER
Executive VP of Lexing® network and Head of Telecommunications and Digital Communications at Lexing
In 2025, we have seen a noticeable shift in the development of data protection by the courts and the Information Regulator (South Africa’s data protection authority). Global platforms have also tested the limits of the Protection of Personal Information Act 4 of 2013 (POPIA) in real disputes. These landmark cases today continue to help define what privacy means in a constitutional democracy, how organisations must treat personal information, and where the limits of processing lie.
In this insight, I focus on landmark cases that have influenced data protection in South Africa recently. Displaying how courts balance privacy, freedom of expression, business interests and regulatory compliance (1).
Botha v Smuts (2)
In this case, a farmer (Botha) challenged a social media activist (Smuts) who published photos of alleged animal trapping on his farm along with Botha’s name, farm name, business address and other identifying information. The dispute forced the courts to weigh the right to privacy against the right to freedom of expression.
The Constitutional court, the apex court of South Africa, ultimately held that even when certain facts are publicly recorded such as farm ownership and business address, individuals may retain a reasonable expectation of privacy in how their personal or business information is used online. The court’s decision set a strong precedent for protecting personal information on social media, underlining that access to public records does not automatically strip away all privacy rights.
De Jager v Netcare (3)
Early this year, a patient (De Jager) challenged a private hospital (Netcare) for conducting covert surveillance to dispute De Jager’s claim of disability. De Jager argued that the surveillance violated his constitutional right to privacy. The court applied the conditions in the POPIA and held Netcare’s processing of his information lawful.
The ruling signals that surveillance for data collection for civil litigation or defence is only allowed if it meets POPIA’s lawful processing conditions.
Digital Law Company v Meta (4)
In mid-2025, the Johannesburg High Court ordered global tech giant Meta Platforms (owner of WhatsApp and Instagram) to permanently disable multiple Instagram handles and WhatsApp channels distributing graphic sexual abuse material involving South African school children. In addition, Meta had to provide identifying data such as IP, names, and phone numbers of those responsible.
The ruling is important for data protection law because its asserts that POPIA and South African constitutional rights bind international online platforms, especially regarding sensitive data involving children.
Information Regulator v Minister of Basic Education (5)
This case has placed government bodies under increased scrutiny regarding how they publish personal information. In 2025, the court heard the matter again, in which the regulator asked it to consider the long-standing practice of publishing final high school (matric) results with identifying details such as learners’ exam numbers and school details. The Information Regulator argues that this practice violates POPIA, particularly because it involves the personal data of minors. The Department of Basic Education defended this practice based on public interest and the spirit of transparency.
Currently, the court has not yet delivered final judgement. The case has exposed the tension between administrative transparency and the right to privacy. Its outcome is likely to influence how government bodies handle large scale data disclosures in the future. It may force a fundamental shift in how examination results and other public education data is published in South Africa.
Looking ahead
South Africa’s data protection law is evolving rapidly. These landmark cases show that courts will actively protect personal information while balancing freedom of expression, public interest, and business needs. They also provide clear signals for businesses: compliance with POPIA is not optional, and privacy cannot be treated as an afterthought.
For global companies operating in South Africa, these cases illustrate that local courts take both corporate and individual privacy seriously, and that South Africa’s legal landscape increasingly mirrors the challenges of regulating privacy in a connected, digital world.
*****
(1) You can read all privacy and data protection judgment summaries on Lexing South Africa (Michalsons) website: https://www.michalsons.com/blog/tag/data-protection-judgments-and-cases
(2) Botha v Smuts and Another [2024] ZACC 22. Available at: https://www.saflii.org/za/cases/ZACC/2024/22.html
(3) De Jager v Netcare surveillance evidence and POPIA, published on 28 February 2025 Available at: https://www.michalsons.com/blog/de-jager-v-netcare-surveillance-evidence-and-popia/77164
(4) Digital Law Company v Meta: Extraterritorial application of South African law, published on 25 August 2025. Available at: https://www.michalsons.com/blog/digital-law-company-v-meta-extraterritorial-application-of-south-african-law/79114
(5) Department of Basic Education DBE enforcement action consent published on 8 October 2025. Available at: https://www.michalsons.com/blog/department-of-basic-education-dbe-enforcement-action-consent/76394
JOHN GILES
southafrica@lexing.network
The Federal Court of Australia has issued the first civil penalty under the Privacy Act (1) in Australian Information Commissioner v Australian Clinical Labs (No 2) [2025] FCA 1224 (2).
The landmark decision comes as a warning to APP entities (3) – it is more imperative than ever that they are proactive and cautious in their handling of personal information.
Background
In 2022, Australian Clinical Labs (ACL) suffered a ransomware attack on technology assets it had purchased from Medlab Pathology only three months prior. This resulted in the disclosure of over 223,000 individual’s highly sensitive data.
Decision
The Federal Court held that ACL breached APP 11.1(b) (4) – that is, they failed to take reasonable steps in the circumstances to protect personal information from unauthorised access, modification, or disclosure.
The ‘circumstances’ of the APP entity ought to be considered when assessing whether they took reasonable steps. ACL’s large size, sophistication, and holding of a large volume of highly sensitive personal information conferred a high standard of protection required to prevent unauthorised disclosure of personal information.
The Federal Court held that ACL’s contraventions of APP 11.1(b) were ‘extensive and significant’, particularly because ACL’s most senior management were involved in managing the integration of Medlab Pathology’s IT systems into ACL, and because of an overreliance on third party service providers to detect and respond to data breaches and cyber incidents.
ACL was also found to have failed to exercise a reasonable and expeditious assessment of the data breach, as well as failed to notify the OAIC (5) of the data breach ‘as soon as practicable’.
Key Takeaways
This is the first civil penalty imposed under the Privacy Act ($5.8 million), indicating Australian Courts’ willingness to penalise APP entities for data breaches.
Further, this is also the first judgment which interprets APP 11.1(b) – the judgment sets out examples of ‘reasonable steps’ APP entities ought to take in the prevention of unauthorised disclosure of personal information.
ACL’s failure to identify the technical deficiencies in the IT assets purchased from Medlab Pathology reinforces the need for thorough due diligence in the acquisition of personal information and the importance of implementing strong and rapid cyber incident response infrastructure.
Where to next
The OAIC has commenced proceedings against Optus for the unauthorised disclosure of over 9.5 million Australians’ personal information (6). They allege Optus breached APP 11.1.
APP entities must be cautious, thorough, and careful in their handling of personal information – the OAIC is ready to pursue action for breaches of APP 11.1 and the Courts are willing to impose higher penalties.
Given the recent amendments to section 13G of the Privacy Act which now confers a maximum penalty of $50 million per contravention for body corporates, future penalties will continue to increase.
*****
(1) Privacy Act 1988 (Cth): https://www.legislation.gov.au/C2004A03712/latest/text
(2) See the full judgment: https://www.judgments.fedcourt.gov.au/judgments/Judgments/fca/single/2025/2025fca1224
(3) See section 6 of the Privacy Act for the definition of an APP entity.
(4) Chapter 11 of the Australian Privacy Principles: https://www.oaic.gov.au/privacy/australian-privacy-principles/australian-privacy-principles-guidelines/chapter-11-app-11-security-of-personal-information
(5) Office of the Australian Information Commissioner.
(6) For further detail, please see https://www.oaic.gov.au/news/media-centre/australian-information-commissioner-takes-civil-penalty-action-against-optus
DUDLEY KNELLER
&
LAURA DOWD
Introduction
In 2025, Brazil’s National Data Protection Agency (Agência Nacional de Proteção de Dados, ANPD) confronted an issue that is likely to recur in digital identity and authentication products, namely, how to assess the validity of consent when the collection of sensitive biometric data is encouraged through financial compensation.
In Board Vote No. 11/2025/DIR-IM/CD, issued in Proceeding No. 0261.006742/2024-53, the ANPD maintained a preventive measure suspending the granting of financial compensation linked to the creation of a World ID through iris collection in Brazil, rejected the operational solutions presented as compliance, and found inadmissible a change of legal basis due to the absence of genuine circumstantial changes.
The case is relevant because it goes beyond a formal review of consent screens. It makes clear that, for high-risk processing, the regulator examines the reality of the incentive and how the product’s onboarding architecture affects the data subject’s freedom and level of information.
1) The case and the purpose of the preventive measure. The supervisory proceeding was initiated to assess the lawfulness of biometric processing within the World ID protocol. The preventive measure targeted a specific risk driver, namely, the connection between iris collection and an economic benefit. The rationale is typical in data protection enforcement. It seeks to prevent a high-risk arrangement from continuing to produce effects while the supervisory record is developed, especially because sensitive biometric data offers limited practical reversibility once captured and integrated into digital ecosystems.
2) Consent, when the incentive becomes the critical element. The LGPD requires consent to be free, informed, and unambiguous, and tied to a specific purpose. In the vote, ANPD treats these requirements as substantive criteria rather than interface formalities. Where financial incentives sit at the center of product design, they can become the decisive driver of participation. This shifts consent into a risk zone because the data subject’s decision may reflect immediate economic advantage rather than an effective understanding of the processing and its consequences.
The most useful regulatory inference is straightforward. If a model depends on remuneration to achieve large-scale engagement, remuneration becomes part of the legal problem. It is not a peripheral detail. This is particularly significant for sensitive biometrics because the data are linked to identification and authentication dynamics, which increases the likelihood of downstream uses and long-term effects on the individual.
3) Operational adjustments, when compliance is measured by outcomes. The regulated entity argued that it had implemented changes to dissociate biometric collection from compensation. ANPD rejected the argument using a practical criterion. The preventive measure required the suspension of the granting of compensation, not merely a reorganization of the app flow or added distance between steps.
The vote therefore points to an outcome-based supervisory approach. If, at the end of the user journey, the data subject can still access the economic advantage as a consequence of biometric verification, then the suspension has not occurred. This is particularly relevant for complex digital programs because it signals that compliance solutions preserving the same material outcome are likely to be deemed insufficient, even if technically sophisticated or carefully designed from a user-experience perspective.
4) Changing the legal basis during supervision, limits on strategic switching. In addition to operational changes, the regulated entity sought to move away from consent and rely on another ground under article 11, especially the basis related to fraud prevention and the data subject’s security in identification and authentication in electronic systems. ANPD treated this change as exceptional. It required justification grounded in genuine circumstantial changes, rather than the need to accommodate a business model after a preventive measure was imposed.
The governance message is clear. The legal basis cannot operate as a tactical variable to reduce regulatory friction. If the core risk persists, meaning remuneration linked to sensitive data, switching the legal basis is likely to be understood as an attempt to shift the focus of supervision rather than a real remediation of non-compliance.
5) Effectiveness, daily fine, and dosimetry. Given the possibility of resumption and non-compliance, the vote imposed a daily fine of BRL 50,000, referencing dosimetry criteria and the seriousness and high potential harm associated with sensitive data scenarios. The key point is effectiveness. For preventive measures, the sanction must be sufficiently dissuasive so that non-compliance does not become an operational cost.
Conclusion
Board Vote No. 11/2025 closes the matter, at the preventive stage, with a set of administrative conclusions that also operate as a practical guide. ANPD denied the request for relaxation and maintained the preventive measure in full, finding that the proposed solutions did not meet the determination because the economic counterperformance remained effectively tied to iris collection. It also held that changing the legal basis was inadmissible in the absence of genuine circumstantial changes.
From a broader compliance perspective, the decision points to an outcome-based standard. Accordingly, what matters is whether the risk-driving effect has ceased in practice. If compensation remains reachable because of biometric verification, the controller is unlikely to be seen as compliant, even if the user journey is redesigned.
The vote also reinforces a substantive reading of consent in sensitive biometric processing. Since remuneration may become the decisive driver of participation, it can place the freedom of consent under practical pressure. For this reason, incentive design becomes a core legal issue in identity and authentication contexts, where long-term implications for the data subject are foreseeable.
Finally, the decision treats legal-basis changes as a governance choice that must reflect real changes in the processing context, rather than a tactical adjustment. In line with this focus on effectiveness, ANPD paired the preventive measure with meaningful deterrence, including a daily fine, reflecting the low reversibility of harms associated with sensitive biometric data.
*****
(1) Brazil, Provisional Measure No. 1,317 of September 17, 2025, restructuring the National Data Protection Agency as a special autonomous federal entity; Brazilian Chamber of Deputies, Legislation Database, accessed at: https://www2.camara.leg.br/legin/fed/medpro/2025/medidaprovisoria-1317-17-setembro-2025-797987-publicacaooriginal-176485-pe.html
(2) Brazil, National Data Protection Agency (ANPD), Board of Directors, Board Vote No. 11/2025/DIR-IM/CD, Supervisory Proceeding No. 00261.006742/2024-53, signed March 24, 2025; published in ANPD’s 2025 deliberative circuit, accessed at: https://www.gov.br/anpd/pt-br/assuntos/deliberacoes-do-conselho-diretor-1/circuitos-deliberativos-ano-2025/cd-10-2025-votos.pdf
(3) Brazil, Law No. 13,709 of August 14, 2018, the General Personal Data Protection Law (LGPD), art. 11; Presidency of the Republic, accessed at: https://www.planalto.gov.br/ccivil_03/_ato2015-2018/2018/lei/l13709.htm
(4) Brazil, National Data Protection Agency (ANPD), Board Resolution No. 4 of February 24, 2023, Regulation on Dosimetry and Application of Administrative Sanctions, accessed at: https://www.gov.br/anpd/pt-br/assuntos/noticias/anpd-publica-regulamento-de-dosimetria/Resolucaon4CDANPD24.02.2023.pdf
FLAVIA M. MURAD SCHAAL
&
DEYSE ALCANTARA DE LIMA
Introduction
With Decision No. 2025-1200, the Ivorian Data Protection Authority (ARTCI) has sent a clear warning to public and local administrations. By sanctioning the City of Tiassalé following a public complaint, the regulator reaffirmed a crucial principle: legitimate security concerns are no excuse for bypassing data privacy laws.
At issue was a camera system installed by the City and operated for security purposes within the City area. ARTCI identified critical failures related to the legal distinction between video protection and video surveillance, and to the strict rules applicable to processing carried out for public security reasons public safety data processing.
Video protection vs. video surveillance: an essential but often misunderstood distinction
Confusion between video protection and video surveillance remains common in administrative practice. The ARTCI took care to restate the fundamental differences between these two systems.
Video protection refers to image-capture systems deployed on public roads or in places open to the public, primarily by public authorities, for purposes such as safeguarding people and property, preventing criminal offences, or managing risk situations. Video surveillance, by contrast, generally covers systems installed in private or professional settings, including premises that are accessible to the public.
In both scenarios, where images enable the direct or indirect identification of individuals, the processing qualifies as personal data processing and therefore falls within the scope of Ivorian data protection law. That said, the legal classification of the system remains decisive, as it determines the applicable authorisation regime—particularly where the stated purpose relates to public security.
Reminder of the 2013 Data Protection Law on Public Security Processing
In its decision, the ARTCI relied on the provisions of Law No. 2013-450, which strictly regulates the processing of personal data carried out on behalf of the State for purposes of public safety, national defence, or public security. Under this framework, such processing may only be implemented if authorized by decree, adopted after the Data Protection Authority has issued a reasoned opinion.
This authorisation mechanism is designed to strike a careful balance between sovereign security imperatives and the protection of individuals’ fundamental rights and freedoms. It places the Authority at the heart of the system, ensuring—at an early stage—that the proposed processing is lawful, necessary, and proportionate.
In the case at hand, however, the Authority found that the video protection system deployed by the City of Tiassalé had been installed and operated without any authorising decree and in the absence of prior opinion with the ARTCI. This procedural failure alone was sufficient to render the processing unlawful.
An illegitimate substitution for the state in matters of public security
Beyond the procedural shortcomings, the decision brings to light a deeper institutional issue. By deploying and operating a video protection system for public security purposes without a proper legal basis, the City of Tiassalé effectively stepped into the State’s shoes, exercising sovereign security powers that fall outside its remit, even though the project had received budgetary approval.
While local authorities have their own powers and plan a key role in local governance, particularly in matters of prevention and public order, their powers must be exercised strictly within the boundaries set by law. Public security, especially when it entails large-scale and intrusive processing of personal data, remains a highly regulated area that primarily falls under the responsibility of the State.
This unauthorised assumption of State prerogatives naturally drew the attention of the ARTCI, all the more so given that the system was deployed in public spaces and indiscriminately affected the population at large.
Immediate and proportionate sanctions
In response to these shortcomings, the ARTCI moved swiftly, adopting a set of immediate measures. Three key sanctions were imposed:
- (i) a formal notice requiring the City to immediately cease all processing of personal data carried out through the video protection system;
- (ii) an order rendering all collected, stored, and processed data immediately inaccessible, to prevent any further use or exploitation of the images;
- (iii) an order for the immediate closure of the control room hosting the system, located within the town hall.
These measures underscore the Authority’s firm commitment to putting an end to the infringement and preventing any further harm to the rights of the data subjects. They also highlight the effectiveness of the ARTCI’s enforcement powers, including when exercised against public authorities.
Conclusion
This case is a timely reminder that local authorities, in the exercise of their responsibilities, are subject to heightened requirements of lawfulness, transparency, and accountability. Citizens must be clearly informed about the existence of image-capture systems, their purposes, their legal basis, and the rights available to them.
The complaint that triggered the sanction also reflects a growing public awareness and vigilance around the use of personal data. It underscores that trust between public authorities and citizens depends on strict adherence to the legal framework.
The decision concerning the City of Tiassalé stands as a strong pedagogical precedent. It reinforces the message that data protection should not be viewed as a secondary constraint or a box-ticking exercise, but as a core principle shaping public action.
States, public administrations, and local authorities alike are expected to lead by example in the application of data protection law. Respecting authorisation procedures, engaging in constructive dialogue with the Data Protection Authority, and fully integrating citizens’ rights are essential to the legitimacy of security policies.
Failing this, even well-intentioned systems risk eroding public trust and exposing institutions to enforcement action. Far from being an obstacle, personal data protection emerges as a cornerstone of a modern rule-of-law State—one that respects fundamental freedoms and remains attentive to the expectations of its citizens.
ANNICK IMBOUA-NIAVA
Introduction
The year 2025 saw a broad range of Finnish data protection decisions across the administrative courts and the Office of the Data Protection Ombudsman. Taken together, these cases offer a clearer view of how national bodies continue to interpret the GDPR’s core provisions, particularly the definitions of personal data, transparency obligations, security of processing, and the limits of lawful disclosure from public registers. While Finland’s enforcement practice generally remains consistent with EU-level guidance, several decisions reveal subtle but meaningful tensions between supervisory and judicial authorities, especially where the scope and limits of lawful processing are tested. This overview synthesizes the most significant developments arising from the courts and the Ombudsman’s office during 2025.
Administrative Courts
Supreme Administrative Court (“SAC”). The Supreme Administrative Court issued five GDPR-relevant judgments in 2025. Among the year’s most significant rulings was KHO:2025:14 (31 January 2025), which examined whether the names, business IDs (Fin. ‘Y-tunnus’), and municipalities of operation of fur-farm operators constituted personal data. The Court concluded that information relating to private entrepreneurs falls within the GDPR’s definition of personal data, whereas similar data concerning incorporated entities does not. This clarification aligns with, and further reinforces, earlier CJEU jurisprudence distinguishing natural persons acting as sole traders from incorporated legal persons.
Several other SAC judgments focused on access to documents under Finland’s publicity laws when datasets include mixed information on natural and legal persons. In several of its rulings (KHO:2025:15; KHO:2025:23; KHO:2025:29; KHO:2025:51), the Court continued to emphasise proportionality and the need to balance transparency obligations with data protection requirements.
Particularly notable is KHO:2025:15, which examined whether a newspaper’s searchable online tool for income and tax information of high income individuals (Fin. ‘verokone’) triggered data subjects’ GDPR rights to erasure and objection. The Court held in this case that the activity fell within the journalistic exemption of the Finnish Data Protection Act (1050/2018), stressing that Finland’s legally mandated public access to personal tax data supports the media’s role in enabling democratic debate on taxation and income distribution. As a result, freedom of expression prevailed, and the data subject’s GDPR-based claims were dismissed.
In KHO:2025:29 court annulled the previous decisions the Data Protection Ombudsman and the Administrative Court, and decided that in digital learning environments, the basic education framework can create a statutory obligation that justifies processing children’s personal data via an electronic learning platform (in this case Google Workspace for Education) (1).
Regional Administrative Courts. Two regional administrative court decisions addressed salient GDPR questions. In Itä-Suomen HAO 539/2025 (11 March 2025), the Court held that child-protection authorities may, pursuant to section 64 of the Client Data Act (703/2023), obtain police register information necessary for assessing a child’s placement. This overturned the police authority’s more restrictive interpretation, which had relied solely on the criminal-background check procedure. As the request was limited to information on potential violence or substance abuse, the Court found the data essential for statutory child-protection duties and ordered disclosure.
Arguably more consequential was Helsingin HAO 6850/2025 (3 November 2025), which overturned a decision by the Data Protection Ombudsman concerning Posti’s (Finland’s national postal and logistics service provider) automatic creation of electronic mailboxes. While the Court agreed that Posti’s transparency practices were insufficient, it held that the processing was necessary for contract performance under Article 6(1)(b) GDPR, thereby setting aside both the associated reprimand and the €2.4 million administrative fine. The ruling raises questions of legal certainty in the application of lawful-basis assessments and underscores the need for clearer national guidance in contested areas of routine service provision.
Data Protection Ombudsman’s Decisions
The Ombudsman’s office published 26 decisions in the Finlex database (i.e. Finland’s official online legal information service, maintained by the Ministry of Justice), addressing a broad spectrum of GDPR issues (2). These included procedural requirements for exercising data subject rights, security obligations in the context of critical digital infrastructure, conditions for disclosure of public-register data, the governance of public-sector cloud arrangements, and the use of cross-border transfer mechanisms.
- Facilitating Data Subject Rights. A key line of decisions concerned the channels and identification practices applied when data subjects exercised their access rights. In a decision of 11 November 2025 (TSV/112/2022), the Ombudsman found that a parking-services provider had infringed Articles 12(2), 12(6), and 15 GDPR by requiring data subjects to submits requests exclusively through a single online form and to provide their personal identity number as a precondition for processing an access request. The authority held that controllers must refrain from collecting additional data when existing information is sufficient to identify the requester. A reprimand and a compliance order were issued. The decision reinforces that procedural obstacles (whether technical, formal, or identification-related) may not impede the effective exercise of access rights (see also TSV/37/2019; TSV/2627/2024; TSV/12685/2024).
- Security of Processing. In two significant decisions of 9 September (TSV/3606/2024) and 23 October 2025 (TSV/1671/2023), the Ombudsman and the Sanctions Board addressed serious breaches in two banks’ strong electronic identification services, where software functionalities resulted in users seeing one another’s names and personal identity numbers and, in some cases, accessing services as another person. The Ombudsman found infringements of Articles 5(1)(f), 25(1), and 32 GDPR, emphasising the heightened obligations relating tochange management, testing and risk assessment in high-risk authentication systems. A reprimand was issued, and the Sanctions Board imposed administrative fines of €1,800,000 and €865,000, respectively, reflecting both the gravity of the incidents and the scale of the affected services.
- Disclosure of Register Data. In a decision of 27 August 2025 (TSV/2529/2023), the Ombudsman examined disclosures from the national vehicle register to a private debt-collection company. Traficom had released personal data despite the data subject’s explicit objection, and the authority held that public-register data remain fully subject to GDPR constraints and that the lawful bases under Article 6(1)(c) and (e) must be interpreted narrowly. A similar restrictive approach appeared in on 27 May 2025 in (TSV/108/2022), where a statutory pharmacy operator was found to have breached Articles 5(1), 25(2) and 32 GDPR by permitting disclosure through third-party analytics and tracking technologies (Google and Meta) without adequate purpose assessment or safeguards. Together, the decisions confirm that statutory register functions do not confer open-ended reuse rights on third parties and align with emerging judicial distinctions between information relating to private individuals and legal persons.
- Public-Sector Cloud Services. On 3 October 2025, the Ombudsman issued three coordinated decisions on public-sector cloud use (TSV/7/2022; TSV/24/2022; TSV/164/2022), following an EDPB-led EU-wide action. The investigations covered Valtori, the Digital and Population Data Services Agency (DVV) and the Tax Administration, each of which relies on extensive cloud-based infrastructure. Across all cases, the Ombudsman identified shortcomings in controller–processor arrangements, including unclear allocation of roles, incomplete contractual safeguards, and insufficient oversight of third-country data transfers, particularly regarding access from the United States. Portions of the decisions were classified for security reasons, but each authority received a reprimand and was instructed to strengthen documentation, conduct more systematic transfer-risk assessments, and ensure that cloud service use complies with Articles 28 and 46 GDPR. Collectively, the decisions demonstrate heightened scrutiny of state IT architectures and underscore the need for public bodies to demonstrate legally robust governance in shared cloud environments
- Binding Corporate Rules. Finland acted as the lead authority in approving Nokia’s Binding Corporate Rules for both controllers and processors on 14 April 2025 (TSV/13/2023; TSV/4995/2023). The approvals confirmed compliance with Article 47 GDPR, incorporating the enhanced safeguards arising from Schrems II and mandating detailed transfer-impact assessments and enforceable data-subject rights.
Conclusion
The 2025 Finnish data protection landscape reflects a system under active refinement. Courts and the Ombudsman continued to enforce strict procedural standards for the exercise of data subject rights, with a particular emphasis on identification practices and the accessibility of channels for submitting requests. Security of processing remained a prominent theme, with enforcement targeting both substantive security breaches and shortcomings in change-management practices within high-risk digital services.
The public-register disclosure case underscored the ongoing need for clearer statutory boundaries on the reuse of personal data originally collected for public functions. Meanwhile, the Helsingin HAO’s annulment of the Ombudsman’s Posti decision highlighted persistent challenges surrounding legal certainty and the delineation of lawful processing under Article 6 GDPR.
Taken together, the developments of 2025 depict an enforcement environment that is both demanding and increasingly granular, with supervisory authorities sharpening expectations while courts continue to calibrate the outer limits of administrative reasoning. Controllers across sectors must therefore maintain rigorous procedural practices alongside robust substantive compliance as Finland’s data protection regime continues to evolve.
*****
(1) KHO:2025:23 held that the Finnish Traffic Insurance Centre had not been shown to systematically breach GDPR principles of fairness, data minimization or privacy by design when requesting patient records for claims handling. In KHO:2025:51, under Article 15 GDPR (with reference to C-579/21), a bank customer had a right of access to the dates on which his data were queried from user logs but not necessarily to the exact times
(2) Besides the decisions discussed below, the cases include TSV/5/2019 (access to data; Art 12, 15); TSV/11/2019 (access channels for rights requests; Art 12, 15); TSV/21/2019 (biometrics, DPIA and safeguards; Art 4, 5, 9, 25, 35); TSV/34/2019 (online service security and password practices; Art 13, 32); TSV/27/2020 (parking-debt collection, legal basis and purpose; Art 6(1)); TSV/61/2021(erasure and retention justification; Art 5, 6, 17); TSV/304/2021 (access to data; Art 15); TSV/301/2022 (telephone marketing, information duties and identification; Art 5(1)(c), 12, 14, 25(2)); TSV/220/2023 (recorded customer calls, transparency; Art 5(1)(a), 12–13); TSV/267/2023 (access to log data; Art 15(1)); TSV/3773/2023 (public availability of meeting attachments; Art 5, 25(2), 32); TSV/1507/2024 (access to healthcare log data; Art 15(1), 12); and TSV/4667/2024 (use of credit data for marketing campaign; Art 6(1); Credit Information Act 19).
JAN LINBERG
&
DIANA ESSER
This high-level analysis focuses on the administrative fines imposed by the Hellenic Data Protection Authority (HDPA) during 2025, reflecting a continued and increasingly consistent enforcement of data subjects’ rights and accountability obligations under the GDPR.
A central theme in the HDPA’s 2025 enforcement practice concerns systematic failures by financial institutions to comply with the right of access under Article 15 GDPR. In a landmark decision, the HDPA imposed an administrative fine of EUR 200,000 on the National Bank of Greece for infringements of Articles 12(2) and 25(1) GDPR, relating to structural deficiencies in the handling of access requests and the design of internal procedures (1). Similarly, the HDPA imposed an administrative fine of EUR 50,000 on Piraeus Bank for violations of the principles of lawfulness and transparency, following inadequate information provided to a data subject concerning the disclosure of personal data in the context of loan management arrangements (2).
On a more limited scale, the HDPA addressed data breach management and internal access control failures within credit institutions. In this context, the Authority imposed an administrative fine of EUR 3,000 on Alpha Bank, taking into account the limited scope of the breach, the absence of significant harm to data subjects, and the mitigating technical and organisational measures adopted following the incident (3).
A further group of cases examined public-sector bodies, primarily in relation to persistent refusals or delays in granting access to documents forming part of personal administrative or disciplinary files. In this context, the HDPA imposed cumulative administrative fines of EUR 15,000 on the Hellenic Fire Service, comprising EUR 10,000 for repeated violations of Article 15 GDPR and EUR 5,000 for failures related to the effective involvement and independence of the Data Protection Officer (4). In a separate case involving unlawful retention of documents in a public employee’s personnel file and failure to satisfy access and erasure requests, the HDPA imposed an administrative fine of EUR 2,000 (5).
At the lower end of the sanctioning spectrum, the HDPA continued to impose fines in CCTV-related cases and individual disputes between private parties. In one such case, the Authority imposed cumulative administrative fines of EUR 3,000 on private individuals for unlawful video surveillance and failure to grant access to recorded footage, in breach of the principles of lawfulness and accountability (6). In another case concerning refusal to grant access to employment-related personal data, the HDPA imposed an administrative fine of EUR 1,000 for violation of Articles 12 and 15 GDPR (7).
To conclude, the HDPA’s enforcement activity in 2025 demonstrates a clear prioritisation of the right of access, coupled with a graduated and proportionate approach to sanctioning. Higher fines were reserved for structural or repeated compliance failures, particularly within large organisations, while targeted and limited sanctions were imposed in individual and small-scale cases, confirming the Authority’s consistent and risk-based enforcement strategy.
*****
(1) Decision 1/2025 of the Hellenic Data Protection Authority
(2) Decision 23/2025 of the Hellenic Data Protection Authority
(3) Decision 7/2025 of the Hellenic Data Protection Authority
(4) Decision 34/2025 of the Hellenic Data Protection Authority
(5) Decision 20/2025 of the Hellenic Data Protection Authority
(6) Decision 21/2025 of the Hellenic Data Protection Authority
(7) Decision 41/2025 of the Hellenic Data Protection Authority
GEORGE BALLAS
&
NIKOLAOS PAPADOPOULOS
In this article, Pádraig Walsh reviews key 2025 data privacy developments in Hong Kong and previews potential developments for the year ahead.
Protection of Critical Infrastructure (Computer Systems) Ordinance (Cap. 653)
The Protection of Critical Infrastructure (Computer Systems) Ordinance (Cap. 653) (“PCICSO”) (1) came into force on 1 January 2026.
The PCICSO creates a dedicated cybersecurity regime for designated Critical Infrastructure Operators (“CI Operators”). The statutory obligations are organised around (i) organisational governance; (ii) preventive and technical safeguards; and (iii) incident reporting and response. The CI Operators may be designated from 8 essential sectors, as well as infrastructure operators that host key social or economic activities.
The designation of CI Operators and detailed guidance on implementation of the PCICSO are expected in early 2026. The focus of the PCICSO is not primarily on personal data protection, which remains under the regulatory ambit of the Office of the Privacy Commissioner for Personal Data (“PCPD“) (2). However, the activities of the Security Bureau in overseeing cybersecurity and implementing the PCICSO will also lead to more regulatory activity conducted by the PCPD.
Guidelines on the Use of Generative AI
Generative AI (“GenAI”) became a mainstream enterprise tool in 2025. A number of Hong Kong authorities have issued various guidance to align use of GenAI tools with the Data Protection Principles of the Personal Data (Privacy) Ordinance (Cap. 486) (“PDPO”).
- PCPD Checklist (31 March 2025) (3): The “Checklist on Guidelines for the Use of GenAI by Employees” helps organisations develop internal policies that ensure compliance with the PDPO. In particular, the policies should specify the scope of permissible use, permissible inputs, storage of output information, embed privacy safeguards via data security measures, set ethical guardrails and identify consequences for breaches.
- Digital Policy Office Guideline (13 April 2025) (4): The “Hong Kong GenAI Technical and Application Guideline” frames five dimensions of governance for GenAI: (i) personal data privacy, (ii) intellectual property, (iii) crime prevention, (iv) reliability and (v) trustworthiness within a four-tier risk classification system (unacceptable, high, limited and low risk). This is a more technical guide with role-specific information for developers, intermediaries and enterprise users of GenAI. The guideline applies within government organisations, but is also considered a general industry benchmark for practice standards.
- Anonymisation Guide (31 July 2025) (5): The PCPD endorsed the cross-jurisdictional “Guide to Getting Started with Anonymisation”, offering a practical guide to its readers. To ensure data anonymisation in AI models, users are expected to (i) know your data, (ii) remove direct identifiers from the dataset; (iii) apply anonymisation techniques to indirect identifiers; (iv) assess re-identification risks; and (v) manage such risks by implementing corresponding risk mitigation measures.
PCPD Compliance Check on Use of AI
In May 2025, the PCPD published a new round of compliance checks conducted across 60 organisations, reviewing adherence to the PDPO and PCPD published guidance (6).
Findings:
AI adoption grew. Among users handling personal data through AI systems, all implemented security safeguards. 83% conducted PIAs pre-implementation. 96% conducted pre-deployment testing for reliability, robustness and fairness.
Most organisations established AI governance structures with board-level oversight. Data breach response plans were common with PCPD guidance widely referenced and adopted within such plans. No PDPO contraventions were found in this check.
As expected, compared with 2024, more organisations now handle personal data through AI. Good practices are more prominent, including data anonymisation, pseudonymisation and privacy-enhancing technologies.
PCPD inspection and investigation reports
In August 2025, the PCPD release a set of investigation reports into major data breaches. Later in November 2025, the PCPD published a report on a set of inspections into data breaches following initial compliance checks (7) (8) (9) (10).
- Outcomes: In the investigations, the PCPD found contraventions of DPP4 (security of personal data) due to fundamental failures and issued Enforcement Notices. In the inspections, the PCPD considered the entities compliant with DPP4 but recommended further improvements.
- Takeaways: The DPP4 standard of taking “all practicable steps” to protect personal data is a living standard. A data breach incident does not automatically result in contravention of the principle. Credible internal policies and timely remediation can support a finding of compliance. However, the absence of basic and widely available protections will likely lead to an investigation and subsequent Enforcement Notice.
2026 in Prospect
2025 was a year of transition. As AI moved firmly into the workplace mainstream, the global challenge became clear – innovate safely. Hong Kong was no exception. Looking ahead, 2026 could be a pivotal year.
The core focus and regulatory activity will be on the PCICSO and its implementation. We expect the Commissioner of Critical Infrastructure to be appointed, CI Operators to be designated, and the first wave of compliance to commence. While 2025 delivered relatively few headline developments, the implementation phase in 2026 should bring clarity and momentum.
AI adoption will continue to expand across sectors. There is already a substantial existing body of guidance in place. We expect the PCPD to shift emphasis to enforcement.
*****
(1) The Protection of Critical Infrastructure (Computer Systems) Ordinance (Cap 653): https://www.elegislation.gov.hk/hk/2025/4!en
(2) The Personal Data (Privacy) Ordinance (Cap. 486): https://www.elegislation.gov.hk/hk/cap486
(3) The Checklist on Guidelines for the Use of GenAI by Employees: https://www.pcpd.org.hk/english/resources_centre/publications/files/guidelines_ai_employees.pdf
(4) The Hong Kong GenAI Technical and Application Guideline: https://www.digitalpolicy.gov.hk/en/our_work/data_governance/policies_standards/ethical_ai_framework/doc/HK_Generative_AI_Technical_and_Application_Guideline_en.pdf
(5) The Guide to Getting Started with Anonymisation: https://www.pcpd.org.hk/english/resources_centre/publications/files/appa_anonymisation_guide072025.pdf
(6) PCPD 2025 Compliance Check on use of AI: https://www.pcpd.org.hk/english/resources_centre/publications/files/AI_ComplianceChecks.pdf
(7) Investigation Report for Kwong’s Art Jewellery Trading Company Limited and My Jewelry Management Limited: https://www.pcpd.org.hk/english/enforcement/commissioners_findings/files/r25_19241_e.pdf
(8) Investigation Report for Adastria Asia Co., Limited: https://www.pcpd.org.hk/english/enforcement/commissioners_findings/files/r25_19906_e.pdf
(9) Inspection Report for HKICC: https://www.pcpd.org.hk/english/enforcement/commissioners_findings/files/r25_17740_e.pdf
(10) Inspection Report for HKCT: https://www.pcpd.org.hk/english/enforcement/commissioners_findings/files/r25_0255_e.pdf
PÁDRAIG WALSH
Abstract
This article examines the massive data breaches that struck Mexico in 2025, focusing on the Instituto Mexicano del Seguro Social (IMSS) and other public bodies. It explores the constitutional, statutory, and comparative dimensions of these incidents, situating them within Mexico’s evolving data protection framework and global standards.
Introduction
The year 2025 marked a turning point for data protection in Mexico. The massive breach at the Instituto Mexicano del Seguro Social (IMSS), involving the theft and sale of personal data of nearly 20 million beneficiaries, revealed systemic vulnerabilities in public institutions. This incident, alongside breaches in other government agencies, underscores the urgent need for stronger enforcement of Mexico’s Federal Law on the Protection of Personal Data Held by Private Parties (LFPDPPP) and the General Law on the Protection of Personal Data in Possession of Obligated Subjects (LGPDPPSO).
Key Facts
- Scope: Approximately 20 million IMSS beneficiaries affected. (1)
- Data exposed: Full names, CURP, dates of birth, pension modalities, and other sensitive identifiers.
- Source of breach: Reports indicate up to 70% of incidents originated internally, through employees. (2)
- Dark web sale: Data sets were confirmed to be sold in illicit markets. (3)
Legal Framework
Mexico’s dual regime—LFPDPPP (private sector) and LGPDPPSO (public sector)—creates overlapping obligations. The IMSS, as a public body, is bound by the LGPDPPSO, which requires:
- Implementation of administrative, technical, and physical safeguards.
- Immediate breach notification to affected individuals.
- Accountability mechanisms through the Secretaría Anticorrupción y Buen Gobierno, which assumed the functions of the now-defunct INAI in March 2025.
Yet enforcement remains inconsistent. Sanctions are often symbolic, and breach notification practices are underdeveloped compared to the EU’s GDPR.
Legal Analysis
The IMSS breach raises critical questions about accountability under Mexican law. As a public body, IMSS falls under the LGPDPPSO, which mandates strict safeguards for sensitive data. However, enforcement mechanisms remain weak, and sanctions are often symbolic. The breach illustrates the tension between constitutional rights to privacy (Article 16 of the Mexican Constitution) and the operational realities of large bureaucracies. Moreover, the incident highlights the need for harmonization with international standards such as the EU’s GDPR, particularly regarding breach notification and data subject remedies.
Comparative Perspective
Compared to other jurisdictions, Mexico’s response appears fragmented. While the GDPR imposes strict timelines for breach notification and heavy fines, Mexican authorities have yet to establish a consistent enforcement practice. The IMSS case demonstrates the risks of underinvestment in cybersecurity and the absence of a culture of compliance.
Conclusion
The IMSS breach of 2025 is not merely a technical failure but a constitutional challenge. It calls for a rethinking of Mexico’s data protection framework, stronger institutional accountability, and alignment with global standards. For legal practitioners, the case serves as a reminder that data protection is no longer peripheral but central to the rule of law.
*****
ENRIQUE OCHOA DE GONZÁLEZ ARGUELLES
&
STEPHANY HERNÁNDEZ SIMÓN
&
OMAR ALEJANDRI RODRÍGUEZ
Introduction
The Nigerian Data Protection Act 2023 (NDPA) establishes a comprehensive statutory regime for data protection in Nigeria.
Prior to its enactment, the Nigerian Data Protection Regulations 2019 (NDPR), issued by the National Information Technology Development Agency (NITDA), served as the country’s principal data protection framework. The NDPR significantly shaped national awareness and enforcement, prompting multiple regulatory actions and public sensitisation efforts. An executive directive later created the Nigerian Data Protection Bureau (NDPB) as an independent body to enforce the NDPR.
With the passage of the NDPA, the NDPB was transformed by operation of law into the Nigerian Data Protection Commission (NDPC), which has since recorded notable enforcement activities along with increased private litigation.
Facts of the Case
The Applicant, Mr. Araka, a registered customer of the now-defunct Jumia Foods, brought an action asserting breaches of his constitutional right to privacy and violations of the NDPA. He sought declarations that both Respondents qualify as data controllers, that the 2nd Respondent’s retention of his personal data violated his right to erasure under Section 34(2), and that the processing of his data for direct marketing contravened Sections 25 and 26 of the NDPA.
The 1st Respondent, an online delivery platform that partnered with Jumia Foods, argued that fulfilling customer orders required the collection and processing of certain personal data. It relied on the Jumia Foods Terms and Conditions and Privacy Notice, asserting that the Applicant consented to the sharing of his data with third parties, including the 2nd Respondent, for order processing only.
The 2nd Respondent relied on an On-Demand Service Agreement with the 1st Respondent and stated that its marketing messages were sent in bulk to existing or past customers, with an opt-out option provided.
After the Applicant complained, the 1st Respondent instructed the 2nd Respondent to stop sending unsolicited marketing messages, which it did; however, the messages resumed automatically when the Applicant later placed a new order.
Findings of the Court
Delivering judgment on 18 February 2025, the Court reaffirmed that the constitutional right to privacy extends to personal information and that the NDPA aims to ensure lawful, transparent, and accountable processing of personal data. Any violation of the NDPA, it held, constitutes a breach of the constitutional right to privacy and attracts compensation.
The Court accepted the 1st Respondent’s argument that the Applicant consented to the processing of his data by accepting its contractual documents, noting that the Applicant did not contest this point. Applying Section 131 of the Evidence Act, the Court found that the Applicant failed to prove that his data had been processed beyond the purpose for which it was collected.
Regarding the lawful basis for processing under Section 25, the Court held that processing was valid under Section 25(1)(b), performance of a contract for placing and delivering food orders. However, it emphasised that nothing in the contract authorised the sending of unsolicited marketing messages. It therefore found the 2nd Respondent’s marketing communications to be unlawful and in violation of the Applicant’s privacy rights.
The Court declared that the Respondents qualify respectively as a data controller and data processor under the NDPA, that the 2nd Respondent’s retention of the Applicant’s personal data breached Section 34(2), and that its processing lacked a lawful basis and valid consent.
The Court awarded ₦3,000,000 in general damages against the 2nd Respondent.
Commentary
Three observations arise from the judgment.
- First, in assessing consent under Section 26, the Court did not specify how consent was obtained, seemingly treating acceptance of contractual documents as sufficient. Yet Sections 26(3) and 26(7) require affirmative, unambiguous consent, silence, inactivity, or pre-selected options are expressly excluded.
- Second, the Court did not examine whether the “legitimate interest” basis under Section 25(1)(v) could apply to the 2nd Respondent’s marketing communications. This ground is relevant where a pre-existing relationship exists and direct marketing falls within the data subject’s reasonable expectations.
- Third, by holding that consent was limited strictly to order processing, the Court appears to conflate consent with legitimate interest, foreclosing the possibility that the 2nd Respondent might rely on the latter as an alternative lawful basis.
Conclusion
This judgment reinforces the growing judicial recognition of data protection rights in Nigeria and elevates NDPA non-compliance to a constitutional breach with significant financial consequences. It highlights the need for data controllers and processors to ensure strict purpose limitation, robust consent practices, and effective governance mechanisms for rights such as erasure and objection to direct marketing. The judgment also exposes interpretive gaps, particularly regarding valid consent and the role of legitimate interest. Organisations must therefore adopt higher compliance standards, implement explicit opt-in mechanisms, and document lawful bases for processing to mitigate regulatory and judicial risks. Proactive compliance is not optional; it is essential to avoid reputational damage, enforcement actions, and liability for substantial damages.
CHUKWUYERE IZUOGU
nigeria@lexing.network
Internal gossip about an employee’s drug test refusal cost an employer $30,000 and shows how easily workplace chatter can breach privacy law.
In March 2025, the Human Rights Review Tribunal ordered KAM Transport Limited to pay $30,000 to a former employee after it found the company had breached the employee’s privacy (1). The case arose from an internal disclosure of sensitive information that led to damaging rumours about the employee, including a false allegation that he was involved in drug dealing. The Tribunal’s findings reinforce that privacy obligations extend to internal conversations and that “casual” workplace gossip can expose employers to legal and reputational risk.
Background
Cummings, a long-serving truck driver, was selected for a random drug test at work in August 2020. He refused to take the test, triggering a disciplinary process as permitted under his employment agreement. After returning a negative test a week later, he resumed his duties. However, within days of his return, a forklift driver at a client’s site confronted him with a rumour that he had been dismissed for failing a drug test and was a drug dealer. It soon became clear that the rumour had originated within KAM and had spread both internally and externally.
Poor conduct by the employer
The Tribunal accepted Cummings’ evidence that the source of the leak was KAM’s branch manager, who had disclosed the drug test refusal to a driver with no management responsibilities. That employee then passed the information on to others, including staff at a client site. KAM denied that the disclosure had happened and therefore did not try to justify it under any of the exceptions in the Privacy Act. The Tribunal found that the internal disclosure did occur.
What the Tribunal considered
The Tribunal focused on Information Privacy Principle 11 (IPP 11) in the Privacy Act 2020, which says personal information must not be disclosed unless one of the specific exceptions applies. One of those exceptions, set out in IPP 11(a)(i), allows disclosure where it directly relates to the reason the information was collected in the first place. (2) However, KAM didn’t rely on that exception or any other exception. The Tribunal also referred to IPP 10, which limits how personal information can be used within a business. (3)
The Tribunal made three key findings on the facts:
- The information was sensitive: It related to a refusal to take a drug test, which carries stigma and reputational risk.
- The person it was shared with didn’t need to know it: The recipient was not a manager and had no role in the disciplinary process.
- No legal exception applied: The disclosure wasn’t necessary for any operational reason and wasn’t permitted under the Privacy Act.
The Tribunal found that the disclosure caused Cummings significant emotional harm. KAM argued the comments made to Cummings were in jest, and that the situation had been blown out of proportion. The Tribunal disagreed. It accepted that Cummings was deeply affected by the disclosure. He felt humiliated, lost confidence, experienced anxiety and depression, and feared for his safety when travelling to some areas.
Cummings later resigned, stating that the incident had irreparably damaged his trust in his employer. However, the Tribunal declined to award him compensation for lost income, noting that KAM had made efforts to support him and that the privacy breach, while serious, did not directly lead to his resignation.
The Tribunal’s findings
The Tribunal ordered KAM to pay $30,000 to Cummings for the emotional harm he experienced. It also made a formal finding that KAM had breached his privacy rights. The award was based on the seriousness of the leak, the sensitivity of the information, and the long-term impact on Cummings. The Tribunal placed the case within the middle of the range of seriousness for privacy breaches and said it did not see any aggravating conduct from KAM’s leadership beyond the original disclosure.
Key takeaway
This case serves as a strong reminder that privacy obligations apply to all communications, including those within the workplace. Information about staff drug testing, disciplinary issues, or health concerns must only be shared on a strict need-to-know basis. A single disclosure outside of the scope of the Privacy Act, even if well-intentioned, can cause emotional harm and significant liability, as well as significant damage to a business’s reputation and relationships.
*****
(1) Cummings v KAM Transport Limited [2025] NZHRRT 8
(2) Privacy Act 2020, section 22 (IPP 11)
(3) Privacy Act 2020, section 22 (IPP 10)
DAVID ALIZADE
newzealand@lexing.network
Portugal’s data-protection landscape in 2024–2025 continued to reflect ongoing tensions between technology, established administrative practices, and fundamental rights. In the broader EU context, the Portuguese National Data Protection Commission (“CNPD”) is not generally seen as a high-volume or punitive supervisory authority. Still, in recent years, enforcement has become more assertive. The CNPD now intervenes selectively in high-risk activities and applies a strict stance to public-sector compliance.
Within the national GDPR framework, the CNPD acts as an independent supervisory authority with administrative and financial autonomy. It monitors compliance across both public and private sectors. Although the CNPD has not published its activity report for the current year, recent enforcement measures and judicial decisions already show the main dynamics shaping GDPR enforcement in Portugal.
Developments in 2025 signal a shift from isolated infringements toward closer scrutiny of organizational and structural compliance, especially within public administration. This trend builds on 2024 enforcement patterns.
Portuguese case law now confirms that public authorities are fully subject to GDPR sanctions and that neither administrative convenience nor public interest justifies easing data-protection obligations.
Woldcoin
Although decided in 2024, the Worldcoin case remains relevant as contextual background to enforcement priorities observed in 2025. The project involved large-scale biometric data collection and raised concerns about transparency, consent, access by minors, and the long-term implications of biometric identification. In response, the CNPD ordered the temporary suspension of biometric data collection in Portugal, acting in coordination with other EU supervisory authorities.
The significance of the case lies primarily in its procedural and regulatory implications. It illustrates the CNPD’s willingness to intervene at an early stage when processing operations present heightened risks, rather than relying solely on ex post corrective action. This precautionary approach continues to inform supervisory attention regarding sensitive, technologically complex processing activities.
Russia Gate
The main enforcement development of 2025 centered on the disclosure by the Lisbon City Council of personal data about organizers of demonstrations and protests. Public attention grew after personal data about protest organizers against the imprisonment of Alexei Navalny was sent to the Russian Embassy in Portugal.
The CNPD began investigating in 2021. It was found that disclosing personal data to third parties, without a lawful basis or proper safeguards, was a long-standing administrative practice. The authority concluded that the GDPR had been infringed and imposed a fine of €1.2 million. The administrative courts upheld these findings, despite the municipality’s appeals.
In August 2025, the second-instance administrative court confirmed the CNPD’s decision. The court reaffirmed that the GDPR applies fully to public authorities. Local government bodies must meet the same standards of legality, necessity, and proportionality as private entities. This case shows clear judicial affirmation of public-sector accountability under the GDPR.
The AIMA incident
Additional insight into enforcement trends in 2025 is provided by the incident involving the Portuguese Agency for Integration, Migration, and Asylum (“AIMA”). In the context of administrative communication, the Agency improperly disclosed the email addresses of hundreds of individuals, revealing shortcomings in basic confidentiality safeguards.
Although limited in scope, the incident highlights ongoing operational weaknesses in public administration. As in other recent cases, the breach did not result from deliberate noncompliance. Instead, it came from outdated procedures, weak internal controls, and poor integration of GDPR principles (such as data minimization and security) into daily operations.
Conclusions
In summary, the trend in 2024–2025 has been toward targeted, structured GDPR enforcement in Portugal, with increased emphasis on public-sector accountability and on integrating data protection principles into administrative practices.
These cases suggest that certain administrative procedures, originally designed before the GDPR era, required substantial adjustment to meet current data-protection standards. As such, some public bodies have encountered greater challenges in updating their internal processes and ensuring full compliance.
*****
(1) Law No. 58/2019, of 8 August which ensures the implementation, within the national legal system, of Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data on the free movement of such data.
(2) Decree-Law No. 41/2023, of 2 June that establishes the Agency for Integration, Migration and Asylum (AIMA, I.P) that has administrative powers in the area of migration and asylum and the Office of the High Commissioner for Migration, I.P. (ACM).
JOÃO G. GIL FIGUEIRA
In 2025, Romania continued to see active enforcement of data protection laws by the National Supervisory Authority for Personal Data Processing (in romanian ANSPDCP), reflecting both growing public attention to privacy and increasing regulatory vigilance. Underlying this enforcement is the continuing application of GDPR (Regulation (EU) 2016/679) and relevant national legislation (e.g., Law 506/2004 on electronic communications privacy).
Key Statistics & Enforcement Trends
Between January and April 2025 alone, ANSPDCP reported handling 3,048 complaints, notifications, and data-breach reports, which led to 153 investigations.
From those investigations, the authority issued 52 fines (totalling approximately 1.13 million RON, ≈ €230,500), 68 warnings, and 96 corrective measures and 2 decisions to cease data processing.
Common issues triggering sanctions included unlawful data disclosure (online or public), excessive or unauthorised video surveillance, failure to respond to data-subject requests, direct marketing without consent, and inadequate data-security measures (especially following cyber-incidents) (1).
These metrics show that data-subject complaints and data-breach notifications remain frequent, and that ANSPDCP continues to follow through with investigations, showing a clear sign that the principles of accountability, transparency, and security are being enforced meaningfully.
Notable Cases and Sanctions in 2025
- Fines for Security Failures and Data Breaches
A company named Prime Transaction SA was sanctioned after a cyber-attack in June 2025 exposed personal data (names, addresses, tax codes, bank data, identity documents, etc.). The company failed to implement adequate technical and organisational safeguards, violating GDPR’s security requirements (Art. 32). The fine imposed was ≈ €2,000 (10,162.20 lei) (2).
In October 2025, the Romanian Data Protection Authority sanctioned Cucina di Fabio S.R.L. following an investigation triggered by a complaint about unsolicited marketing emails. The authority found that the company collected email addresses and sent commercial messages without ensuring that recipients were genuine users of those addresses or that they had expressly opted in for electronic marketing communications. As a result, the operator received two fines: €1,000 for unlawful processing and €2,000 for insufficient data-protection measures (3).
Another case of violating art. 32 of GDPR was sanctioned with a €10.0000 fine (4). Also, the political party AUR was fined a total of €25,000 (5) after two major GDPR breaches came to light. The first incident involved a security vulnerability in the aur.mobi app that exposed extensive personal data of supporters and members, revealing inadequate security and poor implementation of data-protection-by-design measures. The second breach concerned the party’s platforms semnezsivotez.ro and semnezsivotez.org, where excessive personal data was collected without a lawful basis and beyond what was necessary for the stated purposes. During the inquiry, AUR took down the two platforms involved.the politic party AUR, it was .
These decisions underscore that IT-security failures, especially following breaches or attacks, are taken seriously, and that penalties can hit companies across sectors, including presumably small or medium enterprises.
- Unauthorised Surveillance & Employee Tracking
One investigation (6) focused on a firm that had monitored employees’ GPS locations outside working hours (including vacations) and retained that data longer than permitted. Initially, ANSPDCP imposed a fine of ~19,898.4 lei (~€4,000) and a warning.
Another investigation found that ROUMASPORT SRL had unlawfully used its video-surveillance system to process employees’ images, even for disciplinary purposes, leading to both sanctions (a 5000 fine) and a corrective order to bring its monitoring practices in line with GDPR (7).
Last, but not least, the Romanian DPA sanctioned a school, Vasile Conta Highschool from Târgu Neamț with two warnings after finding unlawful and excessive video surveillance, including cameras installed in areas designated as restrooms and unrestricted access to live feeds. The school had processed students’ and staff members’ images in violation of GDPR principles and had failed to implement adequate technical and organisational security measures (8).
- Privacy Violations on Websites (Cookies / Transparency / Consent)
A high-profile case involved former political figure Călin Georgescu, whose personal website was fined over 50,000 lei for deploying cookies and collecting personal data without proper user consent or transparency. Between December 2024 and April 2025, visitors’ cookies and contact-form data (names, phone numbers, email addresses) were collected without adequate notice or consent, breaches of both Law 506/2004 and GDPR transparency obligations (9).
Another operator, Coral Travel & Tourism Services SRL, was fined 5,000 lei for installing non-essential cookies on its website without user consent, thus violating privacy-law requirements for clear, informed consent and proper cookie-banner implementation (10).
These cases illustrate that even “ordinary” website operators (from public individuals to tourism services) remain under scrutiny: consent must be explicit, transparency rules observed, and any data collection justified by legal basis.
What 2025 Tells Us About the State of Data Protection in Romania
- a) Data protection is more than a formality: The volume of complaints & investigations suggests that many data subjects are alert and willing to report perceived abuses and ANSPDCP acts on a large share of them.
- b) Compliance must be proactive and continuous: The 2025 enforcement wave shows that compliance is not a one-time effort. Entities must constantly monitor data-processing practices: conduct data protection impact analyses, implement strong technical and organizational safeguards, maintain transparent consent mechanisms, and respect data-subject rights including minimization and timely response to requests.
- c) Public-profile actors are not immune: The Călin Georgescu case shows that even public figures or politically exposed individuals are subject to the same rules which is important for trust in oversight.
Conclusion
The developments in Romania throughout 2025 demonstrate that data protection has become a practical, enforceable obligation rather than a theoretical legal concept. With hundreds of investigations, numerous fines, and a wide spectrum of violations, from cybersecurity breaches to improper cookie-consent mechanisms, the national data protection authority is actively scrutinizing both public-facing platforms and internal corporate operations. For companies, public institutions, and even individuals maintaining an online presence, the message is unmistakable: neglecting or disregarding data-protection rules will inevitably attract sanctions.
*****
(1) ANSPDCP press release: https://www.dataprotection.ro/?page=%20Comunicat_Presa_23_05_2025&lang=ro
(2) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_16.10.2025
(3) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_26.11.2025
(4) ANSPDCP decision: https://www.dataprotection.ro/?page=Comunicat_Presa_01.09.2025
(5 ANSPDCP decision: https://www.dataprotection.ro/?page=Comunicat_Presa_07_07_2025
(6) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_13_11_2024&lang=ro
(7) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_09.05.2025
(8) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_07.02.2025
(9) ANSPDCP press release: https://www.dataprotection.ro/?page=Comunicat_Presa_16.07.2025 .
(10) ANSPDCP press release: https://www.dataprotection.ro/index.jsp?page=Comunicat_Presa_14.10.2025
RAUL-FELIX HODOȘ
&
ALEXANDRA-DENISA SĂSĂRMAN
Brief Background Facts
In October 2023, Marina Bay Sands Pte Ltd (MBS), an integrated hotel and casino resort, suffered a data breach. The personal data of more than 500,000 individuals were illegally accessed and offered for sale on the dark web. A threat actor gained unauthorised access to six customer accounts, which were then used to access data of other members. The unauthorised access was caused by an employee’s misconfiguration error during MBS’s software migration exercise.
PDPC’s Decision and Financial Penalty Framework
- Commission’s Findings
The Personal Data Protection Commission (PDPC) found that MBS had negligently breached its Protection Obligation under the Singapore Personal Data Protection Act 2012 (PDPA) by failing to put in place reasonable security arrangements to mitigate the risk of human error during the software migration exercise.
PDPC initially determined that MBS should pay a financial penalty of SGD 450,000, taking into account the size of its annual turnover. This amount was reduced to SGD 315,000 after considering MBS’s overall prompt and voluntary remedial measures, including its voluntary notification to all affected individuals despite not being legally obligated to do so.
This is the second highest financial penalty imposed by the PDPC since the enactment of the PDPA (1) and the first under the strengthened financial penalty framework, which was introduced to enhance deterrence and underscore the importance of data protection in the digital economy.
- Objective Computation Framework
The new financial penalty framework starts with the statutory maximum financial penalty – 10% of the total annual turnover in Singapore if that turnover exceeds SGD 10 million.
The PDPC then calibrates the penalty based on aggravating and mitigating factors (such as cooperation with the investigation and voluntary breach notification), the company’s financial situation, and whether the penalty achieves effective deterrence while ensuring proportionality.
In applying the new financial penalty framework, the PDPC will consider guiding principles such as deterrence and balancing of interests.
With this decision, there is greater clarity on how the PDPC will impose financial penalties for PDPA breaches.
*****
(1) The highest amount, SGD 750,000, was imposed on Integrated Health Information Systems Pte Ltd (IHiS) in 2019 in Singapore Health Services Pte Ltd & Ors [2019] SGPDPC 3.
WINNIE CHANG
In February 2025, the Swedish Supreme Court (1) issued two landmark decisions that fundamentally clarify how Sweden must balance its longstanding constitutional principles of transparency and freedom of expression with the European Union’s privacy legislation.
The tension arises between the principle of public access to official documents (2), the constitutional protections for media under the Fundamental Law on Freedom of Expression (3), the secrecy provisions in Public Access to Information and Secrecy Act (4), and the strict requirements of the General Data Protection Regulation (GDPR) (5), particularly article 10, which governs the processing of criminal-offense data.
These rulings affirm that Swedish courts must continue to release public documents, yet they must now carefully restrict how private actors may process sensitive personal data once obtained.
Trobar AB (Case Ä 3169-24)
The first case concerned Trobar AB (“Trobar”), a company operating a legal information database protected by a publication certificate (6) under Fundamental Law on Freedom of Expression. Trobar requested large volumes of criminal judgments from a district court. The court denied the request, arguing that Trobar’s intended use, involving searchable identifiers and alert services, would violate article 10 of GDPR, which places strict limits on the processing of criminal-offense data by private actors.
The Swedish Supreme Court overturned the refusal and ordered that the documents be released, but only subject to binding conditions (7). Trobar was prohibited from disclosing personal names, personal identity numbers or addresses, from offering search functions to personal identifiers, and from operating alert services that identify individuals based on the documents.
The Court emphasized that EU Law has primacy and priority application, meaning that article 10 of GDPR applies even where the Fundamental Law on Freedom of Expression provides constitutional protection. While the principle of public access to official documents obliges courts to release public documents, OSL 21 kap. 7 § (“Chapter 21, Section 7 of the Public Access to Information and Secrecy Act”) allows safeguards when it can be assumed that the recipient will process data in violation of GDPR. A proportionality assessment therefore becomes necessary. Documents may be accessed, but their digital reuse must be restricted. This decision marks a significant recalibration of the relationship between Swedish Constitutional Law and EU Privacy Law.
Nyhetsbyrån Siren / Panoptes Sweden AB (Case Ä 3457-24)
The second case involved Nyhetsbyrån Siren, a major Swedish news agency seeking bulk access to criminal case documents. The Court of Appeal granted the request but imposed protective conditions. Siren appealed, arguing that its journalism, shielded by the Fundamental Law on Freedom of Expression, should exempt it from the restrictions imposed by GDPR.
The Swedish Supreme Court upheld the disclosure while modifying the conditions, again preventing the agency from exposing names, personal identity numbers, or addresses through searchable databases or commercial services. The Court reaffirmed that article 10 of GDPR governs processing of criminal-offense data regardless of journalistic purpose. Although exemptions are granted under article 85 of the GDPR, these exemptions do not authorize the construction of a comprehensive searchable database of criminal-offense information. Under OSL 21 kap. 7 §, courts must continue to release documents but must evaluate proportionality in each case, balancing transparency, journalistic freedom and personal privacy.
Final thoughts
Both these decisions reshape the legal landscape for access to criminal case information in Sweden. Public access remains intact under the principle of public access to official documents, but once data leaves the court’s possession, its processing must comply with GDPR’s framework. The conditions restricting personal identifiers and searchability are now essential tools for courts. The rulings also make clear that constitutional protections under the Fundamental Law on Freedom of Expression do not override GDPR when sensitive criminal-offense data is involved.
Ultimately, the Swedish Supreme Court has confirmed that Sweden’s tradition of transparency and the European Union’s privacy standards must coexist. The rulings preserve openness while ensuring that individuals’ privacy is not compromised through modern, large-scale digital processing. These decisions will guide courts, journalists, legal databases, and private companies navigating the evolving intersections of data protection and freedom of expression.
*****
(1) Högsta domstolen
(2) Offentlighetsprincipen
(3) Yttrandefrihetsgrundlagen
(4) Offentlighets- och skretesslagen
(5) Dataskyddsförordningen
(6) utgivningsbevis
(7) Known as förbehåll
KATARINA BOHM HALLKVIST
For businesses operating in the United States, 2025 was a year of heightened scrutiny and evolving obligations. The U.S. remains unique: unlike the EU’s unified General Data Protection Regulation (GDPR) framework, it offers a patchwork of state and federal laws to address data security, privacy, and AI. This complexity—combined with rising litigation and regulatory actions—makes compliance a moving target for global organizations.
California Sets the Tone
California continued to lead U.S. privacy enforcement, and two cases in particular illustrate why international companies should pay attention.
American Honda Motor Co. became the first public enforcement action by the California Privacy Protection Agency (CPPA) under the CCPA. Honda was fined $632,500 for failing to honor opt-out preference signals and allegedly imposing excessive verification hurdles for consumer requests. The settlement required Honda to redesign its user experience and overhaul ad-tech contracts to meet statutory requirements (1).
Similarly, Tractor Supply Co. agreed to pay $1.35 million after regulators found its privacy notices inadequate and opt-out mechanisms ineffective (2).
These cases illustrate that regulators are targeting not just data misuse but also user interface design—penalizing “dark patterns” and poor consent flows. For international organizations, this signals that compliance is not just about having a privacy notice; it’s about how choices are presented and honored in practice. U.S. compliant notices, policies, and practices are critical.
Children’s Data and AI Litigation
The Federal Trade Commission finalized major amendments to the Children’s Online Privacy Protection Act (COPPA) in June 2025, expanding the definition of personal information to include biometrics and requiring separate parental consent for third-party disclosures, including targeted advertising (3). These changes raise the bar for ed-tech, gaming, and mixed-audience platforms, and reflect a broader trend toward stricter controls on sensitive data.
Meanwhile, artificial intelligence litigation surged. In The New York Times v. OpenAI, a federal court ordered the production of millions of anonymized ChatGPT logs, rejecting privacy objections and expanding discovery obligations around AI training data (4). This case underscores the growing tension between transparency and privacy in AI systems—a theme that resonates globally as the EU AI Act began to take effect.
Patchwork Expansion and Litigation Trends
By the end of 2025, 20 states enforce consumer privacy laws, with eight new statutes taking effect this year (5). Each law introduces unique thresholds and consent requirements, making compliance for multinational companies increasingly complex. At the same time, class actions and regulatory investigations are on the rise, targeting privacy missteps, biometric data use, and AI practices. Plaintiffs are stretching old laws—like video privacy and wiretap statutes—to new technologies. Helpful news for businesses operating in the U.S. is that courts are demanding greater specificity in pleadings.
For international businesses, this means higher litigation risk and the need for robust arbitration strategies. However, certain arbitration clauses and class-action waivers may not be enforceable under U.S. law, so boilerplate terms often fail to provide adequate protection (6).
Accessibility and Risk Reduction
Beyond privacy, U.S. law imposes accessibility obligations under the Americans with Disabilities Act (ADA). Websites and mobile apps must provide accessible experiences for individuals with disabilities. Noncompliance can trigger lawsuits and reputational harm. The last year has seen a surge in accessibility litigation, with a statute that provides for the recovery of attorneys’ fees. Implementing accessibility standards (such as WCAG 2.1) is relatively low cost compared to potential litigation exposure, making ADA compliance a high-impact, low-effort risk mitigation step (7).
The Trump Administration Factor
While no sweeping federal privacy, security, or AI laws emerged in 2025, executive orders and policy signals suggest a deregulatory posture—emphasizing innovation and reducing compliance burdens. This could slow momentum for a national privacy statute, leaving states to continue to drive enforcement. For example, states have amended numerous laws to address AI, and some states have their own comprehensive AI laws, again creating a patchwork of laws despite federal efforts to cut red tape and foster innovation. For the foreseeable future, the U.S. will remain a jurisdiction of fragmented rules and aggressive state-level oversight, rather than a cohesive federal regime (8).
AI Legislation and Data Protection in 2025
Artificial intelligence is increasingly being regulated in the U.S., but there is currently no federal law governing AI or its impact on personal data. Instead, states have taken varied approaches, again resulting in a patchwork of requirements. In 2025, over half of states have adopted or proposed laws on AI governance, oversight, and transparency. California, Texas, and Colorado are among those setting standards for disclosures and risk management, while compliance deadlines and specific obligations differ widely (9) (10).
This fragmented landscape means organizations must monitor state developments and adapt compliance programs accordingly. With Congress declining to set a national framework (11), state laws will continue to shape AI governance, requiring flexible, multi-jurisdictional strategies.
What This Means for International Organizations
If your organization processes the personal information of U.S. citizens, litigation and regulatory investigations are on the rise, so understanding specific notice and processing requirements is critical to avoid damages and fines:
Obtain an analysis of what U.S. laws apply to your organization so you can anticipate legal and compliance obligations to help avoid risk;
Review privacy notices and terms of use for state-specific compliance requirements; compliance with the GDPR does not necessarily equal U.S. compliance because states have a variety of nuances;
Audit consent flows and remove dark patterns;
Update vendor contracts to meet CCPA and other U.S. state and federal requirements and flow down obligations;
Implement ADA accessibility standards through technical design actions and updated online and internal policies;
Address U.S. requirements in policies and in online notices and other consents relating to AI use;
Consider robust cyber insurance and tech E&O insurance as litigation and regulatory fines escalate.
*****
(1) California Privacy Protection Agency, American Honda Motor Co. Enforcement Action, CPPA (2025), https://cppa.ca.gov
(2) California Privacy Protection Agency, Tractor Supply Co. Settlement, CPPA (2025), https://cppa.ca.gov
(3) Federal Trade Commission, FTC Finalizes COPPA Rule Changes, FTC (June 2025), https://ftc.gov
(4) U.S. District Court for the Southern District of New York, The New York Times v. OpenAI, Court Order (2025), https://courtlistener.com
(5) International Association of Privacy Professionals, 2025 U.S. State Privacy Law Overview, IAPP (2025), https://iapp.org
(6) U.S. Courts, Class Action Trends and Arbitration Enforcement, Administrative Office of the U.S. Courts (2025), https://uscourts.gov
(7) U.S. Department of Justice, ADA Website Accessibility Guidance, DOJ (2025), https://ada.gov
(8) White House, Executive Orders on Technology and Privacy, Federal Register (2025), https://federalregister.gov
(9) California Privacy Protection Agency, American Honda Motor Co. Enforcement Action, CPPA (2025), https://cppa.ca.gov
(10) California Privacy Protection Agency, Tractor Supply Co. Settlement, CPPA (2025), https://cppa.ca.gov
(11) Federal Trade Commission, FTC Finalizes COPPA Rule Changes, FTC (June 2025), https://ftc.gov
JENNIFER A. BECKAGE
&
LEE MERREOT