DPOs in 2026: new expectations and emerging priorities
In 2026, Data Protection Officers (DPOs) face a more demanding regulatory environment than ever before. This follows a sharp increase in fines for GDPR non-compliance in 2024, signalling a tougher enforcement stance from regulators. This trend shows no sign of slowing down, particularly with the entry into force of the AI Act and new requirements relating to the protection of minors. As a result, the responsibilities placed on DPOs continue to grow.
But how can DPOs anticipate these new obligations and prevent compliance from becoming a barrier to innovation? How can organisations ensure that personal data management is not merely a legal requirement, but a driver of trust and differentiation? And more importantly, what strategies should they adopt to meet these expectations while optimising the user experience?
This article explores these questions and offers practical insights.
The CNIL’s priorities for a safer digital environment
The French Data Protection Authority (CNIL) published its strategic plan in January 2025, setting out its priorities through to 2028 to support technological change and strengthen data protection. Cybersecurity and mobile applications remain areas of close scrutiny, but regulating artificial intelligence and protecting minors have emerged as new key priorities.
Artificial intelligence at the heart of regulatory challenges
In response to the risks posed by large-scale data collection, algorithmic bias and deepfakes, the CNIL plans to strengthen its capacity to audit and oversee generative AI models. In particular, it intends to clarify the applicable legal framework and work closely with industry stakeholders to promote technologies that respect fundamental rights. This approach aligns with the implementation of the forthcoming EU AI Act, which aims to regulate AI use while fostering innovation.
The rapid growth of generative AI relies heavily on vast volumes of data collected from the public, raising questions about the transparency in data processing and people’s ability to retain control over how their personal information is used.
For DPOs, this means assessing the risks associated with AI systems, including ensuring that the personal data used to train these models is processed in compliance with data protection rules. They must also ensure that algorithmic processes are transparent, that source data is traceable, and that appropriate safeguards are in place to allow users to exercise their rights.
Strengthening the protection of minors in the digital world
The overexposure of children and teenagers to screens and digital platforms increases the risks related to privacy, cyberbullying and advertising profiling. The CNIL plans to ramp up its awareness-raising initiatives aimed at parents, teachers and digital service providers.
Minors are often less aware of data protection issues and consequently share personal information without understanding the consequences. This data is exploited by digital platforms to fuel advertising targeting strategies and recommendation algorithms that influence their online behaviour.
In response to these practices, consent and transparency requirements are expected to be strengthened.
Here, DPOs play a key intermediary role between regulatory obligations and operational realities. They help adapt user journeys so they are understandable for younger audiences, while ensuring that age and consent verification mechanisms are effective and proportionate.
Internet users driving new expectations
Internet users are becoming increasingly aware of the issues surrounding the use of their personal data and want greater control over the information they share online.
The rise of a more informed consumer
Organisations such as UFC-Que Choisir are providing dedicated resources to support this growing awareness. The Je Ne Suis Pas Une Data platform (French for “I Am Not A Data”), launched in 2023, informs consumers about their rights and provides tools to help them manage their personal data more effectively.
Through a range of features, users can exercise their rights to rectify, delete or object to the processing of their information on platforms such as Facebook, Uber, LinkedIn or Instagram, thereby giving them greater control over their digital identity.
This desire for more control over personal data is also reflected in trends observed by businesses. According to Cisco’s 2024 Data Privacy Benchmark Study, 37% of consumers regard receiving clear information about how their data is used as a priority, while 24% want their data not be resold for marketing purposes.
DPOs therefore play a crucial role in adapting transparency policies by ensuring that information provided to users is clear, easy to access and can be used effectively to exercise their rights.
“Dark patterns” under increasing scrutiny
While consumers are calling for greater transparency in how their data is used, many organisations continue to rely on “dark patterns” – interface designs or practices intended to mislead users and influence their choices.
A study conducted in 2024 by the Global Privacy Enforcement Network found that 97% of the websites and applications analysed used at least one of these practices. Of these, 70% made privacy settings difficult to access, whilst 46% imposed particularly cumbersome processes for refusing advertising tracking. These techniques are designed to discourage users from exercising their rights, for example by hiding opt-out options, making unsubscribing difficult or providing deliberately unclear privacy policies.
In response, data protection authorities – including the CNIL in France – are stepping up their enforcement efforts to ensure greater transparency. The challenge is twofold: restoring consumer trust and ensuring compliance with personal data protection regulations.
This is why DPOs need to work closely with marketing, product and UX teams to embed GDPR requirements into interface design from the outset. They are responsible in particular for eliminating manipulative practices and promoting user journeys that respect user rights.
User consent: from compliance to user experience
User consent can no longer be treated as a mere legal formality. At a time when consumers expect greater transparency and control over their data, organisations must adopt a more seamless and engaging approach. This is precisely Fair&Smart’s mission: a platform that transforms consent and preference management into a driver of trust and engagement.
With its intuitive interface and advanced features, Fair&Smart enables organisations centralise consent collection while giving users full control over their preferences. Users can view granted permissions, change their choices with a single click, and withdraw consent at any time, using a form or a personalised dashboard. The platform also ensures full traceability, supporting regulatory compliance and simplifying audits.
For DPOs, the challenge goes far beyond compliance with legal obligations. By integrating Fair&Smart, they can deliver a transparent, reassuring user experience that reduces friction and strengthens loyalty. This turns personal data management into a competitive advantage. Rather than a constraint, consent becomes a cornerstone of the customer relationship, built on trust and respect for privacy.
Managing consent transparently with Fair&Smart
Faced with increasingly demanding users, constantly evolving regulations and the need to simplify processes, the role of DPOs now extends well beyond compliance. They have become a key guardian of digital trust.
Achieving this requires implementing the right tools, delivering a seamless experience and giving users full control – all critical factors in building trust.
Fair&Smart supports DPOs in this mission by providing practical solutions to centralise, trace and manage consent effectively.


