New rules are necessary to regulate – among other things – algorithms targeting kids and young adults, but adoption of those rules will require real political determination. Philip Andrews SC and Aleksandra Pruska press pause
The Irish Times recently reported that Meta is lobbying the Government to use its forthcoming EU Council presidency to kill off a proposed new EU Digital Fairness Act (DFA).
Research published in February 2026 by Corporate Europe Observatory, a Brussels NGO monitoring corporate lobbying, similarly documents an EU-wide campaign by big tech to weaken the DFA before it has even been drafted.
EU officials have held at least 96 meetings with representatives on the DFA since December 2024 – 83% of those meetings were with industry or lobbying groups, compared with under 14% with NGOs supporting the legislation.
The digital sector now reportedly spends over €150 million annually on lobbying in Brussels, deploying some 890 lobbyists.
Digital fairness
One of the most consequential reforms to EU consumer protection in more than a decade, the Digital Fairness Act is intended to tackle unfair and manipulative practices in digital consumer markets, dark patterns that steer consumers toward choices they would not otherwise make, addictive design features such as infinite scroll and autoplay (including particularly those targeted at young adults), opaque personalisation, unfair pricing, problematic influencer marketing, and online contracts, including predatory subscription contracts.
The European Commission completed a public consultation on the adequacy of existing consumer protection law in regulating these practices in October 2025 and expects to publish a first draft of the DFA in Q4 2026.
The results of the commission’s consultation were striking: around 70% of respondents supported new binding rules on dark patterns, addictive design, personalisation, and video-game features alike.
Opposition was concentrated overwhelmingly among large companies and business associations, the very constituency now lobbying Ireland.
‘Unnecessary duplication’
The industry position (that existing laws are adequate and only enforcement needs improving) deserves a fair hearing.
Concerns about regulatory overlap with existing online regulation – such as the Digital Services Act (DSA), Digital Markets Act (DMA), and the GDPR – are not confined to big tech.
Some SMEs and consumer groups raised them in the consultation too, and these are legitimate questions for the legislative process.
But let’s not confuse regulatory volume with regulatory adequacy. The existing framework has structural weaknesses that can properly be remedied only by new legislation.
For instance, existing standards of unfairness in the EU’s consumer-protection rulebook are built around the ‘average consumer’ as a rational, informed decision- maker. But this concept fits poorly with digital environments where consumers face algorithmic control and behavioural exploitation at every step.
A European Commission ‘fitness check’ in 2024 documented this failure in detail.
Tellingly, 65% of consultation respondents supported amending the average consumer standard to better reflect actual digital behaviour, and 58% backed reversing the burden of proof in cases where evidencing a trader’s wrongdoing is disproportionately difficult.
The argument that the DSA already covers dark patterns also requires scrutiny.
Article 25 of the DSA explicitly carves out practices ‘covered by’ the Unfair Commercial Practices Directive and GDPR, creating fragmentation rather than resolving it.
A German appellate court ruling earlier this year, in which a consumer organisation could not rely on the DSA against alleged dark patterns on a ticketing website, illustrates this gap in practice. The case is now before the Federal Court of Justice on appeal.
In what some say is one of the most significant findings in digital markets in recent years, the European Commission issued preliminary findings that TikTok’s “addictive design” breached the DSA on 6 February 2026.
If confirmed, the decision could shatter TikTok’s value to content creators and advertisers.
Does this jump the gun on the Digital Fairness Act? Possibly. But, already, legal commentators are suggesting the DSA (which doesn’t mention addictive design other than in recitals to the act) doesn’t provide legal basis for the commission’s decision.
Problematic practices
Many digital interfaces essentially use ‘tricks’ to sway users in a particular direction. For example, they might highlight certain options, make it harder to pick less popular choices, or keep popping up with nagging prompts until users relent.
Other tactics include confirm-shaming (using emotive language to pressure users), pre-ticked consent boxes, hidden opt-out buttons, adding items to a basket without obtaining clear consent, and making cancellation or unsubscribe processes deliberately difficult – the classic ‘roach motel’, where it is easy to get in, but hard to get out.
Addictive and manipulative design
Many digital products, especially video games, use unregulated ways to keep consumers engaged. These include reward badges, penalties for disengagement, countdown timers, gamification features, infinite scroll, autoplay, and in-game currencies. Such design choices, often targeting young adults, are intended to maximise time spent on the platform.
Unfair personalisation and pricing
Behavioural data enables online traders to tailor content and pricing, occasionally in ways that exploit vulnerabilities. This can involve targeting users based on emotional or financial stress or using unfair pricing tactics, such as vague ‘reference’ prices, drip pricing (revealing mandatory costs only at the end), and dynamic pricing, where algorithms raise prices in real time after low initial offers.
Social-media commerce
Influencer content and social-media advertising can potentially mislead consumers when commercial intent is not clearly disclosed. Not only is hidden advertising a concern, but so are the promotion of unsafe and unhealthy products or unrealistic beauty standards, particularly for younger users.
Digital contracts
Online consumers often face obstacles when managing digital contracts. Problematic practices include forced acceptance of unfair terms, automatic renewals, free trials that are then converted to paid subscriptions without clear consent, and unnecessarily complex cancellation procedures.
The courtroom test
Those who doubt the seriousness of digital consumer harm need only look to a Los Angeles courtroom where, in recent class-action litigation, Meta faces claims that its Instagram platform was deliberately designed to addict young people.
Mark Zuckerberg took the stand before a jury and, confronted with internal documents on his firm’s practices, reportedly struggled to defend claims that it targeted young users.
Whatever the jury decides, the evidence aired in Los Angeles makes a powerful empirical case that voluntary compliance and existing frameworks are not
sufficient.
Ireland’s moment
Ireland’s EU presidency from 1 July to 31 December 2026 is a decisive moment.
The EU is committed to robust consumer protection that keeps pace with today’s complex, data-driven economy. And the Government itself has committed to making digital online safety a key theme.
Against this, US pushback on regulatory enforcement against big data is increasing.
As EU home-jurisdiction to firms like Meta, Google, Microsoft, and X, Ireland has an outsized role in EU enforcement.
Let’s not forget that the Trump administration previously threatened individualised tariffs against EU member states in response to regulatory action targeted at US tech firms.
And, late last year, the US Department of State imposed a visa travel ban on former European commissioner and main sponsor of the DSA, Thierry Breton, and other EU consumer-rights advocates. Irish regulators could, in other words, face personal liability.
Ireland’s DPC, which has imposed fines of more than €4 billion on US tech firms since the introduction of the GDPR in 2018, reportedly has five active large-scale investigations into Elon Musk’s X, and at least six active High Court cases involving US tech firms on the docket.
Similarly, Coimisiún na Meán, Ireland’s new media and digital services regulator, is facing multiple lawsuits involving US tech platforms.
At the same time, Ireland’s tax-revenue reliance on a very small number of US multinationals is significantly increasing year-on-year.
In 2024, the Irish Fiscal Advisory Council estimated that the top three highest-paying corporate groups (reportedly Apple, Microsoft, and US pharma group Eli Lilly) accounted for 46% of all corporation-tax revenues, roughly €13 billion, with the top ten payers accounting for almost 60% of Ireland’s annual corporate tax.
Corporation tax almost doubled between 2021 and 2024, largely driven by increased payments from the top three payers, and is now the largest single contributor to Irish public spending. He who pays the piper calls the tune, you might say.
Ireland’s hand is doubtless a difficult one to play.
But engaging seriously with the legitimate simplification concerns raised across the consultation, pressing for coherent interaction with the DSA and DMA, and ensuring that the fairness-by- design duty is calibrated proportionately could serve both Ireland’s interests and the interests of Irish and European consumers.
Philip Andrews SC is founder of Andrews Law. He previously served as legal adviser to the Competition Authority, led the EU and Competition Group at McCann FitzGerald, and is co-author of Modern Irish Competition Law. Aleksandra Pruska is an associate at Andrews Law, practising in Ireland and Poland, specialising in Irish and EU competition law and consumer protection.
Literature: