We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


‘Constitution of the internet’ comes into force
Pic: Florian Olivo on UnSplash

12 Feb 2024 / data law Print

‘Constitution of the internet’ comes into force

A Fieldfisher webinar (8 February) has heard lawyer Sinéad Taaffe describe the new Digital Services Bill 2023 as “the constitution of the internet”, which signifies its transformative nature.

It ends the era of self-regulation by internet providers, Taafe pointed out, and makes illegal online what is illegal offline.

The bill, which provides for the implementation in Ireland of the EU’s Digital Services Act (DSA), passed its final stages in the Oireachtas earlier last week.

The DSA introduces obligations spanning content moderation, recommender systems, online advertising, and ‘dark patterns’, with a key focus on the better protection of minors, she said.

Once it comes into force on 17 February, it will update liability exemptions under the eCommerce Directive, she added, with a sliding scale of obligations, depending on the service-provider category.

Obligations

Fieldfisher’s Damien Watson said that internet service providers (ISPs) should spend time analysing their new obligations, and the extent to which their offerings were in scope.

Business services should be broken down by categories, he suggested.

“If you don't, it can end up costing you a lot of money later on down the line,” he warned.

ISPs would be protected from liability, as long as they had not played an active role in the transmission of illegal content, he added.

Authorities could now order the removal of illegal content, Sinéad Taaffe explained, and would be entitled to require specific information in relation to recipients of the service.

Following receipt of an order, the provider will be obliged to comply, and to notify the relevant authorities of any effect given to the order, without undue delay.

‘Contact person’

An obligation to designate a point of contact also applies. This will enable authorities to interact with the provider and recipients of the service to communicate directly by electronic means, in a user-friendly manner.

Where the provider is not established in the EU, it will have to identify a natural or legal person to act as its legal representative.

In accordance with DSA article 13.3, the legal representative can be held liable for non-compliance with its obligations, without prejudice to any legal action or liability.

Content-moderation steps must also be reported upon each year.

An additional obligation on providers is to put in place mechanisms that allow for the notification of illegal content on their service without a detailed legal examination.

Clause six of the DSA is a provision that, where restrictions are imposed in relation to content, the provider is required to provide a clear and specific statement of the reasons for the decision. Where the provider becomes aware of suspicions that a criminal offence may have taken place, it must notify the relevant authorities.

Service recipients are entitled to select another body to resolve disputes relating to a decision.

The platform is obliged to post information about how to access those dispute-settlement bodies.

‘Trusted flaggers’

The DSA uses the term “trusted flaggers”, whose information must be acted upon swiftly and processed without undue delay.

‘Dark patterns’ must be policed by providers to ensure their platforms are not operated in a way that deceives or manipulates recipients, or materially distorts or impairs their ability to make free and informed decisions.

Ads must be clearly identified as paid-for content to ensure that the recipient can distinguish the basis on which information is being presented.

The DSA also says that ‘recommender systems’ powered by algorithms must be set out in site terms and conditions, with information on any options to influence or alter the parameters that are being used.

Where the platform is accessible to minors, the provider must have appropriate and proportionate measures to ensure that there is a high level of privacy, safety, and security.

Profiling

The DSA also bans targeted ads for minors based on profiling using special or personal data.

Watson said that the newly established “crisis-response mechanism” was triggered by Ireland for the first time in November 2022 after the Dublin city-centre riots.

The European Commission got directly involved in contacting social-media giants such as Meta, YouTube, X, and others to warn them of their obligations under the DSA.

DSA fines are at a similar level to the GDPR, he added – up to a maximum of 6% of worldwide turnover, or greater if there is lack of co-operation with interim measures.

“These penalties are significant and show the importance of compliance,” he said.

John Brunning of Fieldfisher UK said that, as well as social-media users, all sites that allowed interaction with user-generated content would be in scope for the new laws – including dating websites, online marketplaces and forums, and gambling firms.

Proportionate measures

Businesses should ask themselves if they were providing a regulated service, he said, and take proportionate measures to mitigate and prevent access to harmful or illegal content.

The big financial penalties would persuade compliance, he added, but firms should also be cognisant of reputational damage to their brand.

Businesses looking for investment or acquisition would also be the focus of due diligence on regulated compliance, he added.

Tiernan Kenny, Coimisiún na Meán's Director of Communications and Public Affairs, said that putting in place an online-safety framework in Ireland had meant staffing up from 40 to 90, with growth plans for 160 by the middle of next year.

As well as the DSA, the body has duties under EU online regulation of terrorist content, as well as responsibilities under domestic legislation.

“A large part of the DSA is public-facing consumer rights, so we want to make sure that people are aware of them,” he said.

A contact centre will advise of rights and act as an information source, signalling where supervision or investigation activity should focus.

CnaM also has a legal division, and a data-and-technology division.

Inherently ‘cross-border’

The DSA was inherently ‘cross-border’, with some exclusive competencies of the European Commission, but also cross-border interaction and support between the national regulators, he added.

Government mapping estimated 400 service providers in Ireland within the scope of the DSA, he said, adding that they would be made aware of their obligations.

CnaM was looking for an open and manageable channel of communication, he said.

“We're very keen to learn more about how these services operate; what they know about their users,” he said.

CnaM wanted to establish best practice in a very conscious way, he said: not setting the bar so low that it didn’t make a positive impact for people, nor so high that nobody could reach it.

“We're really hoping that we can have as much open dialogue with the full range of services or service providers as we can,” he concluded.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland