The online-safety regulator Coimisiún na Meán (CnaM) has ordered Facebook’s parent company Meta to take measures to protect its services from being used for the dissemination of terrorist content.
The move follows the watchdog’s finding that the social-media platform was ‘exposed to terrorist content’ under the Terrorist Content Online Regulation.
The regulation is an EU-wide mechanism aimed at combating the dissemination of terrorist content online and enabling the speedy removal of such content by service providers.
The regulator had made similar findings against TikTok, X and Meta (in respect of Instagram) last month.
Service providers that receive two or more final removal orders from competent EU authorities within the last 12 months can be found to be exposed to terrorist content.
CnaM said that it had reached its decision after the notification of two or more final removal orders in respect of Facebook and after engagement with the company.
As well as taking measures to protect their services from such content, Meta will have to report to CnaM on the measures taken within three months.
“Among the measures a hosting service provider exposed to terrorist content is required to take is the inclusion in its terms and conditions of provisions to address the misuse of its service for the dissemination to the public of terrorist content,” the regulator stated.
Providers can face financial penalties if they continue to infringe the regulation.
Terrorist content is defined in EU law as material that:
Separately, CnaM has said that it will be defending its Online Safety Code against judicial-review proceedings lodged by Twitter International Unlimited Company (X).