Online-safety regulator Coimisiún na Meán (CnaM) has begun two investigations into the social-media platforms TikTok and LinkedIn.
The probes will assess whether the platforms have contravened articles 16(1), 16(2)(c), and 25 of the EU’s Digital Services Act (DSA).
Article 16 concerns the ‘notice and action’ mechanisms that providers are required to have in place to allow people to report content that they suspect to be illegal.
The investigations follow a CnaM review that raised concerns about potential ‘dark patterns’, or deceptive designs, of the illegal-content-reporting mechanisms.
The watchdog was concerned that the mechanisms were liable to confuse or deceive people into believing that they were reporting content as illegal content, as opposed to content in violation of the provider’s terms and conditions.
The investigations will look into:
If a provider is found in violation of the DSA, CnaM can apply a fine of up to 6% of turnover.
CnaM’s Digital Services Commissioner John Evans said that other providers had made “significant changes” to their reporting mechanisms for illegal content after engagement with it.
He added that the regulator was assessing these changes for their effectiveness.