We use cookies to collect and analyse information on site performance and usage to improve and customise your experience, where applicable. View our Cookies Policy. Click Accept and continue to use our website or Manage to review and update your preferences.


Use food label model on web speech technologist
Pic: Eamonn Farrell/RollingNews.ie

05 Mar 2021 / technology Print

Use food-label model on web speech – technologist

EU fundamental rights must be the basis for any regulation of social media, a Bar of Ireland webinar entitled Understanding Disinformation: The Legal and Policy Response has been told.

The event this week heard from Pierre Francois Docquir of the media freedom organisation, Article 19, who said that increasing numbers of people take their news primarily from social media.

He said these people were not necessarily able to understand whether there was a producer of news in place, working according to certain ethics of journalism.

Docquir said there was a role for social media platforms to contribute to media plurality and diversity.

The European Charter of Fundamental Rights specifically mentions pluralism, he pointed out, and a diversity of viewpoints must be sufficiently visible on social media in order to have an informed citizenry.

Any supranational regulatory frameworks must be based on EU fundamental rights, Docquir said, because otherwise they could be used to catastrophic effect in less democratic regimes.

He added that tech platforms would not want to implement different regulatory regimes for each territory in which they operated, he told Tuesday’s event (2 March).

More facts

The problem of disinformation can’t be solved by ‘more facts’, given the existing information overload, technologist Mark Little warned those attending the webinar.

“Disinformation has an emotional power that punches through all that noise,” he said.

Trustworthy news is not the only response, since human fact-checking cannot scale to meet the challenge of disinformation.

Neither will automated responses work for disinformation, which is a complex, nuanced, linguistic, cultural phenomenon that is mutating on a moment-by-moment basis to evade detection, Mark Little said.

Automated responses to disinformation will have bias built into their datasets, and that is extremely dangerous, he said.

He added that, while some big-tech platforms took disinformation seriously, some simply saw it as a threat to their business models.

Little drew an analogy with food-labelling, to clearly indicate what can be found in the information we consume every day.

Because of linguistic variations, power must also be devolved to the individual languages and markets, he suggested.

Nazi undertones

Mark Little argued that the term ‘fake news’ has Nazi undertones in Germany, for instance: “Disinformation has an emotional power that punches through all that noise,” he said.

Trustworthy news is not the only response, since human fact-checking cannot scale to meet the challenge of disinformation.

Neither will automated responses work for disinformation, which is a complex, nuanced, linguistic, cultural phenomenon that is mutating on a moment-by-moment basis to evade detection, Mark Little said.

Automated responses to disinformation will have bias built into their datasets, and that is extremely dangerous, he said.

He added that, while some big-tech platforms took disinformation seriously, some simply saw it as a threat to their business models.

Little drew an analogy with food-labelling, to clearly indicate what can be found in the information we consume every day.

Because of linguistic variations, power must also be devolved to the individual languages and markets, he suggested.

The term ‘fake news’ has Nazi undertones in Germany, for instance.

Bot armies

“What kind of networks are creating ‘bot armies’? What kind of networks are manipulating paid advertising?” he asked. 

Data exposure should show how pieces of content are being recommended, and what level of engagement there is with them.

Accountability and a content recommendation system is also required when it comes to speech, he said, offering a comparison with household goods going through a series of safety checks.

“We still don’t have common standards for the safety of the content recommendation systems that are so vital to discourse in our societies,” he said, adding that those tech firms that won’t co-operate should be exposed.

Solutions will not come from national governments, but rather in a focus on transparency in the way in which content is promoted.

Gazette Desk
Gazette.ie is the daily legal news site of the Law Society of Ireland