Ireland’s media regulator, Coimisiún na Meán, has launched formal investigations into TikTok and LinkedIn, citing potential violations of the European Union’s Digital Services Act (DSA).
This comes after a similar investigation was launched against Elon Musk’s X last month, making these the first DSA enforcement actions by the Irish regulator.
According to a Bloomberg report, Ireland’s media regulator investigations into TikTok and LinkedIn are happening over potential flaws in their content reporting mechanisms. The probes will focus on ascertaining whether the platforms’ systems for users to report suspected illegal content are adequately accessible, user-friendly, and anonymous, as required by the DSA.
Providers need to “have reporting mechanisms, that are easy to access and user-friendly, to report content considered to be illegal,” said John Evans, Digital Services Commissioner at the regulator.
The European Commission is the main European Union enforcer against very large online platforms. However, some aspects of the law — including the reporting mechanism — fall under the jurisdiction of the national regulator of the EU country where a platform is headquartered.
Companies found to be in violation of Europe’s digital rules, as determined by Ireland’s media regulator, can be fined as much as 6% of the company’s annual global sales.
This is not the first time the Irish regulators have targeted a social media platform. TikTok was also caught in its crossfire and got slammed with a €530 million penalty for violating the EU’s General Data Protection Regulation (GDPR) in May 2025, while LinkedIn also got fined about €310 million for various regulatory breaches.
The current investigations into TikTok and LinkedIn are coming weeks after the same regulator opened an investigation into Elon Musk’s social media platform X, with claims that the company is failing to remove content users report as illegal.
According to Virkkunen, the Executive Vice-President of the European Commission for Technological Sovereignty, Security, and Democracy, while the DSA requires platforms to enforce content moderation, it also requires them to have effective internal complaint-handling systems, where users have the right to appeal content moderation decisions.
“While automated moderation is allowed, online platforms must be transparent about its use and accuracy,” Virkkunen added.
Investigations by Coimisiún na Meán seek to determine if X’s internal complaint handling system is up to par with the DSA’s regulatory standard or is in violation of it. The investigation is backed by various sources, and a nonprofit HateAid, which had previously made legal moves against X on behalf of a researcher who was repeatedly banned from the platform.
The investigation was the first under DSA by the Coimisiún na Meán, and violations could result in almost 6% of the company’s turnover. It is also not the first time X has attracted scrutiny from official bodies in Europe. Earlier this year, the platform was investigated by the EU over whether it breached the bloc’s content laws.
Experts believe the scrutiny could inspire broader changes in X’s operational and content moderation policies. Meanwhile, the recent investigations are a clear indicator of how far the EU is willing to go when it comes to holding social media platforms accountable and ensuring that user protection measures are robust and enforceable.
The smartest crypto minds already read our newsletter. Want in? Join them.