Mainstream social media platforms could face limits on their ability to take down independent journalism that violates their terms and conditions under a proposal agreed by European Union lawmakers yesterday.
In a vote Tuesday, the European parliament set its negotiating position for upcoming talks with the Council on the bloc’s draft Media Freedom Act — taking aim at what MEPs called “arbitrary decisions by big platforms”.
The text adopted by MEPs expands on the European Commission’s original proposal by setting out a requirement for larger platforms (so called very large online platforms, or VLOPs, with more than 45M regional active monthly users) to give media services providers a heads-up of a planned takedown of their content — providing 24 hours for them to reply to the objections before any restriction or suspension is imposed.
The original Commission text merely urges these platforms to consider freedom and pluralism of media, act diligently and be transparent when they exercise editorial responsibility — i.e. by taking down journalism they deem incompatible with their terms and conditions — and then, after the fact, provide an explanation of their actions to media service providers “as early as possible”.
“To ensure that content moderation decisions by very large online platforms do not negatively affect media freedom, MEPs call for the creation of a mechanism to manage content takedown orders,” the parliament wrote in a press release. “According to MEPs, platforms should first process declarations to distinguish independent media from non-independent sources. Media should then be notified of the platform’s intention to delete or restrict their content alongside a 24-hour window for the media to respond. If after this period the platform still considers the media content fails to comply with its terms and conditions, they can proceed with deleting, restricting or referring the case to national regulators to take the final decision without delay. However, if the media provider considers that the platform’s decision does not have sufficient grounds and undermines media freedom, they have right to bring the case to an out-of-court dispute settlement body.”
In the upcoming trilogue talks involving the Commission, the bloc’s co-legislators, the parliament and the Council, will need to negotiate to find a compromise on a final text so the shape of the law is not fixed in stone yet. And it remains to be seen whether the parliamentarians’ push for the Act to go further in safeguarding media from arbitrary decisions by larger platforms stands or falls.
The parliament vote was a fairly substantial one in favor of the amended file — with 448 votes in favour vs 102 against (and 75 abstentions).
The Commission proposed the Media Freedom Act back in September 2022. The bloc’s lawmakers argue legislation is needed to protect media pluralism and independence in the modern era in light of a variety of growing pressures on the sectors — including in relation to the digital transformation of the media industry.
Since then it’s fair to say we’ve seen a rise in highly visible arbitrary decisions, in the wake of Elon Musk’s takeover of Twitter (now X). Last year, the billionaire owner of the social media platform banned a number of journalists who had written about him — as it turned out because he was unhappy they had reported on an account that tweeted the location of his private jet. That action earned him a swift rebuke from the EU, which dubbed the arbitrary suspensions “worrying” — pointing back to the Media Freedom Act as being intended to reinforce the bloc’s protections for media and fundamental rights in such scenarios.
The public rebuke didn’t stop Musk. He has continued to target traditional media during his erratic turn in charge of X, announcing a plan to stop displaying headlines on news articles this summer, for example (most likely with his eye on trying to evade making copyright payments to news publishers for displaying snippets of their content); and throttling the load time of links on the platform to New York Times and Reuters articles, as well as to competing social networks.
Prior to Musk, legacy Twitter also had some of its own run-ins with the media, of course. Such as its controversial decision three years ago to block the sharing of links or images related to a New York Post article about claimed emails by Hunter Biden found on a laptop — which led on to it amending its hacked material policy. Facebook also restricted sharing of the Hunter Biden laptop story at a time when concerns about disinformation targeting the US elections were riding high.
But Musk’s actions at the helm of Twitter/X vis-a-vis journalists and media firms have seemed far more arbitrary and/or driven by a personal dislike of traditional media. That dislike, combined with apparently limitless resources to spend on taking arbitrary actions regardless of if they harm user trust and advertiser confidence, doesn’t bode well for access to independent journalism on X. So the bloc’s legislative move looks timely. Albeit, whether the planned law will prove effective at reining in Musk is another matter.
X under Musk is charting a reckless collision course with the EU over the Digital Services Act (DSA), the confirmed pan-EU law which designates the aforementioned VLOPs — regulating how these larger platforms (including X) should respond to reports of illegal content and other issues, as well as obligating them to assess and mitigate systemic risks like disinformation.
Musk’s response to this existing pan-EU law — which carries penalties of up to 6% of global annual turnover for breaches, and even the risk of a service being blocked in the region — has so far summed to him thumbing his nose at regulators. Examples include Musk slashing headcount in key areas including content moderation, trust & safety and election integrity; ending policy enforcements on COVID-19 disinformation; removing certain mainstream disinformation reporting tools for users; and pulling the platform out of the bloc’s Disinformation Code (which is linked to DSA compliance).
Musk is also fond of posting/amplifying disinformation and conspiracy theories himself. And has encouraged hateful follower pile-ons of people he takes a dislike to, including the former Twitter head of trust & safety, Yoel Roth. (Or, more recently, a California man who is suing him for defamation — accusing Musk of spreading false claims about him.)
So whether an adjunct to the existing EU content moderation law can convince Musk to bend to the bloc’s rulebook looks questionable. Although reining in Big Tech’s most erratic and deep-pocketed chief is likely to be a regulatory marathon (grit, stamina, strategy etc), not a sprint.
Major platforms, meanwhile, generally remain opposed to the parliament’s proposal to give media firms notice of takedowns of content which violates their T&Cs. But of course tech platforms aren’t renowned for backing checks on platform power.
Following yesterday’s vote by MEPs to affirm their negotiating mandate on the Media Freedom Act, Big Tech lobby organization, the Computer & Communications Industry Association (CCIA), hit out at the “media exemption” — framing it as “controversial” and claiming the provision risks enabling rogue actors to spread disinformation. “This is a major setback in the fight against disinformation,” claimed CCIA Europe’s senior policy manager, Mathilde Adjutor, in a statement. “The media exemption will empower rogue actors, creating new loopholes to spread fake news rather than fixing anything. We can only hope this disinformation loophole will be closed during the trilogue negotiations between the EU institutions.”