TERREG: “Exceptions are never the best solution”
A conversation with Anna Mazgal, EU Policy Officer at Wikimedia Deutschland, about the European Parliament’s controversial “Regulation on Preventing the Dissemination of Terrorist Content Online (TERREG).”
What do you think is problematic about TERREG?
ANNA MAZGAL: There are a whole series of problems attached to TERREG. On the one hand, this regulation is binding for the member states; there is hardly any leeway in its implementation. But even more crucial is the question: What actually constitutes terrorist content? The definition is difficult. Someone who is considered a terrorist by some may be a hero for others. So what is TERREG aimed at? At the national level, for example, it’s about the posts or online communications of far-right groups. But also about terrorism outside the EU, outside our jurisdiction – especially content that can be classified as Islamist propaganda. But with such a general law, how are we going to distinguish selectively between political speech and propaganda?
Still, the regulation seems to have good intentions.
No one wants to see images of torture or beheadings. The only tool is to remove this content, the creators cannot be prosecuted in the EU, either. So far, so well-intentioned. But the regulation has a racist bias and may just hit the already vulnerable groups, the minorities of a society. For example, people who communicate in Arabic or post content about regimes whose oppression they have escaped. Even the mere collection of documentary material about terrorist crimes is targeted. That same content could be from a terrorist organization soliciting support – or be a newsworthy post informing the public about that organization. Because context is not taken into account, TERREG is a blow to freedom of expression and freedom of the press.
What fundamental debates about TERREG should have been held?
We should fundamentally ask ourselves the question: Do we want a debate in society as a whole, addressing the postcolonial aftermath and fault lines that give rise to terrorism worldwide in the first place? No. It’s just a matter of cleansing the Internet of certain content so that the problem disappears from view. Surveys by the EU’s statistical office Eurostat have shown that only 6 percent of users are confronted with terrorist content. And even that remains vague. What does someone perceive as “terrorist”? Is it enough to see a person without hair wearing jump boots? The basis of the entire regulation is fragile.
How is the regulation to be implemented in the member states?
At the administrative level, which is again problematic, because that is where the sphere of political interest prevails. In Germany, the Federal Criminal Police Office (BKA) will most likely be responsible. The procedure would be: someone at the BKA identifies an alleged terrorist content and then requests the platform operator, Facebook for example, to delete it under the URL in question according to TERREG – and to do so within one hour. The same applies to Wikipedia. No distinction is made between commercial and non-commercial platforms.
What does this mean in practice?
For a project that is self-managed by a volunteer community, such a deadline of one hour is of course not feasible. But what is much more serious in our eyes: There is no transparency, no legal authority that decides on the legality of this deletion, as long as no lawsuit is filed against it. It can happen that perfectly legal content is removed. What’s more: these deletion decisions apply EU-wide, even across borders. Videos that are unjustifiably removed in other countries are also no longer available to users in Germany. It also means that, for example, Viktor Orbán could take action against unpopular content about him in Spain, which he could have declared a terrorist threat.
Was Wikimedia able to influence the design of TERREG?
The original proposal included a requirement that platforms take so-called proactive measures against terrorist content. This means, de facto, the use of filtering technology. We know that these filters lack any discriminatory power. After all, the algorithm cannot distinguish the intention with which a piece of content is posted on the internet. In the current version, upload filters are no longer mandatory, but optional. Which, of course, does not mean that they are not used. Another point is that TERREG is not only aimed at deleting violent content, but also terrorist propaganda and the “spread” of terrorism in general.
What does that mean in concrete terms?
That is precisely the question. One example: In the Arab region, poetry has a high value; terrorists also use it. A verse from the Koran is quoted on the IS flag. Wherever else this verse appears on the internet, TERREG could take hold. Together with NGOs such as the German and French chapters of Reporters Without Borders, we have lobbied for at least meaningful exceptions to be enshrined for contexts such as journalism that can be invoked. But exceptions are never the best solution. It would be better to have a good law.
What is the current status regarding TERREG?
TERREG was decided in spring 2021 and effective shortly after, in June 2021, which means that member states have the legal basis to organize its implementation. It will apply from June 7, 2022. From that day on, the first deletions can be initiated.