Wed, May 14, 2025
On October 9, 2024, Turkey blocked access to Discord, a popular social messaging service. Discord was blocked after allegedly failing to comply with data requests from the Turkish government. Officials indicated that the block was the result of a court decision that cited “sufficient evidence”, suggesting that crimes of “child sexual abuse and obscenity” were being committed on the platform.
Turkey's ban on Discord underscores the growing need for decision-makers to reassess the accountability of digital platforms. Broadly, platform accountability entails requiring technology companies to take greater responsibility for the unlawful and harmful activities carried out on their websites or applications. However, this construct disproportionately focuses on larger digital platforms, leaving smaller services like Discord under-scrutinised.
The European Union’s Digital Services Act imposes stricter obligations on 'Very Large Online Platforms' with over 45 million users. Discord does not qualify as a very large online platform under the DSA, even though it reportedly has 200 million monthly active users globally.
In India, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, a.k.a. IT Rules, also emphasise size, by placing additional compliances on entities with five million users or more — deemed “Significant Social Media Intermediaries”.
While these laws differ in scope, they share an underlying assumption — potential for harm correlates with platform size — logic that is grounded in past incidents. In 2019, a shooter shot and killed 51 people across two mosques in Christchurch, New Zealand, and livestreamed it on Facebook.
Facebook indicated that the footage was viewed only 200 times during its broadcast. However, individuals tried to upload the video 1.5 million times. Similarly, the spread of false rumours about child abductors on WhatsApp resulted in the deaths of 23 people across India in 2017 and 2018.
Does A Larger Platform Increase Possibility Of Digital Harm?
The assumption that platform size directly correlates with potential harm, while logical in the context of virality, fails to account for equally damaging risks posed by less regulated online spaces. Larger platforms have taken several measures to stem the tide of harmful content.
These include limiting the size of group chats, adding disclaimers to forwarded messages, building databases of problematic content and de-platforming problematic users that repeatedly violate content guidelines. These actions have made it harder for bad actors to operate openly, forcing them to migrate to alternative, smaller platforms like Discord, where enforcement is laxer and anonymity is easier to maintain.
With encrypted private chats and search features that only permit the discovery of large, verified groups, a platform like Discord offers bad actors the twin promise of operational security and privacy. It has looser enforcement of content restrictions than more mainstream social media platforms, and largely relies on users to report problematic content, for which there is little incentive if similar-minded users flock there.
Discord became prominent in 2017 for harbouring white supremacist groups. Excoriation by the media followed, after which it tightened its moderating practices.
However, researchers have called its commitment to moderation into question, because it seemingly relies on third-party sites to help users find niche (and also problematic) communities. While Discord only permits users to find communities that are verified with 10,000 members or more, users can bypass this restriction using a third-party site called Disboard.org — a community noticeboard that enables users to find and engage with smaller communities on Discord. On Disboard, these communities can be found through ”tags” — keywords the community creator uses to describe the topics and kinds of users they are looking to induct. Such searchable tags include “minors”, “toxic”, “NSFW (not safe for work)”, “problematic”, etc. Many of these community channels advertise low moderation (i.e., anything goes) and solicit children aged between 13 and 17 to join their communities.
The dynamic between Discord and Disboard reveals a troubling new trend in online harm: decentralisation. Online harm becomes more decentralised as platforms crack down on bad actors. Sometimes the dispersion manifests in migration to another platform. At other times, we see the emergence of a network of harm, exemplified by Discord and Disboard, where one site or digital tool is cross-leveraged to carry out misdeeds on the other.
How To Tacke The Hydra?
This decentralisation of harm across platforms evokes the myth of the hydra: for every head cut off, two more grow in its place. As larger platforms crack down on bad actors, they migrate to smaller, less-regulated spaces, making it difficult to root out harmful behavior.
As the mythical hydra reminds us, the battle is not won by cutting off a single head — it requires a more comprehensive strategy to prevent others from growing in its place. The solution cannot be limited to blocking content or compelling a platform to take action, only to see the same content emerge elsewhere. Importantly, these networks can be leveraged for good as well as evil. For instance, law enforcement could use Disboard to locate problematic groups on Discord and facilitate their takedown.
We must also find a way to create a means of inter and intra-platform communication and cooperation, so that information on bad actors can be shared across entities, and they may be blocked across the board. The latter approach has some challenges, as alternative social media often crops up in response to political differences, and it may generally be hard to get companies that compete with one another to meaningfully cooperate. However, without such cooperation, efforts to regulate online spaces may remain a game of whack-a-mole.
(The author is Director of the Esya Centre, a think-tank focused on emerging technologies and a member of the AI Knowledge Consortium. Views are personal)