This article was written by Michael Karanicolas, a fellow at Yale Law School where he leads the Wikimedia Initiative on Intermediaries and Information as part of the Information Society Project. It is the first in a series of articles published by the initiative to capture perspectives on the global impacts of platforms' content moderation decisions. You can find the author at @M_Karanicolas on Twitter, and you can read all the articles in the blog series here.
Every minute, more than 500 hours of video are posted on YouTube, 350,000 tweets are sent and 510,000 comments are posted on Facebook. Managing and managing this amount of content is a huge task, giving platforms enormous power over the limits of online discourse. This includes decisions about whether a particular post should be removed, as well as more minute and subtle interventions that determine whether it goes viral. From deciding to what extent to allow charlatans' ideas about COVID-19 to take hold, to the degree of flexibility given the president of the United States to break the rules, content moderation poses tough challenges that are at the core of debates on freedom of expression.
But while a lot of ink has been spilled on the impact of social media on America's democracy, these decisions can have an even greater impact around the world. This is particularly true in places where access to traditional media is limited, giving platforms a virtual monopoly in shaping public discourse. A platform that fails to take action against hate speech can be instrumental in unleashing a local pogrom, even genocide. A platform that acts too aggressively to eliminate alleged “terrorist propaganda” may find itself destroying evidence of war crimes..
The power of platforms over public discourse is in part the result of a conscious decision by world governments to outsource online moderation functions to these private sector actors. Around the world, governments are making increasingly aggressive requests for platforms to police content they find objectionable. Targeted material can range from daring photos of the King of Thailand to material deemed insulting to the founding president of Turkey. In some cases, such requests are based on local legal standards, putting platforms in the difficult position of having to decide how to enforce a law in Pakistan, for example, which would be manifestly unconstitutional in the United States.
However, in most cases, moderation decisions are not based on any legal rule at all, but rather on the platforms' own privately written community guidelines, which are notoriously imprecise and difficult to understand. This leads to a serious lack of accountability in the mechanisms governing freedom of expression online. And while the perception of opacity, inconsistency, and hypocrisy in online content moderation structures may seem frustrating to Americans, for users in the developing world it is far worse..
Almost all of the largest platforms are located in the United States. This means that their leaders are more accessible and responsive to their American user base than they are to frustrated netizens in Myanmar or Uganda, and also that their global policies continue to be heavily influenced by American cultural norms, particularly the First Amendment.
Although the largest platforms have made efforts to globalize their operations, there remains a huge imbalance in the ability of journalists, human rights activists, and other vulnerable communities to communicate with US personnel who decide what can and cannot be said. . When platforms branch out globally, they tend to hire staff who are connected to existing power structures, rather than those who rely on platforms as a lifeline away from the repressive restrictions of the word. For example, pressure to crack down on “terrorist content” inevitably leads to collateral damage to journalism or legitimate political discourse, particularly in the Arab world. By making this calculation, governments and former government officials are much more likely to have a seat at the table than journalists or human rights activists. In the same way, it is easier for the Israeli government to communicate its wants and needs to Facebook than, for example, Palestinian journalists and NGOs.
None of this is intended to minimize the scope and scale of the challenge platforms face. It's not easy to develop and enforce content policies that take into account the very different needs of your global user base. Platforms typically aim to provide everyone with a roughly identical experience, including similar expectations regarding allowed word limits. There is a clear tension between this objective and the conflicting legal, cultural and moral norms in force in the many countries in which they operate. But the importance and weight of those decisions require platforms to strike that balance and develop and implement policies that adequately reflect their role at the center of political debates from Russia to South Africa. Although the platforms have grown and spread around the world, the center of gravity of these debates continues to revolve around Washington D.C. and San Francisco.
This is the first in a series of articles that aims to bridge the gap between current political debates about content moderation and the people most affected, particularly throughout the southern hemisphere. The authors are academics, civil society activists, and journalists whose work is on the cutting edge of content decisions. In asking for their input, we offered them relative freedom to prioritize the issues they considered most serious and important with regard to content moderation, and asked them to identify areas where improvement was required, in particular regarding the moderation process. , community participation and transparency. Issues they raised included a common frustration at the distant and opaque nature of the platforms' decision-making processes, a desire for platforms to work towards a better understanding of the local sociocultural dynamics underlying online discourse, and the feeling that the platforms' approach to restraint often did not reflect the importance of their role in facilitating the exercise of fundamental human rights. Although each of the different voices offers a unique perspective, they draw a common picture of how platform decision-making impacts their lives, and the need to do better, in line with the power of platforms. to define the contours of world discourse.
Ultimately, our hope in this series is to shed light on the impacts of platform decisions around the world, and provide guidance on how social media platforms could do a better job of developing and enforcing communication structures. moderation that reflect your needs and the values of your various global users.