This article was written by Aye Min Thant, Director of Tech for Peace in Phandeeyar, Innovation Center in Myanmar.
Everyone in Myanmar over the age of 10 has lived part of their life, and even most of it, under a military dictatorship characterized by an obsession with achieving autonomy from international influences. Before the economic and political reforms of the previous decade, Myanmar was one of the most isolated nations in the world. The digital revolution that has reshaped nearly every aspect of human life over the past five years was something the average Burmese has not experienced.
Recent reforms brought an explosion of high hopes and access to technology, and Myanmar experienced a digital leap as internet access went from almost 0% in 2015 to over 40% in 2020. At age 27, I remember living in a Rangoon where having a fridge was considered high-tech, and now 10-year-olds make videos on TikTok
Everyone was excited by the idea that Myanmar's digital revolution would drive the economic and social changes needed to move the country from a pariah state to the next economic frontier. Tourists, development aid and economic investment flooded the country. The cost of SIM cards has dropped from about $ 1,000 in 2013 to just over $ 1 today. This drastic price drop coincided with a glut of relatively affordable smartphones and phone providers providing bundled data that made social media platforms like Facebook free, or almost free to use. This led to the current situation where some 21 million of the 22 million people who use the internet have a Facebook profile. This social network became the main means of accessing the internet, and is now used for almost every activity online, such as selling livestock, watching pornography, reading the news or discussing politics.
Then, following the exodus of more than 700,000 Rohingya from the war-ravaged Burmese state of Rakain, Facebook was accused of allowing genocide.
The ongoing civil wars in the country and state violence against the Rohingya, described by the United Nations as ethnic cleansing with genocidal intent, brought to light the potential damage caused by digital connectivity. Given its dominance of the market, Facebook has faced intense scrutiny in Myanmar for the role the social network has played in normalizing, promoting and facilitating violence against minority groups.
Facebook was, and still is, the preferred tool for spreading hatred and misinformation against Rohingya, Muslims in general, and other marginalized communities. Despite repeated warnings from civil society organizations in the country, Facebook did not address the new difficulties with the urgency and level of resources required during the Rohingya crisis, and in many cases did not even enforce its own community standards.
Certainly, there have been improvements in recent years, such as the appointment of a Myanmar-focused team that has expanded the number of reviewers for Burmese content, added reviewers for content in minority languages, established more regular contact with civil society, and dedicated resources and tools focused on limiting disinformation during the upcoming Myanmar elections. The company also removed the accounts of Burmese military authorities and dozens of Facebook and Instagram pages connected to the Army for engaging in “coordinated inauthentic conduct.” The company defines “inauthentic conduct” as “engaging in conduct designed to facilitate other violations under our community standards,” using tactics such as the use of fake accounts and bots.
Recognizing the seriousness of this problem, everyone from the European Union to telecommunications companies to civil society organizations has invested resources in digital literacy programs, anti-hate speech campaigns, social media monitoring and support for try to deal with this problem. In general, much of this programming focuses on what Myanmar and its people do not have: a rule of law, laws that protect freedom of expression, digital literacy, knowledge of what constitutes hate speech, and resources to finance and execute the necessary planning.
In the frenzy of desperate fighting by organizations on the ground, less attention has been paid to larger systemic issues contributing to the furor.
We must further consider those coordinated groups that try to spread conspiracy theories, false information and hatred, in order to understand who they are, who finances them and how their actions could be interrupted, and if necessary, penalized.
We must re-evaluate how social media platforms are designed in a way that encourages and rewards bad behavior.
We must also question the extent to which we want to put the blame on social media companies, and whether it is good in general to give them the responsibility and therefore the power to determine what is or is not acceptable speech.
Finally, we must ask ourselves what alternatives we can develop, when many governments have shown that they are more than willing to monitor and judge Internet users under the pretext of health, safety and the criminalization of hate speech.
It is dangerous to pretend that multinational, private, and profit-making corporations are given the power to delineate the line between hate speech and free expression. Just as dangerous is giving that same power to governments, especially in these times when ethno-nationalist sentiments are rising around the world and increasing the willingness of governments to collect as much data openly or covertly as possible to use against the governed.
Furthermore, we can see from the ongoing legal proceedings against Myanmar in international courts, and the fact that Facebook has not provided UN investigators in Myanmar with evidence of serious crimes committed against the Rohingya and other ethnic minorities, that neither the Corporate policies and national laws are sufficient to guarantee safety, justice and dignity for vulnerable populations.
The solution to all of this, as unappealing as it may sound, is a multi-faceted, multi-stakeholder and long-term effort to build strong legal and cultural institutions that disperse the power and responsibility to create and maintain safe and inclusive online spaces between governments. , people, the private sector and civil society.
Aye Min Thant is the director of Tech for Peace in Phandeeyar, an innovation hub promoting safer and more inclusive digital spaces in Myanmar. Previously, she was a journalist and was awarded a Pulitzer Prize, reporting on business, politics and ethno-religious conflicts in Myanmar for Reuters. You can follow her on her Twitter account: @ma_ayeminthant.
This article was written by the Wikimedia Initiative / Yale Law School on Intermediaries and Information (WIII) as part of a series of papers to capture perspectives on the global impact of content moderation decisions on online platforms. You can read all the articles in the series on his blog, or on his Twitter list, @YaleISP_WIII.