08 January 2025
Human Rights Myanmar
Mark Zuckerberg’s announcement of significant changes to Meta’s content moderation policies raises serious concerns for Myanmar, where Facebook has both enabled anti-coup dissent and fuelled real-world consequences, including contributing to atrocities against the Rohingya. While fostering free expression is commendable, Meta also has heightened legal and moral responsibilities under international human rights law and its previous commitments to prevent its platforms from enabling harm in high-risk environments like Myanmar.
Return to free expression must start with algorithms
Zuckerberg’s acknowledgement that Meta’s content moderation systems make “too many mistakes” is a positive step, as over-moderation stifles legitimate expression and censors reliable sources, including Myanmar’s independent media. However, restoring freedom of expression should first address Meta’s algorithms, which prioritise emotive content—including disinformation and divisive rhetoric—over trustworthy sources. In Myanmar, where such content fuels violence, tackling algorithmic bias is essential. Moderation alone cannot resolve the issue if harmful content remains prioritised for profit.
Meta’s goal of fostering “friendly and positive” platforms again misreads contexts like Myanmar, where people face widespread human rights violations and censorship, going online to seek truth and accountability—not superficial positivity. Meta must emphasise accurate and context-aware content over sentiment.
Eliminating fact-checkers weakens truth-seeking efforts
Meta’s decision to phase out fact-checkers, first in the U.S. and then globally, in favour of community-based systems is concerning, especially in Myanmar, where military propaganda and disinformation campaigns are rampant online. Zuckerberg’s use of language discrediting fact-checkers as “politically biased” mirrors authoritarian regimes’ attacks on civil society.
Fact-checkers counter false narratives in Myanmar often spread by a military that simultaneously attacks independent media and civil society. They are digital safeguards that promote truth and integrity in public discourse. Indeed, Meta’s Human Rights Impact Assessment recommended that the company support fact-checkers in Myanmar. If fact-checkers’ public reach is limited, Meta could easily amplify their influence. Many fact-checkers are journalists so withdrawing support also threatens the long-term viability of independent media, a critical pillar of democratic societies.
04 April 2025
13 March 2025
28 February 2025
25 February 2025
25 February 2025
Progressive Voice is a participatory rights-based policy research and advocacy organization rooted in civil society, that maintains strong networks and relationships with grassroots organizations and community-based organizations throughout Myanmar. It acts as a bridge to the international community and international policymakers by amplifying voices from the ground, and advocating for a rights-based policy narrative.