After failing to contain the hate speech and misinformation that sparked genocide in Myanmar, Facebook is now planning to take proactive steps to moderate content following a military coup in the country.
In an internal message posted late Monday and viewed by BuzzFeed News, Asia Pacific Public Policy Director Rafael Frankel told staff that the social network was “with great concern” about the “volatile situation” in Myanmar. and observes outlined a number of measures to be taken against people who use them to spread misinformation or threaten violence.
As part of these measures, Facebook has designated Myanmar as a “temporary high risk location” for two weeks, allowing the company to remove content and events in the country that contain “all calls for arms procurement”. The social network previously used this term after the January 6th uprising at the U.S. Capitol in Washington, DC.
The social network, which touted its efforts to protect the integrity of the national elections in Myanmar in November, also said it would protect posts criticizing the military and its coup and keep track of reports of sites and accounts hacked or hacked by the military were taken over.
“The November elections in Myanmar were a key moment in the country’s transition to democracy, although it was not without its challenges, as international human rights groups have pointed out,” Frankel wrote. “This turn of events reminds us of days we hoped would be in Myanmar’s past and reminds us of fundamental rights that should never be taken for granted.”
Facebook’s steps come after General Min Aung Hlaing, the head of the Myanmar military, took control of the country’s government on Monday and arrested its elected leader Aung San Suu Kyi and other members of her National League of Democracy (NLD) party. After the elections, in which the NLD won a majority of the seats in Myanmar’s parliament, opposition groups backed by the military called the results fraudulent and called for a revision.
On Tuesday, the US State Department officially described the takeover of the military in Myanmar as a coup, which triggered financial sanctions.
“After verifying all the facts, we found that the actions of the Burmese military on February 1, after the duly elected head of government was deposed, constituted a military coup,” a State Department official said in a briefing using the name given US government used to refer to the country.
In a statement to BuzzFeed News, Facebook confirmed the measures outlined in Frankel’s post and said it would remove content praising or supporting the coup.
“We are putting the safety of the people in Myanmar first and removing content that violates our rules on violence, hate speech and harmful misinformation,” said Frankel. “This includes eliminating the misinformation that would delegitimize the outcome of the November elections.”
Facebook is taking action in a country previously internationally condemned for its handling of the displacement and genocide of Rohingya Muslims that began in 2016. In 2018, United Nations investigators found that senior military officials in Myanmar had used Facebook, but content moderators in the country had not used it to instill fear and spread hate speech.
“The extent to which Facebook posts and messages have led to discrimination in the real world must be investigated independently and thoroughly,” the UN investigators concluded in their report.
In Monday’s post, Frankel said Facebook had “used a series of product interventions, used in the past in Myanmar and during the US elections, to ensure the platform was not used to spread misinformation, incite or incite violence Coordinate damage. “
The company is working to secure the accounts of activists and journalists “who are at risk or have been arrested” and to remove content that threatens or encourages violence against them, Frankel wrote. The company will also protect “vital information about what is happening on the ground” given the restrictions placed on news agencies in the country.
Facebook’s work is an ongoing effort. On Tuesday, a page for Myanmar’s military television network was removed at the request of the Wall Street Journal. While the company banned a site for the Myawaddy television network in 2018 while hundreds of accounts related to Myanmar’s military were cracked, a new site had re-emerged and received 33,000 likes.
Facebook has often come under fire for facilitating the growth of violent and extremist groups and its ineffectiveness in containing misinformation. Most recently, a tech watchdog group accused the company of stirring up the unrest that led to the fatal coup attempt in the United States.
“[Facebook] has spent the past year failing to eradicate extremist activity and electoral conspiracy theories fueled by President Trump that have radicalized a large segment of the population and led many down a dangerous path, “a Tech Transparency Project report reads (TTP).
The report uncovered specific threats perpetrated in pro-Trump and militant groups on Facebook both before and after Joe Biden’s election victory in November.