Facebook promotes content that incites violence against Myanmar’s anti-coup protesters and amplifies junta disinformation, despite promises to crack down on the abuse of its platform, according to a study.
An investigation by rights group Global Witness found that Facebook’s recommendation algorithm continues to invite users to post content that violates its own policies. After liking a Burmese military fan page, which did not contain recent posts violating Facebook policies, the rights group discovered that Facebook had suggested several pro-military pages containing abusive content.
Among the items on one of the pages was an image of a “wanted” poster offering a $ 10 million bounty for the “dead or alive” capture of a young woman. The post claimed she was among protesters who burned down a factory following military crackdown. Images of the woman’s face and a screenshot of what appeared to be her Facebook profile were posted with a caption reading, “This girl is the one who arson in Hlaing Tharyar. His account has been deactivated. But she can’t run.
Global Witness said its report showed Facebook’s self-regulation was not working, and called for Facebook’s recommendation algorithm to be independently audited.
Other messages identified by Global Witness included a death threat, glorification of military violence and disinformation, such as the false claims that Isis is present in Myanmar and that the military had seized power because of of “electoral fraud”. The military has accused Aung San Suu Kyi’s party of rigging votes in last year’s election to justify February’s coup – a suggestion that has been discredited by observers, including the group. independent oversight Asian Network for Free Elections.
Facebook said in february that it would remove bogus allegations of widespread fraud or foreign interference in Myanmar’s November elections from its site. He also said he had banned state and media entities controlled by the military and introduced a specific Myanmar policy “to suppress praise, support and calls for violence by security forces and civilians. Myanmar protesters “. Content that supported the arrests of civilians by the military and security forces in Myanmar would be removed under this policy.
A Facebook spokesperson said its staff are “closely monitoring” the situation in Myanmar in real time and have taken action against any posts, pages or groups that break its rules.
However, content identified by Global Witness has remained online for months, according to the rights group.
A separate Guardian analysis found numerous recent examples of posts that also appeared to violate Facebook standards:
In a June 19 post, which received over 500 likes, an image showed a man with a bloodied face and a rope tied around his neck. The caption reads: “This is how you should stop them”, referring to the protesters.
The messages often mock and encourage violence against protesters. A message, also from June 19, referred to a recent flower strike, in which protesters carried flowers to mark Aung San Suu Kyi’s 76th birthday, saying: “Every single real man who carried flowers in audience today must be killed… Trash. They must all be killed so that children do not have bad role models. The post was liked 175 times.
Another message, dated June 1, was aimed at children. It showed a picture of students outside their school, with a sign saying, “We are students and we will go to school. You are criminals and you will go to jail. Many children have not returned to school, despite orders from the junta. The post has been liked over 4,300 times.
The publications often share misinformation, for example, accusing pro-democracy politicians of leading the “terrorists”. One article states that “only the real news outlets in this country are ME, MRTV and MWD and other state-run news,” referring to the channels controlled by the military.
Facebook has previously recognized that its platform has been misused in Myanmar, where it is extremely popular and influential. The site is used by almost half of the population and for many it is the primary means of accessing the Internet.
In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence.” A UN fact-finding mission drew similar conclusions the same year, saying Facebook had been “a useful tool for those seeking to spread hatred” and that the company’s response had been “slow and ineffective. “.
In February, Facebook said its staff were working tirelessly to keep its platform secure. The coup dramatically increased the likelihood “that online threats could cause harm offline,” Facebook said at the time.
The Global Witness report also called on Facebook to further investigate other types of content it hosted, including the dissemination of videos of forced confessions by political prisoners, military advertisements and posts that amplified military propaganda – like the claims that the military is acting in a measured way.
In a statement, Facebook said: “We are proactively detecting 99% of hate speech deleted from Facebook in Myanmar, and our ban on Tatmadaw [military] and the repeated disruption of coordinated inauthentic behavior has made it more difficult to misuse our services to spread damage. This is a high-conflict issue and we continue to take action against content that violates our policies to keep people safe. “