Facebook removes Burmese translation feature after Reuters report

LONDON (Reuters) - Facebook has removed a feature that allowed users to translate Burmese posts and comments after a Reuters report showed the tool was producing bizarre results.

A cellphone user looks at a Facebook page at a shop in Latha street, Yangon, Myanmar August 8, 2018. REUTERS/Ann Wang

A Reuters investigation published on August 15 documented how Facebook was failing in its efforts to combat vitriolic Burmese language posts about Myanmar’s Rohingya Muslims. Some 700,000 Rohingya have fled Myanmar over the past year amid a military crackdown and ethnic violence. In late August, United Nations investigators said Facebook had been “a useful instrument for those seeking to spread hate” against the Muslim minority group.

The Reuters article also showed that the translation feature was flawed. It cited an anti-Rohingya post that said in Burmese, "Kill all the kalars that you see in Myanmar; none of them should be left alive." Kalar is a pejorative for the Rohingya. Facebook had translated the post into English as "I shouldn’t have a rainbow in Myanmar."

Why Facebook is losing the war on hate speech Reuters found more than 1,000 examples of content attacking the Rohingya and other Muslims on Facebook. A secretive operation to combat the vitriol has failed to stop it.

A spokeswoman for Facebook said the Burmese translation feature was “switched off” on August 28. She said the Reuters article and feedback from users “prompted us to do this.”

A woman looks at the Facebook logo on an iPad in this photo illustration taken June 3, 2018. REUTERS/Regis Duvignau/Illustration

“We are working on ways to improve the quality of the translations and until then, we have switched off this feature in Myanmar,” the spokeswoman wrote in an email.

Facebook has had other problems interpreting Burmese, Myanmar’s main local language. In April, the California-based social-media company posted a Burmese translation of its internal “Community Standards” enforcement guidelines.

Many of the passages were botched. A sentence that in English stated “we take our role in keeping abuse off our service seriously” was translated into Burmese as “we take our role seriously by abusing our services.” 

The Reuters investigation found more than 1,000 examples of hate speech on Facebook, including calling the Rohingya and other Muslims dogs, maggots and rapists, suggesting they be fed to pigs, and urging they be shot or exterminated. Facebook’s rules specifically prohibit attacking ethnic groups with “violent or dehumanizing speech” or comparing them to animals.

Shortly after the article was published, Facebook issued a statement saying it had been “too slow to prevent misinformation and hate” in Myanmar and that it was taking action, including investing in artificial intelligence that can police posts that violate its rules.

Our Standards:The Thomson Reuters Trust Principles.

(Original source)

« Previous article iPhone XS, new Apple Watch and everything else were expecting
Next article » The JBL Eon One Pro is a powered sound system for speakers and performers