Today marks the start of Facebook CEO Mark Zuckerberg’s much-anticipated trip to Washington as he attends a hearing with the Senate, before moving on to a Congressional hearing tomorrow.
Away from the U.S. political capital, Zuckerberg is engaged in serious discussions about Myanmar with a group of six civil society organizations in the country who took umbrage at his claim that Facebook’s systems had prevented messages aimed at inciting violence between Buddhists and Muslims last September.
Following an open letter to Facebook on Friday that claimed the social network had relied on local sources and remains ill-equipped to handle hate speech, Zuckerberg himself stepped in to personally respond.
“Thank you for writing it and I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to,” Zuckerberg wrote.
“In making my remarks, my intention was to highlight how we’re building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community,” he added.
Zuckerberg also claimed Facebook is working to implement new features that include the option to report inappropriate content inside Messenger, and adding more Burmese language reviewers — two suggestions that the Myanmar-based group had raised.
The group has, however, fired back again to criticize Zuckerberg’s response which it said is “nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the U.S. or Europe.”
In particular, the six companies are asking Facebook and Zuckerberg to give information around its efforts, including the number of abuse reports it has received, how many have been removed, how quickly it has been done, and its progress on banning accounts.
In addition, the group asked for clarity on the number of Burmese content reviewers on staff, the exact mechanisms that are in place for detecting hate speech, and an update on what action Facebook has taken following its last meeting with the group in December.
“When things go wrong in Myanmar, the consequences can be really serious — potentially disastrous,” it added.
The Cambridge Analytica story has become mainstream news in the U.S. and other parts of the world, yet less is known of Facebook’s role in spreading religious hatred in Myanmar, where the government stands accused of ethnic cleansing following its treatment of the minority Muslim Rohingya population.
A recent UN Fact-Finding Mission concluded that social media has played a “determining role” in the crisis, which Facebook the chief actor.
“We know that the ultranationalist Buddhists have their own [Facebook pages] and really [are] inciting a lot of violence and a lot of hatred against the Rohingya or other ethnic minorities. I’m afraid that Facebook has now turned into a beast, [instead of] what it was originally intended to be used [for],” the UN’s Yanghee Lee said to media.
Close to 30 million of Myanmar’s 50 million population is said to use the social network, making it a hugely effective way to reach large audiences.
“There’s this notion to many people [in Myanmar] that Facebook is the internet,” Jes Petersen, CEO of Phandeeyar — one of the companies involved in the correspondence with Zuckerberg — told TechCrunch in an interview last week.
Despite that huge popularity and high levels of abuse that Facebook itself has acknowledged, the social network does not have an office in Myanmar. In fact, its Burmese language reviewers are said to be stationed in Ireland while its policy team is located in Australia.
That doesn’t seem like the right combination, but it is also unclear whether Facebook is prepared to make changes to focus on user safety in Myanmar. The company declined to say whether it had plans to open an office on the ground when we asked last week.
Here’s Zuckerberg’s letter in full:
Dear Htaike Htaike, Jes, Victoire, Phyu Phyu and Thant,
I wanted to personally respond to your open letter. Thank you for writing it and I apologize for not being sufficiently clear about the important role that your organizations play in helping us understand and respond to Myanmar-related issues, including the September incident you referred to.
In making my remarks, my intention was to highlight how we’re building artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.
These improvements in technology and tools are the kinds of solutions that your organizations have called on us to implement and we are committed to doing even more. For example, we are rolling out improvements to our reporting mechanism in Messenger to make it easier to find and simpler for people to report conversations.
In addition to improving our technology and tools, we have added dozens more Burmese language reviewers to handle reports from users across all our services. We have also increased the number of people across the company on Myanmar-related issues and we now we have a special product team working to better understand the specific local challenges and build the right tools to help keep people there safe.
There are several other improvements we have made or are making, and I have directed my teams to ensure we are doing all we can to get your feedback and keep you informed.
We are grateful for your support as we map out our ongoing work in Myanmar, and we are committed to working with you to find more ways to be responsive to these important issues.
Mark
And the group’s reply:
Dear Mark,
Thank you for responding to our letter from your personal email account. It means a lot.
We also appreciate your reiteration of the steps Facebook has taken and intends to take to improve your performance in Myanmar.
This doesn’t change our core belief that your proposed improvements are nowhere near enough to ensure that Myanmar users are provided with the same standards of care as users in the US or Europe.
When things go wrong in Myanmar, the consequences can be really serious – potentially disastrous. You have yourself publicly acknowledged the risk of the platform being abused towards real harm.
Like many discussions we have had with your policy team previously, your email focuses on inputs. We care about performance, progress and positive outcomes.
In the spirit of transparency, we would greatly appreciate if you could provide us with the following indicators, starting with the month of March 2018:
- How many reports of abuse have you received?
- What % of reported abuses did your team ultimately remove due to violations of the community standards?
- How many accounts were behind flagging the reports received?
- What was the average time it took for your review team to provide a final response to users of the reports they have raised?
- What % of the reports received took more than 48 hours to receive a review?
- Do you have a target for review times? Data from our own monitoring suggests that you might have an internal standard for review – with most reported posts being reviewed shortly after the 48 hrs mark. Is this accurate?
- How many fake accounts did you identify and removed?
- How many accounts did you subject to a temporary ban? How many did you ban from the platform?
Improved performance comes with investments and we would also like to ask for more clarifications around these. Most importantly, we would like to know:
- How many Myanmar speaking reviewers did you have, in total, as of March 2018? How many do you expect to have by the end of the year? We are specifically interested in reviewers working on the Facebook service and looking for full-time equivalents figure.
- What mechanisms do you have in place for stopping repeat offenders in Myanmar? We know for a fact that fake accounts remain a key issue and that individuals who were found to violate the community standards on a number of occasions continue to have a presence on the platform.
- What steps have you taken to date to address the duplicate posts issue we raised in the briefing we provided your team in December 2017?
We’re enclosing our December briefing for your reference, as it further elaborates on the challenges we have been trying to work through with Facebook.
Best,
from Social – TechCrunch https://ift.tt/2GBH7l2 Facebook is again criticized for failing to prevent religious conflict in Myanmar Jon Russell https://ift.tt/2qll5s0
via IFTTT
0 comments
Post a Comment