As inflammatory and confrontational content increased in most markets, including India, it was the global Facebook hate speech team that saw cost cuts. This is reported in internal documents reviewed by The Indian Express.
To reduce expenses, three potential levers have been offered internally at the social media company – review fewer user reports, proactively review less content, and review fewer calls, according to an internal policy brief. dated August 6, 2019. Indeed, the cleaning was the victim.
As The Indian Express reported on Thursday, it was in July 2020 that an internal document indicated a “marked increase” in “anti-Muslim” rhetoric on the platform over the previous 18 months in India, the largest Facebook market by number of users. .
âEveryone understands that cost reductions happen no matter what we do: teams will reduce their CO capacityâ¦â said the August 6, 2019 memo, titled âCost Control: An Exploration of Hate Speechâ. CO – community operations – refers to contract labor at Facebook.
“The question is not how to reduce capacity, but how far we can reduce without eliminating our ability to review user reports and do proactive work,” the memo said.
The memo addressed specific ways to use the three levers to reduce costs, including ignoring “benign user reports” and asking users “to be more thoughtful before submitting a request for reconsideration.”
The need to review fewer user reports stemmed from the fact that while Facebook reviewed the majority of user reports, it found that the rate of action on responsively reported content was “at best. 25% “.
The document pointed out that almost three-quarters of the costs incurred for reviewing content were due to responsive capacity, that is, the capacity used to review content already reported by users or third parties. Only 25% of the review costs were incurred on the proactive capability.
âIn the first semester (first semester, January-June 2019), we worked a lot in accordance with the ‘Haine 2019 capacity reduction plan’ to significantly increase the volume of actions that we can undertake while maintaining the same levels of capacity … We need to dramatically increase the rigor with which we make decisions about how to use our capacity for human scrutiny at all levels, and indeed, in the event of hate speech, we need to reduce a significant amount of our current capacity. in order to finance new initiatives â, indicates the strategic note. .
According to this plan, by the end of June 2019, Facebook planned to reduce the dollar cost of the total hate examination capacity by 15%. According to the document, the company was spending over $ 2 million per week reviewing hateful content.
In response to a request for comment, a spokesperson for Meta Platforms Inc – Facebook was renamed Meta on October 28 – told The Indian Express: âThis document does not advocate any budget cuts to suppress hate speech, and we have not done any. In fact, we have increased the number of hours our teams devote each year to combating hate speech. The document shows how we were looking at ways to make our work more effective in eliminating more hate speech on a large scale.
âOver the past decade, we’ve created technology to proactively detect, prioritize and remove content that breaks our rules, rather than having to rely solely on individual reports created by users. Every company regularly reflects on how to execute its business priorities more effectively so that it can do it even better, that’s all this document reflects.
The spokesperson added: âOver the past two years, we have hired more people with linguistic, national and thematic expertise. Adding more linguistic expertise has been a key area for us. They are among more than 40,000 people who work on safety and security, including global content review teams at more than 20 sites around the world who review content in over 70 languages, including 20 Indian languages. .
The company did not respond to a specific question, however, about the total spending Facebook made on reviewing hate speech each year, and how that figure has changed since 2019.
These reports are part of documents disclosed to the U.S. Securities and Exchange Commission (SEC) and provided to Congress in drafted form by legal counsel to former Facebook employee and whistleblower Frances Haugen.
Written versions received by Congress have been reviewed by a consortium of global news organizations, including The Indian Express.
In addition to the cost control hurdles faced by the content review team, there is evidence that Facebook recognized conflicts between teams that dealt with civic quality metrics such as disinformation, hate speech, spam, etc. etc., and those who design the algorithms to make Facebook users’ news feeds more relevant.
In an article published on May 3, 2019 on an internal group called “Election Integrity Discussions,” a Facebook employee noted that “the work on the relevance of the news feed may have inadvertently pushed some reductions in civic quality by 60 % in the wrong direction, âindicating that changes to the algorithm feed had increased the prevalence of problematic content.