Facebook tests ‘extremist content’ warning on anti-radicalization platform – National

0


Facebook Inc is starting to warn some users that they may have seen “extremist content” on the social media site, the company said Thursday.

Screenshots shared on Twitter showed a notice asking “Are you worried that someone you know will become an extremist?” and another which alerted users “you may have recently been exposed to harmful extremist content”. Both included “get help” links.

Read more:

Liberals introduce bill to fight online hate with Criminal Code changes

The world’s largest social media network has long come under pressure from lawmakers and civil rights groups to tackle extremism on its platforms, including U.S. national movements implicated in the January 6 riot on Capitol Hill when groups backing former President Donald Trump tried to block the US Congress from certifying Joe Biden’s victory in the November election.

The story continues under the ad

Facebook said the small test, which only affects its main platform, is underway in the United States as a pilot of a comprehensive approach to prevent radicalization at the site.

“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have been involved in or exposed to extremist content, or who may know someone who is at risk,” a Facebook spokesperson said in an email. declaration. “We are working in partnership with NGOs and academic experts in this space and hope to have more to share in the future.”


Click to play the video:







Canada announces multi-faceted approach to tackle hate speech and online crime with Bill C-36


Canada announces multi-faceted approach to tackle hate speech and online crime with Bill C-36 – June 23, 2021

He said these efforts were part of his commitment to the Christchurch Call to Action, a campaign involving major tech platforms to counter violent extremist content online that was launched following an attack in New Zealand in 2019 and streamed live on Facebook.

Facebook said in the test that it identified both users who may have been exposed to extremist content breaking the rules and users who had previously been the subject of Facebook’s enforcement.

The story continues under the ad

The company, which has tightened its rules against violent and hate groups in recent years, said it was proactively removing some content and accounts that violate its rules before the material was seen by users, but that other content can be viewed before it is applied against.


Share.

Leave A Reply