Facebook asks: Are your friends becoming extremists?

facebook 2156 1120 3 - Facebook asks: Are your friends becoming extremists?Facebook is starting to warn some users they might have seen “extremist content” on the social media site, the company said on Thursday.

Screenshots shared on Twitter showed a notice asking, “Are you concerned that someone you know is becoming an extremist?” Another alerted users, “You may have been exposed to harmful extremist content recently.” Both included links to, “Get support.”

The world’s largest social media network has long been under pressure from lawmakers and civil rights groups to combat extremism on its platforms, including US domestic movements involved in the 6 January Capitol riot when groups supporting former President Donald Trump tried to stop the US congress from certifying Joe Biden’s victory in the November election.

Binance booms as crypto trading unfolds beyond the reach of nation states
🎴Also Read ▶️

Facebook said the small test, which is only on its main platform, was running in the US as a pilot for a global approach to prevent radicalisation on the site.

“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” said a Facebook spokesman in an e-mailed statement. “We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”

Tightened rules

It said the efforts were part of its commitment to the Christchurch Call to Action, a campaign involving major tech platforms to counter violent extremist content online that was launched following a 2019 attack in New Zealand that was live-streamed on Facebook.

Crypto firms drop British registration bids amid scrutiny
🎴Also Read ▶️

Facebook said in the test it was identifying both users who may have been exposed to rule-breaking extremist content and users who had previously been the subject of Facebook’s enforcement.

The company, which has tightened its rules against violent and hate groups in recent years, said it does remove some content and accounts that violate its rules proactively before the material is seen by users, but that other content may be viewed before it is enforced against.  — Reported by Elizabeth Culliford in New York, (c) 2021 Reuters



Be the first to comment

Leave a Reply

Your email address will not be published.