22.4 C
New York

Facebook ordered to remove posts which ‘incited violence’ before Southport riots

Published:

Meta, the parent company which owns Facebook, has been ordered to take down three posts shared during the riots which erupted across the country in the wake of the Southport attack last year.

The Oversight Board, which examines content decisions made by the social media giant, said the posts, which were Islamophobic, incited violence and showed support for the riots, had each “created the risk of likely and imminent harm” and “should have been taken down”. Meta said it would comply with the board’s ruling, adding that it had removed thousands of posts during the unrest last summer.

Violent scenes continued for weeks after the knife attack which left three girls dead and ten others injured. The riots were fuelled by misinformation spreading rapidly on social media about the attacker’s identity, including false claims that he was an asylum seeker who had arrived in the UK on a small boat.

The three posts in question were originally kept on Facebook after being assessed by Meta’s automated tools – none of the posts were reviewed by humans – before the same users who had reported the posts appealed to the Oversight Board over the decision.

In its ruling, the board also raised concerns about Meta’s response to the riots and social media activity linked to it.

“The content was posted during a period of contagious anger and growing violence, fuelled by misinformation and disinformation on social media. Anti-Muslim and anti-immigrant sentiment spilled on to the streets,” the Oversight Board said.

It added that Meta had been too slow to react to misinformation and disinformation about the attack spreading on social media, which began in the hours after the stabbings on July 29.

“Meta activated the crisis policy protocol (CPP) in response to the riots and subsequently identified the UK as a high-risk location on August 6. These actions were too late. By this time, all three pieces of content had been posted,” the board’s decision said.

“The board is concerned about Meta being too slow to deploy crisis measures, noting this should have happened promptly to interrupt the amplification of harmful content.”

In December, when the Oversight Board announced it was examining these posts, it said Meta had acknowledged its decision to leave one of the posts on Facebook was an error and removed it, but said Meta had confirmed it believed it was still correct to leave the second and third posts online.

In response to the board’s ruling, a Meta spokesperson said: “We regularly seek input from experts outside of Meta, including the Oversight Board, and will act to comply with the board’s decision.

“In response to these events last summer, we immediately set up a dedicated taskforce that worked in real time to identify and remove thousands of pieces of content that broke our rules – including threats of violence and links to external sites being used to co-ordinate rioting.

“We will respond to the board’s full recommendations within 60 days in accordance with the bylaws.”

As part of its response to the riots and social media activity during the unrest last summer, Meta said it had removed 24,000 posts from Facebook and Instagram for breaking rules on violence and incitement, as well as 12,000 posts for breaching hate speech guidelines.

At Reach and across our entities we and our partners use information collected through cookies and other identifiers from your device to improve experience on our site, analyse how it is used and to show personalised advertising. You can opt out of the sale or sharing of your data, at any time clicking the “Do Not Sell or Share my Data” button at the bottom of the webpage. Please note that your preferences are browser specific. Use of our website and any of our services represents your acceptance of the use of cookies and consent to the practices described in our Privacy Notice and Cookie Notice.

Related articles

spot_img

Recent articles

spot_img