Social media firms urged to make ‘major changes’ to stop antisemitic abuse

New report by influential parliamentary committee, chaired by Conservative MP Damian Collins, also calls for Twitter, Facebook and other social media platforms to act over disposable anonymous accounts

Examples of online hatred, focused largely on Twitter credit CST

Social media companies such as Twitter, Facebook and Instagram must design their systems to identify, limit the spread of, and remove antisemitic material quickly after it is reported, a report by a parliamentary committee has recommended.

The report recommended that tech companies designate a senior manager as their “Safety Controller” who would be personally liable if they fail to comply with the new law, potentially leading to prison sentences “at the end of an exhaustive legal process”.

Online services should also take steps to prevent abuse by disposable anonymous accounts and should be required to ensure there are governance processes in place to ensure proper requests from law enforcement are responded to quickly, the group of MPs and peers have warned.

The committee said that being able to post online without revealing your identity was “crucial to online safety for marginalised groups, for whistle-blowers, and for victims of domestic abuse and other forms of offline violence”.

“Anonymity and pseudo-anonymity themselves are not the problem and ending them would not be a proportionate response,” it added.

But it recommended that more be done so that traceability was available for law enforcement investigating these accounts.

The findings are made in a report published on Tuesday by the Joint Committee on the Draft Online Bill which calls for “major changes” to proposed government legislation.

“No-one should be abused for their religious faith or identity and tech companies must take steps to prevent the spread of such material and remove it from their platforms,” states the report.

It also recommends that social media firms should be required to address the risks that algorithmic recommendation tools and hashtags may amplify antisemitic abuse or religious hatred.

“Our recommendations ensure that in addition to encompassing abuse, harassment, and threats on the grounds of race against individuals, online services will also have to address hate crimes such as stirring up racial hatred that may not currently be covered,” the report continues.

“Platforms will have a duty to design their systems to identify, limit the spread of, and remove racist abuse quickly following a user report.

“The Joint Committee call for Ofcom to produce a Code of Practice on system and platform design against which platforms will be held responsible for the way in which such material is recommended and amplified.

Addressing racist online abuse linked to football supporters the report says that where possible “service providers should also share information about known offenders with the football authorities so that they can consider whether offences have been committed that would require further penalties, like the imposition of stadium banning orders.

Giving evidence to the committee, former England star Rio Ferdinand had said:”Social media platforms became the toxic and racist safe place… [for people to] continually abuse our England players knowing that they are safe to be able to stay anonymous.”

“The 200-page report includes what the Joint Committee said were “moving testimonies about the impact that religious hatred and antisemitism online has on individuals, families and communities.”

Danny Stone, director the Antisemitism Policy Trust, who gave evidence to the committee, told Jewish News:”I don’t think I am overstating the case by suggesting that this is a moment in history, and what begins with this Bill will mark a shift in approach for the generations that come after us. The Bill must be strong, and well crafted.”

Damian Collins, the Conservative chairman of the Joint Committee, said: “For too long, big tech has gotten away with being the land of the lawless.

“A lack of regulation online has left too many people vulnerable to abuse, fraud, violence and in some cases even loss of life.

“The committee has set out recommendations to bring more offences clearly within the scope of the Online Safety Bill, give Ofcom the power in law to set minimum safety standards for the services they will regulate, and to take enforcement action against companies if they don’t comply.”

Labour’s new shadow culture secretary Lucy Powell also welcomed the report’s recommendations.

read more:
comments