social media giants must do more to tackle hate crime and extremism on their platforms.
That’s the view of Pontefract and Castleford MP Yvette Cooper, who says companies need to take faster action to remove illegal and abusive material.
She spoke to the Express after the House of Commons’ Home Affairs Select Committee quizzed senior executives of Twitter, Google and Facebook about their handling of online abuse.
Ms Cooper said: “Twitter, Facebook and YouTube must do more to tackle serious illegal content, online extremism or hate crime.
“They have taken on more staff and started to raise standards and take more responsibility compared to our last hearing in February, which is welcome. But we still found far too many cases where they had not taken down serious and illegal content.”
Ms Cooper, who chairs the parliamentary committee, said members found adult content and racist content on both Facebook and Twitter, that had not been removed, despite being reported.
And she told the Express she had to contact YouTube four times in relation to an illegal propaganda video before it was taken down.
Ms Cooper also said social media algorithms suggesting relevant content are “promoting more abuse.”
She said: “If you click on far right extremist videos, then you are offered and recommended more of the same. Technology is actually encouraging people to get sucked in, and supporting the kind of radicalisation that is dangerous.”
Representatives from the companies told last week’s meeting they were taking steps to tackle abusive content.
Sinead McSweeney, vice president of public policy at Twitter, said the organisation was taking action against “10 times more accounts” than it did previously.
Meanwhile, Facebook public policy director Simon Milner said the company had more than 7,500 people reviewing its content, up from 4,500 earlier this year and Dr Nicklas Berild Lundblad, representing Google, which owns YouTube, said 10,000 people would review content by the end of 2018, up from 4,500 this summer.