Senior executives from Twitter, Facebook and Google, which owns YouTube, faced tough questions from MPs over their handling of online abuse.

Appearing at the Commons' Home Affairs Select Committee on Tuesday, social media bosses were accused of profiting from violence and criticised for failing to remove material that incites hatred.

Committee chairwoman, Yvette Cooper, said algorithms used by the companies to suggest relevant content are helping to radicalise and groom users.

She said progress had been made in the companies' approach to online hate crime, but told the witness panel: "In the end, this is about the kinds of extremism, whether that be Islamic extremism or far-right extremism, that can lead to very violent incidents.

"It is about the kinds of hate crime that destroys lives. It is about the kind of harassment and abuse that can undermine political debate and undermine democracy and you are some of the richest companies in the world.

"And that is why we need you to accelerate and we need you to do more."

Ms Cooper said she found it hard to believe enough was being done to tackle hate crime, after it emerged anti-Semitic and abusive tweets flagged by MPs at an earlier committee hearing had not been removed.

The chairwoman added that a series of violent tweets, including threats against Theresa May and racist abuse towards shadow home secretary Diane Abbott, were reported by her office but also remained on the site.

Addressing Twitter's Sinead McSweeney, vice president of public policy and communication for Europe, the Middle East and Africa (Emea), Ms Cooper said: "I'm kind of wondering what we have to do.

"We sat in this committee in a public hearing and raised a clearly vile anti-Semitic tweet with your organisation.

"It was discussed and it is still there, and everybody accepted, you've accepted, your predecessor accepted, that it was unacceptable. But it is still there on the platform.

"What is it that we have got to do to get you to take it down?"

Ms McSweeney said there were "balances and risks" in how content is flagged for recommendation to avoid censuring items.

Conservative MP Tim Loughton said social media giants were inciting violence through inaction.

"This is not about taking away somebody's rights to criticise somebody whose politics they don't agree with," he said.

"It's about not providing a platform - whatever the ills of society you want to blame it on - for placing stuff that incites people to kill, harm, maim, incite violence against people, because of their political beliefs in this case."

He added: "You are profiting, I'm afraid, from the fact that people are using your platforms to further the ills of society and you're allowing them to do it and doing very little, proactively, to prevent them."

The hearing took place the day after Twitter suspended accounts belonging to Britain First leaders Jayda Fransen and Paul Golding.

Fransen gained notoriety when three anti-Muslim videos she posted were retweeted by US president Donald Trump.

Facebook's director of public policy, Simon Milner, told the committee that Britain First's page on its platform was under review, with both online and offline behaviour being considered by moderators.

He said: "There are clearly issues with their page on Facebook. There's been a number of pieces of content taken down.

"We are obviously reviewing it but we are very, very cautious about political speech."