It’s not just advertisers putting increasing pressure on social networks and platforms that share user-generated content.  In Germany, the government is considered legislation that would fine Facebook, Twitter and the like fines of tens of millions of dollars for failing to remove hate speech quickly enough.

In the UK, they are debating prosecuting Google and YouTube for allowing the dissemination of extremist videos under anti-terrorism laws.

This comes amid more than 250 advertisers pulling digital ads from YouTube, Google and other third-party ad networks for putting their advertising content on places promoting hate-speech or morally questionable content.

German lawmakers are looking at monitoring done by a German youth protection organization.  According to Forbes, the group says Facebook has gotten worse at blocking and deleting criminal content even after it’s reported.  Last summer, the report says 46% of criminal content was blocked.  That number has dropped to 39%.  Even after complaints, only a third of the posts were deleted within 24 hours.

“Facebook and Twitter have not used the opportunity to improve their deletion practices. Too few punishable comments are deleted. And they are not erased quickly enough. The biggest problem is that the networks do not take the complaints of their own users seriously enough,” – Heiko Maas, Justice Minister, Germany in Forbes

Not only would the law impose hefty fines on the social media outlets, but it would require a single person to be placed in charge of the process and make that person personally liable for fines of approximately $5 million dollars.  EDITOR’S NOTE:  Not sure I’d want the pressure of that job!

“It is not acceptable that companies are making huge profits with their social networks and at the same time shirking their responsibilities in the fight against hate messages.” – Heiko Maas, Justice Minister, Germany in Forbes

Meanwhile, The Telegraphs’ Christopher Hope and Laura Hughes report that Google, Facebook, and other internet companies could be facing prosecution if they don’t stop extremist videos from being seen by Brits.

A new law would tighten the window on removing videos.  Ministers are worried that even pledging to take down offensive videos within 24 hours allows thousands of views before it’s removed.

“Social media companies have a responsibility when it comes to making sure this material is not disseminated and we have been clear repeatedly that we think that they can and must do more. We are always talking with them on how to achieve that.” – Theresa May, Spokesman for the Prime Minister via The Telegraph

All of this adds up to major headaches for the social sites.  More than 300 hours of videos are uploaded to just YouTube every minute of every day.  Imagine how difficult it would be for anybody or any algorithm to monitor the content.  With a direct-to-consumer model where content is available without review, it just doesn’t seem possible that some things won’t filter through.

Because of such staggering number, platforms have relied on users to point things out and complain. “We rely on YouTube community members to flag content that they find inappropriate,” the company says. “Flagged content is not automatically taken down by the flagging system.”  Instead, a staff member will review the flagged videos.  They claim to have teams working 24/7/365.

In case you’re wondering, that’s 432,000 hours of new video on YouTube every single day.  It would take you 49.3 years to watch all of it (and that’s without potty breaks or sleeping).  Don’t even think about the next day’s uploads!

As they said in a 1962 Spiderman comic:

“With great power, there must also come – great responsibility!”

spider400