Europe Asks Social Networks to Remove Terrorist Content Within an Hour

Share

Today, European Commission published new guidelines for social network and among them, there's a request for these websites to remove reported terrorist content within one hour (via Financial Times).

The Commission's guidelines say web companies need humans overseeing content removal to make sure automated tools don't excessively or incorrect remove content.

In recent months, online technology companies have been facing increasing pressure from the European Union to step up their efforts to combat online hate speech on their sites. Digital commissioner Andrus Ansip said: "While several platforms have been removing more illegal content than ever before ... we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights".

The European Union has stepped up pressure on the world's largest internet companies including Facebook, Google, and Twitter to remove illegal content such as child porn, as well as material from terrorist groups from their platforms.

"Member States and companies will be required to submit relevant information on terrorist content within three months, and other illegal content within six months", the European Commission said. "What is illegal offline is also illegal online".

While these recommendations are non-binding, they could factor into future legislation.

Trump communications chief, longtime aide, to resign
He became the longest-tenured aide after White House communications director Hope Hicks announced her resignation on Wednesday . Hicks said there were "no words to adequately express" her gratitude to President Donald Trump .

Mr. Ansip said some commission officials were pushing to overhaul existing rules that protect platforms from being held legally accountable for what appears on their platforms-everything from stolen cars on a second-hand shopping portal to terror content-a move he said he himself was dead set against. Since then, the Commission has been pleased with how the companies have improved their efforts to removing hate speech.

According to WSJ, tech companies are wary the guidelines may infringe on freedom of expression.

Joe McNamee, executive director of European Digital Rights, described the Commission's proposal as "voluntary censorship".

As for the tech firms, at least Facebook has said that it agrees with the EC's recommendations. For their part, the companies have consistently tried to step up by funding research, partnering on a shared database of images and videos that promote terrorism, and putting artificial intelligence to work at identifying content automatically. This includes having tools in place to automatically detect terrorist content.

The EDRI said that the EC's decision is part of a trend that uses the "threat" of legislation to force internet companies into heavy policing of their platforms.

Share