Google, Facebook and Twitter should remove extremist content within an hour or face hefty fines, the European Commission, the president has said.
In its annual report the State of the Union before the European Parliament, Jean-Claude Juncker, said that an hour was a “decisive moment”window.
Net of the companies had been given three months to March, to show they were acting more quickly to end up radical positions.
But the regulators of the EU said that little has been done.
If the authorities of the flag content that incites and advocates of extremism, the content should be removed from the website within an hour, the EU proposal lead officer of the united states. Net of the companies that do not comply could face fines of up to 4% of their global turnover annual.
The proposal would need the backing of countries making up the European Union, as well as the European Parliament.
In response to the plans, Facebook said: “There is no place for terrorism in Facebook, and we share the objective of the European Commission to fight against it, and I think that it is only through a common effort between business, civil society, and institutions that results can be achieved.
“We’ve made significant progress to find and eliminate the terrorist propaganda quickly and on a large scale, but we know that we can do more”.
A spokesman for YouTube, added that the site “shared the European Commission’s desire to react quickly to terrorist content and keep the violent extremism outside of our platforms.”
“That is why we have invested heavily in people, technology, and collaboration with other high technology companies in these efforts”.
The Internet platforms will be necessary to develop new methods to police content, but it is not clear what form it would take.
“We have strong and targeted tools to win this battle,” said the Justice Commissioner Vera Jourova.
While companies like Google are increasingly relying on machine learning to root of the problems, they also need a large number of human moderators to spot extremist content.
The Commission will maintain its voluntary code of conduct on hate speech, according to Facebook, Microsoft, Twitter, and YouTube in 2016.What is the magnitude of the problem and what is being done?
in 2017, Google said it would spend more than 10,000 personnel for the eradication of violent extremist content on YouTube
YouTube said that the staff had seen nearly two millions of videos for violent extremism, from June until December of 2017
YouTube said that more than 98% of said material is automatically marked, with more than 50% of videos removed have less than 10 views
the members of the industry have worked together since 2015 to create a database of “fingerprints” of previously identified content in order to better detect extremist material. From December 2017, which contained more than 40,000 such “hashes”
in 2017, Facebook stated that 99% of all the Islamic State and al Qaeda-related content was removed before that users had dialed. The social network said that 83% of the remaining content was identified and removed within an hour
between August 2015 and December 2017, Twitter said that it had suspended more than 1.2 million accounts in its fight to stop the spread of the propaganda of extremist. It is said that 93% were marked by internal tools, with 74% of suspended prior to your first tweet