Google, Facebook and Twitter must remove extremist content in an hour’s time or face hefty fines, the European Commission president has said.
In the annual speech on the State of the Union address to the European Parliament, Jean-Claude Juncker, said an hour was a “critical time window”.
Net business has given three months in March to show that they were to act more quickly to take down radical positions.
But the regulatory bodies of the EU has said too little is done.
If the authorities to report content that incites and advocates of extremism, the content must be removed from the web in less than an hour, the proposal of the EU to conduct official united states. Net of the businesses that fail to comply would face fines of up to 4% of their global turnover.
The proposal will need support from the countries that comprise the European Union and the European Parliament.
In response to the plans, Facebook said: “there is no place for terrorism on Facebook, and we share the objective of the European Commission to fight it, and believe that it is only through a joint effort between business, civil society and the institutions that the results can be obtained.
“We have made significant progress in research and removal of terrorist propaganda quickly and on a large scale, but we know that we can do more.”
A spokesman for YouTube said that the site “is shared by the European Commission of its willingness to respond quickly to terrorist attacks, and content and to keep violent extremism off our platforms.”
“This is why we have invested heavily in people, technology, and collaboration with other high technology companies on these efforts.”
The Internet platforms will be necessary to develop new methods of policing content, but it is difficult to know what form it could take.
“We need a strong and targeted tools to win this fight online,” said Justice Commissioner Vera Jourova.
While companies like Google are increasingly relying on machine learning to the root of the problems, they also need a lot of human moderators to spot extremist content.
The Commission retains its voluntary code of conduct on hate speech, agree with Facebook, Microsoft, Twitter, and YouTube in 2016.What is the magnitude of the problem and what is being done?
in 2017, Google said it would spend more than 10 000 employees in the eradication of violent extremist content on YouTube
YouTube has said that the employees had viewed nearly two million videos to violent extremism, from June to December 2017
YouTube has said that more than 98% of these materials has been reported automatically, with over 50% of deleted videos have less than 10 views
the members of the industry have been working together since 2015, in order to create a database of “digital fingerprints” has already identified the content to improve the detection of extremist material. In December 2017, it contained more than 40 000 “hashes”
by 2017, Facebook said that 99% of all of the Islamic State and al-Qaeda-linked content has been removed before that users had marked. The social network said that 83% of the remaining content has been identified and removed within an hour
between August 2015 and December 2017, Twitter said that it had suspended more than 1.2 million accounts in its fight to stop the spread of extremist propaganda. He said that 93% were marked by internal tools, with 74% suspension prior to their first tweet