Can calls online, terror, repression

Investors should put pressure on the tech giants in order to respond more quickly to extremist content on social networks, the prime minister said.

Theresa may, told the World Economic Forum in Davos that investors should consider the social impact of the companies they have an interest.

Social networks must continue to provide a platform for terrorism, extremism and the abuse of children, she stressed.

This content should be “automatically deleted,” added Ms. may.

“Earlier this month, a group of shareholders has demanded that Facebook and Twitter to disclose more information about sexual harassment, false information, hate speech and other forms of abuse that take place on the companies’ platforms,” she said.

“Investors can make a big difference here, ensuring the confidence and safety are properly taken into account – and I urge them to do so.”

Ms May told the BBC in Davos that, although technology companies have already worked with the government, many things remain to be done.

“The high-tech enterprises can be a tremendous force for good in many ways, but also we need to ensure that we are looking at those ways in which the internet and technology can be used by those who want to do us harm,” she said.

Getty Images

The prime minister also wants to see the pillars of the technology industry working with start-ups that deal with the issue. Small platforms like the privacy-focused encrypted messaging application Telegram – are often used by terrorists, criminals and paedophiles.

“These companies have some of the best brains in the world,” Ms. may said. “They need to concentrate their brightest and best on the respect of these fundamental social responsibility.”

Telegram has already said that it is “no friend of terrorists” and has blocked the channels used by the extremists.Artificial intelligence

Last year, Facebook has announced several measures to improve the detection of unlawful content on the network, including using artificial intelligence in place of pictures, videos, and texts related to terrorism, as well as clusters of fake accounts.

In November, the social network said 99% of the matter that it now removes about al-Qaeda and the so-called Islamic State has been detected for the first time by itself rather than its users.

However, Facebook has admitted that it had to do more work to identify other terror and extremist groups.

Getty Images

“Tech companies may not always agree with the government on ways and means, but there is no disagreement on the objective of making online platforms environments hostile to illegal and inappropriate content,” said Julian David, ceo of technology trade association techUK.

“A lot has been done already by working in partnership with the government and the tech companies are committed to continue to work to ensure the safety and security of their users.”