Google will spend more than 10 000 employees in the eradication of violent extremist content on YouTube in 2018, the video-sharing website in the chief said.
Writing in the Daily Telegraph, Susan Wojcicki said that some users have been operating YouTube for “deceive, manipulate, harass, or even harm”.
She said that the website, owned by Google, had used the computer “-the learning of the technology that could be extremist videos.
More than 150 000 of these videos have been removed since the month of June, she said.
In March, the government of the united KINGDOM has suspended its ads from YouTube, following concerns that they were appearing next to inappropriate content.
And in a speech to the United Nations general assembly in September, the BRITISH Prime Minister, Theresa may, has challenged tech companies to shoot down the terrorists of material in two hours.
The prime minister has repeatedly called for the end of “safe spaces,” she said terrorists enjoy online.
Google launches the UK ” anti-terror funds
May warns tech companies on the terror content
Tech firms to fight extremism row
Ms. Wojcicki said that staff has reviewed the nearly two million videos of violent extremist content from the month of June.
It is thanks to the formation of the society, the technology of machine learning to identify similar videos, which staff can use to delete almost five times as many videos as they were before, she said.
She said that the company was taking “decisive action” on the comments, the use of technology to help employees find and stop hundreds of accounts and hundreds of thousands of comments.
And its teams to “work in close collaboration with child safety organisations around the world to report the behaviors, predators, and accounts to the proper law enforcement agencies.”
During this time, the police in the UK have warned that sex offenders are increasingly using live streaming to online platforms to exploit children.Google committed to fighting terror videos
Earlier this year, Google announced that it would give a total of 1 million pounds ($1.3 million) to finance projects that help combat extremism in the UK.
And, in June, YouTube announced four new steps it was taking to combat the extremist content:
The improvement of its use of machine learning to remove controversial videos
Working with 15 new expert groups, including the Anti-Defamation League, No Hate Speech Movement, and the Institute for Strategic Dialogue.
The more difficult the treatment is for videos that are not illegal, but they have been reported by users that the breach of its policies on hate speech and violent extremism
Redirect people who search for certain keywords to a reading list of commissioner of YouTube videos directly confront and demystify the messages of violent extremists