YouTube more than three months before the “run report” reveals the web site deleted the 8.3 million video, between October and December 2017 for violation of the community guidelines.
The figure does not include the video removed for copyright or legal reasons.
Sexually explicit Video attracted 9.1 million reports by users of the site, while 4.7 million have been reported for hate speech or offensive content.
The majority of complaints coming from India, the UNITED states and Brazil.
YouTube said its algorithms had marked 6.7 million video that was then sent to the human moderators and deleted.
Of these, 76% had not been seen on YouTube as well as the moderators.
The company has told the BBC it is stored the data of “digital fingerprint” of the video deleted so that you can immediately discover if someone has uploaded the same video.
In the month of March, YouTube has been criticized for its inability to remove the four propaganda videos posted by the bandit UK neo-Nazi group National Action.
Give evidence from the UK Commission for home Affairs, the company responsible for counter-terrorism, William McCants, blamed human error for the delay in the removal of the video.
But Yvette Cooper MP has said that the evidence given was “disappointed”, “weak” and was “a failure to even do the basics”.
The company has also been criticized for the use of algorithms for the care of the YouTube for Kids app for children. Inappropriate videos have repeatedly slipped through the net and posted on YouTube Children.
The report does not reveal how many inappropriate video had been reported or removed from YouTube for Children.
YouTube also announced the addition of a “reporting dashboard” for users ‘ accounts, to see the status of all the videos that were flagged as inappropriate.
Top 10 countries flag video