YouTube has given additional privileges to 200 “super flaggers” that are able to apply for up to 20 videos to be removed at any one time with government sources among those able to suggest videos are taken down.
Google will let trusted flaggers identify videos that are considered “extremist” or if they flout YouTube’s rules on offensive content and it’s understood that just less than 10 are from government agencies or non-governmental organisations such as anti-hate or child-safety groups, according to a person familiar with the program.
The Wall Street Journal adds that Google will have the final say on the removal of a video and has hit out at claims that the governmental agencies are able to remove any YouTube content without Google’s prior approval.
“Any suggestion that a government or any other group can use these flagging tools to remove YouTube content themselves is wrong,” a Google spokesman said.
Google’s current guidelines on offensive content prohibit videos that incite violence or those that contain animal abuse, drug abuse, underage drinking, bomb making and various other inflammatory topics.
The same source stated that over 90 per cent of the videos identified by the super flaggers are eventually removed from the site for breaching guidelines or are flagged as being inappropriate for a younger audience. This percentage is noticeably higher than regular users that flag up content on the odd occasion.
The UK counter terrorism unit put no pressure on Google to start the programme, stated the person familiar with the plans, and that agency instead showed an interest in YouTube’s guidelines and spotted videos that violated them.
The Metropolitan Police has already been using its “super flagger” authority to see videos reviewed and then removed if they are considered “extremist”, and more government agencies could try to fight terrorism in this manner in future.