Is YouTube Becoming a One-eyed Monster to Our Kids?

(photo: https://pixabay.com/en/social-media-interaction-abstract-1233873/)
By Faith MagbanuaNovember 13th, 2017

Recently, YouTube announced that it is to restrict the availability of videos showing children's characters in violent or sexual scenes if they are reported by viewers.

The multi-million company has announced a clampdown on disturbing and inappropriate children's videos, following accusations that the site enabled "infrastructural violence" through the long-run effects of its content recommendation system.

Last week, a blog post by writer James Bridle highlighted how YouTube was still being swamped by bizarre and indecent videos aimed at children.

The site says it already stopping such videos to earn advertising revenue.

YouTube said its team was "made up of parents who are committed to improving our apps and getting this right".

But critics say YouTube is not taking enough action as they would only act after viewers report inappropriate videos.

YOUTUBE'S NEW POLICY

The new policy, announced on Thursday evening, will see age restrictions apply on content featuring "inappropriate use of family entertainment characters" like unofficial videos depicting Peppa Pig "basically tortured" at the dentist. The company already had a policy that rendered such videos ineligible for advertising revenue, in the hope that doing would reduce the motivation to create them in the first place.

"Earlier this year, we updated our policies to make content featuring inappropriate use of family entertainment characters' ineligible for monetization," said Juniper Downs, YouTube's director of policy. "We're in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids. The YouTube team is made up of parents who are committed to improving our apps and getting this right."

The problem of video-makers using popular characters such as Peppa Pig in violent or sexual videos, to frighten children, has been widely reported.

However, in Bridle's blog post, it tackled a deeper discussion into what he called the rabbit hole of children's content on YouTube.

Sighting some, he gave examples of videos aimed at children that were not necessarily violent or sexual but were sinister, "creepy" or otherwise inappropriate that remain active in the site.

Often, it appeared that the videos had been algorithmically generated to capitalize on popular trends.

"Stock animations, audio tracks, and lists of keywords being assembled in their thousands to produce an endless stream of videos," he said.

Many used popular family entertainment characters such as Spiderman, and Elsa from Frozen, to get viewers and they had been viewed millions of times.

"Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale," he wrote.

Although these videos are being barred by the site, many of the videos do not get reported by viewers and continue to carry advertisements.

YouTube has now said it will give such videos an age restriction if they are reported by viewers, so they cannot be viewed by people under 18 and age-restricted videos are blocked from appearing in the YouTube Kids app, which is primarily curated by algorithms.

To add to that, they also cannot be viewed on the YouTube website unless people are logged in with an adult's account.

However, a report in the New York Times found that inappropriate videos have previously slipped through the net.

YouTube says it uses human reviewers to evaluate whether flagged videos are appropriate for a family audience.

In his blog post, Bridle said he did not know how YouTube could stamp out the problem.

"We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I've used in this essay," he said.

 

related articles
LATEST FROM China