YouTube says it will begin removing violent content that targets children instead of relying on age-restrictions alone
- YouTube will now remove violent and mature content direct toward kids
- Previously, the company had only age-restricted its content
- It will take into account titles, tags, and descriptions of videos
- After a 30-day grace period, accounts that repeatedly offend will be banned
- The firm’s decision comes on the heels of a recent settlement with the FTC
- YouTube also announced that it will no longer serve kids targeted ads
YouTube will remove any ‘mature’ or ‘violent’ content directed toward children amid mounting pressure to make its platform safer for minors.
According to The Verge, YouTube says it will weed out unsafe content by monitoring video titles, descriptions, and tags and will begin banning offenders following a grace period.
Targeted content will include any material that touches on sex, violence, death or other topics deemed inappropriate for young audiences.
While it may be odd to think that a platform of YouTube’s size hadn’t already been moderating ‘violent’ and ‘mature’ content directed towards kids, The Verge notes that up to this point YouTube has only age-restricted access.
YouTube has made another major change to content geared toward children by choosing to remove videos that contained violence or ‘mature’ themes. File photo
YouTube reportedly announced the change two days ago, quietly, through a YouTube Help community forum.
The platform said it will remove content if and when it’s found, but won’t give out any ‘strikes’ to creators until after a 30-day notice period meant to familiarize users with the policy.
Videos uploaded prior to the rule change, however, will not be given strikes, though they can still be removed.
As a part of the push, YouTube will also be age-restricting other types of content that they fear could be misconstrued as being for kids such as adult cartoons.
An example, said the platform, would be cartoon directed towards children that depicts inappropriate subject matter like a character ‘injecting needles.’
On the heels of an undisclosed settlement with Federal Trade Commission (FTC) on alleged breaches of the Children’s Online Privacy Act (COPPA), YouTube also recently agreed to stop serving targeted ads in kids’ content.
Critics say the use of targeted advertising on children violates laws that prevent companies from collecting data on individuals under 13 years-of-age without permission from their legal guardians.
Targeted ads use data aggregated from myriad sources in order to promote products based on user preferences, and coupled with a robust audience of children, are critical to YouTube’s business model.
According to a report from Bloomberg, the research firm Loup Ventures estimates that YouTube brings in between $500 to $750 million annually from children content alone.
While YouTube has a separate app for children that doesn’t use targeted ads, it still has droves of kids content on its main site, for which its practices of using data-driven product placement still apply.
The platform has begun to take kids’ safety more seriously amid pressure from regulators and concerned parents.
The move to end targeted advertising for kids content marks another major step for YouTube, which has begun to alter policies amid mounting pressure from regulators
Earlier this month YouTube confirmed to Bloomberg that it adjusted its algorithm to favor ‘trusted creators’ in July.
According to YouTube creators interviewed by Bloomberg, the tweak has gutted viewership for some, with views dropping by as much as 98 percent while others have noted substantial increases in viewership.
In February, the company killed more than 400 channels amid concerns over child abuse and exploitation.
While preventing targeted ads from reaching children’s content may appease regulators and concerned parents for the time being, actually implementing the strategy may be easier said than done.
To successfully remove the ads, YouTube would have to first identify what constitutes content geared toward kids and then be able to devise a way to successfully identify and remove the advertising.
HOW IS YOUTUBE MAKING ITS KIDS APP SAFER?
YouTube is finally rolling out changes to the privacy settings on it Kids app.
After several issues with the service were reported, it has now begun to roll out updates.
The new features will allow parents to filter content on the app so it only displays channels that have been reviewed by humans rather than algorithms.
Later this year there will be three further updates.
Collections by trusted partners and YouTube Kids staff
YouTube Kids staff will offer collections of trusted channels on a variety of subjects.
This can be done from in the Profile Settings and parents can select from collections such as Sesame Workshop and PBS KIDS.
YouTube will continue to add more partners over time.
Parent approved content
YouTube is rolling out a feature later this year that will allow parents to specifically handpick every video and channel available to their child in the Kids app.
Improved search-off control
Starting this week, turning search off will limit the YouTube Kids experience to channels that have been verified by the YouTube Kids team.