In recent months, YouTube has found itself at the center of several scandals involving inappropriate content on the site. Back in November, users began noticing strange and at times overtly disturbing content being generated for children. The same month, an investigation was launched when videos that appeared to be child exploitation appeared on the site. The disturbing content, along with the accompanying obscene comments, lead many brands to immediately pull ads. Several weeks ago, vlogger Logan Paul published a video of Japan’s Aokigahara forest, in which he explicitly filmed the body of a person who had taken their own life. YouTube responded by removing Paul from their ad program and putting projects with him on hold, though his main channel is still active.
In the wake of these events, the platform announced on Wednesday that several major changes to the eligibility requirements for monetizing content will begin rolling out next month. Starting February 20th, channels will no longer qualify for monetization (meaning that they are able to be accompanied by Google preferred ads that generate income for the content creator) if they have less than 1,000 subscribers and less than 4,000 hours of total watch-time.
YouTube said the changes are intended to give the platform more time to confirm channels are following content guidelines and discourage channels from producing inappropriate content. In a blog post, the company stated the changes “will allow us to significantly improve our ability to identify creators who contribute positively to the community and help drive more ad revenue to them (and away from bad actors).” They continued by alluding to some recent incidents, stating that the “higher standards will also help us prevent potentially inappropriate videos from monetizing which can hurt revenue for everyone.”
Many in the YouTube community have pushed back against the changes, pointing out that the size of a channel doesn’t necessarily dictate how appropriate it is. It does, however, allow the company to streamline their monitoring practices. To allay creator concerns, YouTube also promised to increase the human vetting of videos — as opposed to the content assessment being done by algorithms or AI. Choosing human moderators over artificially intelligent could not only help the platform identify more nuanced offenses, but also prevent rule-abiding content from being mischaracterized as inappropriate.
The changes will have consequences, however, particularly for smaller channels. Previously, channels could become a part of YouTube’s Partner Program and receive monetization when they hit 10,000 total public views. The new restrictions will make the path to monetization much longer for those who are just starting out on the platform. It will also remove the potential to generate ad revenue from smaller channels who have, up until now, been able to monetize their content. Those in the YouTube community who have raised concerns argue that the changes will stifle creativity and discourage original content.
In their blog post, YouTube stated that “protecting our creator ecosystem” and stabilizing monetization are their main priorities in 2018. While the changes are meant to facilitate those goals, YouTube has acknowledged the concerns of creators. The platform said they are planning to “schedule conversations with our creators in the months ahead so we can hear your thoughts and ideas and what more we can do to tackle that challenge.”