Like users of any other online network, YouTube creators have long dealt with their fair share of trolls. And while updated policies against harassment were a positive step, if you ever dived into the comments, it was clear this was not enough. Creators could delete comments and later block users, but they had no way of ensuring that hateful words wouldn't appear in the first place.
Now, the video network is putting control back in the hands of its users. YouTube have announced a significant new beta feature: Creators see and reject inappropriate comments before they appear publicly, based on the language in the posts. You can choose words or phrases you want blacklisted, and those will be held for review by you or a moderator you've delegated.
"If you choose to opt-in, comments identified by our algorithm will be held and you have the final decision whether to approve, hide, or report these comments," said the YouTube blog post. "We recognise that the algorithms will not always be accurate: The beta feature may hold some comments you deem fine for approval, or may not catch comments you’d like to hold and remove."
The more feedback you give, the smarter the algorithm will become. Other new features include the opportunity to pin comments to the top of your feed, in the same way you can on Twitter and Facebook, as well as the option to "heart" your favourite comments.
But it's really the moderation tool that we're most excited about. Together with recent changes from Facebook and Instagram, YouTube's actions are part of a critical larger movement to fight online harassment. And while all of these recent changes are ones that should have happened years ago, they're no less important now.
Now we're just waiting on you, Twitter.
AdvertisementADVERTISEMENT