YouTube’s New Tools Attempt to Address Online Harassment

Online harassment and abuse can take many forms. Threating and hateful comments turn up across online communities from newspapers to blogs to social media. Anyone posting online can be the target of these comments, which cross the line from honest disagreement to vengeful and violent attacks. This behavior is more than someone saying something you don’t like or saying something “mean” – it often includes ongoing harassment that can be nasty, personal, or threatening in nature. For survivors of abuse, threatening comments can be traumatizing, frightening, and can lead some people to not participate in online spaces.

YouTube recently created new tools to combat online abuse occurring within comments. These tools let users who post on their site choose words or phrases to “blacklist” as well as the option to use a beta (or test) version of a filter that will flag potentially inappropriate comments. With both tools, the comments are held for the user’s approval before going public. Users can also select other people to help moderate the comments.

Here’s a summary of the tools, pulled from YouTube:

  • Choose Moderators: This was launched earlier in the year and allows users to give select people they trust the ability to remove public comments.
  • Blacklist Words and Phrases: Users can have comments with select words or phrases held back from being posted until they are approved.
  • Hold Potentially Inappropriate Comments for Review: Currently available in beta, this feature offers an automated system that will flag and hold, according to YouTube’s algorithm, any potentially inappropriate comments for approval before they are published. The algorithm may, of course, pull content that the user thinks is fine, but it will improve in its detection based on the users’ choices.

Survivors who post online know that abusive comments can come in by the hundreds or even thousands. While many sites have offered a way to report or block comments, these steps have only been available after a comment is already public, and each comment may have to be reported one by one. This new approach helps to catch abusive comments before they go live, and takes the pressure off of having to watch the comment feed 24 hours a day.

These tools also offer survivors a means to be proactive in protecting their information and safety. Since many online harassment includes tactics such as doxing (where personal information of someone is posted online with the goal of causing them harm), a YouTube user can add their personal information to the list of words and phrases that are not allowed to be posted. This can include part or all of phone numbers, addresses, email addresses, or usernames of other accounts. Proactively being able to block someone from posting your personal content in this space will be a great tool.

Everyone has the right to express themselves safely online, and survivors should be able to fully participate in online spaces. Connecting with family and friends online helps protect against the isolation that many survivors experience. These new tools can help to protect survivors’ voices online.