Early last Fall, Instagram made a commitment to users to make sure it’s a safe place. Recently they’ve taken new steps toward fulfilling that commitment by launching a number of additional safety tools. These include:
· Instagram Together a new safety center that catalogues all of the safety tools available to Instagram users, and lists international resources to support peoples’ safety (And we’re thrilled to say that techsafety.org is listed among them!)
· Two-factor authentication will now be available to all users, adding an extra layer of security that helps keep your account safe even if your password is stolen.
· Sensitive Content Screens will now blur out images and videos that have been flagged by users (and verified by Instagram’s review team) as sensitive in nature. These are images and videos that don’t violate Instagram’s guidelines, but that some users may feel are offensive or disturbing. As we know, sometimes online harassment takes the form of people mis-flagging the photos of victims in an effort to prevent them from effectively engaging on social media. We spoke with Instagram to see how they work to make sure the Sensitive Content Screens won’t be misused in such a way, and were told that the only time the screens will go up is if the content doesn’t violate their community guidelines but contains graphic or violent content. Examples of this include images of animal abuse, the impact of war on local communities, etc. Only Instagram can place a screen over a photo, and the number of times a post is flagged will not impact their decision-making process – so if someone is trying to troll a victim by mis-flagging their photos, their efforts will be ineffective.
We’re pleased to see Instagram work to make their platform a safer place for survivors of harassment and abuse, and look forward to seeing what’s next in their efforts to fulfill their commitment to kindness!