Facebook’s Proactive Approach to Addressing Nonconsensual Distribution of Intimate Images

It’s well-known that technology has made sharing sexually intimate content easier. While many people share intimate images without any problems, there’s a growing issue with non-consensual distribution of intimate images (NCII[1]), or what is often referred to as “revenge porn.” Perpetrators often share - or threaten to share - intimate images in an effort to control, intimidate, coerce, shame, or humiliate others. A survivor threatened by or already victimized by someone who’s shared their intimate images not only deserves the opportunity to hold their perpetrator accountable, but also should have better options for removing content or keeping it from being posted in the first place.

Recently, Facebook announced a new pilot project aimed at stopping NCII before it can be uploaded onto their platforms. This process gives people who wish to participate the option to submit intimate images or videos they’re concerned someone will share without their permission to a small, select group of specially trained professionals within Facebook. Once submitted, the images are given what’s called a “hash value”, and the actual images are deleted. “Hashing” basically means that the images are turned into a digital code that is a unique identifier, similar to a fingerprint. Once the image has been hashed, Facebook deletes it, and all that’s left is the code. That code is then used as a way for Facebook to identify if someone is attempting to upload the image and prevent it from being posted on Facebook, Messenger, and Instagram.

Facebook’s new pilot project may not be something everyone feels comfortable using, but for some it may bring much peace of mind. For those who believe it may help in their situation, we’ve outlined detailed information about how the process works:

  1. Victims work with a trusted partner. Individuals who believe they’re at risk of NCII and wish to have their images hashed should first contact one of Facebook’s trusted partners: the Cyber Civil Rights Initiative, YWCA Canada, UK Revenge Porn Hotline, and the eSafety Commissioner in Australia. These partners will help them through the process and identify other assistance that may be useful to them.
  2. Partner organizations help ensure appropriate use. The partner organization will carefully discuss the individual’s situation with them before helping them start the hashing process. This helps ensure that individuals are seeking to protect their own image and not trying to misuse the feature against another person. It’s important to note that the feature is meant for adults and not for images of people under 18. If the images are of someone under 18, they will be reported to the National Center for Missing and Exploited Children. Partner organizations will help to explain the reporting process so that individuals can make appropriate decisions for their own case.
  3. The Image will be reviewed by trained staff at Facebook. If the images meet Facebook’s definitions of NCII, a one-time link is sent to the individual’s e-mail. The link will take the individual to a portal where they can directly upload the images. All submissions are then added to a secure review queue where they will be reviewed by a small team specifically trained in reviewing content related to NCII abuse.
  4. NCII will be hashed and deleted: All images that are reviewed and found to meet Facebook’s definition of NCII will be translated into a set of numerical values to create a code called a “hash.” The actual image will then be deleted. If an image is reviewed and Facebook determines it does not match their definition of NCII, the individual will receive an email letting them know (so it’s critical that someone use an email that cannot be accessed by someone else). If the content submitted does not meet Facebook’s definition of NCII, then the concerned individual may still have other options. For example, they may be able to report an image for a violation of Facebook’s Community Standards.
  5. Hashed images will be blocked: If someone tries to upload a copy of the original image that was hashed, Facebook will block the upload and provide a pop-up message notifying the person that their attempted upload violates Facebook’s policies.

This proactive approach has been requested by many victims, and may be appropriate on a case-by-case basis. People who believe they’re at risk of exposure and are considering this process as an option should carefully discuss their situation with one of Facebook’s partner organizations. This will help them make sure they’re fully informed about the process so that they can feel empowered to decide if this is something that’s appropriate for their unique circumstances.  

For more information about how survivors can increase their privacy and safety on Facebook, check out our Facebook Privacy & Safety Guide for Survivors of Abuse.


 

[1] NCII refers to private, sexual content that a perpetrator shares publicly or sends to other individuals without the consent of the victim. How we discuss an issue is essential to resolving it. The term “revenge porn” is misleading, because it suggests that a person shared the intimate images as a reaction to a victim’s behavior.

Cambridge Analytica and Why Privacy Matters to Survivors

Recent news that the personal information of tens of millions of people was used by Cambridge Analytica “to create algorithms aimed at ‘breaking’ American democracy” as the New Yorker phrases it, has led to a call to #DeleteFacebook. For those unfamiliar with the story, our friends at AccessNow wrote a great summary.

This kind of invasion of privacy is not new, nor is it limited to this case. The old expression, “No free lunch,” applies to any service that we don’t pay for, whether it is social media or a discount card at the grocery store or entering a raffle to win a new car. The true cost is allowing those companies to access our personal information for their own profit.

Safety is the primary concern. For survivors who face threats of harm, who live daily in fear from the abusers, the security of personal information can be a life and death issue. For survivors fleeing an abuser, information about location, work, kids’ schools, and social connections can lead an abuser to the doorstep. For survivors living with abuse, information about friends, thoughts, feelings, opinions, and interests can be misused by an abuser to control, isolate, or humiliate.

For survivors, privacy is not an abstract issue, or a theoretical right to be debated on CSPAN. Privacy is essential to safety, to dignity, to independence. Yet, we live in a time when personal information = profit.

The Cambridge Analytica story surfaces the underlying reality that our personal information is not under our control. It feels like we are seldom asked for consent to share our personal data. When we are, it is in legalese, in tiny letters that we might have to scroll through to be able to check that box, and get on with using whatever website we’re trying to use. Even if we do take the time to read through those privacy terms, we know that data is routinely stolen, or accidentally published on the Internet, or used against us to affect access to loans, insurance, employment, and services.

We are social animals. We crave connection. Research shows that we suffer without it. Isolation is a classic tactic of abuse. But the price we too often pay for connection online is our privacy.

At times like these, we may think about deleting Facebook, going offline, or throwing away our phones. We may think that survivors should give up their tech at the door of our shelters, or that they have to go off the grid in order to be safe.

Digital exile is not the answer. Technology, and the Internet, is a public space where everyone, including survivors, should have the right, to share their voices, to make connections, and to access information without fear of their personal information being collected and used without their consent. April Glaser writes in Slate that, “[d]eleting Facebook is a privilege,” pointing to the huge number of people that rely on it to connect with friends, to learn about events, to promote a business, or, in parts of the world with limited Internet access, just to be online at all.

Survivors, just like every other consumer, should be given the opportunity to give truly informed consent. That consent must be based on clear, simple, meaningful, understandable privacy policies and practices – not just a check box that no one pays attention to.

A guide to the process of changing your Facebook settings to control apps’ access to your data is available from the Electronic Frontier Foundation. Also check out our own guides to Online Privacy and Facebook Privacy and Safety.

New & Updated Resources on Facebook Privacy & Safety

We recently had the exciting opportunity to collaborate with Facebook on their international roundtables on Women’s Online Safety and were able to participate in three of these events in Washington, DC, Hyderabad, India, and New York City. The roundtables featured leading voices from many of the nation’s gender based violence (GBV) organizations as well as government representatives from various countries.

The roundtables were devised to create space for GBV organizations to contribute to the broader conversation on how Facebook in particular can engage the voices of women and create a safer environment for women to use the platform without fear of harassment and threats.  The goals of the roundtables were:

  1. To share existing Facebook tools women can use to help with privacy and safety.
  2. To share innovations Facebook is currently working on to improve the user experience. 
  3. To hear concerns from the field on what users are experiencing. 
  4. To create a network for GBV organizations to foster continuous conversations and provide a support structure for women users. 

The roundtables included conversations around Facebook’s Real Name Policy. Facebook has strongly backed their long-standing policy for users to be authentically identified by their real names. This policy also minimizes the ability for abusers and perpetrators to hide behind fake accounts and increases the likelihood that abusers misusing the platform to harass, threaten, or stalk a person can be held accountable. The policy has received some push-back, however, and Facebook addressed the various steps they have taken to allow some flexibility for individuals who are going by a different name in their everyday lives than their legal name.

All of the meetings discussed counter speech, which is used to combat negative comments posted on an account. By using counter speech, users can ask their audiences to post positive comments and help manage some of the negative, threatening, and harassing comments they are receiving.

During the roundtables, Facebook and Safety Net introduced the new Guide to Staying Safe on Facebook. This guide is a condensed version of the Privacy & Safety on Facebook: A Guide for Survivors of Abuse, providing short and concise tips on privacy and safety settings. Both resources can be found in our Privacy & Safety on Facebook page of the blog.

The roundtables were an incredible success. We appreciate the opportunity Facebook provided for global GBV organizations to convene and share their concerns. We will continue to foster collaborations between technology companies, government organizations, and non-profits to help eradicate violence against women in all forms, including in online spaces. To learn more about the roundtables and all of the great topics discussed, visit #HerVoice. Also, check out our video series on Facebook Privacy, Security and Safety!

Facebook Removes Search By Name Option

 

Last week, Facebook announced that they were removing the “Who Can Look Up My Timeline By Name” option for their users. Since then we have been contacted by many concerned advocates about what removing this feature means for survivors, many of whom use Facebook to stay connected with friends and family but whose privacy from their abusers and stalkers is equally important.

When Facebook first told us they were planning to make this change, we expressed that this feature is one method some survivors use to control their privacy. Opting out of being searchable by name was one way in which survivors could use to keep an abuser or stalker from finding their timeline/account. 

However, Facebook explained, and we agree (because we’ve known this for a while too), that this feature gave a false sense of privacy, since even if this feature was activated, people can still be found in other ways. Some of those ways include:

  • Mutual friends. If you have mutual friends, unless you choose to not allow mutual friends to see your activity, many people can be found that way. Moreover, even if they have chosen to not allow friends of friends to see their activity, we have heard of many survivors whose mutual friends simply shared the information with their abuser or other people. 
  • Username/User ID. If someone knew your exact username or userID, they can find you that way. 
  • Graph Search. Graph search is a new searching option that Facebook has been slowly rolling out, and this type of search will make anyone searchable, even if they have selected that they don’t want to be found by name. Unlike personal demographics information, graph search reveals users based on things they like or things their friends like and other demographics information about the user that public. So, for example, if you like a particular restaurant, live in Albuquerque, NM, someone can do a search for “People who like [restaurant] in [city]” and find all the people who have liked it. 

Although we are disappointed that the option to be searched by name has been removed, the safest course for survivors and advocates is to educate themselves about how they can be found on Facebook regardless of privacy settings. Users should know what kinds of information will always be public, understand how widely information can be shared online, and determine what they will share based on their own privacy risks. The reality is that social media always has, and always will, move toward a model of sharing and openness; even if something is private now, it may not always be so. 

In light if that, it is important to know that these activities/information will always be public on Facebook:

  • Your name, profile picture, your cover photo, your username and user ID, and any networks you belong to.
  • Any public pictures or posts you like or comment on. For example, if you like or commented on a picture or a post where the original author set that picture or post to public, the fact that you liked it or your comment will be public. 

There are a few things that survivors can do to maximize their privacy.

  • Check out the “view as” option, to see what someone can see when they look at your page, whether it’s as a friend, a friend of a friend, or the public. 
  • Review your timeline by going back to previous posts on your timeline and change who can see those posts. You can even delete old posts. 
  • Going forward, limit what you share by choosing only friends. You can even go further and create lists that will limit exactly who see the specific information you are sharing. 
  • Take a look at Safety Net’s handout on Facebook Privacy for more privacy tips. 

As Facebook continues to change their privacy settings and introduce new features to their users, it is critical that survivors and advocates understand those changes and how it affects the personal information they share on Facebook. Facebook allows users to delete old posts or pictures, so it might be time to do your own Facebook audit and clean up your timeline. 

 

Act. Speak. Make a Difference.

 

Last week, was a busy week for the Safety Net team. At the beginning of the week, Erica Olsen, Stephen Montagna and I were in Little Rock, AZ, co-hosting The Use of Technology in Intimate Partner Stalking Conference with the Stalking Resource Center. The three of us provided trainings on phones and its misuse, the internet and how privacy and safety can be compromised, and focused specifically in a 3-hour presentation on social media to about 50 victim service advocates, law enforcement, and prosecutors. Visit the links below for tips and handouts on these topics.

On Thursday & Friday of last week, Cindy Southworth and I attended Facebook’s Safety Advisory Board meeting. Each year, Facebook brings together their Safety Advisory Board to talk about upcoming products and how they can ensure that their services are safe for teens and users who might be stalked or harassed on Facebook. 

Kaofeng Lee and Cindy Southworth at Facebook HQ in Menlo Park, CA.

Kaofeng Lee and Cindy Southworth at Facebook HQ in Menlo Park, CA.

I’m always struck at how passionate Facebook staff is about connecting people. Facebook, at its core, is about connections. That’s why we urge victim advocates and service providers to NOT tell survivors to just get off Facebook. We know how important that community can be for many survivors. Doing this work, we are focused (and rightly so) on how social media is misused to stalk and harass. We are constantly having conversations about how survivors can find resources, justice, and peace of mind and spirit when they are subject to abuse and control by abusers and stalkers. 

Yet, let’s take a moment and focus on Facebook and social media as a medium of connection and communication. How can we use social media to shift how we, as a society, talk about issues of domestic violence, sexual violence, and other crimes of abuse, harassment, and stalking? How can we use these spaces to change the way we talk about these issues, the way we think about these issues, and ultimately change the way people understand and perceive gender roles, relationships, and violence? Let us know what you think in the comments below.

In less than a week, it’ll be October, Domestic Violence Awareness Month. What are you doing to raise awareness about domestic violence? Here at the National Network to End Domestic Violence, this year’s 31n31 campaign will be focused on actions: 31 actions you can take to make a difference. We’re also starting a book club on Goodreads to discuss issues of domestic violence in modern literature. Follow this blog and our Facebook page for other activities you can take part in.

Take action with us. Join our book club and talk to us (and each other) about this issue. Let us know what you think we can do to use social media to change the way we talk and understand domestic violence.

PS…To receive updates of new blog posts, click on techsafety.org RSS (at the left navigational toolbar on this page) and sign up.