Tech Abuse: Information from the Field

Survey Findings from the Conference on Crimes Against Women

 We are happy to announce the summary of our short survey Tech Abuse: Information from the Field: Survey Findings from the Conference on Crimes Against Women. This survey allowed the Safety Net team to gather information from the field to better guide the work that we do.

For more information on this survey and the findings check out Tech Abuse: Information from the Field: Survey Findings from the Conference on Crimes Against Women.


 

This survey was conducted by the National Network to End Domestic Violence and funded under the Technology, Abuse, and Safety Project awarded by the Office for Victims of Crime, Office of Justice Programs, U.S. Department of Justice (2016-TA-AX-K069).

 



Tech Summit 2018 Recap

 HIGHLIGHTS FROM TECH SUMMIT 2018

HIGHLIGHTS FROM TECH SUMMIT 2018

The month of July is always jam-packed for the Safety Net team. This past July, we hosted the 6th Annual Technology Summit in San Francisco, California. This year saw more participants, more sessions, and more ways to connect than ever before. We welcomed over 300 staff, victim service providers, law enforcement, trainers, and technology partners to engage, connect, and learn more about the intersections of technology misuse and intimate partner violence.

Here are some of the highlights from this year’s spectacular summit!

1.   “Technology isn’t the problem, abuse is!”- Our very own Erica Olsen, Director of the Safety Net project, opened the week with foundational principles. She centered the training with reminding participants that we need to hold perpetrators accountable, while also allowing for survivors to choose what is the best option for them during their tech safety planning process.

2.   “Technology is often misused, but technology can also empower survivors” - Malika Saada Saar, Google - Our 2018 tech summit speakers and presenters left us feeling empowered and energized to continue this work. We had representatives from Uber, Facebook, Google, law enforcement, and many other phenomenal presenters who not only shared their knowledge and expertise, but their own stories and ways they  work to end gender-based violence. We couldn’t have done this without them.

3.   “Technology can be used to reach those at the margins and provide them with a safe space”- This year we had sessions that really spoke to the intersections in which many survivors live. We offered sessions on technology and accessibility, working with immigrant survivors, and the impact of technology on LGBTQ survivors. These sessions were an added bonus to our agenda and provided new and innovative approaches for advocates to do this work. Likewise, we held our 3rd annual Women in Technology reception where technologists and advocates came together to discuss emerging tech and the v

4.    “Lots of work, but lots of fun”- #TechSummit18 wasn’t just all work, we were able to have fun with our participants. From live polling, tech themed coloring pages, our daily prize drawings, and of course karaoke and trivia, this year we engaged with participants in ways we haven’t in the past. We were able to enjoy each other and really provide connections that will foster new friendships and networking relationships.

We thank all of the participants, speakers, sponsors, and you for making Tech Summit 2018 a huge success. Until July 2019.

Facebook’s Proactive Approach to Addressing Nonconsensual Distribution of Intimate Images

It’s well-known that technology has made sharing sexually intimate content easier. While many people share intimate images without any problems, there’s a growing issue with non-consensual distribution of intimate images (NCII[1]), or what is often referred to as “revenge porn.” Perpetrators often share - or threaten to share - intimate images in an effort to control, intimidate, coerce, shame, or humiliate others. A survivor threatened by or already victimized by someone who’s shared their intimate images not only deserves the opportunity to hold their perpetrator accountable, but also should have better options for removing content or keeping it from being posted in the first place.

Recently, Facebook announced a new pilot project aimed at stopping NCII before it can be uploaded onto their platforms. This process gives people who wish to participate the option to submit intimate images or videos they’re concerned someone will share without their permission to a small, select group of specially trained professionals within Facebook. Once submitted, the images are given what’s called a “hash value”, and the actual images are deleted. “Hashing” basically means that the images are turned into a digital code that is a unique identifier, similar to a fingerprint. Once the image has been hashed, Facebook deletes it, and all that’s left is the code. That code is then used as a way for Facebook to identify if someone is attempting to upload the image and prevent it from being posted on Facebook, Messenger, and Instagram.

Facebook’s new pilot project may not be something everyone feels comfortable using, but for some it may bring much peace of mind. For those who believe it may help in their situation, we’ve outlined detailed information about how the process works:

  1. Victims work with a trusted partner. Individuals who believe they’re at risk of NCII and wish to have their images hashed should first contact one of Facebook’s trusted partners: the Cyber Civil Rights Initiative, YWCA Canada, UK Revenge Porn Hotline, and the eSafety Commissioner in Australia. These partners will help them through the process and identify other assistance that may be useful to them.
  2. Partner organizations help ensure appropriate use. The partner organization will carefully discuss the individual’s situation with them before helping them start the hashing process. This helps ensure that individuals are seeking to protect their own image and not trying to misuse the feature against another person. It’s important to note that the feature is meant for adults and not for images of people under 18. If the images are of someone under 18, they will be reported to the National Center for Missing and Exploited Children. Partner organizations will help to explain the reporting process so that individuals can make appropriate decisions for their own case.
  3. The Image will be reviewed by trained staff at Facebook. If the images meet Facebook’s definitions of NCII, a one-time link is sent to the individual’s e-mail. The link will take the individual to a portal where they can directly upload the images. All submissions are then added to a secure review queue where they will be reviewed by a small team specifically trained in reviewing content related to NCII abuse.
  4. NCII will be hashed and deleted: All images that are reviewed and found to meet Facebook’s definition of NCII will be translated into a set of numerical values to create a code called a “hash.” The actual image will then be deleted. If an image is reviewed and Facebook determines it does not match their definition of NCII, the individual will receive an email letting them know (so it’s critical that someone use an email that cannot be accessed by someone else). If the content submitted does not meet Facebook’s definition of NCII, then the concerned individual may still have other options. For example, they may be able to report an image for a violation of Facebook’s Community Standards.
  5. Hashed images will be blocked: If someone tries to upload a copy of the original image that was hashed, Facebook will block the upload and provide a pop-up message notifying the person that their attempted upload violates Facebook’s policies.

This proactive approach has been requested by many victims, and may be appropriate on a case-by-case basis. People who believe they’re at risk of exposure and are considering this process as an option should carefully discuss their situation with one of Facebook’s partner organizations. This will help them make sure they’re fully informed about the process so that they can feel empowered to decide if this is something that’s appropriate for their unique circumstances.  

For more information about how survivors can increase their privacy and safety on Facebook, check out our Facebook Privacy & Safety Guide for Survivors of Abuse.


 

[1] NCII refers to private, sexual content that a perpetrator shares publicly or sends to other individuals without the consent of the victim. How we discuss an issue is essential to resolving it. The term “revenge porn” is misleading, because it suggests that a person shared the intimate images as a reaction to a victim’s behavior.

New Internet of Things (IoT) Resources

The Internet of Things (IoT) refers to internet-connected devices that are able to connect with other devices and to be controlled remotely through a device or app. IoT devices have become commonplace in many homes and can serve as important tools for increased efficiency and for users to connect with friends and family. Unfortunately, IoT devices can also be misused to stalk, harass, and surveil. For more information about the misuse of IoT devices, check out Thermostats, Locks and Lights: Digital Tools of Domestic Abusea recent New York Times article in which NNEDV was interviewed regarding the misuse of “smart home” devices in domestic violence cases.

While misuse of IoT devices appears to be rising, it can be hard to identify all of the risks and safety options associated with common IoT devices. To assist in better understanding IoT in domestic violence cases, NNEDV has created a new set of resources, including: 

Technology is constantly changing, so stay connected to Techsafety.org for upcoming new content!

After HopeLine, What Are Survivors’ Options for Free Phones?

As word spreads that Verizon’s HopeLine program, which provided free cell phones to survivors, is ending, many local programs are wondering what options are available.

Probably the best option right now, at least for survivors who are low-income, will be the Lifeline program. Lifeline is managed by the Federal Communications Commission (FCC) and run by individual phone providers. The program offers reduced fee or free phones with data and minutes for eligible low-income individuals. Program materials state that, “To participate in the program, subscribers must either have an income that is at or below 135% of the federal Poverty Guidelines or participate in certain assistance programs.”

As for other programs that collect, refurbish and give out free phones to survivors, be cautious when considering partnering with them. Older phones, often donated directly to shelters or through donation drives, often have old batteries. This means that a phone kept hidden in case a survivor needs to call 911 might not work when it’s needed. Ask how they wipe previous owner’s data from the devices, if they install a new battery, and whether the phone can only be used for 911 calls.

In addition, we know that access to a phone can make a difference for survivor beyond the ability to contact emergency services. A smartphone with data, minutes and messaging, can help survivors to locate housing, services, employment, medical appointments, court dates, and can reduce isolation.

The HopeLine program differed from other programs by giving survivors a new phone. The Illinois Coalition Against Domestic Violence summarized the success of the program in announcing it was discontinued, “Over the course of HopeLine’s phone donation program, millions of phones were provided to survivors of domestic violence and tens of millions of dollars were committed to support the important work of domestic violence prevention and awareness.” Survivors currently using HopeLine phones will be able to continue using them through December 31, 2018.

Cambridge Analytica and Why Privacy Matters to Survivors

Recent news that the personal information of tens of millions of people was used by Cambridge Analytica “to create algorithms aimed at ‘breaking’ American democracy” as the New Yorker phrases it, has led to a call to #DeleteFacebook. For those unfamiliar with the story, our friends at AccessNow wrote a great summary.

This kind of invasion of privacy is not new, nor is it limited to this case. The old expression, “No free lunch,” applies to any service that we don’t pay for, whether it is social media or a discount card at the grocery store or entering a raffle to win a new car. The true cost is allowing those companies to access our personal information for their own profit.

Safety is the primary concern. For survivors who face threats of harm, who live daily in fear from the abusers, the security of personal information can be a life and death issue. For survivors fleeing an abuser, information about location, work, kids’ schools, and social connections can lead an abuser to the doorstep. For survivors living with abuse, information about friends, thoughts, feelings, opinions, and interests can be misused by an abuser to control, isolate, or humiliate.

For survivors, privacy is not an abstract issue, or a theoretical right to be debated on CSPAN. Privacy is essential to safety, to dignity, to independence. Yet, we live in a time when personal information = profit.

The Cambridge Analytica story surfaces the underlying reality that our personal information is not under our control. It feels like we are seldom asked for consent to share our personal data. When we are, it is in legalese, in tiny letters that we might have to scroll through to be able to check that box, and get on with using whatever website we’re trying to use. Even if we do take the time to read through those privacy terms, we know that data is routinely stolen, or accidentally published on the Internet, or used against us to affect access to loans, insurance, employment, and services.

We are social animals. We crave connection. Research shows that we suffer without it. Isolation is a classic tactic of abuse. But the price we too often pay for connection online is our privacy.

At times like these, we may think about deleting Facebook, going offline, or throwing away our phones. We may think that survivors should give up their tech at the door of our shelters, or that they have to go off the grid in order to be safe.

Digital exile is not the answer. Technology, and the Internet, is a public space where everyone, including survivors, should have the right, to share their voices, to make connections, and to access information without fear of their personal information being collected and used without their consent. April Glaser writes in Slate that, “[d]eleting Facebook is a privilege,” pointing to the huge number of people that rely on it to connect with friends, to learn about events, to promote a business, or, in parts of the world with limited Internet access, just to be online at all.

Survivors, just like every other consumer, should be given the opportunity to give truly informed consent. That consent must be based on clear, simple, meaningful, understandable privacy policies and practices – not just a check box that no one pays attention to.

A guide to the process of changing your Facebook settings to control apps’ access to your data is available from the Electronic Frontier Foundation. Also check out our own guides to Online Privacy and Facebook Privacy and Safety.

Biometric Information Privacy

In early March, NNEDV was notified by the World Privacy Forum about an effort that is underway in Illinois to strip essential biometric privacy protections. NNEDV and the Illinois Coalition have since been coordinating efforts to oppose this move. The privacy protections being threatened are a model for other states and should be replicated, not undermined. You can learn more about biometric data and about what's happening in Illinois below. 

PROTECT SURVIVOR PRIVACY

In 2008, Illinois passed a landmark bill titled the Illinois Biometric Information Privacy Act. BIPA, as it is known, was specifically aimed at protecting critical biometric data such as “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” Several states have followed Illinois’ leadership and passed similar laws, however Illinois still stands alone in not only protecting against the misuse of biometric information, but also providing a private right of action, which allows individuals to personally sue entities such as corporations for failure to inform and obtain consent before collecting, capturing, purchasing, or receiving biometric data.

BIPA is an important law for anyone who is interested in privacy and security, but it is especially important for vulnerable individuals like survivors of domestic violence. Many survivors of domestic violence are forced to flee abusive relationships and to change their identities in order to protect themselves and their children. While a name or social security number can be changed, there is no way for most individuals to change biometric information. It is unique and once compromised cannot be fully protected. For survivors of domestic violence who regularly experience identify theft and who may need to protect their information in order to stay safe, BIPA is an essential tool in helping survivors to make smart decisions about what information to share. BIPA requires entities to inform people about their collection practices, to obtain written consent before collecting, and to have processes to destroy biometric information that has been collected in a reasonable time. These safeguards are essential and can actually save lives.

BIPA is currently under attack. A new bill has been introduced in an attempt to peel back the protections afforded under BIPA in large part to protect companies from lawsuits for failure to comply with BIPA’s important privacy protections. The use of biometric data is increasingly common and we know that it will continue to be an important part of many aspects of commerce, including how companies oversee human resources. While there may be a way to balance the changing needs and practices of companies with essential privacy rights, the current bill to modify BIPA is far from reaching an appropriate balance. While the bill would still protect against the sale of the biometric information, it would otherwise completely gut BIPA exempting nearly every single business in Illinois.

TAKE ACTION

The National Network to End Domestic Violence (NNEDV) believes that BIPA is an essential law to protect biometric data and is especially important for survivors of domestic violence. If you agree, please contact Senators Bill Cunningham, Chris Nybo, and Napoleon Harris, III and file a witness slip in opposition to the bill. A link to the witness slip can be found here. When filling out the slip, make sure to check Record of Appearance Only, under the section entitled “IV. Testimony.”

 

 

NNEDV Resource Highlight: Technology Safety and Your Website

Safer Internet Day

February 6 is recognized as Safer Internet Day. We believe that survivors have the right to be safe at home, at work, on the streets, and online. Most domestic violence and sexual assault service providers have some type of online presence, whether it’s a website, social media page, or something else. 

The following steps can be taken to increase the safety and privacy of people who search for resources and reach out to agencies online:  

1.      Add a Safety Alert to Your Website: This can remind survivors that their online activity could be monitored or viewed by someone without the survivor’s knowledge.

2.      Create a Quick Escape Button: This allows survivors to be redirected to an innocuous webpage. Be mindful, this only prevents immediate over-the-shoulder monitoring (not spyware), and does not block browser history.

3.      Include Information about Internet Safety: Be upfront about safety risks of communicating online with survivors via email, website, or other platforms, and transparent about what information might be retained.

4.      Use a Web Form Instead of Email Addresses: Unlike direct email addresses, a web form does not leave a record of the email in the sender’s email sent folder.

5.      Posting Pictures & Videos: Be sure to get consent before you post any pictures or videos online. This includes permission from staff, board members, or speakers – don’t assume that because someone works for your agency or was invited to speak, that they are willing to have their images posted online. 

6.      Include Accurate Information: Make sure any information that pertains specifically to your area – county, state, or region – such as laws, processes, or services are made clear. Survivors who visit your site may be from a different area and should know if the information provided is applicable to them.

7.      Accessibility: Make sure your website is accessible for all viewers – including those living with disabilities or who are Deaf. You can increase accessibility by ensuring that the font you use is large enough, has strong contrast, and that the images on your website have alternative text descriptions (alt text).

Find more tips and technology safety resources in our Agency’s Use of Technology Best Practices & Policies Toolkit.
If you have additional questions about technology safety, please visit TechSafety.org or reach out to our Safety Net team: SafetyNet@NNEDV.org.

Privacy Risks and Strategies with Online Dating & Gaming

Both online dating and online gaming are fast-growing industries that are increasingly becoming a regular part of life. Online dating has rapidly gained in popularity as a common way to connect to potential dates or find a partner. And, contrary to popular perception, online gaming is not just a pastime for teenage boys. Many people have concerns about the safety of online dating, often due to widely publicized stories of assault and abuse, and unfortunately, online harassment is an all-too-common experience while playing games online, that can also cross into real life.

Everyone should be able to be online safely, free from harassment and abuse, and that includes dating and gaming. For survivors of domestic violence, sexual assault, and stalking, privacy and safety concerns may be even greater when trying to engage in online spaces. Fortunately, it is possible to increase privacy and safety when dating and gaming online.

Two new resources from Safety Net discuss both risks and strategies, for survivors who want to be active in online dating or gaming communities.

Harassment, threats, and abuse that happen “only” online should be taken seriously. Such experiences can be traumatizing, and may include financial crime or identity theft. Victims report efforts to ruin their reputations and drive them from the online community. If enough identifying information is known, the abuse can also quickly become an offline threat.

If you are concerned about online harassment or abuse, see our Survivor Toolkit for more information about Online Privacy & Safety Tips, guides to Facebook and Twitter, and for resources to assist in documenting abuse.

Online harassment and abuse may fall under a number of crimes, depending on what is happening. To learn more about laws in your state on online harassment, visit WomensLaw.org

Data Privacy Day: The Gold Standard for Protecting Survivor Privacy

data privacy

When thinking about domestic violence victims, data privacy isn’t the first thing that comes to mind for most people. But here at Safety Net, it’s always a top priority for us, and we spend a lot of time helping local domestic violence programs and other victim service providers understand the impact that their use of technology can have on the privacy of the survivors they work with.

Understanding what real data privacy looks like can be complicated. As we move ever more rapidly into a technology-driven world, local domestic violence programs are under increasing pressure to join in and adopt new technologies. There are many benefits to this – it means that survivors have new ways to find help that are often easier (and in some ways safer) than making a phone call or showing up at the front door, and it means the administrative work programs have to do can become more streamlined, giving them more time to spend helping those they are there to assist. But as with everything related to domestic violence, there are major risks involved in the use of technology that must be considered and minimized before moving forward.

Let’s start with why data privacy is so important. When survivors seek help, they take huge personal risks. If their abusive partner finds out they’ve asked for help, the abuse often escalates. They also face the possibility of harmful social and economic repercussions, like housing discrimination, job loss, and exclusion from their family or community. The information victims share with the domestic violence programs is often incredibly sensitive, and if others gain access to it, it can be used to cause further harm to them. This is why the Violence Against Women Act (VAWA) requires such stringent confidentiality practices – well beyond what the more widely known HIPAA practices require. (Learn more about this in our HIPAA/VAWA/VOCA FVPSA Privacy Comparison resource.)

Domestic violence programs often ask us to help them learn and understand best practices related to data privacy and online services. A practice we are constantly encouraging programs to look at is the use of zero-knowledge encryption services. When we suggest that as the best option for confidentiality, many want to know “But what does that even mean?!” Well, zero-knowledge encryption is the best way to ensure that the information being sent between the survivor and the program, or the information that is being stored in the cloud by the program, is protected against all third-party access (a third-party is anyone who is not the victim or the program that is helping them out).

When a domestic violence program uses cloud-based services, they are essentially storing the information they are collecting at an outside location. And it is standard practice for most cloud-based companies to have access to the data that is being stored. This means that if they choose, they can go in and read all of the information the domestic violence program has stored about the victims they are working with. But when a software company uses zero-knowledge encryption, even THEY can’t see the data.

Here’s a helpful analogy for understanding how zero-knowledge encryption works: Imagine a physical storage company where you can rent a vault to store your organization's paper files. When you go there to rent a vault, they let you know that you will be the only one who has a key to your vault, and that there is no way to get into the vault without that key. The vault can't be broken into. And the storage company does not have an extra copy of the key. No one but you, or someone you give the key to, can get into the vault. This is what zero-knowledge encryption does for survivors' data. It ensures that only the domestic violence program has the key to unlock and access the data they have entered about survivors. This is why we consider this the gold standard of data protection, and the one that most clearly aligns with VAWA confidentiality obligations. Software companies are third parties. And they get approached by other third parties - like law enforcement and abusers' attorneys - to share the data stored on their servers. If the software company can't see the data, and they can't hand it over to others who might use it to harm the survivor, the privacy and safety of the survivor is much more secure. 

If you have questions about this, feel free to reach out to us. To learn more about privacy and confidentiality, check out our Technology & Confidentiality Toolkit.