So, You Wanna Build an App? App Security

This post is part of the “So You Wanna Build an App” series. The other posts include: “What to Consider Before Developing an App,” “Know Your Audience,” and “Safety First.” This series is based on lessons we learned when developing the NNEDV Tech Safety App, and in reviewing dozens of apps created for victims of domestic violence, sexual assault, and stalking. Our reviews can be found in the App Safety Center.

 In the “Safety First” post, we talked about how to minimize risks for users when you build the app. Another concern that app developers must be aware of is security—both security of the app itself and security of the data that the app collects from users.

Minimize User Data & Secure What You Store

User data can include anything from asking users to create an account with a username and password to asking users to upload and store evidence of abuse. The first step to data security is to only collect the information needed in order to provide the service. Don’t ask for data you don’t need. For example, some apps require users to create an account when there is no obvious need for an account. Other apps require access to information on the device, such as the user’s contact list and calendar, even when that information has no relevance to the functionality of the app.

Also remember that some types of data are more sensitive than others. Sensitive data includes personally identifying information like name, birthdate, location, health/mental health information, and documentation of abuse. The exposure of sensitive data can have dangerous consequences for the survivor if it’s discovered by the abuser. For this reason, securing sensitive data from unintentional disclosure is crucial.

Develop your app in a way that doesn’t require users to share personal information, or that offers users multiple ways they can opt into or out of sharing personal information. For example, some safety apps allow users to contact someone through the app. Develop the app in a way that lets the user manually type in the contact information, rather than requiring that the app be connected to their contact list. Also remember - if your app is designed so that it can inform 2 or 3 contacts when the survivor needs help, the app does not need access to the entire address book. This is also helpful, because some users may want to input a safety contact, such as their domestic violence advocate or private attorney, who isn’t in their contact list.

App Security

For apps that collect no or minimal data from their users, the security issues are more about the app itself. Some apps are built to function fully on the device, where all the content is accessible via the downloaded app. Other apps require users to retrieve information online. Depending on how the online content is hosted, if someone was covertly watching the internet traffic, they might be able to find out the names of the websites and other content that’s being accessed. Think about where your online content is hosted and how that information is retrieved. As an example, in order to protect survivors, all of the videos on our Tech Safety App are hosted on a secure server, and the files are named in a way that obscures what they are in case someone is covertly watching the internet traffic.

Have a Security Framework and Policy

Anytime you ask users to share personal information with you, you need to know (and let them know) how you’ll keep that data secure. The security framework should encompass every level of engagement – from the time they share their information (account creation, uploading/downloading content) to when you store that information (on secure and encrypted servers) to how (and how often) you destroy content. Your security policy should be clear, and posted where users can easily review. It should also be very clear about when and how you might share their information with third parties such as law enforcement or courts.

Educate Users on Security

If your app encourages people to use third-party cloud storage like Dropbox to store personal information gathered via your app, provide tips and education on good security practices. Where appropriate, teach users to use strong passwords and multi-factor authentication. The better they understand the risks, and how to minimize those risks, the better they can navigate them and develop stronger safety strategies.

Thanks for reading this blog series! If you’re still curious for more, you can find great information on our website:

·       Technology Safety and Privacy: A Toolkit for Survivors

·       Agency’s Use of Technology: Best Practices & Policies

·       App Safety Center

Speaking of apps – check out NNEDV’s Tech Safety App! DC-based company 3Advance developed the CMS infrastructure and created the multi-platform mobile apps to bring to life the NNEDV Tech Safety App. If you’re an app developer or a victim service provider working with an app developer, be sure to check out our Considerations for App Developers resource!

So, You Wanna Build an App? Safety First

This post is part of the “So You Wanna Build an App” series. The other posts include: “What to Consider Before Developing an App,” “Know Your Audience,” and “App Security.” This series is based on lessons we learned when developing the NNEDV Tech Safety App, and in reviewing dozens of apps created for victims of domestic violence, sexual assault and stalking. Our reviews can be found in the App Safety Center.

Minimizing safety risks for victims of abuse who use your app is a daunting but crucial process. Remember that survivors may be in crisis, in danger, or have someone monitoring their device when they’re using your app. This post discusses how you can address and minimize some of these safety risks.

Your App Could Be a Safety Risk

Victims of abuse are most at risk when they attempt to leave their abusive partner or try to limit the abuser’s control. Simply having a safety app on their device could indicate that the victim is seeking information or help, and the abuser could escalate his/her control and abuse. While you can’t remove that risk entirely, it’s important to consider ways you can address and minimize those risks.

Inform the User

The first step is to inform the user of possible dangers and risks they might face if they download your app. Some survivors may be aware that their devices are being monitored and know to be careful about what they download, but others may have never thought that about risk before, and may not have considered that the abuser may see the app and discover that they are seeking help.

This reminder should take place before they download the app. It should be noted in the app store description, and in other places that describe the app. For example, the Tech Safety App provides notices about potential monitoring by abusive partners and suggests that users only access the app from a safer device. These notices are available on the app’s informational website, in the app description in both the Apple App & Google Play stores, and as part of the onboarding process after someone downloads the app. These reminders both inform potential users of the related risks when downloading the app, and encourages them to wait until they are on a safer device.

Other Safety Strategies That May or May Not Work

·       Quick Escape – Most websites for survivors of abuse have a “Quick Escape” or “Exit” button so that they can leave the site quickly if they’re worried that someone is monitoring their internet use. However, this can be a challenge for apps, since having an exit button can take up valuable screen space. It’s also unnecessary because it’s often very easy to quickly close an app. Since building an “Exit” button throughout an app isn’t practical, the best way to inform users of possible monitoring is to inform them before they download the app.

·       Disguised Apps – Some apps have been designed to look like something else, such as a news app or a calculator, but are actually apps to help domestic violence or sexual assault survivors. While it might be helpful for the icon to be disguised so that it doesn’t raise the suspicions of an abusive partner, there can also be significant challenges with this strategy. The Apple App Store doesn’t allow these types of apps, or they require an explanation of what the app actually is in the app description, which may defeat the purpose of it being disguised. App users also won’t be able to find the app unless they know exactly what it’s called and what the icon looks like. If the icon changes as a part of the update process and the survivor doesn’t notice, this may make the app hard to find, or may lead to accidental deletions. Survivors may also forget the fake name if they download the app and don’t use it regularly, making it difficult to find in a time of crisis.  Moreover, if someone happens to open the app on the phone, they’ll know that it isn’t whatever the app is pretending to be.

In some cases, app developers may actually build the disguised app and hide domestic violence/sexual assault content within the app. While this might minimize the risk of someone opening the app and immediately seeing the domestic violence/sexual assault content, it might be harder for users to access hidden content easily and quickly.

·       Passwords – Some apps will use a password to protect the app (or parts of the app) so that only someone with the password can access it. This strategy does work to a certain extent, particularly if there’s private or sensitive information the survivor wants to keep protected in case someone goes through the device. Just keep in mind that a password protected app might raise the suspicions of the abusive person if he or she is used to having full control over the device. This strategy might be best for someone whose abuser generally doesn’t have access to the device, but who wants additional privacy protection for the information she/he is accessing or storing. Having this as a security option rather than a default setting can be helpful for survivors, because it lets them individualize the app based on their unique circumstances.

Be Aware of Unintentional Access to App Content

There are many ways that app content can be accessed without the knowledge of the survivor, simply by the way the device may be connected to other technologies. For example, some devices are set up to automatically connect to smart TVs, speakers, or cars via Bluetooth. If your app contains multimedia, build the app so that files don’t automatically start playing when the device connects to a speaker or other technology. Also consider naming multimedia files in a way that doesn’t reveal anything if someone happens to see the file name on a media player.

Safety and Privacy When Collecting Sensitive Information

Some safety apps encourage users to store personal information either on the app itself or to the cloud via the app. This might include contact information, a journal logging the abuse, and photographic/video/audio evidence of abuse. It’s critical that users of these apps are notified of the related safety risks involved in storing information this way. If the information is stored on the device, users should be warned that anyone with access to the device might be able to see the content.

Additionally, if your app collects and stores any private information connected to its users, you should have a privacy and security policy that clearly explains what information the app is collecting, why it is being collected, and who has access to it. If your app is using a third-party service to store the information, or if it shares the information with another company, it’s vital to let users know how to find that third-party’s privacy and security policies.

In cases where personal information is being stored on the user’s own cloud-based service, such as Dropbox, they should be notified of the related privacy and security risks. Many users don’t know how easily cloud-based services can be accessed. If the abusive person knows the victim’s password or has access to a device the account syncs with, all of the information stored could be easily accessed, manipulated, or deleted. If your app encourages users to use their personal cloud storage service, provide them with information about how they can increase their privacy and security when using these services.

Speaking of apps – check out NNEDV’s Tech Safety App! DC-based company 3Advance developed the CMS infrastructure and created the multi-platform mobile apps to bring to life the NNEDV Tech Safety App. If you’re an app developer or a victim service provider working with an app developer, be sure to check out our Considerations for App Developers resource!

So, You Wanna Build an App? Know Your Audience

This post is part of the “So You Wanna Build an App” series. The other posts include: “What to Consider Before Developing an App,” “Safety First,” and “App Security.” This series is based on lessons we learned when developing the NNEDV Tech Safety App, and in reviewing dozens of apps created for victims of domestic violence, sexual assault, and stalking. Our reviews can be found in the App Safety Center.

 If you’re building an app for survivors of abuse, your mantra should always be: first, do no harm. Survivors of abuse may be using your app in the middle of a crisis, or while looking for help to escape a violent situation. Although you can’t predict how someone will use your app, you can minimize harm by building an app that takes the unique needs of your audience into consideration. Below are some tips for doing just that.

Don’t Create a False Sense of Security

Because survivors may rely on your app to help them find safety or to get time-critical information, the app needs to work as it’s intended. Unfortunately, many apps that have been created for survivors are so complex they often don’t work the way the designers intended. We tested dozens of apps whose sole promise was to locate a victim when they’re in danger, but many of them didn’t always show exact location. Sometimes it was off by a few houses and sometimes it was off by a few miles. If your app promises personal security and safety as a key function, you have to make sure it works accurately every time, and in every environment (rural/suburban/urban).

Don’t Overpromise

Carefully market your app, and be sure not to imply that it does more than it’s actually able to. We’ve seen many safety apps created for victims of abuse that are marketed with claims that are blatantly false, and that (unethically) try to appeal to the victim’s need for safety. Some of these marketing ploys include: “#1 Prevention of Sexual Assault!”, “You’ll never be in danger again,” and “It’s like having a police officer in your pocket.” Even though the developers may have had good intentions, not only are these claims unrealistic, they can be dangerous if someone were to accept them as true.

Such claims may keep the user from thinking through other safety measures they could take. If users believe that your app is the only safety strategy they need, you’ve likely created a false sense of security that can result in unintended danger to the victim. Moreover, if someone has the app and does get assaulted, it can contribute to victim blaming – accusations that the victim had a safety app that could have prevented the assault, if only they’d used it properly. Simply put: don’t tell victims the app will keep them safe. There is no app that can stop an abusive partner from trying to harm their victim – the only thing that can stop that from happening are abusers themselves.

Be Accurate About Your Information

Because the app is created for someone who might be in danger, make sure that the information in your app is accurate. Resources should link to accurate phone numbers or websites, and be appropriate for your intended audience. For example, if your app is for victims of domestic violence, list local domestic violence programs in the resources section, rather than listing general health services. Remember that even if your app is meant for a specific location (such as your city) or population (such as teens), anyone can download the app, so resources should be applicable to all users (you can include the National Hotline in addition to the local hotline numbers), or clearly state who can use the resources. Double and triple check the information you’ve listed (websites, phone numbers, and other contact information) to make sure it’s correct, and make this a part of your ongoing maintenance plan.

Be Accurate in Your Language

If you don’t have expertise in the dynamics of domestic violence, sexual assault, or stalking, work with experts in those respective fields to develop your content and to ensure your language is correct and appropriate. Domestic violence, sexual assault, and stalking are all very nuanced issues. Users of your app could be in a traumatized state of mind when they’re using your app, and your content needs to be sensitive. Victims may not yet have the words or definitions to explain what they’re experiencing, and the way you describe it may have a major impact on their understanding. Work with domestic violence, sexual assault, and stalking victim experts to help you write content that is appropriate.

 Speaking of apps – check out NNEDV’s Tech Safety App! DC-based company 3Advance developed the CMS infrastructure and created the multi-platform mobile apps to bring to life the NNEDV Tech Safety App. If you’re an app developer or a victim service provider working with an app developer, be sure to check out our Considerations for App Developers resource!