If you have ever assumed that information shared on a mental health app was confidential, you are in good company with many others who likely assume that sensitive medical information is always protected. This is not true, however, and it is important to understand why.
Many of us are familiar with or active users of some type of digital health application. Whether it is nutrition, fitness, sleep tracking, or mindfulness, the arena for apps that can help us track aspects of our health has never been bigger. Similarly, platforms that help us reach out to health care providers and receive virtual care have become more available, and often necessary, during the pandemic. Online therapy in particular has grown over the years, and became a critical resource for many people during quarantines and remote living.
Making health resources and care more accessible to people is vital, and the ease of accessing health resources right from your phone is obvious.
However, among the many, heavy implications of Roe v. Wade having been overturned are a number of digital privacy concerns. Significant focus recently has been on period-tracking or fertility apps, as well as location information, and reasonably so. On July 8, the House Oversight Committee submitted letters to data brokers and health companies “requesting information and documents regarding the collection and sale of personal reproductive health data.”
What has been less discussed is the large gap in legal protections for all types of medical information that is shared through digital platforms, all of which should be subject to regulations and better oversight.
The U.S. Department of Health and Human Services (HHS) recently released updated guidance on cellphones, health information, and HIPAA, confirming that the HIPAA Privacy Rule does not apply to most health apps as they are not “covered entities” under the law.” The Health Insurance Portability and Accountability Act (HIPAA) is a federal law that creates a privacy rule for our “medical records” and “other individually identifiable health information” during the flow of certain health care transactions. Most apps that are selected individually by the user are not covered — only platforms that are specifically used by or developed for traditional health care providers (i.e. a clinic’s digital patient portal where they send you messages or test results).
Making health resources and care more accessible is vital, but people need to know about the large gap in legal protections for medical information shared through these digital platforms.
Mental health apps are a revealing example. Although some consider themselves to be covered by the HIPAA Privacy Rule, like other digital health apps, they generally are not bound by the privacy laws that apply to traditional health care providers. This is concerning especially because people often seek out mental health platforms specifically in order to discuss difficult or traumatic experiences with sensitive implications. HIPAA and state laws on this issue would need to be amended to specifically include digital app-based platforms as covered entities. For example, California currently has a bill pending that would bring mental health apps within the scope of their state medical information confidentiality law.
It is important to note that even HIPAA has exceptions for law enforcement, so bringing these apps within the scope of HIPAA would still not prevent government requests for this data. It would be more useful in regulating information that gets shared with data brokers and companies like Facebook and Google.
An example of information that does get shared is what is collected during an “intake questionnaire” that needs to be filled out on prominent services such as Talkspace and BetterHelp in order to be matched with a provider. The questions cover extremely sensitive information: gender identity, age, sexual orientation, mental health history (including details such as when or if you have thought about suicide, whether you have experienced panic attacks or have phobias), sleep habits, medications, current symptoms, etc. These intake answers were found by Jezebel to all be shared with an analytics company by BetterHelp, along with the approximate location and device of the user.
Another type is all the “metadata” (i.e. data about the data) about your usage of the app, and Consumer Reports discovered this can include the fact that you are a user of a mental health app. Jezebel found that other information shared by BetterHelp can include how long you are on the app, how long your sessions are with your therapist, how long you are sending messages on the app, what times you log in, what times you send messages/speak to your therapist, your approximate location, how often you open the app, and so on. Data brokers, Facebook, and Google were found to be among the recipients of other information shared from Talkspace and BetterHelp. Apps regularly justify sharing information about users if this data is “anonymized,” but anonymized data can easily be connected to you when combined with other information.
Along with the collection and sharing of this data, retention of the data by health apps is incredibly opaque. Several of these apps do not have clear policies on how long they retain your data, and there is no rule requiring them to. HIPAA does not create any records retention requirements — they are regulated by state laws and unlikely to include health apps as practitioners subject to them. For example, New York State requires licensed mental health practitioners to maintain records for at least six years, but the app itself is not a practitioner or licensed. Requesting deletion of your account or data also may not remove everything, but there is no way of knowing what remains. It is unclear how long sensitive information they collect and retain about you could be available at some future point to law enforcement.
Accordingly, here are a few things to keep in mind when navigating health apps that may share your data:
- Get care if you need it. Being aware of privacy gaps does not change the fact that your well-being and safety is critical. It can be difficult to find accessible, low-cost options for care, particularly mental health care.
- When seeking care, inform yourself about the app you are using and think through which information is necessary for you to provide. For example, you could skip or answer “prefer not to say” (where available) for questions that could be especially sensitive for you (like gender identity or sexual orientation).
- Be vigilant about claims that health apps make. They can say they are compliant with certain laws or practices but that may hide the fact that not all of the information you provide to them is protected by those laws.
- See if there is an option to opt-out of cookies or analytics sharing on the app. This is not a perfect system but certainly worth doing if you can.
- Use good, general digital privacy practices when using health services, including not connecting health apps to your other accounts (like Facebook), limiting location access as much as possible, and using secure messaging platforms to discuss or transfer sensitive health information in general.
The accessibility to care that these types of apps have created is more than critical, and everyone should seek the care they need, including via these platforms if they are the best option for you (and they are for many people). The important takeaway is to be as informed as possible when using them and to take the steps that are available to you to maximize your privacy.
Editor’s note: A previous version of this article implied that Talkspace shares a greater variety of metadata with data brokers than it does. The article has been updated to clarify that only some data is shared.