Mental health matters – and with global crises such as the Covid pandemic shaping our lives in the 2020s, everyone has been talking about it. A common piece of advice for people who are struggling with their mental health is to get help: Find a therapist or speak to a coach or counselor. However, these options are not available for everyone. Whether for financial or logistical reasons or simply because of time constraints, regularly seeing a mental health professional in person is not always possible.

This is where a plethora of apps come in, promising support: BetterHelp provides easy access to online therapy, Headspace and Calm offer guided meditations, and several journaling apps help you untangle your most private thoughts and feelings based on their daily prompts. You can even get interactive versions of these apps and speak to an AI chatbot about what is keeping you up at night.

What happens to your data?

The question that remains, however, is: Where does all that extremely sensitive private information go? In case of some apps, unfortunately…to Facebook. The Federal Trade Commission, which is in charge of enforcing federal privacy laws in the US, issued a complaint after discovering that BetterHelp shared health data that was expressly stated to be confidential with Meta, Snapchat and Pinterest for advertising purposes. In March 2023, BetterHelp and the FTC ended up reaching a $7,8 million settlement in the case. BetterHelp is far from being the only culprit though. The Mozilla Foundation regularly reviews mental health apps in regards to privacy and has found “more than a few [to be] downright creepy”.

Health data protection in US apps

In the US, the processing of health-related personal data is – in some cases – regulated by the Health Insurance Portability and Accountability Act (HIPAA). However, not everything we would consider data concerning health under the definition of Art. 9 GDPR is also governed by HIPAA: If the app is offered by a business that is not a covered entity (such as healthcare or insurance providers) and the users enter their health-related information themselves, it may not be considered protected health information under HIPAA. This means that US users of most lifestyle-oriented health apps that may offer meditation, journaling, and mindfulness tools are not receiving the extra protection that HIPAA would provide.

How is the situation in GDPR-governed Europe?

In accordance with Art. 9 GDPR, data concerning health is considered especially sensitive and may only be processed in a very limited number of cases. Typically, the processing will take place on the basis of the data subject’s express consent.

In Germany, health apps can now be officially prescribed by healthcare providers to support traditional treatment or promote general health (so-called DiGAs, digitale Gesundheits-Apps). Before they can be added to the list of apps that qualify, tools have to meet certain criteria according to federal regulation.

However, despite the high level of protection provided by the GDPR and additional administrative regulations, data breaches or security concerns are still an issue: In terms of DiGAs, the German hacker collective Zerforschung found that two of these officially listed apps came with massive privacy concerns and inadvertently made sensitive health data available to other users – despite meeting the federal criteria. German “prescription apps” aside, US tools like Headspace are of course also available in European app stores.  Most of these tools are run on US-based servers even when offering services to EU users, which already brings up the first privacy concern.

Specifically for women’s health apps, a 2022 study found that almost 90% of the tools investigated shared data with third parties, but only 70% had a privacy policy and only a little over half requested the legally required consent from their users. Furthermore, as BetterHelp’s practices in the US have demonstrated, even if a business advertises its apparent commitment to privacy and claims not to pass on your data to third parties, this does not necessarily mean that users can trust those claims to be true. This was also confirmed by the findings summarized in a 2019 European Journal of Law and Technology article: While many health app providers stress the importance of privacy to advertise their products, the reality reflected in their practices and policies is more often than not very far from the claims made in their marketing campaigns.

To use or not to use…?

From a privacy standpoint, entering your health-related personal data into an app comes with many risks and cannot be recommended. However, not every app is the same. Those that follow data protection regulations and do not process sensitive data in the first place are typically a little less dubious. In other words: Some research into the app provider will go a long way. And listening to a guided meditation on your phone every now and then naturally does not put your data at risk the same way logging detailed information about your mental state will.

Finally, in regards to “therapy apps”, the existence of these tools fills a need that should not even exist in the first place: If appropriate (and free) mental healthcare was always readily and easily available to everyone who requires it, it is fairly safe to assume that nobody would prefer talking to a chatbot about their day.