Women's Health Apps Continue to Pose Privacy Concerns

30 May 2024 2749
Share Tweet

The global women's health app industry, with millions of users worldwide, is expected to surpass $18 billion by 2031. However, such apps commonly attract criticism for mistrust due to data collection, including menstrual patterns, sexual activity, pregnancy conditions, as well as personal details like phone numbers and emails. Over recent years, these apps have faced criticism due to concerns over privacy violation.

Researchers described several ongoing issues during a presentation at the Conference on Human Factors in Computing Systems in Honolulu in May.

The researchers examined 20 popular women's health apps available on the Google Play Store in the US and UK, focusing on their data management and privacy policies. The investigation uncovered instances of covert user data collection, inconsistent privacy policies and app features related to privacy, ineffective data removal procedures, and more.

The research team also discovered that apps frequently associated users' data with their web searches or internet activity, thus compromising their anonymity. Some apps necessitated the user disclose if they had experienced a miscarriage or abortion to activate a feature for deleting data, which illustrates the use of 'dark patterns', or the manipulation of users to reveal personal data, as per the study's authors.

Lisa Mekioussa Malki, co-author of the study and a computer science researcher at University College London, discussed the findings' implications on privacy and safety with Science News. The interview has been shortened and clarified.

SN: You express in your study that the data collected by women’s health and fertility apps have physical safety implications along with privacy concerns.

Malki: While data privacy is often thought of as a way to protect data from an organizational standpoint, I believe we need to go further by taking into account the user implications of data leaks. Next to the critical concern of potential criminalization [of abortion in post-Roe America], there are many other potential consequences due to the release of reproductive health information.

One example could be that if someone's pregnancy status gets disclosed without their permission, it might cause workplace discrimination. In addition, past research has focused on stalking and intimate partner violence. Moreover, data sharing can cause serious harm in societies where abortion and women's reproductive health are stigmatized.

SN: The disclaimer "We don’t sell your data" often used by apps, might still imply the data accessibility by advertisers and others. This is likely to make it hard for users to understand what their consent means when using the app.

Malki: In addition to the data directly supplied by the user, these apps collect various kinds of user data. Some of this data, like health information, is subject to legal restrictions regarding sharing and monetizing. But other data collected from the user devices, such as IP address and usage details, can be shared with analytics companies according to the privacy policy.

It is concerning that there is a lack of clarity on what behavior data are being shared. Predictive, sensitive information can be drawn from such shared data. From reading a privacy policy, it's unreasonable to expect the user to have a thorough understanding.

SN: What guidance can you provide to women and others who use these health apps?

Malki: From our discovery, when users come across alarming news about data breaches, the immediate response is to delete the app. However, this does not secure the data as developers usually store backup information on their servers.

So our advice is to search for a data deletion feature within the app or contact the developers directly. For people living in Europe, they can request developers to delete their data under the right to be forgotten.

SN: What should developers do to make apps more ethical?

Malki: A lot of time, particularly when the app development team is quite small and perhaps limited in resources, data privacy is a compliance issue rather than a humanistic and user experience issue. So I think a shift in understanding is needed — who the users are, what potential risks they could be facing, what needs they have — and building that into the design process from the beginning.

We’ve developed this groundwork for understanding and identifying the characteristics of privacy policies. So what researchers and developers, even auditors and compliance people, can do in the future is use that framework to automate the analysis of a larger set of privacy policies on a large scale. Our codebook provides a framework for doing that.


RELATED ARTICLES