Period-tracking apps and data privacy: Why we should be worried

Representative Image Only
Representative Image Only
Published on

Access to data regarding conception and intention to abort can especially be used by States where abortion is illegal to vilify and penalise women and gendered minorities. 

—–

Why there is increased scrutiny over the data collection practices of period-tracking apps?

IN June, the United States Supreme Court overturned its previous judgment in Roe versus Wade (1973) and Planned Parenthood versus Casey (1992)which had protected abortion as a constitutional right for decadesThe decision, delivered in Dobbs versus Jackson Women's Health Organization, marked a victory for social conservatives in America and paved the way for U.S. states like Mississippi to restrict access to abortion. In light of this, it becomes important to analyse the impact such a legal development would have on women's lives. Although the decision comes from the U.S. Supreme Court, insights derived from its fallout may help India take a closer look at our laws and whether they suffer from similar shortcomings.

Dobbs versus Jackson has brought the limelight on concerns regarding data privacy by period-tracking apps that may have access to sensitive data such as those of conception and intention to abort. In this regard, questions related to the kind of data collected, the uses it is put to, and the possible consequences of sharing that data, become all the more deserving of attention.

Recently, the U.S. Congress, taking cognisance of the matter, investigated "the collection and sale of sensitive personal data related to access to abortion and other reproductive health services." The letters sent out by Congress to data brokers during the investigation acknowledge the promises made by these apps to protect the confidentiality of their users' data. However, they note that even the possibility of compromise could have severe consequences for people who reproduce, and hence warrants a closer look at their data collection and sharing practices.

What are the types of data collected by such apps?

The type and nature of data typically collected by developers vary based on their use. De-identified and aggregated data include analytics data that track user activity on the app such as the length of the sessions and the performance of the app, among other things. It further includes data about users like the device information, including details about the device model, its storage information, and operating system. Data collected for targeted advertising, and for characterising user activity across services and applications, may involve the collection of user activity on other sites and apps, through third-party trackers.

Dobbs versus Jackson has brought the limelight on concerns regarding data privacy by period-tracking apps that may have access to sensitive data such as those of conception and intention to abort.

The concern more specific to period-tracking apps arises from the collection of sensitive personal data. Period-tracking apps including ClueFlo, and MIA, amongst others, transparently collect data such as the user's name, email address, gender, birth year and birthday, place of residence, location, and Internet Protocol address. Such apps also have access to information about the user's weight, body temperature, menstrual cycle, and details of pregnancy.

In research conducted by British charity Privacy International, period tracking apps were found to have collected data about sexual practices, masturbation habits, quality of stool, birth control practices, skin type, mood cravings, hair quality, and other such intimate details.

What is done with this data?

In 2021, the U.S. Federal Trade Commission ('FTC') confirmed that Flo, an app with 100 million users, shared consumer health information with third-party data analytics services despite declaring in its privacy policy that such information would be kept private. FTC's investigation revealed that as far back as 2016, the app incorporated software development kits ('SDKs') from various third-party marketing and analytics organisations, including online social media and social networking service Facebook; American mobile analytics, monetisation, and advertising company Flurry, e-commerce platform Fabric, mobile marketing analytics and attribution platform AppsFlyer, and American multinational technology company Google. The function of the SDKs was to record user interactions with different features of the app and thereby collect sensitive health information. In 2019, Wall Street Journal revealed how Flo's Period & Ovulation Tracker informed Facebook when a user was on her period or intended to get pregnant.

In 2020, the FTC formally lodged a complaint against Flo's misconduct and blatant violation of data privacy laws. A year later, the matter was settled after the FTC ordered Flo to obtain an independent audit of its privacy policies and receive the express agreement of app users before sharing their health information.

The massive amounts of data collected puts users at the disadvantage of being coerced and manipulated into making decisions they may not make otherwise. This can manifest in the form of impacts such as price discrimination when companies have access to purchasers' willingness to pay a premium or blocking important information for certain groups of people based on discriminatory parameters. Additionally, users are constantly forced to expose themselves to risks of identity theft, fraud attempts, and data breaches, amongst other security issues. 

Another case study involved the Norwegian Consumer Council investigating the data collection and sharing practices of ten apps, including Clue. It found that data-sharing by these apps is part of the practices of the digital marketing and ad-tech industry, where the vast amounts of data collected are assigned unique identifiers through which personalised profiles are compiled about every individual. The Council described such practices as "massive commercial surveillance" violating consumer rights, and filed formal complaints with the Norwegian Data Protection Agency against the apps. It has also asserted the need for companies to develop alternative technology that does not rely upon such intrusive data collection and sharing practices.

What is the cause for concern?

Although the purpose of processing such data is primarily marketing, customisation of services, and improvement in the app's functions, the collection raises concerns about potential misuse. A 2019 study found that the biggest problem with the data-sharing practices of health and medicine-related apps was the lack of transparency. It highlighted how third parties may aggregate sensitive health-related data from many sources and utilise it to populate proprietary algorithms that promise "insights" into consumers. Thus, the study warned, the sharing of user data "has real-world consequences in the form of highly targeted advertising or algorithmic decisions about insurance premiums, employability, financial services, or suitability for housing."

Roughly 91 million women of reproductive age live in nations where abortion is illegal. Another 358 million live in countries where abortion is prohibited unless the woman's life is at risk. In such nations, weak privacy laws or shoddy data collection and dissemination practices of apps about women's reproductive health can lead to the surveillance and policing of intimate and deeply sensitive information about a woman's body.

Although this may not be the case universally, data collected from fin tech companies has affected user credit ratings. Access to data regarding conception and intention to abort can especially be used by States where abortion is illegal to vilify and penalise women and gendered minorities.

The report by the Norwegian Consumer Council further identified concerns with the profiling of personalised data, particularly regarding the absence of consumer consent and the inability to opt-out of the data collection without losing the utility provided by the app. The massive amounts of data collected puts users at the disadvantage of being coerced and manipulated into making decisions they may not make otherwise. This can manifest in the form of impacts such as price discrimination when companies have access to purchasers' willingness to pay a premium or blocking important information for certain groups of people based on discriminatory parameters. Additionally, users are constantly forced to expose themselves to risks of identity theft, fraud attempts, and data breaches, amongst other security issues.

The problem is compounded due to the absence in several countries of strict privacy laws that govern how apps share sensitive personal data with other parties. For instance, as pointed out by American non-profit consumer organisation Consumer Reports, in the U.S., information collected by apps providing health services is not covered by the Health Insurance Portability and Accountability Act (HIPAA), which limits where healthcare providers such as doctors and hospitals can share your health information. This gives app developers leeway to do as they please with the information they amass. The Indian equivalent of HIPAA – the Draft Health Data Management Policy – has been criticised for posing surveillance risks by requiring the linking of digitised health records with a digital health ID, allowing the same to be further linked with other personal information.

What is the way forward?

At any given point in time, users evaluate their concerns about data sharing anywhere on the internet using a risk versus convenience metric. Although the data-sharing practices of these apps can aggravate problems for women and people who reproduce, they can also help diagnose medical conditions such as polycystic ovary syndrome. Some apps help produce reports that can be shared with medical professionals, making the opt-out harder and more undesirable, particularly for users that lack access to health insurance or medical aid.

The primary concern in the data collection and sharing practices of these companies is their lack of transparency. Despite the existence of tools such as Agrigento, a tool to identify privacy leaks in Android apps by performing black-box differential analysis on the network traffic, the investment of time and effort in securing user data is minimal, the 2019 study referred to earlier has underlined.

As per the Centre for Reproductive Rights, a global human rights organisation, roughly 91 million women of reproductive age live in nations where abortion is illegal. Another 358 million live in countries where abortion is prohibited unless the woman's life is at risk. In such nations, weak privacy laws or shoddy data collection and dissemination practices of apps about women's reproductive health can lead to the surveillance and policing of intimate and deeply sensitive information about a woman's body. The impacts of such practices also have larger implications for anyone who uses these apps, highlighting the need for everyone to take cognisance of the problem.

Ultimately, the possibility of the sale of such data to untrustworthy third parties or the data being subpoenaed should be sufficient for users to be cautious regarding the information they share with these apps and to lobby for increased safeguarding of their digital rights.

Note: This article was briefly removed to address concerns expressed by Clue's Principal PR Manager, Clare Sayas in an email to The Leaflet's editorial team. According to Sayas, "Clue does not participate in the practices mentioned". Sayas also provided links to two statements from Clue's co-CEOs regarding Clue's stance on data privacy (see them here and here).
As Sayas did not question the facts in the article, which have been duly corroborated by the authors and verified by the editorial team, we have decided to re-publish the story with Clue's clarification, so that readers will be able to judge the article in proper perspective.
– Editor
logo
The Leaflet
theleaflet.in