Access to data regarding conception and intention to abort can especially be used by States where abortion is illegal to vilify and penalise women and gendered minorities.
Why there is increased scrutiny over the data collection practices of period-tracking apps?
IN June, the United States Supreme Court overturned its previous judgment in Roe versus Wade (1973) and Planned Parenthood versus Casey (1992), which had protected abortion as a constitutional right for decades. The decision, delivered in Dobbs versus Jackson Women’s Health Organization, marked a victory for social conservatives in America and paved the way for U.S. states like Mississippi to restrict access to abortion. In light of this, it becomes important to analyse the impact such a legal development would have on women’s lives. Although the decision comes from the U.S. Supreme Court, insights derived from its fallout may help India take a closer look at our laws and whether they suffer from similar shortcomings.
Dobbs versus Jackson has brought the limelight on concerns regarding data privacy by period-tracking apps that may have access to sensitive data such as those of conception and intention to abort. In this regard, questions related to the kind of data collected, the uses it is put to, and the possible consequences of sharing that data, become all the more deserving of attention.
Recently, the U.S. Congress, taking cognisance of the matter, investigated “the collection and sale of sensitive personal data related to access to abortion and other reproductive health services.” The letters sent out by Congress to data brokers during the investigation acknowledge the promises made by these apps to protect the confidentiality of their users’ data. However, they note that even the possibility of compromise could have severe consequences for people who reproduce, and hence warrants a closer look at their data collection and sharing practices.
Also read: U.S. Supreme Court’s judgment in Dobbs versus Jackson Women’s Health Organization: The reasoning and the takeaways
What are the types of data collected by such apps?
The type and nature of data typically collected by developers vary based on their use. De-identified and aggregated data include analytics data that track user activity on the app such as the length of the sessions and the performance of the app, among other things. It further includes data about users like the device information, including details about the device model, its storage information, and operating system. Data collected for targeted advertising, and for characterising user activity across services and applications, may involve the collection of user activity on other sites and apps, through third-party trackers.
Dobbs versus Jackson has brought the limelight on concerns regarding data privacy by period-tracking apps that may have access to sensitive data such as those of conception and intention to abort.
The concern more specific to period-tracking apps arises from the collection of sensitive personal data. Period-tracking apps including Clue, Flo, and MIA, amongst others, transparently collect data such as the user’s name, email address, gender, birth year and birthday, place of residence, location, and Internet Protocol address. Such apps also have access to information about the user’s weight, body temperature, menstrual cycle, and details of pregnancy.
In research conducted by British charity Privacy International, period tracking apps were found to have collected data about sexual practices, masturbation habits, quality of stool, birth control practices, skin type, mood cravings, hair quality, and other such intimate details.
What is done with this data?
In 2020, the FTC formally lodged a complaint against Flo’s misconduct and blatant violation of data privacy laws. A year later, the matter was settled after the FTC ordered Flo to obtain an independent audit of its privacy policies and receive the express agreement of app users before sharing their health information.
The massive amounts of data collected puts users at the disadvantage of being coerced and manipulated into making decisions they may not make otherwise. This can manifest in the form of impacts such as price discrimination when companies have access to purchasers’ willingness to pay a premium or blocking important information for certain groups of people based on discriminatory parameters. Additionally, users are constantly forced to expose themselves to risks of identity theft, fraud attempts, and data breaches, amongst other security issues.
Another case study involved the Norwegian Consumer Council investigating the data collection and sharing practices of ten apps, including Clue. It found that data-sharing by these apps is part of the practices of the digital marketing and ad-tech industry, where the vast amounts of data collected are assigned unique identifiers through which personalised profiles are compiled about every individual. The Council described such practices as “massive commercial surveillance” violating consumer rights, and filed formal complaints with the Norwegian Data Protection Agency against the apps. It has also asserted the need for companies to develop alternative technology that does not rely upon such intrusive data collection and sharing practices.
What is the cause for concern?
Although the purpose of processing such data is primarily marketing, customisation of services, and improvement in the app’s functions, the collection raises concerns about potential misuse. A 2019 study found that the biggest problem with the data-sharing practices of health and medicine-related apps was the lack of transparency. It highlighted how third parties may aggregate sensitive health-related data from many sources and utilise it to populate proprietary algorithms that promise “insights” into consumers. Thus, the study warned, the sharing of user data “has real-world consequences in the form of highly targeted advertising or algorithmic decisions about insurance premiums, employability, financial services, or suitability for housing.”
Roughly 91 million women of reproductive age live in nations where abortion is illegal. Another 358 million live in countries where abortion is prohibited unless the woman’s life is at risk. In such nations, weak privacy laws or shoddy data collection and dissemination practices of apps about women’s reproductive health can lead to the surveillance and policing of intimate and deeply sensitive information about a woman’s body.
Although this may not be the case universally, data collected from fin tech companies has affected user credit ratings. Access to data regarding conception and intention to abort can especially be used by States where abortion is illegal to vilify and penalise women and gendered minorities.
The report by the Norwegian Consumer Council further identified concerns with the profiling of personalised data, particularly regarding the absence of consumer consent and the inability to opt-out of the data collection without losing the utility provided by the app. The massive amounts of data collected puts users at the disadvantage of being coerced and manipulated into making decisions they may not make otherwise. This can manifest in the form of impacts such as price discrimination when companies have access to purchasers’ willingness to pay a premium or blocking important information for certain groups of people based on discriminatory parameters. Additionally, users are constantly forced to expose themselves to risks of identity theft, fraud attempts, and data breaches, amongst other security issues.
The problem is compounded due to the absence in several countries of strict privacy laws that govern how apps share sensitive personal data with other parties. For instance, as pointed out by American non-profit consumer organisation Consumer Reports, in the U.S., information collected by apps providing health services is not covered by the Health Insurance Portability and Accountability Act (HIPAA), which limits where healthcare providers such as doctors and hospitals can share your health information. This gives app developers leeway to do as they please with the information they amass. The Indian equivalent of HIPAA – the Draft Health Data Management Policy – has been criticised for posing surveillance risks by requiring the linking of digitised health records with a digital health ID, allowing the same to be further linked with other personal information.
Also read: A Digital Health Mission without a regulatory framework
What is the way forward?
At any given point in time, users evaluate their concerns about data sharing anywhere on the internet using a risk versus convenience metric. Although the data-sharing practices of these apps can aggravate problems for women and people who reproduce, they can also help diagnose medical conditions such as polycystic ovary syndrome. Some apps help produce reports that can be shared with medical professionals, making the opt-out harder and more undesirable, particularly for users that lack access to health insurance or medical aid.
The primary concern in the data collection and sharing practices of these companies is their lack of transparency. Despite the existence of tools such as Agrigento, a tool to identify privacy leaks in Android apps by performing black-box differential analysis on the network traffic, the investment of time and effort in securing user data is minimal, the 2019 study referred to earlier has underlined.
As per the Centre for Reproductive Rights, a global human rights organisation, roughly 91 million women of reproductive age live in nations where abortion is illegal. Another 358 million live in countries where abortion is prohibited unless the woman’s life is at risk. In such nations, weak privacy laws or shoddy data collection and dissemination practices of apps about women’s reproductive health can lead to the surveillance and policing of intimate and deeply sensitive information about a woman’s body. The impacts of such practices also have larger implications for anyone who uses these apps, highlighting the need for everyone to take cognisance of the problem.
Ultimately, the possibility of the sale of such data to untrustworthy third parties or the data being subpoenaed should be sufficient for users to be cautious regarding the information they share with these apps and to lobby for increased safeguarding of their digital rights.