Dear African women, your health data is more valuable than you think
- Gigi
- Aug 18
- 3 min read

Many of us install a period or pregnancy app, tap through permissions, and move on. If it tracks cycles and gives helpful tips, job done.
The catch is that this is not just calendar info. It is intimate health data. How it is collected, shared, and protected can affect your life in ways that are easy to miss in the moment, and that risk is often higher when free apps rely on data to make money.
A quick note on Flo
Flo helps millions understand their cycles and has taken steps like adding an anonymous mode after criticism and an FTC settlement.
A recent US jury decision focused on Meta’s collection of Flo users’ sensitive data from 2016 to 2019. That ruling is about Meta’s conduct and SDK collection practices, not only Flo’s present policies. Flo has since changed course in important ways. The lesson is bigger than any one app.
This is not only about one app
Here are other high-profile cases that show the breadth of the problem:
Premom: The FTC barred Premom from sharing health data for ads after finding it shared reproductive and location data with third parties, including two firms in China.
Glow: California’s Attorney General reached a settlement over security and privacy failures that could have exposed fertility and sexual health details. This case did not hinge on a single breach, but on basic security lapses.
Ovia Health: Reporting showed some employers paid for a version that shared aggregated, de-identified pregnancy and fertility metrics with HR. Even when aggregated, many workers did not realise this was happening or what it could mean for them.
Maya and MIA Fem: Privacy International found that both apps sent sensitive details to Facebook through tracking tools, including information about sexual activity and contraception. Some practices changed after the findings, which shows how audits and public pressure can work.
Stardust: The app surged in downloads after promising end-to-end encryption, but later walked back some claims and faced scrutiny about what it could hand over. Promises of encryption need careful reading.
Babylon Health: A user briefly saw other patients’ video consultations due to a software error in a telehealth app. Even short exposures of clinical data can be serious.
These examples span fertility tracking, pregnancy support, period tracking, and telehealth. They show recurring issues around third-party trackers, SDKs, weak security, and unclear consent. The theme is simple. If an app is free or ad-supported, your data often pays the bill.
Why African women should care
Real-world consequences: Reproductive health can carry stigma and even legal or safety risks. Leaks or aggressive profiling can affect relationships, jobs, or personal safety. These harms show up clearly in breach stories, and they do not stop at national borders.
Cost pressure drives free app use: Free tools remove a paywall but can introduce hidden costs in the form of data sharing. This matters in markets where paid options are out of reach.
Weak recourse: Many African countries are strengthening privacy laws, but enforcement and user awareness can lag. When something goes wrong, it can be harder to get redress compared with the US or EU. Independent reviews have found inconsistent protections across popular apps.
What “good” looks like in a health app
Clear, plain privacy policy. Look for specific statements about whether data is shared with advertisers or data brokers, not vague words like “partners.”
Minimal third-party tracking. Apps should limit SDKs that send events to ad platforms.
Options for local-only storage or an anonymous mode. If cloud sync exists, you should be able to turn it off or use it without tying records to your identity.
Straightforward data deletion and export. You should be able to delete everything and take a copy for yourself.
Practical steps you can take today
Review app permissions: Disable location, contacts, and ad tracking unless clearly needed. Check your phone’s ad settings as well.
Avoid social logins for health apps: Use email and a strong password. Social logins can add another data pipe.
Use privacy friendly modes: If the app offers anonymous mode or local storage, switch it on. If not, consider an alternative that does.
Read before you tap “agree”: If a policy mentions sharing with advertisers or “partners,” assume targeting is possible. Consumer Reports’ review framework is a useful reference when comparing apps.
Ask your community to care: Share guides (like this one!) in women’s groups, campuses, and workplaces. Awareness changes behavior, and providers respond to what users expect.
The bottom line
These apps are not the enemy here, and they continue to make important improvements. It's a learning curve for all of us building in femtech.
The wider issue is a pattern across many health and wellness apps that collect sensitive information and then share it in ways that users do not fully expect.
African women deserve tools that respect consent, context, and control. Privacy is not a luxury feature. It is part of safety.
Comments