Founder Diary #2: What Asele Is Not Allowed to Become
- Gigi

- 2 minutes ago
- 2 min read

Hello, it's Gigi here, and I've been giving a lot of thought towards building trust with our users and their health data, especially with the state of things in 2026. I figured it's best to factor this in very early as to how Asele could be spun into a negative tool, and here's my reflection.
Building a women’s health product means working in a space where care, data, and behaviour overlap. That combination carries influence, whether it is acknowledged or not. Products that track cycles, moods, or symptoms do not just provide information. They shape how people understand their bodies and how others may attempt to interpret or control them.
That reality makes it necessary to think beyond features and growth. It raises a more difficult question: not only what a product is meant to do, but how it could be misused if incentives shift or guardrails are removed.
Tools framed as supportive can quietly turn into systems of surveillance. In workplace or family settings, health data can be repurposed to monitor productivity, compliance, or behaviour under the language of care. What begins as optional self-tracking can become an expectation, or worse, a requirement.
Health data is also economically valuable. Cycle information, symptom patterns, and emotional trends can be sold, combined, and inferred upon without users ever seeing the full picture. Once that data leaves a user’s control, the consequences are no longer abstract. They affect insurance decisions, employment risk, reproductive privacy, and personal safety.
There is also a subtler risk. Wellness products can drift into moral instruction. When recommendations are framed too rigidly, normal biological variation starts to look like failure. Rest becomes something to justify. Irregularity becomes something to correct. Over time, biology is used to reinforce bias rather than reduce it.
Another common failure point is overconfidence. When guidance is presented with too much certainty, people delay seeking professional care or mistake pattern recognition for diagnosis. The line between education and authority becomes blurred.
These risks are not theoretical. They are well-documented patterns in digital health and consumer wellness products. Asele is being built with those patterns in mind, not as edge cases, but as design constraints.
That means refusing certain paths, even when they are profitable or easy to justify. The product is not designed to serve employers, governments, or partners as a monitoring tool. Sensitive health data is not treated as a commercial asset. Recommendations are framed as context, not prescriptions. Productivity is not scored against hormonal cycles. Opting out, resting, and doing nothing are treated as valid choices.
The goal is not to build a system that optimises women’s bodies. It is to support understanding without control, and awareness without pressure.
Thinking about misuse early is uncomfortable, but avoiding it creates products that cause harm quietly and at scale. Building responsibly in women’s health requires making boundaries visible, enforceable, and non-negotiable.
Asele is being shaped around those boundaries, and they will continue to guide decisions as the product grows.
I hope you continue with the team and me as we navigate things and build a tool for good.
See you in my next diary entry. :)



Comments