The 23andMe fallout shows why privacy can’t be an afterthought
When a company built on personal data collapses, it’s not just a financial event; it’s an inflection point for the entire digital ecosystem. The case of 23andMe, once a pioneer in direct-to-consumer DNA testing, has sparked headlines for its recent Chapter 11 filing. But beneath the surface lies a more profound question: What happens to user data when the company that collected it no longer exists as we know it?
This isn't just about genetics. It’s about any company, in any sector, that collects, stores, and monetizes sensitive information. Whether it’s a fintech firm handling transaction histories, an HR platform managing employee records, or a retail brand tracking behavioral data, the principle is the same: once you hold someone’s data, you're responsible for it, not just while operations run smoothly, but especially when they don't.
Data Isn’t Just an Asset; It’s a Responsibility
In M&A and bankruptcy proceedings, data is often treated like any other asset that can be valued, transferred, or sold. But sensitive data—customer records, behavior profiles, and consent histories—isn’t a commodity. It reflects people’s lives, identities, and decisions.
That distinction matters. Companies must stop viewing data privacy as a compliance checkbox and treat it like business-critical infrastructure. That means embedding privacy controls into daily operations, creating clear audit trails, and maintaining visibility across the entire data lifecycle - from collection to deletion.
When transitions happen, such as acquisitions, shutdowns, or restructuring, data shouldn’t fall into legal or operational limbo. There must be guardrails: clear internal policies, well-documented consent frameworks, and clarity around data location, access controls, and governance practices across environments.
The public is paying attention, and so are regulators. California’s Attorney General issued a consumer warning within hours of the 23andMe announcement, reminding users of their right to delete their genetic data and revoke consent. The reputational cost of mishandling data in high-stakes moments can eclipse any short-term financial gain. Privacy today is not just a legal topic; it's a leadership one. And leadership shows its strength the moment users start asking hard questions: What data do you have on me? Can I delete it? Who else has access to it? That’s when Data Subject Requests (DSRs) come flooding in - and when operational cracks become visible.
The DSR Surge Is Real and Predictable
Public incidents trigger a spike in Data Subject Requests (DSRs). We repeatedly see this pattern: people act quickly to reclaim control over their data when uncertainty hits. Due to significant events, DSR volumes can grow hundreds of percent in days, catching many companies unprepared.
Many organizations still rely on semi-manual processes to fulfill DSRs—exporting data from dozens of systems, reviewing each record, consulting legal, and redacting files. That may be manageable at low volume, but when requests multiply overnight, the system collapses.
DSR fulfillment should be as resilient and scalable as your security posture. A company isn't genuinely ready if it isn’t confident it can respond to a data deletion request at scale.
DSR readiness requires real operational maturity, including orchestration across systems, traceable actions, and the ability to scale response at speed. These aren’t just nice-to-haves; they’re prerequisites for trust in a crisis.
Planning for the “What If” - Not Just the “Right Now”
Companies invest heavily in disaster recovery plans for systems, but what about for trust? Most security leaders are well-versed in incident response playbooks, yet few organizations have equivalent plans for user privacy scenarios.
What happens to personal data during a breach, shutdown, or acquisition? Who is responsible for deletion rights, consent logs, or data retention policies when a legal entity changes hands
These questions shouldn’t be asked for the first time during a crisis. They should be mapped, tested, and documented in the organization’s core governance structure. That means legal, security, and privacy teams need to operate together, not in silos, with shared visibility and responsibility.
When data is siloed, outdated, or duplicated across shadow systems, it becomes impossible to uphold user rights or respond confidently to regulators. Privacy engineering becomes critical, not as a buzzword, but as a concrete practice of building privacy-by-design into systems, codebases, and workflows.
For example, organizations need to think beyond simple access control. It’s not just about who can see the data; it’s about whether you can prove why that person had access, for what purpose, and whether that aligns with user consent.
Companies navigating high-growth or high-stress phases are especially vulnerable. We've seen cases where DSRs and consent records were scattered across tools, with no central audit trail. When scrutiny comes, and it will, that lack of preparedness becomes an existential risk.
Beyond Compliance: Toward a Culture of Accountability
While regulatory frameworks are evolving - from GDPR and CCPA to the AI Act, NIST and other global regulations, the real challenge is cultural. Laws can create consequences, but they can’t create care. That has to come from within.
Suppose we want to prevent future cases like 23andMe. In that case, we need to normalize a different mindset: data stewardship is ongoing, that consent is dynamic, and privacy doesn’t stop when the marketing campaign ends. It lives on in every API call, every database backup, every analytics dashboard.
This isn’t just a job for privacy professionals or legal departments. CISOs, CTOs, and CDOs all play a role in building infrastructure that supports user trust, not just security controls. It means asking harder questions about third-party risk, the lifecycle of data, how automation is used, and what we owe our users—even if we shut down tomorrow.
Because let’s face it: users won’t wait for legal teams to sort things out. In a privacy-aware market, loyalty is fragile. If users don’t get transparency and control, they’ll walk - and take their trust with them.
Trust Is Built on Foresight
What 23andMe underscores isn’t limited to one company or one industry. It reminds us that a business's durability is measured not just in capital but also in how responsibly it handles its data. Companies that rely on personal data don’t just manage risk; they manage relationships, expectations, and, ultimately, trust.
True resilience isn’t about staying afloat during a crisis - it’s about preparing for one before it hits. Privacy workflows and governance readiness aren’t emergency measures. They’re how businesses future-proof themselves in a landscape where scrutiny, regulation, and public awareness only intensify.
The companies that will lead the next decade will not be the ones that collect the most data but the ones that know how to protect it, respect it, and navigate change without losing the credibility and confidence they’ve spent years building.