The 1:10:100 rule—coined in 1992 by George Labovitz and Yu Sang Chang, the rule describes how much bad data costs. Preventing the creation of bad data at its source costs $1. Remediating bad data costs $10. Doing nothing costs $100.
Of course, those costs are much higher now than they were in 1992, but the overall dynamic is unchanged. It is fundamentally much cheaper to fix a problem early than it is to wait to fix it later, and both approaches are much cheaper than doing nothing.
This rule is often cited in the context of shifting left. Catching a bug early is cheap; waiting until you’ve developed dependencies that need to be untangled before you can fix that bug is more costly; letting the bug ruin your product or service is very expensive.
This truism has inspired DevOps and shift-left approaches to software development, challenging traditional understanding of how products and services get to market.
What if we applied this concept to privacy management too?
What Is Shifting Left?
Consider the software development lifecycle. There will be different phases involved depending on who you ask, but the general process and flow goes like this:

Various tasks are associated with each of these phases, like defining scope, ensuring security and reliability, testing individual components, and so on.
Shifting left means taking certain steps from the right side of this process flow (usually testing), and... shifting them left—that is, considering them earlier in the process.
Rather than test software after you’ve built it and baked in any bugs, it’s more effective to shift testing left and catch bugs early, before dependencies complicate their remediation.
The same can be said for things like designing for security and—you guessed it—privacy.
Why Shift Privacy Management Left
It’s easy to see why shifting testing left yields positive results. But it may be less clear to developers and other stakeholders who may not have experience in privacy why they should shift privacy management left too.
Reduce Tech Debt
Development timelines are tight. Sometimes, it feels like corners have to be cut. You know that cutting corners doesn’t really “save” time, it just incurs tech debt that you need to pay back later.
If you ignore privacy during early design phases, you’ll have to pay that back just like any other kind of tech debt. Yes, you’ll be able to ship your product or service faster to market—but once it's in the market, it’ll be non-compliant. For that reason, privacy-related tech debt can be worse than other kinds of tech debt.
A design decision that leads to a sub-optimal user experience or poor reliability needs to be addressed eventually, but not immediately. Like monetary debt, you can think of this as a long-term, low-interest loan, like a mortgage. Privacy tech debt is more like a payday loan; the interest is so high, it’s not worth taking out in the first place.
You may not have the time and resources to design and develop something in a completely optimal fashion. Some tech debt may be necessary. But you still need to prioritize what kind of tech debt you’re willing to incur. Often, privacy tech debt isn’t worth it.
Reduce Penalties and Brand Damage
Many data privacy regulations offer what’s known as a “cure period.” They’re a grace period—a data privacy authority notifies a company about some violation and gives that company a period to “cure” the violation. Sometimes they’re permanent features of the law, and sometimes they sunset at a certain date.
Sounds pretty great, right?
Unfortunately, they’re usually only 30 days long or so.
We mention them because if you receive a notice of violation under a data privacy law due to the way your systems process consumer data, there’s little chance you’ll be able to fix the violation. If you receive a notice of violation, you’ll likely be on the hook for a painful fine on top of the often more significant cost of a stain on your organization’s reputation.
Rather than risk it and wait for a notice of violation before addressing privacy flaws, it's far easier to ensure your system is compliant from the start.
Reduce Time Spent on Privacy Management
We talked about tech debt earlier. With data privacy compliance, you can also think in terms of privacy debt—the non-technical work you or your colleagues will need to spend managing data privacy because of sub-optimal decisions you make early in the development process.
If you design your system to collect minimal data, there will be less data to manage. That means fewer rights requests from fewer data subjects, less data to search through for the requests you do receive, less data to secure and protect, fewer data transfers to watch, potentially fewer laws to comply with, and so on.
Data minimization is a huge part of shifting privacy left and major way to reduce your privacy debt, but there are dozens of additional ways that making privacy decisions early reduces your workload down the road. Giving your users control over their privacy reduces complaints and frustrations. Less data processing simplifies policy management and makes it easy to manage data inventories and records of processing activities (RoPAs). If you or a vendor receive a notice of violation or if a regulator announces an investigatory sweep, you’ll have far less to review for compliance. The list goes on.
How to Shift Privacy Left
Familiarize Yourself with Privacy by Design
The seven privacy-by-design principles are essential for shifting left on privacy. If there’s one thing you take away from this blog, it should be these principles:
- Be proactive not reactive. Prevent privacy violations, don’t remediate them.
- Make privacy the default setting. Let users opt into—not out of—a less-private experience if they choose.
- Embed privacy into design. Privacy should be inherent to the system, not an add-on.
- Privacy by design should yield full functionality and be positive-sum, not zero-sum. Avoid false dichotomies like privacy versus usability.
- Provide end-to-end security. Ensure user data is as safe as you can make it, even if your obligations are limited to certain parts of the user lifecycle.
- Be visible and make it transparent. Users should know what’s going on with their data. Even if you’re respecting their data privacy rights, let them know how you’re doing so.
- Respect the user’s privacy. Be user-centric.
When developing products and services, consider where, when, and how you can apply these principles.
Double Down on Assessments
Assessments aren’t just a bureaucratic box to check. They’re an important tool for discovering privacy risk, documenting decision-making, and—crucially—acting on findings.
(And yes, they are also a required box that you have to legally check from time to time.)
Start assessments as early as possible. This could look like conducting regular threshold assessments to determine whether you’re subject to or will soon be subject to a given privacy law. This way, you’ll know to watch out for features or practices that might be impacted by the law’s requirements during the development process.
Anytime you handle personal information, it’s a good idea to run a privacy impact assessment, or PIA. If you’re subject to the GDPR, data protection impact assessments (DPIAs) are required whenever processing is likely to result in a high risk to the rights and freedoms of individuals.
Assessment requirements in data privacy law are intentionally broad—some laws call out specific use cases that absolutely require assessments, like profiling users, but they generally advocate for the use of assessments whenever they might be relevant. To play it safe, assess often and early.
Adopt PETs
Investing in privacy can feel like it requires giving up valuable data that could improve development outcomes. Privacy-enhancing technologies (PETs) give you the best of both worlds: access to data that can be used in development without violating privacy rights. This category of technologies enables privacy-preserving development, testing, and production practices.
If, for instance, you’re working on a model that needs data for testing or training purposes, such as an AI model, then an important PET to investigate would be synthetic data. This avoids the need for you to use real people’s data and generates fake, but useful and statistically relevant data.
Similarly, you could investigate homomorphic encryption, a PET that enables you to run computations on encrypted data without de-encrypting that data first.
Breaking down each type of PET out there and assessing which use cases each is best suited for is outside the scope of this article, but if you haven’t investigated this before and are interested in shifting privacy left, they’re a must-have.
Data Minimization, Data Minimization, Data Minimization
The best way to preserve users’ privacy is to design your systems such that they don’t require user data. Every design decision you make should consider how it can reduce reliance on user data.
You could develop a model using wholly synthetic data rather than real-world data, as described above. You might require users creating an account to input just the year of their birthday, rather than the day, month, and year. You might stick to email address identity verification rather than require an upload of a photo ID. You might perform some calculation on user data but deliver only the output of those calculations to downstream systems for further processing, rather than pool lots of potentially identifiable information in one system for processing.
Try to adopt a data-minimization-by-default mindset and make your design decisions for this perspective.
Collaborate Cross-Functionally Early and Often
The user data lifecycle doesn’t begin and end in development.
If you’re a privacy engineer, you probably already bridge the gap between the privacy and engineering teams. If you belong to one team or the other, schedule recurring meetings to discuss the role of privacy in development in general, and ensure that both teams understand when and why to meet for specific initiatives.
You may benefit from seeking the input of other teams as well. Marketing and sales, customer support, IT and security—depending on the nature of your organization, these teams may share important insight into what data they need to do their jobs well.
Just be aware that in organizations without a culture of data privacy, there tends to be a bias towards data maximization, rather than minimization. You may have to push back on certain asks and requirements if your colleagues still think data is the new oil.
Enable Privacy Management
You won’t be able to design your way out of any and all privacy obligations. No matter what, some user data is going to enter your organization’s systems. Invest in privacy management solutions that allow you to treat that data compliantly and efficiently.
After all, you can’t spend the time it takes to implement privacy by design if you’re playing catch-up with assessments, data mapping, consent management, subject rights requests, vendor management, and all of the other duties required by privacy law.