Nineteen U.S. states have comprehensive privacy laws on the books as of 2026. If your organization handles consumer data at any scale, you're dealing with a patchwork of requirements that share common DNA but differ in critical details. The challenge isn't whether to comply—that decision was made for you when you started collecting personal information from residents of these states. The challenge is building a program that handles all of them without burning out your team or your budget.

I've watched organizations tackle this problem from both sides: the ones that tried to treat each state law as a separate project, and the ones that built a unified framework from the start. The difference in outcomes is stark. This article maps the current landscape of state privacy laws, identifies the patterns that matter, and walks through how to build a compliance program that scales.

The States With Comprehensive Privacy Laws

As of 2026, nineteen states have enacted comprehensive consumer privacy legislation. These aren't sector-specific laws like HIPAA or financial privacy regulations—they're broad frameworks that apply to businesses handling personal data of state residents, regardless of industry.

The states are: California, Virginia, Colorado, Connecticut, Utah, Iowa, Indiana, Tennessee, Montana, Oregon, Texas, Delaware, New Jersey, New Hampshire, Nebraska, Kentucky, Maryland, Minnesota, and Rhode Island.

California remains the outlier. The CCPA and its successor, the CPRA, are substantially more detailed and more expansive than every other state law. The other eighteen states largely follow a model that originated with Virginia's Consumer Data Protection Act. They're not identical—far from it—but they share structural similarities that make unified compliance feasible.

The pattern I see most often: organizations treat California as its own project and build a second framework for the Virginia-model states. That's not unreasonable. California's requirements around sensitive personal information, risk assessments, and the creation of the California Privacy Protection Agency as an enforcement body set it apart. But treating the other eighteen states as interchangeable is where organizations get into trouble.

Applicability Thresholds Vary More Than You'd Think

Most Virginia-model states use revenue and data volume thresholds to determine which businesses must comply. A common pattern is $25 million in annual revenue and either processing data of 100,000 consumers or processing data of 25,000 consumers while deriving more than 50% of revenue from data sales. But several states deviate from this formula in ways that expand or contract coverage.

Connecticut, for example, lowered the consumer threshold to 75,000. Oregon requires compliance for businesses controlling or processing data of 100,000 consumers or more, regardless of revenue share from data sales. Texas uses 100,000 consumers but defines "consumer" to exclude individuals acting in a business-to-business context—a distinction that matters if you primarily serve commercial customers.

California remains the most aggressive. The CCPA applies to businesses with annual gross revenues exceeding $25 million, or those that buy, sell, or share personal information of 100,000 or more California residents or households, or derive 50% or more of their annual revenue from selling or sharing consumers' personal information. No "and" requirement—any one of those three conditions triggers coverage.

Core Consumer Rights: The Common Framework

Every state privacy law grants consumers a set of rights over their personal data. The specifics vary, but the core structure is consistent across all nineteen states. Understanding this common framework is the foundation for any multi-state compliance program.

The right to know what personal data a business has collected about them. Consumers can request confirmation of whether you're processing their personal data and, in most cases, access to that data. The format requirements differ—some states require machine-readable formats, others are silent—but the fundamental obligation is universal.

The right to delete personal data. With exceptions for legal obligations, fraud prevention, security purposes, and completion of transactions, consumers can request deletion of their personal data. The scope of what must be deleted varies. Some states require deletion from backup systems only when those backups are used for purposes other than disaster recovery. Others are less specific.

The right to correct inaccurate personal data. This right exists in most, but not all, state laws. It's often overlooked because organizations assume their data is accurate. In my experience, that assumption falls apart the moment you start fielding correction requests. Consumers notice errors in their profiles, especially when those errors affect outcomes like credit decisions or employment screening.

The right to opt out of certain types of data processing. This is where state laws diverge significantly, and I'll address it separately below. But every state gives consumers some form of opt-out right, at minimum for the sale of personal data.

The right to non-discrimination. Businesses cannot deny goods or services, charge different prices, or provide a different level of quality based solely on a consumer's exercise of their privacy rights. The wrinkle: businesses can offer financial incentives for data collection if those incentives are reasonably related to the value of the data. Designing compliant incentive programs requires careful analysis, because "reasonably related to the value" is a legal standard that lacks clear safe harbors.

Sensitive Data Gets Special Treatment

Most state privacy laws define a category of "sensitive personal information" or "sensitive data" that requires heightened protection. The definitions overlap but aren't identical. Common categories include: racial or ethnic origin, religious beliefs, health information, sexual orientation, citizenship or immigration status, genetic or biometric data, precise geolocation, and personal data collected from known children.

California goes further. Under the CPRA, sensitive personal information includes government-issued identifiers like Social Security numbers and driver's license numbers. It also includes account login credentials and the contents of mail, email, and text messages where the business isn't the intended recipient.

The compliance obligation for sensitive data typically requires obtaining consumer consent before processing, or at minimum providing a clear opt-out mechanism. The consent standard varies. Some states require affirmative opt-in consent. Others allow opt-out with prominent notice. This is one area where you cannot assume that if you comply with the strictest state, you've covered the others. The mechanism matters, not just the outcome.

Need Help Building a Multi-State Privacy Compliance Program?

Carl speaks to leadership teams and boards about practical approaches to privacy compliance that account for real organizational constraints. His keynotes cut through vendor noise and focus on what actually works.

Book Carl to Speak
Inline article illustration

Opt-Out Rights: Where the Differences Matter Most

The right to opt out is where state privacy laws diverge in ways that directly affect system design and operational workflows. Every state requires some form of opt-out, but the scope of that right varies considerably.

At the baseline, all nineteen states give consumers the right to opt out of the "sale" of their personal data. But "sale" is defined differently across statutes. California defines it broadly to include making personal data available to third parties for monetary or other valuable consideration. Under this definition, many common advertising and analytics arrangements qualify as sales, even if no money changes hands directly for the data itself.

Other states use narrower definitions. Some require actual monetary exchange. Others require that the transaction be the primary purpose of the transfer, excluding transfers that are incidental to providing a service the consumer requested. If you're processing opt-out requests based on California's definition, you may be honoring requests that aren't actually required under other state laws. That's not necessarily a problem—it may be simpler than maintaining separate logic—but you should make that choice deliberately.

Several states have added an opt-out right for "targeted advertising." This typically means advertising based on personal data obtained from the consumer's activities across different businesses, websites, or online services. It's narrower than opting out of all advertising, but broader than opting out of data sales. Connecticut, Colorado, Virginia, Montana, and Oregon all include targeted advertising opt-outs. The implementation challenge is that your advertising partners need to respect these opt-outs, which means your vendor contracts and technical integrations need to support segregated audiences.

California introduced an opt-out right for "sharing" under the CPRA, which covers cross-context behavioral advertising even when no sale occurs. The practical effect is similar to the targeted advertising opt-out in other states, but the legal category is distinct.

More recently, some states have added opt-out rights for profiling in furtherance of solely automated decisions that produce legal or similarly significant effects. This is borrowed from GDPR's logic. It applies to decisions like credit approvals, employment decisions, or insurance underwriting when those decisions are made without human involvement. Not every state has adopted this right, and among those that have, the definitions of "solely automated" and "legal or similarly significant" vary enough that blanket assumptions are risky.

Universal Opt-Out Mechanisms

Several states now require businesses to honor universal opt-out mechanisms. These are technical signals—typically browser settings or browser extensions—that communicate a consumer's opt-out preference automatically, without requiring the consumer to visit each website individually.

The Global Privacy Control (GPC) is the most widely adopted signal. It's a browser setting that sends an HTTP header indicating the user's opt-out preference for data sales and sharing. California, Colorado, Connecticut, and several other states explicitly require businesses to recognize GPC as a valid opt-out signal. If your website or application doesn't detect and honor GPC, you're non-compliant in those states.

Implementing GPC support isn't technically complex, but it requires coordination between your web development, legal, and privacy teams. You need to detect the signal, apply the appropriate opt-out status to the user's profile, and ensure that your downstream data processors respect that status. The failure mode I see most often is that the signal is detected but not propagated to third-party tags or advertising pixels, which means the opt-out isn't actually effective.

Data Protection Assessments: Not Optional in Many States

Data Protection Assessments (DPAs), sometimes called Data Protection Impact Assessments, are required in California and several Virginia-model states. They're risk assessments for specific types of high-risk processing activities. This is not a general security risk assessment—it's a documented evaluation of the privacy risks and benefits of a particular data processing activity.

California requires risk assessments for processing that presents a significant risk to consumers' privacy or security. The CPRA lists specific triggers: processing sensitive personal information for purposes that require a risk assessment, processing for targeted advertising, processing for profiling with foreseeable risks of unfair or deceptive treatment, and processing for automated decision-making that produces legal or similarly significant effects.

Virginia, Colorado, Connecticut, and several others require DPAs for data processing activities that present a heightened risk of harm to consumers, including targeted advertising, sale of personal data, profiling where there's a foreseeable risk of certain harms, and processing of sensitive data. The harm standard varies by state. Some define it explicitly; others rely on the business's judgment. That ambiguity is not a reason to skip the assessment—it's a reason to document your reasoning carefully.

DPAs are discoverable. State attorneys general can request them during an investigation. That means they need to be defensible. I've reviewed DPAs that are clearly checkbox exercises, written to justify a predetermined outcome. Those don't help you in an enforcement action—they hurt you, because they demonstrate that you understood the risks and proceeded anyway without adequate mitigation.

A credible DPA identifies the processing activity, the categories of personal data involved, the purpose and benefits of the processing, the privacy risks to consumers, the safeguards in place to mitigate those risks, and a determination of whether the benefits outweigh the risks. It's a short document—typically three to five pages—but it should reflect actual analysis, not template language.

Privacy Compliance Requires Leadership, Not Just Legal

Carl's keynote talks help leadership teams understand privacy compliance as a strategic capability, not a legal checkbox. See all keynote speaking topics or reach out about your event.

Book Carl for Your Event
Inline article illustration

Vendor and Service Provider Obligations

Every state privacy law distinguishes between businesses that control personal data and service providers or processors that handle data on behalf of the controller. The terminology varies—California uses "service provider" and "contractor," while most other states use "processor"—but the concept is consistent. If you're handling personal data on behalf of another business pursuant to a contract, your obligations are different from those of the business that controls the data.

Service providers and processors are generally prohibited from using personal data for any purpose other than the specific business purpose described in the contract. They can't sell the data (with narrow exceptions for California service providers). They can't retain, use, or disclose the data outside the scope of the contract. And they're required to enter into written agreements with the controlling business that include specific contractual provisions mandated by the state law.

The contract requirements are detailed. California's CPRA requires that service provider agreements include provisions governing data security, subprocessor usage, and the business's right to audit the service provider. The agreement must prohibit the service provider from combining personal data it receives from the business with personal data it receives from other sources, except as permitted by the CPRA. It must certify that the service provider understands the restrictions and will comply with them.

Virginia-model states have similar but not identical contract requirements. Most require that processor agreements specify the subject matter and duration of processing, the nature and purpose of processing, the types of data involved, and the obligations and rights of the controller. The processor must assist the controller in meeting its obligations under the state law, including responding to consumer rights requests and meeting security requirements.

If you're a service provider or processor under any state law, your standard contract template needs to address these requirements. If you're a controller, you need to ensure your vendor agreements include the required provisions. This is not an area where you can rely on general indemnification clauses or vague commitments to comply with applicable law. The statutes require specific contractual language, and state attorneys general know what to look for.

Subprocessor Management

Most state privacy laws require that service providers and processors obtain authorization before engaging subprocessors. The authorization can be general—meaning the controller agrees in advance that the processor can use subprocessors, typically with notice and an opt-out right—or specific, meaning the controller approves each subprocessor individually.

In practice, most organizations use general authorization with a list of subprocessors disclosed in the contract or maintained on a website. The processor is obligated to notify the controller before engaging a new subprocessor and give the controller an opportunity to object. If the controller objects and the processor can't accommodate the objection, the controller typically has a right to terminate the contract.

This creates operational challenges when you're both a processor for some customers and a controller for others. You need systems to track which role you're in for each dataset and ensure that your subprocessor usage complies with the applicable restrictions. The failure mode is inconsistency—using a subprocessor for data where you're a processor without first obtaining authorization, or failing to notify controllers of subprocessor changes. Both create contract breaches and potential liability.

Building a Compliance Program That Scales

The only sustainable approach to multi-state privacy compliance is to build a unified program that meets the highest standard required by any applicable state law, with documented exceptions for the specific requirements that genuinely differ. Trying to maintain separate compliance tracks for each state doesn't scale beyond three or four states, and we're well past that threshold.

Start with California. The CPRA is the most comprehensive state privacy law, and if you're compliant with the CPRA, you've addressed most of what the other eighteen states require. The gaps—and there are some—can be handled as delta compliance rather than separate programs.

Map your data inventory to the definitions used across all applicable state laws. This is tedious work, but it only needs to be done once and then maintained. You need to know what personal data you collect, from which sources, for which purposes, with which third parties you share it, and how long you retain it. Every state law requires some version of this understanding. Document it in a way that lets you answer questions from any state's perspective.

Build consumer request workflows that handle the broadest set of rights. That means access, deletion, correction, and opt-out at minimum. Your workflow needs to verify the identity of the requestor—every state allows businesses to request information reasonably necessary to verify identity, but the standards for what's "reasonable" vary. A risk-based approach works: higher-risk requests (like deletion of sensitive data) justify more stringent verification.

The time limits for responding to requests are mostly consistent: 45 days, with one 45-day extension if needed. A few states have shorter windows for specific request types. California requires responses to opt-out requests within 15 business days. Plan your workflows around the shortest applicable deadline, not the longest.

Implement technical mechanisms for opt-outs that work across all opt-out categories: sales, sharing, targeted advertising, and profiling for automated decisions. This typically means maintaining opt-out flags in your user database and ensuring those flags are checked before data is used for any of those purposes. The challenge is not the flag itself—it's ensuring that every system and every vendor integration respects the flag. That requires testing, because third-party pixels and tags often bypass your application logic entirely.

Honor universal opt-out signals like GPC. This is table stakes in multiple states, and it's becoming a consumer expectation even where it's not legally required. The implementation is straightforward: detect the signal, apply the opt-out, and propagate it to your data partners.

Training and Documentation

Your compliance program is only as strong as your team's understanding of it. Privacy compliance requires judgment calls—determining whether a particular use of data qualifies as a "sale," assessing whether a processing activity requires a DPA, deciding whether a consumer request is verifiable. Those judgments need to be consistent and defensible.

Train the people who handle consumer requests. They need to understand the rights that apply, the verification standards, the time limits, and the exceptions that allow you to deny a request. They also need to know when to escalate. A deletion request that would require deleting data subject to a legal hold is not something your frontline support team should decide on their own.

Document your compliance procedures. This isn't about creating a binder that no one reads. It's about ensuring that if someone on your team leaves, their replacement can step in without rebuilding institutional knowledge. It's also about demonstrating to regulators that you have systematic processes, not ad hoc responses.

Review your documentation annually, or whenever a new state law takes effect. State privacy laws are still evolving. Enforcement guidance is emerging. Your documented procedures need to reflect the current state of the law, not the law as it existed when you first built your program.

Enforcement and Penalties: What Happens When You Get It Wrong

Enforcement of state privacy laws is ramping up. California's Privacy Protection Agency has brought several enforcement actions under the CPRA. State attorneys general in Virginia, Colorado, and other states have signaled that privacy enforcement is a priority. The pattern I see is that early enforcement targets fall into two categories: high-profile companies with consumer-facing brands, and companies with egregious practices that draw media attention.

Most state privacy laws include a cure period for first violations. If a business receives a notice of violation from the state attorney general, it has 30 or 60 days to cure the violation before penalties attach. That's a meaningful grace period, but it's not unlimited. Some states have eliminated cure periods for violations that occur after a certain date, and California has no general cure right under the CPRA—only for specific categories of violations.

Penalties vary. California allows civil penalties up to $2,500 per violation, or $7,500 per intentional violation. Virginia and several other states use a cap of $7,500 per violation, with no distinction for intent. The question of what constitutes a "violation" is open to interpretation. Is it per consumer affected? Per data element mishandled? Per day of non-compliance? The statutes are often ambiguous, which means the potential exposure in a large-scale violation can be substantial.

Private rights of action are limited in most state privacy laws. California's CCPA includes a private right of action for data breaches involving specific categories of personal information, but not for other violations. Most other states reserve enforcement to the attorney general. That's a significant difference from Illinois's Biometric Information Privacy Act, which allows private lawsuits and has generated a flood of class actions. For now, the risk is primarily regulatory, not litigation.

But that doesn't mean you should discount reputational risk. A notice of violation from a state attorney general, even if it's cured before penalties attach, is a public event. It signals to customers, partners, and investors that your privacy practices fell short. The cost of remediating that reputational damage often exceeds the potential financial penalties.

What This Means for CISOs and Privacy Leaders

Privacy compliance is not a legal-only problem. It requires coordination across IT, security, product, marketing, and legal. The CISO's role is to ensure that privacy requirements translate into technical controls and operational processes that are actually implemented and maintained.

That means you need visibility into data flows. You can't comply with state privacy laws if you don't know where personal data lives, how it moves through your systems, and which vendors have access to it. Data mapping is foundational. It's also never finished, because your systems and vendors change constantly.

You need a mechanism for reviewing new data uses before they're deployed. A new marketing campaign that involves data sharing with a third party may trigger opt-out requirements or require a DPA. If your privacy and legal teams only learn about it after it's live, you're already out of compliance. Build privacy into your change management process, not as a gate that slows everything down, but as a checkpoint that identifies issues while they're still easy to fix.

You need contracts that reflect your role in the data relationship. If you're a service provider or processor, your agreements need to include the provisions required by applicable state laws. If you're a controller, your vendor agreements need to impose those same requirements on your vendors. Contracts are not self-executing—you need processes to ensure that the contractual commitments are actually met—but they're the foundation for accountability.

Most importantly, you need to treat privacy compliance as a program, not a project. State privacy laws are not a one-time implementation. They require ongoing attention: responding to consumer requests, updating data inventories, reviewing vendor relationships, conducting DPAs for new processing activities, and monitoring for changes in the law. Organizations that approach privacy compliance as something they can "finish" are the ones that end up scrambling when an enforcement action arrives.

The complexity of multi-state privacy compliance is real, but it's not unmanageable. The organizations that succeed are the ones that build systematic processes, invest in the right tools and training, and treat privacy as a shared responsibility across the business. The ones that struggle are the ones that try to handle privacy as a purely legal exercise, divorced from the operational realities of how data is actually used.

If you haven't yet built a multi-state privacy compliance program, start now. If you have one, review it against the framework outlined here and identify the gaps. The regulatory landscape will continue to evolve, but the fundamentals of data inventory, consumer rights, vendor management, and risk assessment are stable. Build on those fundamentals, and you'll be in a position to adapt as new states enact privacy laws and existing laws are amended.

For more detail on California's specific requirements, see my breakdown of CCPA vs CPRA: What Changed and What You Need to Do. And if you're looking for practical steps to protect your own personal data in this evolving landscape, read How to Protect Your Privacy Online: A CISO's Guide.

📖
CCPA vs CPRA: What Changed and What You Need to Do → How to Protect Your Privacy Online: A CISO's Guide →