Why State Law Matters More Than HIPAA Alone
For three decades, HIPAA has served as the de facto floor for health data privacy in the United States. But the landscape has changed dramatically. The proliferation of consumer health apps, wearables, genomic testing services, AI diagnostic tools, and direct-to-consumer telehealth platforms has created a vast ecosystem of health-sensitive data that HIPAA was never designed to reach. HIPAA covers covered entities and their business associates — it says nothing about a fitness app that tracks your menstrual cycle, a mental wellness platform that stores your therapy transcripts, or an AI model trained on de-identified imaging studies.
States have stepped into that gap with urgency. As of April 2026, twenty states have enacted comprehensive consumer privacy laws, all of which classify health data as sensitive and impose heightened protections. Several states — Washington, Nevada, and New York — have gone further by passing stand-alone consumer health data privacy laws that apply regardless of whether an entity is a HIPAA covered entity. More are moving through legislatures in 2026 and beyond.
For healthcare AI developers and data infrastructure operators — the core constituency of Radiant AI Health Data — the compliance burden is compounded. A de-identified dataset sourced from patients in Washington, California, and Maryland carries three different legal frameworks for how it may be collected, processed, shared, and sold. Getting this wrong isn't just a compliance problem; it is an existential business risk.
The Federal Baseline: HIPAA and Its Limits
The Health Insurance Portability and Accountability Act (HIPAA) Privacy Rule, implemented in 2003 and last substantively updated in 2013, establishes national standards for the protection of individually identifiable health information. It applies to covered entities (health plans, healthcare clearinghouses, and most healthcare providers) and their business associates.
HIPAA's de-identification standard — codified at 45 CFR §164.514(b) — permits two methods: the Safe Harbor method (removal of 18 specific identifiers) and the Expert Determination method (statistical verification that re-identification risk is very small). Once properly de-identified, data falls outside HIPAA's protections entirely — and may be freely sold, shared, or used for AI training purposes under federal law.
The critical limitation for 2026 and beyond: HIPAA de-identification does not equal compliance with state law. Washington, Maryland, and several other states impose consent requirements on the collection of consumer health data, not just its disclosure — a fundamentally different legal posture than HIPAA's framework.
Tier 1: Stand-Alone Consumer Health Privacy Laws
These states have enacted laws whose primary purpose is consumer health data privacy, extending far beyond HIPAA's covered entity scope. They impose the most rigorous consent requirements and carry meaningful private rights of action.
| State / Law | Effective | Status | Consent Model | De-ID Standard | Private Right |
|---|---|---|---|---|---|
|
Washington My Health My Data Act (MHMDA / SB 1155) |
Mar 31, 2023 (small biz: Jun 2024) |
Strongest | Opt-in required for collection, sharing, AND sale. Separate authorization for each. No general consent bundles. | Follows HIPAA 45 CFR §164.514 by reference; de-id data excluded from coverage. | Yes — consumers may sue directly; no AG-only enforcement. |
|
Nevada SB 370 (2023) |
Oct 1, 2023 | Active | Prior affirmative consent required to collect, share, or sell consumer health data. Prohibits geofencing near healthcare facilities. | Not explicitly defined; HIPAA de-id likely persuasive but not dispositive. | No private right — AG and Commissioner of Consumer Affairs enforce. |
|
New York NY Health Information Privacy Act (NY HIPA / SB 929) |
Awaiting governor (passed legislature Jan 22, 2025) |
Pending Signature | Modeled on MHMDA; opt-in for collection, sharing, and sale. Strong data minimization requirements. | HIPAA Safe Harbor recognized; Expert Determination requires third-party certification. | Yes — private right of action proposed in enrolled bill. |
Washington's MHMDA is widely considered the most aggressive consumer health privacy law in the country. It applies to any entity that collects, shares, or sells the consumer health data of Washington residents — regardless of whether that entity is a HIPAA covered entity. "Consumer health data" is defined broadly to include any personal information that identifies a consumer's physical or mental health status. Notably, geofencing technology around healthcare facilities — a practice used by some data brokers and advertisers — is explicitly prohibited.
Tier 2: Comprehensive Privacy Laws with Broad Health Definitions
Ten states have enacted comprehensive privacy laws that define consumer health data broadly — encompassing not just diagnosis, but also health status, conditions, and related information. These laws require opt-in consent for processing sensitive data and impose data protection assessment obligations.
| State / Law | Effective | Status | Health Data Scope | Consent / De-ID Notes |
|---|---|---|---|---|
| California CPRA / CMIA |
Jan 1, 2023 | Strict | Mental and physical health diagnosis or condition; sex life; sexual orientation. CMIA separately covers medical information handled by providers and health plans. | Opt-in for sensitive data processing. CPRA broadly prohibits collecting/selling without opt-in consent. De-id follows HIPAA standard or statistical certification. CPPA rulemaking continues. |
| Colorado CPA (SB 21-190) |
Jul 1, 2023 | Active | Broad — health status, condition, or treatment; sex life; biometric data (heightened rules from Jul 2025). Neural data added for wearable context. | Opt-in required; data protection assessments mandatory for reproductive health data profiling. Cure period expired 2025 — immediate enforcement authority now applies. |
| Connecticut CTDPA (SB 6) |
Jul 1, 2023 (amendments Jul 2026) |
Active | Broad health definition; neural data explicitly included. Minor protections tightened in 2026 amendments. | Opt-in for sensitive data. Consumer health data cannot be sold without affirmative consent. SB 4 (2026) adds California-style data deletion portal provisions. |
| Maryland Online Data Privacy Act |
Oct 1, 2025 | Strictest Sale Ban | Broad health data including status and condition; precise geolocation. Most strict: prohibits sale of sensitive personal data even with consent. | Data minimization to necessary/proportionate only — stricter than most states. Sale of health data banned outright. Mandatory cure period expires Jan 31, 2026. |
| Oregon OCPA (SB 619) |
Jul 1, 2024 | Active | Health status and condition; mental health; sex life; precise geolocation. | Opt-in for sensitive data. No general cure period — AG has immediate enforcement authority. |
| Delaware DPDPA (HB 154) |
Jan 1, 2025 | Active | Broad health definition; pregnancy explicitly named as a sensitive category — one of the few states to do so explicitly. | Opt-in for sensitive data. Enhanced teen protections: opt-in for targeted advertising and sale for those under 18. |
| New Hampshire NHPDA (SB 255) |
Jan 1, 2025 | Active | Health status and condition. Broad coverage for non-HIPAA entities. | Opt-in consent for sensitive data; AG-only enforcement; 60-day cure period in place initially. |
| New Jersey NJDPA (P.L.2023, c.266) |
Jan 15, 2025 | Active | Broad health data including status; consumer health data cannot be sold without consent. Geofencing near healthcare facilities prohibited. | Proposed AG rules (2025) would require: refreshed consent after 24 months inactivity, immediate deletion on consent withdrawal, dark-pattern prohibitions. Final rules expected 2026. |
| Rhode Island RIDPA (HB 7787) |
Jan 1, 2026 | Active | Broad health definition; notably low applicability thresholds (35,000 consumers, or 10,000 if >20% revenue from data sales). | Opt-in for sensitive data. AG enforcement; cure period discretionary. Mirrors Virginia framework with broader health coverage. |
| Utah UCPA (SB 227) |
Dec 31, 2023 | Active | Broad health condition and status definition. | Opt-out model (not opt-in) for most processing; controllers must honor opt-out requests within 45 days. More business-friendly than other Tier 2 states. AG enforcement only. |
Tier 3: Comprehensive Privacy Laws with Narrow Health Definitions
Nine states define consumer health data narrowly — primarily as clinical diagnosis. An entity that collects heart rate data, menstrual cycle data, or mental wellness scores from a consumer app may not be subject to health-specific protections in these states, even if the data is sensitive in practice.
| State / Law | Effective | Status | Health Definition | Notable Provisions |
|---|---|---|---|---|
| Virginia VCDPA (HB 2307) + VCPA Amendment |
Jan 1, 2023 (VCPA reproductive health: Jul 1, 2025) |
Active | Narrow (diagnosis-focused) under VCDPA; VCPA amendment separately protects reproductive and sexual health data with explicit consent requirement. | Virginia's dual-track approach means VCDPA governs most health data narrowly, while VCPA covers a specific sensitive subset more aggressively. |
| Texas TDPSA (HB 4) + TX RAIGA (HB 149) |
Jul 1, 2024 (RAIGA: Jan 1, 2026) |
Active | Narrow — limited to health diagnosis. Small businesses must obtain consent to sell sensitive data (including narrow health data). | RAIGA (2026) prohibits harmful AI uses and applies existing privacy requirements to AI-processed data. Biometric data consent clarified for AI training use cases. |
| Indiana INCDPA (SB 5) |
Jan 1, 2026 | Active | Narrow (diagnosis-focused). Mirrors Virginia template. | AG enforcement only. Cure period discretionary. Relatively business-friendly framework. |
| Kentucky KCDPA (HB 15) |
Jan 1, 2026 | Active | Narrow (diagnosis-focused). First state to specifically target ACR (automatic content recognition) data from smart TVs as sensitive data. | Opt-in required for ACR data — a novel provision not seen in other state laws. AG enforcement only. |
| Minnesota MCDPA (HF 4757) |
Jul 31, 2025 | Active | Narrow (diagnosis-focused). Small businesses must obtain consent to sell sensitive data. | No cure period at all — aggressive enforcement posture from day one. Unique among comprehensive state laws. |
| Nebraska NDPA (LB 1074) |
Jan 1, 2025 | Active | Narrow (diagnosis-focused). Small businesses must consent for sale of sensitive data. | No cure period. AG enforcement. Narrow health definition limits applicability for non-clinical health data contexts. |
| Montana MCDPA (SB 384) |
Oct 1, 2024 | Active | Narrow (diagnosis-focused). Separately, genetic information covered under Genetic Information Privacy Act, which explicitly adds neural data. | Genetic Information Privacy Act is a separate, sector-specific overlay with its own consent requirements — important for genomic AI applications. |
| Tennessee TIPA (SB 73) |
Jul 1, 2025 | Active | Narrow (diagnosis-focused). Unique $25M revenue threshold in addition to processing threshold — limits applicability to larger entities. | NIST Privacy Framework compliance creates an affirmative defense safe harbor. Treble damages for knowing violations — the highest multiplier among state comprehensive laws. |
| Iowa ICDPA (SF 262) |
Jan 1, 2025 | Active | Narrow (diagnosis-focused). Most limited in scope of all current state comprehensive laws. | Opt-out model (not opt-in) for most processing. No cure period after 90-day initial grace period. No private right of action. |
Tier 4: Sector-Specific and Biometric Laws with Health Relevance
Several states without comprehensive privacy frameworks have enacted targeted laws that are material to health data governance — particularly for organizations using biometric data, genetic information, or patient records.
| State / Law | Scope | Consent / Enforcement | Healthcare AI Relevance |
|---|---|---|---|
| Illinois BIPA (740 ILCS 14) |
Biometric identifiers (fingerprints, retina scans, voiceprints, facial geometry). Covers private entities collecting biometric data from Illinois residents. | Written informed consent required before collection. No sale without consent. Private right of action — statutory damages $1,000–$5,000 per violation. No cap. | Highest litigation risk in the country for any AI application that processes facial, voiceprint, or retinal imagery. Active class actions against healthcare employers have exceeded $100M in settlements. |
| Texas Biometric Privacy Law (Bus. & Com. §503) |
Biometric identifiers. Covers commercial entities. Separate from TDPSA. | Opt-in consent required; AG-only enforcement (no private right); $25,000 per intentional violation. | AG has investigated multiple healthcare and tech companies for AI-enabled facial recognition without consent. |
| Washington My Health My Data + Biometric Privacy (RCW 19.375) |
Both laws apply simultaneously for biometric health data. | MHMDA private right of action + biometric law private right of action may apply concurrently — creating dual exposure for radiology AI and pathology imaging applications. | Radiology and pathology AI vendors processing patient-derived imagery for Washington residents face overlapping consent and data use obligations from both statutes. |
| Arkansas Comprehensive Privacy Law (SB 947) |
Effective July 1, 2026. Minors' data protections tightened in 2026 amendments. | AG enforcement. Broad health definition anticipated in final regulations. | Healthcare organizations with AR patient populations should begin gap assessments before July 2026 effective date. |
| Florida FDBR (SB 262) |
Jul 1, 2024. Applies only to controllers with >$1B global revenue — significantly limits scope compared to other states. | AG enforcement only. No private right of action. Relatively narrow applicability means most healthcare AI startups fall outside its scope. | Limited applicability for most healthcare organizations, but large health systems and national payers may be covered. |
States to Watch in 2026–2027
The legislative calendar for 2026 is active. Several significant state bills are moving through committees or awaiting final votes that would materially expand health data protections.
| State / Bill | Status (as of Apr 2026) | Key Health Provisions |
|---|---|---|
| Maine LD 1822 — Maine Online Data Privacy Act (MODPA) |
Passed Senate 18-16 (Mar 5, 2026); returning to House | Data minimization required — data collection limited to what is necessary and proportionate. Bans sensitive data (including health conditions) without explicit consent. If enacted, would be among the strongest laws nationally. Effective date: September 2027. |
| Alabama HB — Comprehensive Privacy Bill |
Passed House unanimously (104-0); awaiting Senate vote | Comprehensive framework; health and sensitive data provisions expected. Strong bipartisan support signals likelihood of passage in 2026 session. |
| Massachusetts Proposed Comprehensive Privacy Law |
Active deliberation | Anticipated to model Maryland's law — strict data minimization and a potential ban on sale of sensitive health data even with consent. If passed, would be the most restrictive East Coast law. |
| Hawaii Geolocation Privacy Bill |
Active 2026 session | Proposes an outright ban on the sale of geolocation data — material for any health app or healthcare AI system that uses location as a predictive or operational signal. |
De-Identification Standards Across State Frameworks
De-identification is both a compliance tool and a business strategy. For healthcare AI companies that monetize de-identified datasets, the specific de-identification standard recognized by each state jurisdiction determines whether a dataset is legally outside the scope of privacy requirements — or still regulated as consumer health data.
The emerging consensus pattern across state laws is as follows:
- HIPAA Safe Harbor (45 CFR §164.514(b)(2)): Recognized as the baseline de-identification standard in Washington, California, New York (pending), and most states that explicitly reference de-identification. Removal of all 18 HIPAA-specified identifiers is required, with no actual knowledge that the remaining information could identify an individual.
- Expert Determination (45 CFR §164.514(b)(1)): Accepted in most jurisdictions, but New York's pending NY HIPA bill would require third-party certification of the statistical analysis — adding cost and process overhead compared to the current HIPAA model.
- State-specific standards: Maryland has signaled that its AG may apply a stricter interpretation of de-identification than HIPAA's baseline, particularly for datasets that combine health data with precise geolocation. No formal regulatory guidance has been issued yet as of Q1 2026.
- Re-identification risk: California's CPRA requires that de-identified data include technical safeguards preventing re-identification and that businesses contractually prohibit re-identification by downstream recipients. This mirrors HIPAA's existing BAA re-identification prohibition but extends it to the CPRA context.
The key open question for 2026 is whether state AGs will begin to challenge datasets labeled as "de-identified" that were compiled using geolocation proximity to healthcare facilities, clinical visit patterns, or behavioral health indicators — all of which are highly specific to individual health context even after formal identifier removal.
Notable Enforcement Actions and Emerging Trends
Enforcement is no longer a theoretical risk. Multiple state attorneys general have opened investigations and reached settlements in 2024–2026 involving consumer health data handled by non-HIPAA entities. The following actions are illustrative of enforcement priorities.
The Washington AG's office issued formal guidance clarifying that consumer health apps, fertility trackers, and mental wellness platforms are covered "regulated entities" under the MHMDA regardless of their HIPAA status. Investigations of several app developers were reported by industry press following the guidance. The MHMDA's private right of action has also generated consumer class action litigation in Washington state courts.
Illinois remains the most active enforcement jurisdiction for biometric data in the country. Healthcare sector BIPA litigation has expanded from employer timeclock cases to AI-enabled diagnostic imaging platforms that collect facial geometry or retinal scans as part of patient intake or identity verification workflows. Class action settlements exceeding $50M have been reached in the healthcare-adjacent technology sector.
The Texas AG reached a landmark $1.4 billion settlement in 2024 with a major technology company over alleged biometric data collection without consent — the largest biometric privacy settlement in U.S. history at the time. Healthcare AI companies using facial recognition or biometric matching for patient identity verification face direct exposure under the Texas biometric statute, separate from TDPSA.
Connecticut's AG announced that health-related data collected by consumer apps — including pregnancy tracking, mental health, and chronic condition management applications — would be prioritized for enforcement review under the CTDPA's sensitive data provisions. The AG specifically identified the lack of clear opt-in consent flows as the primary compliance gap being observed in the market.
Cross-Cutting Themes for Healthcare AI Organizations
Several patterns emerge from the state-by-state landscape that are directly material to healthcare AI developers, data platform operators, and governance teams:
🔐 Consent Architecture is Critical
The shift from opt-out to opt-in consent for health data is not uniform across states, but it is directional. Building consent architecture that supports granular, revocable, state-aware consent at collection time is the only scalable path forward.
📍 Geolocation + Health = High Risk
Multiple states now treat the combination of precise geolocation data with health context as a category warranting the highest protection level. This directly affects AI models using location as a predictive variable for health outcomes.
🧬 De-ID Is Not a Universal Shield
HIPAA de-identification removes federal regulation but does not guarantee compliance with all state laws. Washington and Maryland impose obligations at the point of collection — before de-identification occurs.
⚖️ Private Rights of Action Are Expanding
Washington and Illinois give consumers direct standing to sue. New York's pending law would add a third major state with private rights. Class action exposure is a material financial risk for any company handling health data at scale.
🤖 AI Governance Laws Overlay Privacy
Texas's RAIGA (2026) and other pending AI-specific laws are beginning to layer on top of privacy frameworks, applying consent requirements specifically to AI training data. This is an emerging compliance frontier.
🏥 Non-Covered Entities Face Growing Exposure
The HIPAA gap is the central target of state legislation. Apps, wearables, AI diagnostic tools, and data brokers that previously operated outside HIPAA's reach are now the primary enforcement focus across multiple state AGs.
Implications for Data Governance and AI Infrastructure
For organizations building healthcare AI infrastructure — including data aggregation platforms, de-identification pipelines, and AI training dataset marketplaces — the state privacy landscape creates both compliance complexity and competitive opportunity.
Organizations that operationalize state-specific compliance at the data layer — mapping each data element to its source jurisdiction, consent record, and applicable law — will be able to move faster, monetize data more broadly, and provide their institutional partners with the legal defensibility they require. Those that do not will face mounting regulatory exposure as enforcement activity accelerates.
The practical implication is that consent management, provenance tracking, and de-identification documentation can no longer be afterthoughts in AI data pipeline design. They must be foundational infrastructure elements — logged, auditable, and jurisdiction-aware from the moment data enters a system.
This tracker will be updated on a quarterly basis or when material legislative or enforcement developments occur. For questions about how RAIHD's data governance infrastructure addresses state-specific compliance requirements, reach out directly or explore our Health AI Data Model overview.