AI in Indian Healthcare: Building HIPAA + DPDP-Compliant Patient Assistants
Indian healthcare AI engagements are uniquely tricky. You’re often serving patients in India under DPDP, processing claims that flow to US partners under HIPAA, and integrating with ABDM (Ayushman Bharat Digital Mission) standards on the ledger side. Three regimes, one app. Here is the architecture pattern we ship.
The three compliance regimes in plain terms
- HIPAA — applies if your data flows to US payers, providers, or business associates. Forces specific access controls, audit logging, and breach-notification timelines.
- DPDP Act — India’s data protection law. Consent-first, purpose-limited, deletion-on-request. Aligns broadly with GDPR.
- ABDM — India’s health-data interoperability standard. FHIR-based, consent-managed via gateway, identity tied to ABHA.
Architecture: separate the data planes
Don’t try to make one data store satisfy all three regimes. We split into:
- Patient interaction store (DPDP) — chat history, voice transcripts, preferences. Stored in India region with explicit consent and deletion API.
- Clinical record store (ABDM-aligned FHIR) — observations, encounters, conditions. Accessed via consent gateway.
- Cross-border claims store (HIPAA-aligned) — only when needed for US partner integrations, with BAA in place.
Logically separated stores let each regime apply cleanly without one regime’s rules contaminating the others.
Where AI fits without violating anything
The safe and useful AI surfaces:
- Appointment scheduling and triage. Conversational AI that books slots and classifies urgency. No clinical decision-making.
- Document intake. Vision-LLM extraction from prescriptions, lab reports, insurance cards. Output flows to FHIR.
- Patient education. Retrieval over hospital-approved knowledge base, scoped to "informational, not advice."
- Clinician copilot. Summarises patient history, drafts notes for clinician review. Always human-signed.
Where AI should not go yet
Direct clinical advice without a clinician in the loop, autonomous medication recommendations, diagnostic claims. Regulatory framework isn’t there. Liability isn’t there. The model isn’t reliable enough. We decline these scopes.
Consent flows that actually work
DPDP requires explicit, granular consent. ABDM requires consent-gateway-mediated access. Layer them: patient consents in your app (DPDP), and that consent triggers a gateway request for any clinical data fetched from the broader ABDM network. Persist consent records with timestamps and the exact scope granted. Every access checks the consent record before reading.
Multilingual matters more in healthcare than anywhere
Patient comfort with English drops off sharply in healthcare contexts even among educated users. Hindi, Tamil, Bengali, Telugu, Marathi support is not optional. Indic LLMs (Sarvam, Krutrim) plus Bhashini ASR handle this well now. Don’t ship English-only.
How we approach this at Velura Labs
Our Custom LLM Applications and Agentic Systems services have shipped patient-facing healthcare assistants under all three regimes. Pair with Document Processing for the document-intake side. Read our guardrails playbook for the broader compliance pattern. Talk to us before designing the architecture — we’ll save you a few months of compliance back-and-forth.