LLMSafetyHub

Does HIPAA Cover AI Tools? What Healthcare Startups Need to Know

AI is moving fast in healthcare. Here’s a plain‑English look at where HIPAA applies, where it doesn’t, and the simple questions to ask before you plug AI into patient work.

HIPAA in one minute

Where AI fits (and common mistakes)

  1. Pasting PHI into public AI tools (e.g., free chatbots) → risky, often non‑compliant.
  2. Using non‑HIPAA vendors for summarization/transcription → no BAA, no go.
  3. Assuming “private mode” = HIPAA → not necessarily; read the contract and data policy.

Quick scenarios

Scenario A: A clinician pastes patient notes into a general LLM. The vendor stores prompts to improve the model. That’s PHI disclosure to a non‑BAA vendor → compliance risk.

Scenario B: A telehealth startup sends transcripts to a speech‑to‑text API without a BAA. Even if encrypted, vendor role still matters under HIPAA.

Scenario C: You deploy an AI scribe from a vendor that signs a BAA and limits data use to your account. Safer, but still needs access controls & audit logs.

Insurance: what’s usually covered (and not)

Five questions to ask before using AI with PHI

  1. Will the vendor sign a BAA? What’s their data retention policy?
  2. Is PHI used to train models beyond our account? Can we opt out?
  3. Where is data stored and processed (region, sub‑processors)?
  4. Do we have access controls, audit logs, and prompt redaction?
  5. Does our current cyber policy address AI incidents? If not, what endorsements exist? See our 5 questions to ask your insurer.
Download: HIPAA + AI Risk Checklist (free)

No email required — direct download available.

Before you implement AI in healthcare...

Run the Free 10-Minute AI Preflight Check to spot HIPAA gaps, vendor liability issues, and compliance blind spots before they become costly problems.

Get Free Preflight Check Or get the Complete Playbook

2-page PDF with fillable checkboxes • No email required