AI for medical records — Omaha healthcare
Ambient documentation, prior-auth letter drafting, denials response, coding support — drafted by AI, signed by the clinician or coder. Built on a Business Associate Agreement, with the audit trail HIPAA expects.
Text Rosey · Schedule a call →The workflow, end to end
What goes in, what the AI does, what comes out, what your team gets back.
- Input
- Patient encounter (audio or transcript) + EHR context + prior auths and denials queue
- Work
- Draft clinical note, prior-auth letter, denials response, coding suggestion; flag PHI handling exceptions
- Output
- Clinician-reviewable note draft, prior-auth packet, denials response in queue, coding suggestions for review
- Saved
- 5–15 minutes per encounter on documentation; 10–20 minutes per prior-auth packet
What this looks like in production
Medical records work — clinical documentation, prior auths, denials, coding — has the highest documentation burden in healthcare and the strictest data-handling requirements. AI fits well because the work is high-volume, structured, and reviewable; it has to be implemented carefully because PHI is involved on every touch.
At an Omaha health system — Nebraska Medicine has been public about running 22+ in-house AI tools, an AI-driven contact center handling 70% of calls, and a 47% reduction in first-year nurse turnover via an AI-powered platform — the workflow that scales is AI-drafts-and-clinician-signs. Ambient documentation captures the encounter (audio or transcript), AI drafts the clinical note in your EHR's expected format, the clinician reviews and signs. Prior auths follow the same pattern: AI assembles the packet from chart and policy, the prior-auth specialist reviews and submits.
The governance discipline is non-negotiable. HIPAA-covered entities cannot put PHI into a tool without a Business Associate Agreement; enterprise tiers of major AI vendors offer BAAs (consumer tiers do not). HHS OCR's January 2025 NPRM would treat AI software touching ePHI as a technology asset that must be in your inventory and risk analysis. Section 1557's affirmative duty to identify and mitigate bias risk in patient-care decision-support tools became effective May 1, 2025 — relevant for any AI that influences clinical decisions, not just AI that makes them.
How we run it
- Identify the AI vendor with a BAA, no-training data handling, and SOC 2 reports. Configure for your environment.
- Map the AI's role per workflow — drafter for documentation, packet-assembler for prior auths, suggester for coding. Never decider for clinical care.
- Build the ambient documentation flow — capture, draft, clinician review and sign. Audit trail per encounter.
- Build the prior-auth flow — pull from chart and policy, draft the packet, specialist reviews and submits. Track turnaround time.
- Build the denials-response flow — analyze denial reason, pull supporting evidence, draft appeal. Coder or denials specialist signs.
- Bias-mitigation discipline — for any AI that influences clinical decisions, document the bias-mitigation review. Section 1557's affirmative duty (effective May 1, 2025) requires it.
Common questions
- Do we need a BAA with the AI vendor?
- Yes — full stop, before any PHI touches the tool. Enterprise tiers of major AI vendors (OpenAI, Anthropic, Microsoft) offer BAAs; consumer tiers do not. Consumer-tier use with PHI is a HIPAA violation, regardless of how careful the staff is.
- What about HHS OCR's HIPAA Security Rule NPRM?
- Published January 6, 2025; not yet finalized as of mid-2026. Proposed rule would treat AI software touching ePHI as an inventoriable technology asset — meaning your tool inventory has to include it, and your risk analysis has to cover it. Treat it as expected forward direction even before finalization.
- Does Section 1557 apply to all AI use, or just clinical decision-support?
- Section 1557 applies to AI used in patient-care decision-support tools — and the affirmative duty to identify and mitigate bias risk became effective May 1, 2025. Pure documentation tools (ambient transcription) are lower-risk; clinical decision-support AI (triage, treatment recommendations, eligibility determinations) is squarely in scope.
- Can AI handle the prior-auth submission directly?
- AI assembles and drafts; specialists submit. Direct submission by AI is technically possible but introduces accountability complications — payer responses to AI-generated submissions are a developing legal area. Specialist-in-the-loop is the durable pattern.
- What about Nebraska-specific healthcare AI rules?
- No NE DHHS AI-specific guidance has been published as of 2026-05-01. Nebraska providers default to federal HHS OCR / Section 1557 / HIPAA / FDA frameworks. NITC Standard 8-609 applies if you're a state contractor.
Sources
- Proposed rule would treat AI software touching ePHI as a technology asset that must be in the regulated entity's inventory and risk analysis — HIPAA Security Rule To Strengthen the Cybersecurity of Electronic Protected Health Information (NPRM), U.S. HHS Office for Civil Rights (OCR), 2025
- Section 1557 prohibits discrimination through the use of patient care decision support tools, including AI/clinical algorithms — Section 1557 Final Rule — Nondiscrimination Through Patient Care Decision Support Tools, HHS Office for Civil Rights, 2024
- Affirmative duty to identify and mitigate discrimination risks in patient-care decision support tools became effective May 1, 2025 — Section 1557 Final Rule — Nondiscrimination Through Patient Care Decision Support Tools, HHS Office for Civil Rights, 2024
- AI high performers are nearly 3x as likely as others to say their organizations have fundamentally redesigned individual workflows — The state of AI in 2025: Agents, innovation, and transformation, McKinsey & Company (QuantumBlack, AI by McKinsey), 2025
Related
Text Rosey to begin.
Rosey is our executive-assistant bot. Text the number below — she'll ask two questions, offer three calendar slots, and put a 30-minute call on Jim's calendar.
Text Rosey · Schedule a call →