Blue Sage Data Systems
A use case we run for Omaha healthcare providers

AI for medical records — Omaha healthcare

Ambient documentation, prior-auth letter drafting, denials response, coding support — drafted by AI, signed by the clinician or coder. Built on a Business Associate Agreement, with the audit trail HIPAA expects.

Lincoln companies asking the same? See the Lincoln view →

Text Rosey · Schedule a call →

The workflow, end to end

What goes in, what the AI does, what comes out, what your team gets back.

Input
Patient encounter (audio or transcript) + EHR context + prior auths and denials queue
Work
Draft clinical note, prior-auth letter, denials response, coding suggestion; flag PHI handling exceptions
Output
Clinician-reviewable note draft, prior-auth packet, denials response in queue, coding suggestions for review
Saved
5–15 minutes per encounter on documentation; 10–20 minutes per prior-auth packet

What this looks like in production

Medical records work — clinical documentation, prior auths, denials, coding — has the highest documentation burden in healthcare and the strictest data-handling requirements. AI fits well because the work is high-volume, structured, and reviewable; it has to be implemented carefully because PHI is involved on every touch.

At an Omaha health system — Nebraska Medicine has been public about running 22+ in-house AI tools, an AI-driven contact center handling 70% of calls, and a 47% reduction in first-year nurse turnover via an AI-powered platform — the workflow that scales is AI-drafts-and-clinician-signs. Ambient documentation captures the encounter (audio or transcript), AI drafts the clinical note in your EHR's expected format, the clinician reviews and signs. Prior auths follow the same pattern: AI assembles the packet from chart and policy, the prior-auth specialist reviews and submits.

The governance discipline is non-negotiable. HIPAA-covered entities cannot put PHI into a tool without a Business Associate Agreement; enterprise tiers of major AI vendors offer BAAs (consumer tiers do not). HHS OCR's January 2025 NPRM would treat AI software touching ePHI as a technology asset that must be in your inventory and risk analysis. Section 1557's affirmative duty to identify and mitigate bias risk in patient-care decision-support tools became effective May 1, 2025 — relevant for any AI that influences clinical decisions, not just AI that makes them.

How we run it

  1. Identify the AI vendor with a BAA, no-training data handling, and SOC 2 reports. Configure for your environment.
  2. Map the AI's role per workflow — drafter for documentation, packet-assembler for prior auths, suggester for coding. Never decider for clinical care.
  3. Build the ambient documentation flow — capture, draft, clinician review and sign. Audit trail per encounter.
  4. Build the prior-auth flow — pull from chart and policy, draft the packet, specialist reviews and submits. Track turnaround time.
  5. Build the denials-response flow — analyze denial reason, pull supporting evidence, draft appeal. Coder or denials specialist signs.
  6. Bias-mitigation discipline — for any AI that influences clinical decisions, document the bias-mitigation review. Section 1557's affirmative duty (effective May 1, 2025) requires it.

Common questions

Do we need a BAA with the AI vendor?
Yes — full stop, before any PHI touches the tool. Enterprise tiers of major AI vendors (OpenAI, Anthropic, Microsoft) offer BAAs; consumer tiers do not. Consumer-tier use with PHI is a HIPAA violation, regardless of how careful the staff is.
What about HHS OCR's HIPAA Security Rule NPRM?
Published January 6, 2025; not yet finalized as of mid-2026. Proposed rule would treat AI software touching ePHI as an inventoriable technology asset — meaning your tool inventory has to include it, and your risk analysis has to cover it. Treat it as expected forward direction even before finalization.
Does Section 1557 apply to all AI use, or just clinical decision-support?
Section 1557 applies to AI used in patient-care decision-support tools — and the affirmative duty to identify and mitigate bias risk became effective May 1, 2025. Pure documentation tools (ambient transcription) are lower-risk; clinical decision-support AI (triage, treatment recommendations, eligibility determinations) is squarely in scope.
Can AI handle the prior-auth submission directly?
AI assembles and drafts; specialists submit. Direct submission by AI is technically possible but introduces accountability complications — payer responses to AI-generated submissions are a developing legal area. Specialist-in-the-loop is the durable pattern.
What about Nebraska-specific healthcare AI rules?
No NE DHHS AI-specific guidance has been published as of 2026-05-01. Nebraska providers default to federal HHS OCR / Section 1557 / HIPAA / FDA frameworks. NITC Standard 8-609 applies if you're a state contractor.

Sources

Related

→ Start here

Text Rosey to begin.

Rosey is our executive-assistant bot. Text the number below — she'll ask two questions, offer three calendar slots, and put a 30-minute call on Jim's calendar.

Text Rosey · Schedule a call →

or call 415 481 2629