Blue Sage Data Systems
A real concern Omaha leaders raise

Our board doesn't get AI

Common pattern. McKinsey 2025 found only 17% of organizations report board-level AI governance. Boards aren't unusual in not yet understanding AI — and that's a problem only if leadership treats it as one.

Lincoln companies asking the same? See the Lincoln view →

Text Rosey · Schedule a call →

Common questions from Omaha leaders

How common is this pattern?
Very. McKinsey 2025 found only 17% of organizations report direct board responsibility for AI governance — vs. 28% reporting CEO-level. The gap is structural: AI moved fast, board agendas didn't. Mid-market boards in particular often lack a board member with AI background, which makes the agenda-setting harder.
Should we just educate the board?
Not as a tutorial. Educating a board on AI as a topic ('here's what an LLM is, here's how transformers work') is the wrong framing — it makes AI a technical matter rather than a governance matter. The right framing: board-level AI governance is a strategy decision about how this organization will work in three years, not a technical decision about which tools the IT department buys.
How do we bring the board into the conversation?
Three moves. (1) Reframe the board agenda — add an AI-strategy item to the regular cadence (quarterly minimum), with materials that focus on outcomes and risk rather than technology. (2) Bring in a board-level AI strategy document that walks through governance, risk, opportunity, and trade-offs. (3) Have leadership commit to the strategic direction and ask the board to oversee, not approve every detail. The board's job is governance and accountability; leadership's job is execution.
What if our board is risk-averse and that's slowing AI adoption?
Risk-aversion at the board level is appropriate. The fix isn't pushing the board to be less risk-averse; it's giving them a risk-management framework that lets them say yes to specific moves with appropriate guardrails. SHRM 2026 found only 49% of organizations have AI use policies, and only 25% of those feel their policy is 'future-proof.' A board that sees a credible governance scaffold tends to be more comfortable with the rollout that follows.
Do we need a board AI committee?
Not necessarily a separate committee, but board-level AI oversight should sit somewhere in the governance structure — usually the audit committee or the technology/risk committee for organizations that have one. The structural decision matters less than the cadence: at minimum, AI strategy review quarterly with formal action items captured in the minutes.
What does board-level AI governance look like in practice?
(1) An AI strategy adopted by the board, refreshed annually. (2) An AI use policy adopted by the board, reviewed quarterly. (3) Standing reports to the board on AI use, incidents, governance metrics. (4) Named board accountability — usually a single board member who carries the AI portfolio. (5) AI risk discussed alongside other major risks, not as a separate technical agenda. McKinsey 2025's high performers do this systematically; their differentiator is workflow redesign, but the governance scaffold is what makes the redesign possible.

Sources

Related

→ Start here

Text Rosey to begin.

Rosey is our executive-assistant bot. Text the number below — she'll ask two questions, offer three calendar slots, and put a 30-minute call on Jim's calendar.

Text Rosey · Schedule a call →

or call 415 481 2629