AI + Leadership

AI Governance for Local Government

AI adoption in local government is accelerating. The governance question hasn’t kept up.

The Governance Gap

Every municipality faces the same tension: the pressure to adopt AI is real, the risks of adopting it without structure are real, and the anxiety about getting it wrong, or falling behind, is real at every stage of adoption.

That tension exists at every level. Leaders who haven’t started feel the pressure to begin. Leaders well into implementation feel the pressure of others moving faster. Leaders who’ve built sophisticated AI capabilities still face governance questions they haven’t resolved. The common thread isn’t where you are on the adoption curve. It’s whether you have a framework for the decisions the technology keeps requiring.

The AI conversation in local government has largely been shaped by technology vendors. The questions they lead with, which tools, which platforms, which models, are important. But they’re second-order questions.

The first-order question is the same one that governs any organizational change: what are we actually protecting, and do our systems enforce that?

Government leaders face constraints that make governance especially critical. Staff using cloud AI tools may be creating FOIA exposure without realizing it. Vendor-controlled platforms can change models, pricing, and capabilities without notice. Institutional knowledge built on a vendor’s system is institutional knowledge you don’t fully control. And in many organizations, employees are already using AI tools informally, without clear policies, without visibility, and without alignment to organizational priorities.

These aren’t technology problems. They’re leadership governance problems that require both a framework for deciding what AI should do and technical implementation that enforces those decisions.

Where The Shift That Sticks Meets AI

The Shift That Sticks wasn’t designed for AI specifically. It was designed for navigating change when the ground keeps moving. That turns out to be exactly what AI governance requires.

Face the Truth asks: what is actually happening with AI in your organization right now? Not the vendor pitch, not the fear, but the reality. Which staff are using AI tools? What data is leaving your network? Where are the governance gaps?

Define What Matters asks: what are you protecting regardless of which tools emerge? Data sovereignty, staff trust, institutional knowledge, public accountability: which of these is non-negotiable, and do your current practices enforce that?

Make It Real builds the governance systems that ensure AI operates within the boundaries your leadership has defined. Not theoretical policies. Working systems with clear rules, appropriate access controls, and accountability structures.

Reinforce the Change sustains those governance structures as the technology evolves. The tools will keep changing. The question of what you’re protecting doesn’t.

What Simon Sinek calls Circle of Safety, the boundary within which people can focus on work rather than protecting themselves from internal threats, becomes harder to maintain when AI introduces uncertainty about roles, data, and institutional knowledge. AI mirrors the framework you wrap around it. An organization with clear values and strong governance will deploy AI in alignment with those values. An organization without that clarity will find that AI amplifies whatever patterns already exist, including the ones leaders would rather change. Governance is what holds that boundary intact as the technology shifts.

This is what we’re building. Not AI tools. A governance framework, grounded in proven methodology, that helps local government leaders adopt AI in alignment with what their organizations actually value.

What Does Your Governance Gap Look Like?

Most governance gaps aren’t visible from the top. They show up in the questions leaders haven’t had a reason to ask yet. These four are a starting point.

Do you know which staff members are currently using AI tools? In most organizations, employees are already experimenting with AI on their own. Research calls them “secret cyborgs,” people using AI to boost productivity but not telling leadership, often because there’s no policy framework that makes reporting feel safe.

Do you know what data is passing through those tools? Every cloud AI interaction sends organizational data to external servers. In government, that can mean constituent information, internal deliberations, or draft documents flowing through systems your organization doesn’t control and may not be able to account for under public records requests.

Is there a policy framework that makes AI experimentation safe to talk about? When staff can’t discuss how they’re using AI, the organization loses the ability to learn from what’s working, prevent what’s risky, and build shared standards. The silence itself becomes the governance gap.

Could you answer a FOIA request about your organization’s AI usage today? If the answer is uncertain, that’s the gap between where your organization is and where your governance needs to be. Not a failure of leadership. A condition that changed faster than the structures around it.

If any of those questions surfaced something you’re not sure about, you’re not behind. You’re at the point where the right conversation matters more than the right tool.

Join the Conversation

Governance questions don’t resolve in isolation. They benefit from thinking out loud with peers who face the same constraints: the same FOIA exposure, the same political dynamics, the same pressure to adopt without a plan.

We’re hosting virtual sessions for city managers, county administrators, and government leaders who are working through what AI governance means for their organizations. Small groups. Confidential. Facilitated using the same methodology we apply to every leadership challenge. Whether you’re early in adoption, deep into it, or somewhere in between, the conversation is designed for where you actually are.

No pitch. No product demo. A conversation about what matters when the technology keeps changing.

AI Governance Conversation

Tell us where you are and what you’re thinking about. We’ll be in touch as sessions are scheduled.

AI Alpha Interest

Your information stays with us. This isn’t a sales funnel. It’s a peer conversation.

Consulting Services   ·   Think Tanks   ·   Contact Us   ·   All Services