What’s the Purpose of Strategic Planning in the Age of AI?

“Gemini, please write me a three year strategy for this downtown Toronto Community Health team.”

Picture of keyboard with one of the keys highlighted and says AI

15 seconds passes, then a comprehensive strategy flows across the screen complete with environmental scan, ambitious and yet inspiring priorities and perfectly plotted set of goals, objectives and actions.

First – to get this out of the way – the strategy the AI produces for this CHC isn’t wrong. It’s actually brilliant and compelling. The vision is one we might write -- Empowered, healthy, and thriving communities where everyone belongs. The priorities focus on integrated care, health equity and harm reduction redesign. It appears to synthesize and understand the context for this unique organization. It’s comprehensive.

So why not just print this strategy out and call it a day?

We’ve been doing high engagement strategic planning in health, higher education and all their intersections for 20 years, and this is a definitely a moment you might call fundamentally disruptive.  So how do we unpack the purpose of strategic planning in a way that helps us figure out how to integrate AI but also keep the humanness of our organizations at the centre?

We don’t have hard and fast answers – just some observations that we hope will continue to spark thoughtful, ongoing conversations.

 

1. The perfect strategy doesn’t do anything to move people.

First – and foremost – people don’t do things just because they know they’re told to. Whether we call this “resistance to change” or the poetry and joy of being human, people’s motivations aren’t entirely rational (if they are at all).  Management gurus have lamented for the past 30 years that large scale change efforts almost always don’t fulfill their goals.  The answer isn’t a more perfect, more comprehensive plan – or a better, more compelling campaign to get people to “buy in.”  People – especially the smart, committed and purposeful people in health and education -- need a chance to contribute to and shape the future, and to locate their own – sometimes idiosyncratic and unique – approaches to it.

2. Quantitative data and compelling arguments are not enough to drive change in complex systems

Similar to the pesky unpredictability of humans in organizations, data and evidence don’t alone drive change. We often encounter clients who believe that if they can just build a compelling case through data, they will be able to achieve systemic or policy change. But data and evidence are only a small part of influencing system change -- political will, leadership focus, organizational energy and capacity, and sheer human preference all tend to be bigger factors in decision making. 

We only have to look at the Community Health Centre example we started with for a recent example. There is ample evidence that lives are saved with harm reduction efforts like supervised consumption – but in Ontario, the political forces turned in another direction, and a hard-fought system was dismantled. 

AI can generate excellent analyses, and can point you in useful directions – but you need humans to make meaning of the nuances of the environment you are in, and to suss out the actual openings for change.

3. AI tools are based on imperfect data sets – and yet the comprehensiveness of their analyses creates a false sense of authority or certainty

Much of the work we do in strategy right now is about health equity, or inclusion, or challenging existing assumptions. And we also know that the way we’ve historically gathered data that doesn’t fully reflect our populations – and certainly doesn’t reflect future populations. 

AI tools are only as inclusive as their training data, the humans who create them and their design.  There ARE many people working to call attention to and address AI bias – but today, we need to work doubly hard to ensure the questions we are asking AI to help us with are asked with a critical lens.

4. Strategy needs to be adaptable, learning-focused and emergent

The Community Health Centre strategy we asked Gemini to make at the beginning of this piece has 27 different actions, across three years. The actions are super specific – e.g., “Expand same-day appointment capacity across the main site and prioritize enrolment for individuals who are homeless, newcomers, or medically unattached.”  So what happens if you don’t have the capacity to expand same day appointments?  That goal automatically becomes a “failure.”

Strategic direction is about a future that hasn’t happened yet – in a constantly changing world that has never existed before. It’s tempting to want the certainty of a plan that outlines exactly what you need to do over the next three years. But even the best forecasts can’t anticipate what will and won’t actually be doable. Strategies need to set high level direction where we see clear intention (i.e., “improve access to rapid care”) and then set people up to explore different ways to reach this – and to learn from what works and what doesn’t.

One Light, Many Boats

We call our approach to change leadership  “one light, many boats” – i.e., you need to give people the opportunity to think together to define the future, and then to find their own pathway toward that shared direction. We’ve learned that in complex systems -- where the way to the vision is nebulous and constantly evolving -- people move when they have a shared sense of purpose and the support for their own way forward.  There are a lot of different ways to move toward goals like integrated primary care – we need to create space for the diversity of innovation and the power of individual ownership.

So where should we use AI in strategic planning?

In this moment in early 2026, we use generative AI in strategy in many different ways. Easily accessible tools like Gemini, ChatGPT, Co-pilot, Claude and others do an excellent job of synthesizing existing knowledge, or creating a first draft of an environmental scan – while also allowing you to safeguard the privacy of your data.

There are numerous tools that help with automating tasks like note-taking, basic theming, and laying out initial frameworks or analyses. These are also a great soother for the tyranny of the blank page, enabling you to generate first drafts or offering multiple options for a title, tagline or way to organize ideas. They can create videos and images that translate our ideas into concrete shape. We can “think with” our Ais to ask them to consider additional perspectives, or generate operational or trend forecasts, or to identify data that points us to areas that need further exploration. And they can certainly help us identify measures that targets that help us learn and adapt.

There is definitely a place for AI in the world of strategic planning. But just like in the days when our clients asked us to produce strategies that didn’t end up being “the binder on the shelf no one looks at,” we have to remember that the product of strategic planning isn’t really the point. It’s important, but it means nothing if it hasn’t been created with the minds, hearts and aspirations of the people who will have to make it real.

 
Next
Next

Is change hard?