“AI can be an unusual voice that gives you fresh ideas, makes you think differently, and provides the kind of fuel that sparks innovation. But ultimately, humans provide the context, the judgment, and the ability to bring strategy to life.”
– Christian Stadler
About Christian Stadler
Christian Stadler is a professor of strategic management at Warwick Business School. He is author of Open Strategy, which was named as a Best Business Book by Financial Times and Strategy + Business and has been translated into 11 languages. His work has been featured in Harvard Business Review, New York Times, Wall Street Journal, CNN, BBC, and Al Jazeera, among others.
Website:
LinkedIn Profile:
University Profile:
What you will learn
- How AI is changing strategic decision-making
- The role of AI as a co-strategist, not a replacement
- Why AI disrupts but enhances boardroom discussions
- How open strategy leads to better execution
- Leveraging collective intelligence for stronger strategies
- The rising importance of political awareness in leadership
- Engaging employees to drive innovation and strategy
Episode Resources
Companies & Organizations
Books
Technical Terms & Concepts
Transcript
Ross Dawson: Christian, it’s a delight to have you on the show.
Christian Stadler: Thanks for having me, Ross. It is a delight for me as well.
Ross: So, you have been delving deep into a lot of your background in open strategy. You’ve also been looking at the role of AI in strategy and strategic decision-making. At a high level, how do you see the role of AI in strategy making today?
Christian: I’m an optimist. I think generally, by nature, and also when it comes to how AI can actually be useful for strategists, more and more people are coming to see AI as a partner in many different areas of what we do. I think that’s true for strategy as well. We have some form of co-decision-making, co-intelligence, or an additional voice that we can use in the strategy-making process. For that, it’s really cool.
Ross: These are human-first processes, I suppose. The more complex the decisions are, the more multifaceted they become, and the more the human element needs to be at the forefront. Strategy seems to fall into that category. What are the places where AI might provide support, complementary perspectives, or analysis that are particularly valuable?
Christian: Strategy, obviously, consists of different “boxes” or activities. Some involve coming up with new ideas—something new you want to do in your strategy. Other parts involve fine-tuning and formulating the strategy. Then there’s the execution and implementation side. Probably in each of these aspects, it makes sense to use AI in slightly different ways.
When it comes to ideation, I can ask a tool for ideas, such as setting up a new product line. I played around with this early on when ChatGPT started gaining traction. Even then, it was phenomenally good if you guided the conversation as a strategist. If you just ask ChatGPT, you get generic suggestions, and sometimes they don’t make sense. For example, I once asked for a suggestion for a streaming service. One idea was to create some form of entertainment platform and partner with universities.
Being a professor, I know that universities don’t work like that. Professors aren’t told to participate by some central directive. You need to find ways to motivate individual professors. As I pushed the platform further, better ideas came up. As long as you drive the conversation and are smart about it, AI can provide good ideas.
When it comes to fine-tuning and formulating, the tool can be quick. I’ve been experimenting with a company in Austria for over a year. They make sneakers—Gieswein. We tried seeing what happens in board meetings when we bring ChatGPT into the mix. For instance, when we needed a press release, the tool quickly drafted something. In this case, we didn’t need an agency, which saved time and resources.
However, when it comes to execution, that’s more of a human game. You need to convince people to buy into ideas and feel comfortable with new directions. AI has limitations here, but other tools can help involve more people. Greater involvement aids implementation.
Ross: There’s a lot there I’d like to dig into. We might do a bit of hopping around.
Christian: It’s a bit long-winded, isn’t it? I just keep talking on and on. My bad.
Ross: It’s all good. One interesting point is that part of Amazon’s internal processes involves starting with a press release for a potential product. Then they work backward to figure out how to achieve it. That’s something ChatGPT can facilitate in board meetings. You can draft a press release and discuss if this is something you want to pursue.
Digging into the boardroom specifically, how do you see AI being valuable when working with a group of directors? For instance, red-teaming—having the AI critique decisions—seems promising. What are other potential applications?
Christian: One surprising and insightful aspect of using ChatGPT in boardrooms was its disruptive nature. Imagine a group that has worked together for a long time. The process is smooth because they know how each other thinks. Then ChatGPT comes in and disrupts the flow. For instance, I might ask someone to read a page of suggestions from ChatGPT mid-meeting. It forces the group to stop and think.
Initially, I thought they would hate it, but they actually liked it. The disruption brought up ideas that wouldn’t have otherwise come up in the discussion. It changed the process in a positive way, rather than simply adding information.
Ross: So you were distilling conversations, summarizing, and presenting them to the board?
Christian: Essentially, I was a disturbance. For example, when discussing market entry into the U.S., I’d interrupt and say, “Here’s what ChatGPT suggests.” Having to read and discuss AI-generated content mid-meeting isn’t smooth, but that lack of smoothness was beneficial.
Ross: That reinforces the idea that the facilitator plays a critical role. You acted as an AI-enabled facilitator. Your choice of interventions determined the success of the process.
Christian: Absolutely. We tried different approaches: preparing content beforehand, engaging during the meeting, and doing post-meeting analysis. When the tool worked independently, the output was too superficial. It needed human direction. I’m not an industry expert, but with a feel for strategy, you can create significant benefits.
Ross: One of the specific applications of AI is strategic decision-making, where you already know what decision needs to be made. The decision-making process typically involves defining the decision, generating options, assessing those options, and ultimately making a choice. AI can assist with ideation and evaluating options.
How do you see AI’s role evolving in formal strategic decision-making processes, both today and in the future?
Christian: You mentioned options, and I’ve always been a big fan of scenario planning—drawing pictures of what the future could look like in various versions. Some companies use two scenarios; others prefer four, making it more complex. Whatever strategy you pursue, it needs to be tested against these different futures.
AI tools like ChatGPT are excellent at generating plausible future scenarios. Of course, human direction is necessary, but AI can support the process. Writing compelling, coherent stories about the future is a skill not everyone possesses, and AI can facilitate this.
For now, I see AI primarily as a facilitator. In the medium term, it helps strategists think through different possibilities. Whether AI will ever be capable of independent strategic thinking, I can’t say—I’m no magician. But for now, its power lies in augmenting human intelligence rather than replacing it.
Ross: In some of your work, you’ve referenced ethical concerns. Humans have the ability to grasp broader context, values, and the human experience in ways AI cannot. Personally, I believe AI will remain a strong supporting tool, but I doubt it will take over higher-order strategic decision-making.
Christian: I agree. As you know, I’ve long advocated for involving more people in strategy-making. Opening up the process brings in fresh, unconventional voices, leading to better strategies. AI can serve as one of those voices—offering unexpected insights that force us to think differently.
However, human context is essential. AI-generated suggestions must be assessed within the company’s reality. AI becomes even more valuable when integrated with internal company data, allowing for more tailored insights. That said, I’m cautious about assuming all relevant data is neatly captured. In big tech, this might be the case, but for many medium-sized businesses, it’s not.
For example, I spoke with a CEO who runs a company that makes high-end ski gloves. His strategic decisions—what products to produce and in what quantities—aren’t based on hard data. Instead, he relies on conversations with retailers and industry experts. This highlights a limitation of AI: in many cases, businesses don’t have the vast datasets AI needs to be truly effective.
Ross: Let’s dig into open strategy. Could you provide a simple framing of what open strategy is? And how does it connect to AI?
Christian: The easiest way to understand open strategy is to contrast it with traditional strategy-making. Historically, strategy was developed behind closed doors by a small group—perhaps with the help of a consulting firm.
Open strategy, on the other hand, involves bringing in more voices. This approach not only generates fresher, better ideas but also makes execution smoother. The majority of failed strategies don’t fail because they were bad ideas—they fail due to poor execution. Various surveys suggest that up to 90% of strategic failures stem from execution issues.
When people are involved in strategy-making, they develop buy-in. They also begin to see how strategic goals connect to their work. In our book Open Strategy, we surveyed executives who had implemented open strategy. About 69% said it led to better ideas, and 70% noted that execution was significantly improved.
Ross: We can think of open strategy in different layers. One layer involves opening strategy within the organization, allowing all employees to participate. Another layer involves engaging external stakeholders—partners, suppliers, customers, or even the public.
What are your thoughts on these different levels of openness?
Christian: Absolutely. There are different degrees of openness. You don’t necessarily have to involve all employees—you might just expand participation beyond the usual small group. This is the most common approach and brings significant benefits, even in hierarchical organizations.
For example, I worked with a company in the Middle East, where hierarchy is deeply ingrained. Initially, there was hesitation about involving middle management in strategy-making. Eventually, they agreed, and the results were fantastic. It helped align the organization behind the new strategy.
In this case, we first collected middle management’s input separately because they might have hesitated to speak openly in front of top executives. Later, we shared their insights with leadership. This process built enough trust that in a subsequent round, both groups could participate together.
As for external engagement, it depends on the phase of strategy-making. During the ideation phase, involving external voices can be valuable. You don’t need to share company secrets—just frame the challenge broadly and let external contributors provide fresh ideas.
Even the U.S. military has done this. The Pentagon has held open exercises where the public contributes strategic insights, but they don’t necessarily disclose how those insights are used.
For execution, however, you want broader internal involvement. Everyone in the company needs to understand what’s happening. IBM ran one of the largest open strategy initiatives, involving 160,000 participants. Managing a discussion at that scale requires AI-powered tools to structure and synthesize input.
Ross: Open strategy can be seen as a form of collective intelligence. Whether it’s eight board members, 100 managers, or an entire organization, the challenge is structuring participation effectively.
What’s the state of the art in integrating diverse perspectives into a coherent strategy? How can we improve?
Christian: I have to admit, I’m still a bit old-school when it comes to strategy. I prefer in-person workshops because they allow for deeper discussions. That said, large-scale engagements require digital tools.
A structured approach is key. One method is to start with a broad survey to identify major trends. This helps leadership gauge the organization’s sentiment. Understanding what people think is happening is just as important as knowing what’s actually happening. If leadership’s actions contradict employees’ perceptions, it can create resistance.
Next, bring people into structured workshops. Different teams can develop and pitch ideas. A “Dragon’s Den” format works well—teams compete to refine the best ideas. Facilitators play a crucial role in guiding discussions and ensuring productive outcomes.
Ultimately, open strategy isn’t about turning companies into democracies where everyone votes on decisions. Instead, leadership uses the insights generated through participation to make informed choices. The key is communicating back to employees—explaining what decisions were made and why. People don’t expect to be the final decision-makers, but they value having a voice.
Ross: In a world of accelerating change—particularly with AI—leaders need to refine new capabilities.
What skills do senior executives, board members, and strategy-makers need to be effective in today’s landscape?
Christian: First, they need to engage with AI. It’s as simple as replacing Google with an AI tool when searching for information. Play around with it, get familiar. These models are user-friendly, and you don’t need programming skills to experiment.
Second, leaders must navigate the increasing entanglement between business and politics. In past decades, it was easy to overlook politics, but that illusion is gone. Leaders must understand how to operate in politically charged environments.
It’s not about whether a company is conservative or liberal—successful brands exist at both extremes. Nike is known for progressive values, while Chick-fil-A is conservative, yet both maintain broad customer bases. The problem arises when companies appear inconsistent or opportunistic. For example, many big tech firms once positioned themselves as liberal but later courted the Trump administration. That shift alienated both sides.
Finally, attracting and retaining talent is critical. Many CEOs cite talent shortages as their top concern. Engaging employees, making work meaningful, and fostering motivation will be key leadership skills.
Ross: That ties into the aging workforce issue. Countries with declining populations are often the most receptive to AI and robotics. This shifts the workforce balance, making talent attraction even more crucial.
Christian: Absolutely. Immigration could help, but politically, it’s difficult. Interestingly, even Elon Musk—despite his conservative shift—supports H-1B visas because he recognizes the need for skilled immigrants.
Ross: As we wrap up, what excites you most about your upcoming research, particularly regarding AI and strategy?
Christian: I’m exploring the intersection of corporate strategy and individual decision-making. I’m also using AI to analyze emotions in strategic discussions. For example, I’m working with IBM data to study how emotions impact idea adoption in large-scale strategy sessions. Using AI for research is something I really enjoy.
Ross: That sounds fascinating. Thank you for sharing your insights!
Christian: Thank you! It was a pleasure.
Podcast: Play in new window | Download