Just Run Your Business on Spells and Magic. If you rely on AI summaries and Top 10 lists, you already are.
- William Lum

- 6 days ago
- 8 min read

We like to believe we’re more knowledgeable and wise than superstitious leaders of ancient times. After all, we don’t consult oracles. We don’t cast spells to make decisions. Most of us would call that crazy or ignorant.
But pause on that for a second—why is it ignorant?
Let's take it apart. Why are those sources of "truth" flawed? Because those methods don’t reliably describe the world. They don’t help us predict outcomes. They don’t stand up to scrutiny. In other words, they don’t require understanding—they replace it.
And that’s the uncomfortable part:
We may be drifting back toward the same pattern.
The modern oracle: AI summaries, Top 10 lists, and outsourced thinking
We don’t use tarot cards anymore. We use AI summaries. “TL;DR” has become part of our everyday language. Think about that. We’ve normalized the idea that something can be too long to take the time to really understand. We just want the bottom line. What we don’t realize is what gets lost when we jump straight to it.
And it’s not just AI summaries. Society has gotten comfortable outsourcing thinking itself, both in our personal and professional lives. Leaders bring in consultants—not just for capacity or perspective—but to tell them what to do. To interpret the situation. To define the strategy. At that point, they’re not advisors. They’re oracles.
To be clear—AI summaries and Top 10 lists aren’t the problem. They can be useful starting points. Sources of inspiration. But they’re not a strategy. They’re not a checklist. They’re someone else’s conclusion—stripped of the reasoning that made it valid. And they’re definitely not a substitute for understanding your own context. The work is in figuring out what applies, what doesn’t, and why. That part should never be outsourced because this is the muscle you use to solve future challenges. When this is outsourced you become the unnecessary piece in the machine. The same pattern shows up more often than we realize... with customers, with leaders, who heard the latest shiny idea / trend and they want that magic bullet without really understanding how it works and what it's limitations are.
I was in a meeting with a group of marketing leaders who had recently joined the team. We were discussing priorities, and one of them said, “We need co-dynamic lead scoring.”
I hadn’t heard that term in over a decade. I was there when we first starting implementing this solution with customers. At the time, it sounded advanced—almost like a breakthrough. It was built on logic, but not on real data. Over the years, I realized what its limitations were and replaced it with something far more grounded and effective. But here it was again—brought back like a magical incantation that would solve the perceived problem. Not because it was right. Because it was familiar and a fixture on Marketing Must have lists for years.
Social media trained us to consume in fragments—headlines, clips, hot takes. Attention spans followed. Now AI summaries are accelerating it. And here’s where it gets dangerous:
A recent analysis found that Google’s AI-generated search summaries are wrong about 1 in 10 times—and not in obvious ways, but in subtle, convincing ways that are hard to catch. It's not completely it's fault, as it was incentivised to fill in missing input info (guess) to be able to give a confident answer (despite the guessing it needed to do). At scale, that means millions of people are getting confident—but flawed—answers as their first impression.
Because this is exactly how oracles worked:
Confident answers. Limited scrutiny. Blind trust.
The problem isn’t access to knowledge—it’s the loss of effort
We live in a time where nearly all human knowledge is accessible within seconds. And yet, we’re becoming less knowledgeable. That sounds contradictory until you realize:
Knowledge isn’t access—it’s effort.
Real understanding requires time. It requires wrestling with ideas. Sitting with ambiguity. Questioning what you just read. You aren't learning anything when you are just mindlessly following steps.
I am a child of an age long before AI summaries and even before web search engines. Knowledge was gained through shear your passion which was an emptiness that craved to be filled... you treasure every bit on info you could find on that topic you were passionate about. Each piece of info took effort to find and understand. Growing up I loved fighter jets and computers. I would borrow and re-borrow the few books we had on the topics. I would clip articles and keep them in scrapbooks. It held value and deserved scrutiny and admiration. Learning something was a badge of honour and it made you special. You didn't just learn what you found, because information was so hard to attain you took time and thought about what you found and questioned the meaning of it and how it fits into your other understanding of the topic. You gained deeper understanding. This kind of thirst seems missing from far too many.
We’re optimizing for speed instead (easy to measure). But faster doesn’t mean better (harder to measure). And in leadership, it often means worse.
What I learned from great leaders
I’ve been fortunate to work with some exceptional leaders. I tried to take the best parts of each of them and blend that into my own style. There’s bias in that, of course. But one trait showed up consistently:
Curiosity.
Not the kind that skims. The kind that digs. That asks “why” one more time. That reads past the summary. The best leaders I’ve seen don’t outsource their understanding. They might outsource for scale but keep skills and experiences in-house. This is how you keep from being beholden to a vendor.
While working in marketing, I found numerous leaders and peers happy to outsource to agencies. At the time, we outsourced a large portion of our digital advertising to agencies. Not just execution—but thinking. Strategy, optimization, even interpretation of results.
It was just how things were done. Enter a new leader that wanted to not just build skills in-house but build a world leading group of experts. He kept asking questions of each of his teams.
Why are we spending so much here? What’s actually driving performance? If this stopped working tomorrow, would we even know why?
He didn’t just review reports—he spent time with the team actually doing the work. He met with people multiple layers down. He asked them to walk him through how things worked, where they were guessing, where they were confident. He was building his own understanding from the ground up.
And then he made a call that went against the grain. If this is a major area of spend, it should be a core competency. So we started bringing that capability in-house. It wasn’t fast. It wasn’t easy. We were learning as we went. But over time, two things happened. Our costs dropped significantly. And our results improved. Not because we found a better vendor.
Because we finally understood what we were doing.
What stood out even more was how he operated day to day. He kept notes—not just on performance, but on people. What they were working on, how they thought, where they were strong, where they were interested in expanding their skills and experience.
He treated understanding the system—and the people in it—as part of the job.
The Cliff Notes problem in leadership
You’ve likely seen this trap before. In school, there were two types of students: the ones who read the book, and the ones who read the Cliff Notes.
One understood. The other got by.
That same divide exists in leadership today. There are leaders who make time for long-form understanding. And there are leaders who operate on summaries. Summaries are useful.
But they are not a substitute for thinking. I’ve seen what happens when that gap shows up in practice.
Early in my time on a team, we were dealing with a data flow issue in our funnel. It was messy, and multiple people were trying to diagnose it. One my peers quickly stepped in with a confident answer. They looked at a field, interpreted it based on the label, and concluded they had found the root cause.
It sounded right.
It was wrong.
But the confidence behind it moved things forward anyway—and created a cascade of follow-on issues. What was missing wasn’t intelligence. It was depth. She hadn't taken the time to understand how the system actually worked beneath the surface—how the data flowed, what the dependencies were, where the edge cases lived. So I started digging. Reading through documentation. Tracing the flow. It took more time. But it led to the real issue.
That experience stuck with me. Because it wasn’t an isolated moment. It’s a pattern you see when people rely on surface-level understanding. They can talk about the work.They can sound confident. But when it comes time to reason through complexity, things break down.
And that’s the risk.
What worries me isn’t that we have tools like AI. It’s how we’re choosing to use them.
There are many among us who default to shortcuts. Just-in-time knowledge. Surface-level understanding.
Leaders aren’t there to repeat information. They’re responsible for making decisions under uncertainty. And that requires:
Seeing assumptions, not just conclusions
Understanding trade-offs, not just outcomes
Recognizing risks hidden in the fine print
Summaries strip most of that away.
Which creates a dangerous illusion:
Confidence without comprehension.
And it raises a real question:
What happens when our ability to act at scale outpaces our ability to think and understand impact? This is the disaster waiting to become the next career ending news story.
A better approach (that still respects your time)
This isn’t about rejecting AI summaries, "Must Have" lists, case studies and consultants.
It’s about using them properly.
Use AI to triage what matters
Use case studies and lists as inspiration
Use consultants for initial expertise and perspective and guidance
Go deep on anything tied to decisions
Form your own view
This is part of why I say work in a field you love. That way staying informed by reading articles and case studies is enjoyable and automatic. And thinking about what you read and watch is also automatic. I share with my team and followers interesting articles (currently my forays in using AI in my daily work #aiAdventures) to generate conversation... perhaps a perspective I hadn't thought of or a novel use case. As a seasoned leader I try to make sure my team has plenty of opportunities to gain experiences but I roll up my sleeves when it comes to something I've not done before and is something we need to develop expertise on allowing me to add my my skills and experience as well.
Even a simple habit helps:
After reading something important, write down:
What new idea/concept/process/etc you learned
What you believe
What you’re unsure about
What would change your mind
I'm old school, and like to read and write on my PC. I find the mobile workflow not as easy to do this. The act of writing helps sort out your thoughts. also exposes areas you might not understand. That’s where understanding compounds.
Final thought
Time is limited. So the goal isn’t to do more in the time we have. It’s to understand more of what actually matters.
Because when you build real understanding, something changes. You don’t just have answers—you have judgment. You see around corners. You catch what others miss.
While everyone else is operating on summaries, you’re operating on insight.
And that gap compounds. In a world addicted to shortcuts, learning deeply is no longer just a skill.
It’s a superpower.



Comments