Some of my best ideas don’t show up when I am actively trying to write them.
They show up later – when I am in my car rushing to pick up my son because I had to get just “one more thing finished” before I left, knowing full well I can’t actually stop and put the thought down. It’s probably the same reason people say they do their best thinking in the shower.
That’s actually how this post came together, sitting at stoplights (thank goodness for DFW stoplights!) in between bursts of racing to pick up my son from school right after I had nearly completed a website triage, when the last thing on my mind would be AI.
You know, those little fragments of things you came across or are still processing that get stuck in your brain. Then at the strangest of times they reappear in fully formed thoughts that disappear as soon as you are in a place to actually get them down in writing. Or maybe that’s just me…although I doubt that is the case.
AI can run the data. Humans still have to think.
One of the most dangerous things I’m seeing right now is people treating AI as definitive truth.
I was working through a website triage, running data and comparisons between an old site and a newly launched one. On the surface, the numbers looked rough…dire almost. Traffic was showing to be down. Certain metric had dropped. Side-by-side comparisons painted a pretty grim picture.
And to be fair, AI surfaced all of that nearly instantly. What would have taken me hours to sift through the data, I was able to give to my trusty sidekick and have it run those numbers and compare in mere minutes.
But when I slowed down and actually started to dig into some of the data myself, something didn’t seem right. The numbers and what the client was telling me just wasn’t lining up.
When the numbers tell an incomplete story
As I looked closer, a different picture started to emerge.
Some keywords were performing better.
Certain rankings were strong.
Specific pages were gaining traction in ways the old site had never had.
The issue started to become clear that the new site was failing as originally thought. At least not in the manner they were thinking.
The issue became deciding which metrics- and which keywords – actually mattered.
Data doesn't tell you what matters,
It only tells you what moved.
AI can surface performance.
Humans decide whether it's success.

The moment that changed everything
When I questioned one of the conclusions my trusty sidekick spit out and explained why I was questioning it, the response shifted.
“Oh — that’s right. That changes things.”
And that was the moment it all clicked for me.
AI didn't get it wrong.
It just didn't have all the variables.
My trusty sidekick can only give me answers based on the information we humans input. And let's face it, if you are anything like me, I tend to ramble and am all over the map when I chat with it.
Most people treat AI like a search engine. They ask one question. Accept one answer. Move on.
But when you interact with AI - push back, clarify, refine - you are actively influencing the reasoning path. Which means the output isn't just AI's intelligence. It's a collaboration with the human driving it.
The real risk isn’t AI. It’s unquestioned certainty.
Lately, I’ve begun to notice more and more people treating AI output as the gospel. One answer. One conclusion. No follow-up questions.
That’s where things get dangerous.
Because performance only matters if you understand what you are measuring for.
Data without context isn’t clarity – it’s noise.
AI can process data faster than any human ever could.
What it can't do is recognize when the data doesn't tell the whole story.
AI reflects how we think – The uncomfortable truth about inputs
The more I work with AI, the more I realize it doesn’t just give answers. It mirrors how we ask questions.
Our assumptions.
Our framing.
Our certainty.
Our blind spots.
Which means the output isn't objective truth. It's a reflection of the interaction.
AI only works with the inputs it is given.
Humans don't input data cleanly or linearly.
And real businesses don't operate in clean systems anyway.
If we don’t interrogate our own thinking, AI will happily build on it. Right or wrong. Good or bad.

My son thinks I’m killing 13 people every time I use AI
He’s deeply against AI. Like, give-you-the-look-when-he-sees-you-using-it against it.
And the more I work with it, the more I understand why.
The other day I was using Shopify’s AI to write some code. He saw me and said, dead serious: “You’re killing like 13 people every time you use that.” And he truly believes this with all his heart. Not the first time he has brought up all the resources using AI takes up. And in a way I admire him for being so young and thinking so deeply about the world he lives in.
But I looked at him and said, “If you want to spend hours (or days) writing code that takes AI minutes to do, be my guest. You’re welcome to do it for me instead.” Of course, no comment there other than him chuckling knowing mom was on to something.
But here’s the thing – I actually understand his resistance more than he probably realizes.
What he really fears is a world where humans stop thinking.
And working with AI every day has made it clearer to me how often thinking is still required – and how obvious it becomes when it is missing.
The danger isn't the tool.
It's the temptation to stop thinking once the tool is involved.
The real risks people see:
Outsourcing responsibility → But AI doesn’t remove responsibility – it magnifies the absence of it.
Letting systems decide → But bad strategy with AI just happens faster. It is still bad strategy.
Using AI to avoid thinking → But unexamined inputs produce confident-looking nonsense.
So to me the real risk isn’t AI.
It’s humans who stop questioning.
AI doesn’t replace judgment.
It reveals who was relying on it in the first place.
And yes – AI helped me write this post
Did AI help me write this post?
Yes. It absolutely did.
Not because I didn’t have the ideas – but because I think out loud, in fragments, and in circles. My trusty sidekick ChatGPT helps me organize those thoughts and connect them more clearly.
But the ideas are still mine.
The judgement is still mine.
The responsibility is still mine.
AI didn't decide what mattered here.
It helped me explain why it mattered.
What this actually means for your business
That’s the part I keep coming back to.
I often hear people say. “Just have ChatGPT write your SEO copy.”
And yes – you absolutely can.
But the difference between decent copy and copy that actually performs comes down to whether you understand what variables to give it in the first place.
SEO copy isn’t just about keywords. It’s about intent, hierarchy, context, and clarity. AI can execute – but it needs direction.
AI is an accelerator, not a strategist.
The strategy lives in the prompt - not the output.
SEO and AEO aren't just about words.
They are about strategy, and strategy lives in the exceptions.
Strategy doesn't live in the data.
It lives in the questions we ask after seeing it.
Metrics don't move because content exists.
They move because there is intention behind it.
Website triage isn't a checklist.
It's a discovery process that is different for every person.
AI can accelerate execution. Humans decide direction. AI can surface patterns. Humans decide whether they matter. AI didn't replace my role. It clarified it.
And that difference still matters.
Wisdom still lives in context, experience, restraint, and knowing when to question a confident answer.
When people say AI replaces SEO or copywriting, what they really mean is it replaces execution. The thinking still have to come from somewhere.
AI can surface the patterns. I help you decide which ones actually matter for your business. If you’re ready to stop guessing and start knowing what to prioritize, let’s talk about a Strategic Digital Audit.
