Your Demand Engine Isn't Ready for AI


Hello Reader

When I was the director of marketing for an events management company, we ran demand programs the way you pack for a trip at 5am, throwing things in and hoping you didn't forget something important.

Every Monday we'd look at the pipeline number, panic, and launch something.

By Friday we'd already moved on to the next thing without checking if the last one worked. If we missed our number, budget was on the line (and so was my job).

And that was years ago.

Now the pressure is even higher, because the CEO wants us to wave our AI wand and do it faster, with less budget and fewer people.

But AI on top of an unstable system just compounds the instability.

So before we talk about where to add AI, we need to talk about what powers your programs, and that's workflow design.

Where We Are Today

Before we get to workflow design it’s helpful to understand the new rules of demand generation.

As marketers we’re operating under three key shifts:

#1 Your Buyers Have Left the Funnel

Your buyers aren't sitting in one channel waiting for your email sequence to arrive. They’re bouncing between AI search tools, peer communities, private Slack groups, podcasts, LinkedIn threads, and conversations you’ll never see.

Obviously this is the short version. If you want more detail on the buyer journey and how it's changing, check out this edition from a few weeks ago.

#2 More Content Won't Save You

AI made content production cheap and fast.

Every competitor on your list can now publish more posts, send more emails, launch more ads and spin up more landing pages in a week than your team used to produce in a quarter.

#3 One Touchpoint Isn't Enough

Today, budgets are tighter, committees are larger, and risk tolerance is lower.

A single white paper download no longer earns you a demo. Buyers want steady proof across multiple touch points before they'll even take a meeting with your sales team.

We Were Productive and Still Losing

Let's go back to my prior events job.

Every quarter at that company started the same way: A whiteboard full of ideas, a rush to execute, and no conversation about how any of it connected to what we ran last quarter.

We were always building something new.

Here's a campaign strategy workflow I wish I'd taken the time to build back then.

In this workflow I'm responsible for aligning strategy and setting the KPIs before I collaborate with AI to build out the campaign themes and channel mix. My favorite part of this workflow is asking Claude to act as my target persona so I can test my assumptions.

Here's another one for content creation:

When I hand off to my team, they're responsible for starting with our brand guidelines to set the creative direction, then they collaborate with AI for drafting and scaling, but they're always close out the loop for final review and approval.

Adding AI to the Recipe

Once you've plotted out your workflows and cleaned up the process it's time to decide what you'll delegate to AI.

One of the first workflows I tackled was campaign analysis. If you don't know how your programs are converting, you have no way to know if you're helping the sales team hit their goals.

What you need is a performance loop that runs on a shorter cycle.

Here's what mine looks like.

After a campaign runs for two weeks, I export the performance data from Meta and Google and pull screenshots of every creative variant. I load it all into Claude (the data and the visuals together, because Claude can connect creative decisions to performance patterns).

Then I run it through a sequence:

First, I ask for a creative analysis. What's working, what's underperforming, and what patterns it sees across the variants.

Because it can see the actual ads, it can connect performance to creative decisions. In one recent campaign, it identified that the headline with a customer proof point was outperforming the office imagery variants by a wide margin.

Second, I have it dig into demographics. Which age cohorts are converting, where the spend is being wasted, and whether there's an anomaly it finds surprising. Last month it flagged that interactive content was outperforming the static ads.

If you want to get really fancy you can even ask Claude to build you an interactive dashboard.

The whole review takes about 30 minutes. Before this workflow, it took half a day and I still missed things.

Optimize all Your Workflows

You can apply this same approach to any workflow in your demand engine (or any other program!).

The pattern is the same:

Standardize the inputs, decide what you and your team handle, assign the rest to AI and run it on a cadence.

If we're talking about topic validation, that means scraping your sales call transcripts and Gong notes, feeding them into your LLM (minus the PII), and clustering the objections and recurring questions.

Every Monday, marketing reviews the top three objection clusters and assigns one to a campaign theme for the month.

For content production, that means maintaining a single reference table that maps your topic clusters to your ICP segments and active campaign themes.

When it's time to produce, you feed that table into your LLM with your brand guidelines and let it draft the full content kit (article, social post, email etc.) matched to the right audience and message. Your two-week production cycle collapses into an afternoon.

Here are a few more workflow ideas from marketers I follow on Reddit:

EntreprenuerRideAlong combines ICP lookalikes, LinkedIn profiles and ChatGPT to identify high converting leads.

Either_Bunch is layering AI agents on top of playbooks.

If online directories matter for your business you can try AntiqueDark’s approach.

Before You Add Anything New

When I think back to that events job, effort was the one thing we never lacked. Every campaign started from zero because we had no foundation and no clear workflow.

If I could go back, I would take the time to document our workflows, score them honestly and then decide where to add AI.

Does this feel familiar?

Use the scorecard below to see where your workflows are breaking. It's a great place to start and will show you exactly where to focus.

Score Your Workflows (Before you Add AI)

Pick one workflow (campaign launches, content production, lead scoring, whatever runs most often) and score it across five dimensions.

A perfect score is 25.

In my experience working with marketing teams, most land somewhere between 8 and 14 so don’t feel discouraged if you end up in that category.

1. Repeatability

Does this workflow run the same way every time, or does it depend on who's running it?

1 = Fully ad hoc — different every time
3 = Mostly structured — some steps are consistent
5 = Fully standardized — anyone on the team could run it

2. Context Clarity

Are the inputs defined before work begins, or does every cycle start with "what are we doing again?"

1 = Briefs vary wildly or don't exist
3 = Basic template that sometimes gets used
5 = Structured inputs required before work starts

3. Judgment Boundaries

Is it clear where humans decide and where AI assists, or is everyone just prompting and hoping?

1 = Blurred — no one knows who owns what
3 = Informal agreement — "AI does the first draft, I guess"
5 = Explicit ownership with defined review gates

4. Optimization Cadence

Is there a scheduled rhythm for reviewing and improving this workflow, or do you only look at it when something breaks?

1 = Reactive — we fix it when it fails
3 = Monthly review — someone looks at the numbers
5 = Predefined optimization loop — regular review with documented changes

5. AI Leverage

Is AI embedded inside the structured workflow, or is it something people use on the side when they remember to?

1 = Occasional prompt use by individuals
3 = Team is experimenting with tools
5 = AI is a defined step in the workflow with clear inputs and outputs

Score Yourself

21–25: Your workflow is stable. You're optimizing, not guessing.

15–20: Moderate variability. The workflow works but it's fragile — one person leaves and it breaks.

10–14: High inconsistency risk. You're producing output but you can't predict results.

Under 10: Pipeline volatility is structural. This is the workflow to redesign first.

If you scored under 15, you now know exactly where to focus.

Look at whichever dimension scored lowest — that's your bottleneck. Fix that one before you add any new AI tools, launch any new campaigns, or hire any new people.

Want to Level Up Your AI Game?

If your team is ready for a hands-on AI strategy session, my custom-designed workshops are built to uncover the workflows that can save you hours every week.

Prefer to start small? My YouTube channel is packed with quick, practical “how-to” videos that show you exactly how I use AI tools for marketing, content, and automation.

Planning an event or conference? I deliver high-energy AI sessions that engage audiences and leave them with actionable strategies they’ll talk about long after the event. Book me for your event here.

Did some one forward you this email? You can subscribe here.

2120 Contra Costa Blvd #1059 , Pleasant Hill, CA 94523
Unsubscribe · Preferences

AI at Work

AI at Work is a weekly newsletter on how marketing teams redesign workflows, roles, and systems with AI. Real examples, practical frameworks, and repeatable processes operators can use immediately. Join thousands of successful marketing leaders by subscribing below!

Read more from AI at Work
Getting AI Approval

Hello Reader There’s a dynamic I hear from marketing leaders right now, and it usually sounds like this. “My CEO keeps pushing us to drive results with AI, but when I put together a proposal, with budget, headcount, and a timeline, I struggle to get it approved.” If that feels familiar, you’re not alone, and the problem isn’t your proposal. It’s that most AI programs get stuck in what I call pilot purgatory: lots of small experiments, zero coordinated investment, and a CEO who’s enthusiastic...

For most of my career, the stack was something I learned to live with. Five-year CRM contracts, clunky workflows and bolt-on tools the IT team never wanted to approve. We worked around limitations and stitched together processes. Proving attribution was near impossible. Then AI arrived—promising to finally simplify our lives. But has it? You’ll have to read on to find out 👇 You Stack Has a Context Problem Your team is running campaigns across a dozen tools. Some overlap, some were added to...

If it feels like everyone is panicking about AI right now, you’re not imagining it. A lot of the loudest takes are designed to travel. They’re crafted to provoke urgency and fear, and that noise can make even experienced leaders feel overwhelmed. My take? Don’t build your strategy on someone else’s hyperbole. Instead, zoom out and get intentional about how you’re using AI at work. This week’s theme connects three ideas: Slow down. AI speeds execution, but judgment, alignment and...