The question isn't whether to use AI in your creative workflow. It's knowing exactly where it helps and where it doesn't. Get that wrong and you end up with faster mediocrity instead of faster quality.
The most common mistake I see with AI in creative workflows is using it as a content generator. You prompt it for ad copy, it gives you something that sounds plausible, you clean it up and send it to production. The problem is that the output is only as good as the input — and if you haven't done the research to know what angle to pursue, the AI is just generating plausible-sounding guesses faster than you could write them yourself.
The better use is as a research accelerator. AI can process a product's review landscape in minutes — categorising thousands of comments by persona, by buying reason, by objection type — work that would take a strategist hours to do manually. It can surface patterns in competitor ad copy that would take days to spot through manual review. It can take your documented research process and critique it against known direct response principles, flagging where you're missing angles or making assumptions without evidence.
What it can't do is decide what matters. It can surface that 40% of reviews mention durability concerns — but whether that's the dominant concern for your target buyer, or a secondary one that's being drowned out by vocal edge cases, requires judgment about who your buyer actually is. That judgment is yours. AI gets you to the decision point faster. It doesn't make the decision for you.
The workflow below reflects how I actually use these tools: front-loaded on research and angle ideation, where AI has a genuine edge, and human-led from strategy through production, where it doesn't. The goal is better briefs in less time — not faster creative that skips the thinking.
This is the workflow I use when approaching a new product or a new account from scratch — no historical data, no existing winners. The AI-led stages compress the research phase dramatically. The human-led stages are where the strategic and creative quality actually gets made. Each stage feeds the next; shortcutting any of them produces weaker output at the next stage.
This is a condensed version of the research stage as I ran it approaching a home security lead gen account — an account where we had no historical data and needed to build a creative strategy from scratch. The prompts are paraphrased; the output types and the human decisions they required are accurate.
Being clear about where AI doesn't help is as important as knowing where it does. These are the four things that remain entirely human in this workflow — not because the tools aren't capable enough yet, but because they require judgment that's contextual, strategic, and market-specific in ways that general-purpose AI can't replicate.