Building an AI Content Engine: A Field Report
San Francisco · June 25, 2025
The content marketer quits on a Friday. She gives two weeks' notice, which in practice means one week of actual work and one week of "knowledge transfer" that consists of her showing me a Google Drive folder with 340 documents in it and saying, "It's all in there."
She was our only content person. She wrote three blog posts a week, managed our email newsletter (4,200 subscribers), created landing page copy, wrote case studies, and maintained our SEO content library of 147 published articles. She did all of this by herself, which is both impressive and, in retrospect, a single point of failure that we should have addressed a year ago.
Now she's gone, and I have two options: hire a replacement (timeline: 6-8 weeks for recruiting, plus 4 weeks of ramp-up) or build something different. I choose option two. I'm going to build an AI content engine.
Before building anything, I need to understand what we have. I spend Monday going through the Google Drive folder — all 340 documents — and categorizing everything. Here's the inventory:
147 published blog posts. Of those, 23 generate 80% of our organic traffic (classic Pareto distribution). The top three posts alone account for 34% of all organic signups. They're all "how-to" content targeting specific long-tail keywords: "how to calculate SaaS churn rate," "B2B onboarding email sequence template," "product-led growth metrics dashboard."
42 email newsletters. Average open rate: 31%. Average click rate: 4.7%. The best-performing emails aren't the ones with product updates — they're the ones with original analysis or data. "We analyzed 10,000 SaaS signups" got a 52% open rate. "Product update: new dashboard features" got 18%.
28 landing pages. Conversion rates range from 1.2% to 11.4%. The high converters all have one thing in common: they focus on a single use case and include specific numbers. The low converters try to do too much.
The content marketer also left behind a content calendar with 34 planned topics. I look at them and immediately cut 20. They're topics like "The Future of SaaS" and "Why Customer Success Matters" — broad, generic, the kind of content that ranks for nothing and converts no one.
She left behind a content calendar with 34 planned topics. I cut 20 immediately. They were the kind of content that ranks for nothing and converts no one.
The AI content engine isn't a single tool. It's a workflow — a series of steps that combine AI generation with human editing to produce content at roughly 3x the speed of a single writer.
Here's the workflow:
Step 1: Topic selection. I use an AI tool to analyze our top-performing content and identify patterns in topic, structure, and keyword targeting. It generates a prioritized list of 50 topic ideas based on search volume, keyword difficulty, and relevance to our ICP. I manually review and select 12 topics per month — three per week.
Step 2: Outline generation. For each topic, I provide the AI with: the target keyword, three competing articles (the top Google results), our brand voice guidelines (which I wrote in a two-page document), and any proprietary data we have on the subject. The AI generates a detailed outline with section headers, key points, and suggested data to include.
Step 3: First draft. The AI writes the first draft based on the outline. This is the part that most people think is the whole process. It's not. The first draft is raw material — factually accurate (mostly), structurally sound (usually), but tonally flat and generically written. It reads like what it is: a machine's best guess at what a human would write.
Step 4: Human editing. This is where I spend most of my time. I rewrite the introduction (AI introductions are universally terrible — they always start with "In today's fast-paced business environment" or some variation). I add specificity — real numbers from our data, anecdotes from customer conversations, opinions that an AI won't generate because it has no opinions. I cut the filler. AI writes long; I edit short.
Step 5: SEO optimization. I use an AI tool to check keyword density, meta descriptions, internal linking, and readability score. This takes about ten minutes per post and is genuinely useful — it catches things I'd miss.
Step 6: Publication and distribution. Post goes live. Gets added to the newsletter queue. Gets shared on the CEO's LinkedIn (where I ghostwrite posts). Gets added to the relevant landing pages as supporting content.
In the first full week of operating the engine, I publish four blog posts. That's one more per week than the content marketer was producing. Time spent per post: approximately 2.5 hours (20 minutes on topic selection and outline, 10 minutes for AI draft, 90 minutes of editing, 30 minutes of SEO and publication). The content marketer spent approximately 6 hours per post.
Quality is the question. The first four posts are competent — well-structured, properly targeted, factually correct. But they lack personality. They lack the content marketer's ability to open with a surprising anecdote or close with a line that sticks. I can feel the difference, even if the metrics can't measure it yet.
I address this in week two by building what I call a "voice library" — a collection of 30 paragraphs from our best-performing content that exemplify the tone I want. I feed these to the AI as examples before every draft. The output improves noticeably. Not to the level of a talented human writer, but to a level that, after editing, passes what I call the "would I be embarrassed if a customer read this?" test.
After six weeks, here's the data:
Content produced: 24 blog posts, 6 newsletter editions, 4 landing page updates. Under the previous system, we would have produced approximately 18 blog posts, 6 newsletters, and 1-2 landing page updates in the same period.
Organic traffic: up 14% month-over-month. This is partly seasonal and partly attributable to the higher volume of indexed content. It's too early to isolate the AI content's specific contribution.
Organic signups: up 8%. Again, early days. But the trend is directionally positive.
Content cost: approximately $1,200/month in AI tool subscriptions plus my time (roughly 25% of my working hours). Previously, the content marketer's fully loaded cost was $8,700/month. That's an 86% cost reduction.
Newsletter performance: open rates are flat at 31%. Click rates are slightly down — 4.2% versus 4.7%. The difference isn't statistically significant yet, but I'm watching it.
An 86% cost reduction in content production. The number sounds good. It also makes me uncomfortable, for reasons I can't fully articulate.
I've been presenting this as a success story, and in many ways it is. But there are things I'm not saying that I should say.
The AI content is good enough. It's not great. The difference between "good enough" and "great" content is the difference between a blog post that ranks on page one and a blog post that someone shares with their team because it changed how they think about a problem. AI can do the first. I haven't figured out how to make it do the second.
I'm spending 25% of my time on content, which means I'm spending 25% less time on other growth work. The opportunity cost isn't zero. I'm not doing as much channel experimentation, not running as many A/B tests, not having as many customer conversations. The content engine freed the company from needing a content marketer, but it didn't free me from the work of content.
The ethical question. The content marketer who left — she left because she saw this coming. She told me in her exit interview: "I've been watching you experiment with AI tools. I know where this is going." She's now freelancing, and some of her clients are our competitors. She's writing the kind of content that AI can't replicate — deeply reported, interview-based, genuinely original. She's doing work that matters more and getting paid less for it. I don't know what to do with that.
The AI content engine is running. Three posts a week, one newsletter, ongoing landing page optimization. The metrics are positive. The cost is low. The CEO is happy.
I'm hiring a content editor — not a writer, an editor. Someone who can take AI-generated drafts and make them sing. Who can add the voice and the personality and the surprise that machines can't produce. The job posting says "experience with AI content workflows preferred," which is a sentence that wouldn't have made sense two years ago.
We got 240 applications in the first week. Twelve of the applicants used AI to write their cover letters. I could tell because they all started with "I am excited to apply for the position of Content Editor at your innovative company."
I put those twelve in the reject pile. If you're applying to be a human voice in an AI world, you probably shouldn't let the AI speak for you.
