📖 In This Issue

  • Featured Snippets: (News & Resources)

  • Cover Story: Stop Treating AI Like a Content Factory (Do This Instead)

  • Operator of Interest: Annie Cushing

  • Learn This: Vector Embeddings

📰 Featured Snippets (News & Resources)

David Cramer tells us how he serves sentry.io’s web pages to AI agents in MD format directly from the server. I think this is sort of unneeded & maybe a bad idea, but if you want to do it, this is a good start.

Google is expanding Gemini capabilities to Docs, Sheets, Slides, and Drive. I’ve already been using it in Sheets, and have to say, its really helped with things like query analysis and content strategy.

BuzzFeed’s recent bankruptcy should be a warning for any publisher that is over invested in AI content development.

A new report from Define Media Group says Google’s AI Overviews are responsible for a -42% organic traffic decline, however “breaking news“ content is getting a 103% boost from all Google properties.

AI as a Content Triage Tool, Not a Content Factory

What if the fastest way to “optimize content” is to stop creating anything?

What if the real speed gain isn’t “draft faster,” but “publish less”, because you finally know what should not exist?

Most in-house teams don’t have a content production problem. They have a content inventory problem. Too many pages doing the same job, too many half-answers competing for the same intent, and too many “we’ll clean it up later” URLs quietly turning into permanent infrastructure.

AI makes it cheap to publish. It does not make it cheap to maintain trust, prevent cannibalization, or keep indexable pages coherent at scale. And that matters more now because Google has been explicit that it uses site-wide signals among many signals, and it is trying to surface content that is genuinely helpful rather than made primarily to rank.

If AI is a multiplier, it will multiply your quality or your debt. The only question is which system you’re feeding.

The core idea: treat AI like a triage nurse

A triage nurse doesn’t “create more patients.” They sort. They prioritize. They decide what gets attention now, what can wait, and what doesn’t belong in the ER at all.

That’s the mindset shift: use AI to decide what not to write, what to consolidate, and what to retire. Only then do you write something new; if there’s still a gap worth filling. This is content ops, not content vibes.

Part 1: Use AI to map what you already have (before you touch a blank doc)

AI is genuinely good at pattern work. Give it a list of URLs, titles, headings, and internal anchor text, and it can cluster pages by topic and intent faster than a human can scan a spreadsheet. It can also flag near-duplicates, summarize each page’s “promise” in plain language, and surface mismatches where the template screams “product page” but the query intent expects “how-to,” or where three different pages all claim to be the definitive answer.

That speed matters because content debt usually hides in plain sight. Humans normalize it. AI doesn’t.

But AI is also bad at context you can’t scrape. It won’t know which pages are politically protected, which ones are tied to sales enablement, or which ones exist because legal said so. It won’t understand why a page exists unless you tell it. And it absolutely should not be making irreversible delete calls without performance data, internal link dependencies, and a sense of what the business is willing to break.

Both things are true at once: AI can surface patterns humans miss, and it can confidently recommend “cleanup” that breaks internal journeys, fragments authority, or removes the one page that quietly ranks for long-tail variations you didn’t think to check.

So the safe use case is “map and recommend,” not “judge and execute.”

Part 2: A triage framework teams can run weekly (and defend later)

The most useful output AI can produce isn’t another outline. It’s a decision memo you can defend months later. Not because it’s beautifully written, but because it shows the tradeoffs.

Bucket 1: Do not write

Sometimes the right move is to not create a new page. The obvious trigger is overlap: you already have multiple pages answering the same job-to-be-done, and a new “fresh guide” would just split signals again. Another trigger is SERP reality: if the results are dominated by forums, UGC, or official documentation, you might be signing up for a fight your site isn’t structurally positioned to win.

This is where AI earns its keep by drafting the “why not” memo. It can summarize overlap risk, point out cannibalization patterns, and spell out opportunity cost in plain language. Your job is to decide whether this is a strategic bet or a distraction.

This also keeps you honest. “We should write it because competitors have it” is not a strategy. It’s anxiety with a CMS.

Bucket 2: Consolidate

Consolidation is the unglamorous lever that makes everything else easier: crawling, internal linking, topical ownership, and user trust.

The signals are usually there. Multiple pages compete for the same intent. Rankings fluctuate without a clear winner. Internal links point to different versions of the “same” answer because different teams linked the page they happened to find first.

AI can propose a consolidation plan: which URL becomes the canonical home, what sections to merge, what to redirect, and which internal links should be updated to stop voting for three candidates at once. It can even draft the combined structure so a human editor can make it coherent.

The human work is the hard part: making the merged page clearer, not just longer. Fluency is not usefulness. A longer page that mixes three intents is still three pages; it’s just harder to debug.

Bucket 3: Retire

Retirement is not a moral judgment. It’s maintenance.

Sometimes a page is obsolete because the product changed. Sometimes it attracts the wrong traffic: high visits, low value, poor match. Sometimes it creates trust risk because it contains stale or unverifiable claims.

AI can help by detecting staleness markers: old dates, outdated UI references, deprecated terminology, and “this used to be true” language. It can draft replacement copy if the topic still matters, or a transitional explanation if you’re redirecting users to a newer source of truth.

But humans decide the method: retire vs update vs redirect vs noindex, based on business impact and technical reality. If you’re removing something from Google, you should be deliberate about how you do it. Google’s own documentation distinguishes between quick temporary removals via Search Console tools and the underlying site changes required for durable removal, and it’s explicit about using the right mechanisms depending on whether you need a fast hide or an actual removal from indexing.

Part 3: What changes when AI becomes your content gatekeeper

The upside

When you treat AI as a gatekeeper instead of a generator, the benefits compound.

Cannibalization drops because you stop publishing redundant pages and start consolidating intent into a clear “home.” Crawl efficiency improves because Googlebot isn’t spending budget on dead-end near-duplicates. Obsolete content gets pruned faster, which lowers the odds that a user lands on a page that makes you look careless.

This also changes internal alignment. It’s easier to get buy-in when the roadmap includes “here’s what we’re removing and why,” not just “here’s what we’re adding.” And in a world where Google evaluates helpfulness using signals that can apply site-wide, treating low-value content as harmless clutter is a risky assumption.

The risks teams underestimate

The biggest risk is triage theater: AI produces neat clusters and pretty tables, and nothing ships. The site remains the same, but now the deck is prettier.

The second risk is false confidence. AI might declare a page redundant because it looks semantically similar, while search behavior says otherwise. Two pages can share 80% of the same words and still serve different intents. Or one might be the only reason you rank for a weird set of long-tail queries.

Then there’s over-pruning. Some “support content” never converts directly, but it functions as trust scaffolding. It answers objections. It reduces support tickets. It helps sales. It gives the site a consistent voice. Delete it blindly and you might not see the cost immediately, but you will feel it.

Even the SEO industry voices that promote pruning tend to warn that benefits are not guaranteed, and that the downside is real if you delete useful pages or links. Treat that caution as a feature, not a footnote.

Finally, tool bias is real. A model optimizes for semantic similarity because that’s what it can “see.” It does not naturally optimize for revenue impact, internal dependencies, or the fragile reality of how your organization actually uses content.

Part 4: A practical workflow that stays lightweight

Start monthly, not daily. Once a month, pull all indexable URLs in the content area you care about and attach the metrics that prevent bad decisions: performance, conversions (or proxy value), internal links, and last updated date. Then have AI cluster the URLs and label each cluster in plain language, as if it’s explaining to someone outside SEO what each set of pages is trying to do.

Weekly, pick a handful of clusters and force a decision. For each cluster, choose to keep as-is, consolidate, update, retire, or create new only if a real gap remains after consolidation. The point is not perfect coverage. The point is a steady reduction in self-inflicted complexity.

Ship with guardrails every time. If you consolidate, ship the redirect plan and the internal link updates, not just the new copy. If you retire, leave a decision note that explains why it was safe, because you will be asked later. If you create something new, require one sentence: what existing page does this replace or reduce? If that sentence is hard to write, you’re probably creating overlap on purpose.

And when removals are involved, be clear about your mechanism. “Remove from Google” is not one action; it’s a set of options with different outcomes and timelines, and Google’s documentation is worth revisiting before you do anything irreversible.

If AI is your content factory, you’ll publish faster and accumulate debt faster.

If AI is your triage tool, you’ll publish less, but your site will get easier to crawl, easier to understand, and harder to distrust. That’s the compounding advantage.

👤 Operator of Interest: Annie Cushing

Learn This:

Vector Embeddings: Numerical representations of data that capture semantic meaning.

One more thing: AI is only as good as it’s operator, and if you are reading this newsletter, you are better than most!

Till next time,

Joe Hall

PS: Let me know what you think of this issue, or anything else here: [email protected]

Keep Reading