HubSpot's 140 Million Lost Visits Isn't a Content Problem. It's an Authority Problem.
HubSpot's CMO confirmed the company lost 140 million visits to AI search in a single year. The industry's response is to restructure content. That's half the fix—and the wrong half to stop at.
Yesterday, the BBC ran a story that should have rattled every CMO who read it. HubSpot lost 140 million website visits in a single year. Kipp Bodnar, HubSpot's CMO, explained the cause plainly: AI overviews reduce click-through rates by 60 to 70 percent. Users who once landed on HubSpot's blog posts now get the answer from an AI system without ever clicking through. The traffic stopped showing up.
The industry's diagnosis: content structure. Chunk your pages differently. Write in natural language. Match the longer query format — 40 to 60 words instead of 4 to 6. Give AI systems something easy to extract.
That diagnosis is correct. And it stops in the wrong place.
What content restructuring actually fixes
Content optimization addresses one layer of the problem: technical extractability. If AI crawlers can't access your site, they can't cite it. If your content is a wall of prose with no structural anchors, the model skips it. HubSpot's move to smaller, answer-shaped chunks is the right tactical response.
Technical extractability is table stakes. It's not a competitive advantage. Fixing it removes you from the invisible category. It doesn't place you in the cited category.
A Berkeley/UC Santa Cruz study analyzing 1,700 citations across Brave, Google AIO, and Perplexity found that pages with strong metadata, semantic HTML, and structured data achieve a 78% cross-engine citation rate — four times the rate of unoptimized pages. (Source: arXiv, September 2025)
That study also found something the content-optimization narrative skips: cross-engine citations, appearing in multiple AI systems simultaneously, correlate with 71% higher quality scores. The pages getting cited everywhere share one thing. They're already on high-trust domains. The optimization is what gets them extracted. The domain trust is what gets them selected.
The layer the BBC story missed
ConvertMate's analysis of 80 million citations across 10,000+ domains found that brand search volume now predicts AI citation frequency more reliably than backlink count. (Source: ConvertMate Research, January 2026)
That looks like good news until you ask what drives brand search volume. The answer is external coverage: mentions in publications your buyers trust, which get crawled by AI engines and referenced as authoritative sources.
Otterly.AI's analysis of one million AI citations across ChatGPT, Perplexity, and Google AI Overviews found that community platforms and trusted third-party publications outrank brand-owned domains in AI citation share. 73% of brand websites have technical barriers blocking AI crawlers entirely. But even the brands that fix the technical layer are still competing against Reddit, Wikipedia, and the publication stack AI engines treat as their primary reference tier. (Source: Otterly.AI, February 2026)
HubSpot shows up in AI answers about marketing automation because Forbes, TechCrunch, G2, and dozens of credible publications have written about HubSpot for years. When a prospect asks Perplexity what's the best CRM for a mid-market SaaS company, HubSpot appears not because HubSpot chunked its content. It appears because its editorial presence in trusted publications creates the corroboration signal AI engines read as authority.
A mid-market brand with zero earned media presence can restructure its content perfectly and still not appear in AI answers. The model has no external corroboration to pull from. This is why AI search traffic converts at significantly higher rates than organic Google traffic for brands that have built the underlying editorial foundation — the clicks that do arrive are from buyers who have already been pre-qualified by the AI's recommendation. Jaxon Parrott has written about why most AI citation strategies stall at the content layer and never reach the editorial authority layer that actually moves citation share.
Two layers, not one
| Layer | What it does | What it doesn't do |
|---|---|---|
| Content optimization (chunks, schema, robots.txt) | Makes your site technically extractable | Establish domain trust with AI engines |
| Editorial authority (earned media in trusted publications) | Builds the corroboration signal AI systems read as credibility | Fix technical crawlability |
Both layers are required. Most companies treating the BBC story as a content brief are stopping at Layer 1.
HubSpot's CMO gets it: "I don't know how you are a competitive business in the future without having a strong competency in this." He's describing the whole stack. Not just the technical half.
The brands that show up consistently across AI engines share a common structure: years of earned placements in publications those engines already trust. This is why thought leadership content that doesn't earn placements in cited publications rarely moves the needle on AI visibility, regardless of how well the on-site content is structured. Christian Lehman's tracking setup for AI referral attribution shows what happens to conversion quality once the editorial foundation is in place: the source channel changes, and the buyer arrives further along in their decision.
The founder's question
If your brand disappeared from Google tomorrow, how would AI systems know you existed?
Not your website. Not your chunked content. What third-party editorial coverage would give AI engines reason to cite you when a buyer asks who leads your category?
If the answer is thin — a few press releases, a LinkedIn company page, some G2 reviews — the content optimization project is building on sand. AI engine training data and live retrieval both weight trusted publication presence heavily. You can optimize your site for extraction, but the model is still asking: "Do I trust this domain enough to put it in front of the user?"
That answer comes from what's written about you elsewhere.
FAQ
Why did HubSpot lose 140 million website visits to AI search? HubSpot's CMO confirmed that AI overviews reduce click-through rates by 60 to 70 percent for searches where AI provides a direct summary. Users got the answer without visiting HubSpot's site. The loss was structural: AI systems intercept demand before the click happens.
Does restructuring content for AI search actually work? Yes, technically. Content that's chunked, schema-tagged, and accessible to AI crawlers is significantly more likely to be extracted and cited. A Berkeley/arXiv study found pages with strong structural optimization achieve a 78% cross-engine citation rate. But optimization only matters if AI engines already trust your domain through external editorial corroboration.
What does Machine Relations have to do with AI traffic loss? Machine Relations is the discipline of building the earned media presence that makes AI engines cite your brand as an authority — not just crawl your site. Content optimization makes your site technically readable. Earned media in trusted publications is what makes it worth recommending.
Related Reading
- Earned Media for Fintech Companies — Editorial Authority in AI-Driven Discovery
- Machine Relations for Fintech Companies: How to Get Cited by ChatGPT, Perplexity, and Financial AI Engines
The conclusion the BBC couldn't write
The companies restructuring their content are doing necessary work. They're not doing sufficient work.
The brands that show up in AI answers for their category — consistently, across engines — have years of earned media placements in publications AI systems trust. That presence creates the corroboration signal that turns "technically extractable" into "actually cited."
This is what Machine Relations names as the shift. Earned media placements in respected publications were always the most durable trust signal for human buyers. AI engines read the same publications humans trust. The reader changed. The mechanism didn't.
HubSpot's 140 million lost visits are real. So is the fix. It starts with content structure. It doesn't end there.
Run the audit and see where your brand stands: AuthorityTech AI Visibility Audit