HUBSPOT ELITE PARTNER

Our thinking about HubSpot, AI, & removing silo'ed data

All blogs

Find our lastest here

In our blog, you'll find content on HubSpot strategy, HubSpot CRM strategy, digital marketing, digital branding, marketing automation, sales automation, HubSpot blog design, HubSpot design agency stuff and more.

HubSpot

What Does “None Met” Mean in HubSpot Workflows?

Read more
What Does “None Met” Mean in HubSpot Workflows? Most HubSpot users ignore “none met” until the day a workflow breaks. Then suddenly it becomes very important. I recently came across this exact question on Reddit. Someone asked, “Can someone explain ‘None met’ to me like I’m a toddler?” I jumped into the comments to help clarify. But it got me thinking. If this many people are unsure about what “none met” does, it is probably a commonly misunderstood condition
Artificial Intelligence

How to Structure Content so AI Systems Cite You

Read more
You've made your website AI-readable. Your robots.txt allows AI crawlers. Your pages are available as clean markdown. But AI systems still aren't citing you. The infrastructure gets your content in front of AI agents. The structure of your content determines whether they actually use it. This article covers how to write and format content that AI systems can confidently extract, cite, and include in their answers.
Artificial Intelligence

Making your website AI-readable: The complete guide

Read more
AI systems are becoming primary discovery channels. ChatGPT, Perplexity, Claude, Google AI Overviews, and Gemini now answer hundreds of millions of queries every day. When they answer a question that your content could address, they make a retrieval decision: which sources should inform this answer, and which should be cited? If your website isn't AI-readable, you're invisible to a growing share of your audience. And the gap between AI-readable and AI-invisible is widening fast. This guide covers everything you need to do to make your website visible to AI systems - from the format
Artificial Intelligence

Content Signals: Telling AI how it can use your content

Read more
robots.txt gives you a binary choice: allow an AI crawler to access your content, or block it entirely. But what if you want something in between? What if you want AI systems to cite your content in search results but not use it for model training? What if you want AI agents to read your public marketing pages but restrict access to your premium content? robots.txt can't express these distinctions. Content Signals can. What Content Signals are Content Signals is a response header standard that lets you declare how AI systems may use
Artificial Intelligence

What is llms.txt and why every website needs one

Read more
If you've heard about making your website AI-readable, you've probably encountered llms.txt. It's one of those things that sounds technical but is actually straightforward - and it's something every website should have in place right now. This article explains what llms.txt is, what it looks like, how to create one, and where it fits in the broader AI readability stack. The one-line explanation llms.txt is a markdown file at your website's root that gives AI systems a curated table of contents for your site. It tells them who you are, what your site contains, and where to
Artificial Intelligence

What is Markdown and why LLMs prefer it over HTML

Read more
If you work in marketing, content, or business, you may have heard that AI systems prefer markdown. But what does that actually mean? What is markdown, why do large language models prefer it, and what does it look like in practice? This article explains markdown from the ground up - no developer background required - and makes the case for why it's the most important format shift happening in web content today. What markdown actually is Markdown is a lightweight text formatting language created by John Gruber in 2004. It uses simple characters to indicate structure:
Artificial Intelligence

The robots.txt Audit: Are you Accidentally Blocking AI?

Read more
You've published great content. Maybe you've even set up llms.txt and per-page .md files. But none of that matters if your robots.txt is blocking the AI agents that want to consume it. Many websites are invisible to AI search - not because their content is poor, but because their robots.txt rules inadvertently shut the door. This article walks you through auditing and fixing it. The problem you don't know you have robots.txt has been around since 1994. It tells web crawlers what
Artificial Intelligence

Per-page .md Files: The Gold Standard for AI Readability

Read more
If you've set up an llms.txt file, you've given AI systems a table of contents for your site. That's a good start. But the table of contents is not the book. Per-page .md files are the book. They deliver the full content of every page on your site in clean, structured, token-efficient markdown - and they're the single most impactful thing you can do to make your website AI-readable. The map vs the territory llms.txt is a static file that tells AI agents what your site contains and
Artificial Intelligence

How to Know Which of Your Pages LLMs Have Indexed

Read more
You've done the work. You've set up llms.txt. You've created per-page .md files. You've audited your robots.txt. Your content is clean, structured, and AI-readable. But how do you know if any of it is working? How do you know which pages ChatGPT, Claude, or Perplexity have actually consumed? How do you know whether AI systems are reading your content at all? This is the biggest unsolved problem in
SHOWCASE

See Our Work in Action

Watch our favourite HubSpot projects and client success stories.

5k subscribers

Sign up to our Newsletter

By submitting your information, you agree to receive occasional updates, industry news, events, and services.

CTA Image 1

Ready to get started

Get in touch,
book a discovery call

Let's chat. We'd love to unpack how we can accelerate your growth.