Publication date and freshness: how AI measures content aging
AI models factor content freshness into source selection. Discover how to correctly implement publication and modification dates with structured data and meta tags.
My career started by manually combing through server log files. I wanted to understand how Googlebot crawls websites. That fascination with the technical side of discoverability? Never faded.
At Kobalt, I translate complex protocols and standards into practical implementations. Websites that are discoverable not just in Google, but also in ChatGPT, Perplexity and Gemini. My approach is always the same: measure first. Then optimize.
I believe the shift toward AI-driven search experiences is the biggest change in online discoverability since the introduction of mobile search. That is a big claim. But I stand by it.
Outside of work, I play complex strategy board games (Terraforming Mars is a favourite), I brew my own beer with the same precision I use to validate structured data, and I spin vinyl while reviewing code.
My best work happens somewhere between ten at night and two in the morning. Post-rock in the background. Homebrewed pale ale within reach. Not for everyone. Works for me.
AI models factor content freshness into source selection. Discover how to correctly implement publication and modification dates with structured data and meta tags.
Duplicate content confuses search engines and AI models. The canonical tag indicates which version of a page is the original. Learn how to implement canonical URLs correctly.
HTTPS and HSTS are more than security measures. They are trust signals that AI models factor in when evaluating your website as a reliable source.
Open Graph and Twitter Cards determine how your content appears when shared on social media and in AI summaries. Learn how to implement these meta tags correctly for maximum visibility.
A correct heading structure is essential for both accessibility and AI readability. Learn how to properly use H1 to H6 tags for maximum impact.
AI agents need standardized authentication to communicate securely with your website. Learn how OAuth discovery and Protected Resource metadata make this possible.
Your robots.txt determines whether AI bots can read your content. Learn how to correctly configure GPTBot, ClaudeBot and other AI crawlers for maximum visibility.
The Model Context Protocol (MCP) is the new standard that allows AI agents to use tools and data from your website. Learn how MCP works and how to prepare your website.
Schema.org structured data is the universal language your website uses to communicate with search engines and AI models. Learn to implement JSON-LD for maximum visibility.
The llms.txt file tells AI models what your website has to offer. Learn how to create, structure and validate it for maximum AI visibility.