{"id":10550,"date":"2026-04-28T21:53:18","date_gmt":"2026-04-28T21:53:18","guid":{"rendered":"https:\/\/franklyn-k.com\/blog\/?p=10550"},"modified":"2026-04-28T21:59:47","modified_gmt":"2026-04-28T21:59:47","slug":"ai-invisible-cms-photography-websites-chatgpt","status":"publish","type":"post","link":"https:\/\/franklyn-k.com\/blog\/ai-invisible-cms-photography-websites-chatgpt\/","title":{"rendered":"AI-Invisible CMS: Why Your Photo Site Misses ChatGPT in 2026"},"content":{"rendered":"\n<p>In late 2025 I started seeing referrals in my logs from a user-agent I had never paid attention to: <code>GPTBot\/1.2<\/code>. Then <code>ClaudeBot<\/code>, then <code>PerplexityBot<\/code>. By February 2026, these three bots together accounted for roughly 4% of my crawl traffic. Small number, but the trajectory was loud. So I ran a test on seven photography CMSes to see what an AI crawler actually receives when it requests a wedding photographer&#8217;s home page.<\/p>\n\n\n\n<p>The result is a quiet disaster for most of the industry. Most photography platforms send AI crawlers a near-empty document. The portfolios are technically online. They are, in any practical sense for AI retrieval, invisible.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why I bothered running the test<\/h2>\n\n\n\n<p>I work as a wedding photographer in Paris and I have spent the last two years rebuilding my own site away from Showit and toward static HTML. I had assumptions about why JavaScript-heavy CMSes were a problem for SEO, but the conversation has shifted. Googlebot now renders JavaScript reasonably well. The new bots do not, or at least not in the same way, or not at the same priority.<\/p>\n\n\n\n<p>If a couple asks ChatGPT for &#8220;best wedding photographers in Provence&#8221; and the AI cites three names, those three names will become the only names that exist for that query. The platforms that produce empty HTML for AI crawlers will sit out of that conversation entirely. That is the point of the test.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The setup<\/h2>\n\n\n\n<p>I picked seven platforms that cover the bulk of the wedding photography market in France and the US: Showit, Squarespace, Wix, Pixieset, Format, ProPhoto on WordPress, and a hand-coded static HTML reference site (mine).<\/p>\n\n\n\n<p>For each platform I selected a representative live portfolio with a strong Google ranking. I requested the home page and one portfolio gallery page using three user-agent strings, the ones each crawler announces in its own logs:<\/p>\n\n\n\n<p>GPTBot\/1.2 (+https:\/\/openai.com\/gptbot) ClaudeBot\/1.0 (+claudebot@anthropic.com) PerplexityBot\/1.0 (+https:\/\/www.perplexity.ai\/perplexitybot)<\/p>\n\n\n\n<p>I logged the raw HTML returned, then ran the response through three counts: number of <code>&lt;h1&gt;<\/code>, <code>&lt;h2&gt;<\/code>, <code>&lt;h3&gt;<\/code> tags, number of <code>&lt;article&gt;<\/code> and <code>&lt;section&gt;<\/code> tags, and total word count of visible text content excluding script and style blocks.<\/p>\n\n\n\n<p>I did not render the JavaScript. That is the entire point. AI crawlers in 2026 mostly do not.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What the bots actually see<\/h2>\n\n\n\n<p>The Showit portfolios returned an average of 47 words of visible text and zero <code>&lt;h1&gt;<\/code> tags on the home page. The site loads through a JavaScript bootstrap that paints content client-side, and the initial HTML is essentially scaffolding. To a non-rendering crawler, a Showit page looks like an empty room with the lights off.<\/p>\n\n\n\n<p>Squarespace was better. Server-side rendering produces real text in the response, on average 380 words on a portfolio home, with two <code>&lt;h1&gt;<\/code> and four to six <code>&lt;h2&gt;<\/code> tags. Not perfect, but a real document.<\/p>\n\n\n\n<p>Wix sat in the middle. Some templates render server-side, others depend on Velo or Wix Code that delays content. The two Wix sites I tested gave 120 words on average, and the heading hierarchy was inconsistent.<\/p>\n\n\n\n<p>Pixieset and Format are gallery-first platforms. The wrapper page contains your bio and a navigation, the galleries themselves are heavy on JavaScript and lazy-loaded image grids. The bio text was visible to the bots. The galleries themselves were a wall of JavaScript with no <code>&lt;figcaption&gt;<\/code> and no semantic image markup.<\/p>\n\n\n\n<p>ProPhoto on WordPress varied wildly with theme choice, but the WordPress backbone always produced a real <code>&lt;article&gt;<\/code> tag and real heading structure. Average 540 words on the portfolio page, full semantic skeleton.<\/p>\n\n\n\n<p>The static HTML reference returned the full document text on the first byte, around 1100 words on the home page, with proper landmark roles, structured <code>&lt;article&gt;<\/code> blocks per project, and inline alt text and figcaptions. AI bots received the same content a human would.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">The percentage that matters<\/h2>\n\n\n\n<p>I averaged the visible-text-to-source-bytes ratio across the seven platforms. Showit returned visible text equal to roughly 1.2% of the bytes shipped. Static HTML returned about 38%. Everything else fell somewhere in between, with Squarespace and ProPhoto\/WordPress at the higher end and Wix and Pixieset at the lower.<\/p>\n\n\n\n<p>Put differently: a Showit photographer is paying full bandwidth costs to ship a document that contains almost no information for any non-rendering reader. The aesthetics arrive eventually, after a JavaScript runtime executes. The AI crawler has already left.<\/p>\n\n\n\n<p>I have spent time mapping exactly which Showit elements collapse to empty HTML at the source level. The short version is that most of the page is a positioned <code>&lt;div&gt;<\/code> containing nothing until JavaScript runs. The deep dive on that specific gap will land in the <a href=\"https:\/\/franklyn-k.com\/lab\/\">Lab section<\/a> shortly.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Why this is not the same as a Google ranking problem<\/h2>\n\n\n\n<p>I want to draw a clean line here, because the conversation tends to slide. Google&#8217;s crawler can render JavaScript. It is slower and more expensive and there are documented edge cases, but in principle a Showit site can rank in Google search.<\/p>\n\n\n\n<p>AI crawlers in 2026 are not Googlebot. They are pulling content for retrieval-augmented generation, for citation, for training, and for live web answers. The cost of rendering JavaScript at scale across the open web is high enough that none of the major AI crawlers do it the way Google does. They want a document. If your CMS does not produce a document on first byte, you are not in the index that feeds the answer.<\/p>\n\n\n\n<p>The downstream effect is concrete. When a couple types a question into ChatGPT or Perplexity, the model retrieves a small number of source pages and cites them. Those citations drive a new kind of referral traffic that I am already seeing in my own logs. If your home page is invisible to the retrieval step, you are not in the citation set, and you do not get the click.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What to do about it<\/h2>\n\n\n\n<p>The honest answer is that this is a CMS choice, not a plugin. You cannot patch your way out of a JavaScript-rendered page. The options are real and limited.<\/p>\n\n\n\n<p>Move the public-facing site to a system that produces server-rendered HTML on first byte. WordPress with a non-bloated theme works. A static site generator works. Hand-coded HTML works. Squarespace works for most of what photographers need, with caveats around custom layouts. The point is that the bytes that arrive at the bot must contain the actual content.<\/p>\n\n\n\n<p>If you cannot move, at minimum produce a parallel content layer that AI crawlers can read. A blog on a sub-path that runs WordPress, a <code>\/portfolio.html<\/code> static page that mirrors your gallery, an <code>llms.txt<\/code> file that points to your real content. These are mitigations, not fixes, but they put something in the index.<\/p>\n\n\n\n<p>I built <a href=\"https:\/\/vision.franklyn-k.com\">PhotoSEO Vision<\/a> for the diagnostic part of this work, and I am opening it up to other photographers in the next quarter. The tool runs the user-agent test I described above, scores each page across the three AI crawlers, and points to the elements that collapse. If you want to know what your site looks like to GPTBot, that is the fastest way.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">What I expect to see in 2026<\/h2>\n\n\n\n<p>Two predictions. First, the platforms with server-rendered HTML will quietly absorb a meaningful share of the AI-citation traffic, and the photographers on those platforms will see the gain in their Search Console and analytics without understanding why. Second, the gap between what a human visitor sees and what a bot sees will become a measurable variable in how a portfolio performs commercially, and CMS choices made in 2018 will start to show their cost.<\/p>\n\n\n\n<p>The next two articles in this series go deeper into the mechanics. The next one looks at filenames as a ranking factor and what changed when I renamed 100 wedding photos: <a href=\"https:\/\/franklyn-k.com\/blog\/photography-file-naming-seo-2026\/\">Photography File Naming in 2026<\/a>. The third walks through the per-bot rendering differences I observed in this test: <a href=\"https:\/\/franklyn-k.com\/blog\/ai-crawlers-photography-portfolio-test\/\">How GPTBot, ClaudeBot, PerplexityBot See Your Portfolio<\/a>.<\/p>\n\n\n\n<p>If you want to see the press coverage and broader context for this research, the <a href=\"https:\/\/franklyn-k.com\/press\/\">press page<\/a> has the references.<\/p>\n\n","protected":false},"excerpt":{"rendered":"<p> [&hellip;]<\/p>\n","protected":false},"author":0,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[73],"tags":[],"class_list":["post-10550","post","type-post","status-publish","format-standard","hentry","category-research"],"acf":[],"_links":{"self":[{"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/posts\/10550","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/comments?post=10550"}],"version-history":[{"count":2,"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/posts\/10550\/revisions"}],"predecessor-version":[{"id":10557,"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/posts\/10550\/revisions\/10557"}],"wp:attachment":[{"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/media?parent=10550"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/categories?post=10550"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/franklyn-k.com\/blog\/wp-json\/wp\/v2\/tags?post=10550"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}