The JS rendering point is critical. Even though bots like GPTBot technically have headless capabilities, they often fall back to text-only extraction for non-priority pages to save compute. We see a lot of "invisible" content in e-com especially because of this.
One other signal to check: internal linking structure. AI crawlers seem to respect semantic clusters more than traditional pagerank flow. If your "about" page isn't semantically linked to your "product" page in a way the LLM understands as a relationship, it often hallucinates the connection.
Interesting concept. The gap between traditional SEO and AI visibility is real — I've noticed that structured, opinionated content with clear problem/solution framing tends to get cited by AI models much more than keyword-optimized content.
One signal I think matters a lot for GEO that traditional SEO ignores: specificity of claims. AI models seem to prefer content that makes concrete, verifiable statements over vague authority pages. Would be cool to see Potatometer check for that kind of content quality signal.
Hey HN, creator here. Happy to answer questions on how the scoring works. Potatometer checks both traditional SEO signals and GEO factors, things like structured data, citation-friendliness, entity clarity, and topical authority, then gives you specific actionable fixes rather than just a score.
Also building out AI citation scoring and a content roadmap for AI search visibility if anyone is interested in that direction.
This is actually a broader web fetching limitation, not specific to Potatometer. Most AI crawlers like GPTBot face the same challenge with JS-rendered sites, which is itself a GEO signal worth knowing. I am exploring headless rendering to get around it. What site were you testing?
The JS rendering point is critical. Even though bots like GPTBot technically have headless capabilities, they often fall back to text-only extraction for non-priority pages to save compute. We see a lot of "invisible" content in e-com especially because of this.
One other signal to check: internal linking structure. AI crawlers seem to respect semantic clusters more than traditional pagerank flow. If your "about" page isn't semantically linked to your "product" page in a way the LLM understands as a relationship, it often hallucinates the connection.
Interesting concept. The gap between traditional SEO and AI visibility is real — I've noticed that structured, opinionated content with clear problem/solution framing tends to get cited by AI models much more than keyword-optimized content.
One signal I think matters a lot for GEO that traditional SEO ignores: specificity of claims. AI models seem to prefer content that makes concrete, verifiable statements over vague authority pages. Would be cool to see Potatometer check for that kind of content quality signal.
Hey HN, creator here. Happy to answer questions on how the scoring works. Potatometer checks both traditional SEO signals and GEO factors, things like structured data, citation-friendliness, entity clarity, and topical authority, then gives you specific actionable fixes rather than just a score. Also building out AI citation scoring and a content roadmap for AI search visibility if anyone is interested in that direction.
For a GEO site, Web Fetch can't work? "The page is JavaScript-rendered so WebFetch can't see the results."
This is actually a broader web fetching limitation, not specific to Potatometer. Most AI crawlers like GPTBot face the same challenge with JS-rendered sites, which is itself a GEO signal worth knowing. I am exploring headless rendering to get around it. What site were you testing?