FAQ: Understanding Your 7-Second AI Visibility Report

You’ve just received your 7-second AI visibility report. That means you’ve taken the most important step: you’ve finally looked at your website the way AI systems see it. This FAQ is here to translate the technical pieces into normal language, so you understand what each line means for your business and what to do next.

Let’s go through it calmly, one question at a time.

What does the report color actually mean?

The overall color is not a “school grade”. It is a signal of how visible and understandable your site is to AI systems.

Green means: your site is visible, readable, and technically accessible. AI crawlers such as GPTBot, ClaudeBot, Google’s AI bots and others can reach your pages, parse your content, and at least understand who you are, where you are, and what you offer.

Yellow means: partial visibility. The site looks fine at first glance, but one or two critical elements are broken or missing. Because of that, AI models either misinterpret you or cannot fully trust your content. It might be something “small” like missing GEO markup, a restrictive robots.txt, or bad structured data. For humans this seems minor, but for AI – it's a stop signal.

Red means: your site is effectively invisible to AI. It may load quickly for a human, but for AI crawlers it is blocked, unreadable, or technically unreachable. In that state, you simply do not exist in the AI discovery layer.

Why can a yellow report be as dangerous as a red one?

Because in the AI world “almost good” often means “does not work”. One wrong directive in robots.txt, one missing Schema.org block, one broken response code – and AI systems treat your site as unreliable or inaccessible.

AI does not grade you on effort. It works in a binary mode: either it can safely use your site as a source, or it quietly steps away and uses someone else. That is why a yellow report is not “okay, we’ll fix it later”. It is a warning that one critical weakness is pulling everything else down.

What is TTFB (Time to First Byte) and why should I care?

TTFB is the time it takes for your server to send the very first byte of data back to the crawler. Think of it as the time between a knock on the door and your first “hello”.

For AI crawlers, time is a hard resource. If your TTFB is slow (over roughly 800 ms), many crawlers simply stop waiting and move on. A good TTFB is usually in the 200–300 ms range; excellent is under 150 ms. Fast TTFB tells AI: “This site is responsive, safe to crawl, and cheap to process.”

What is Total Response Time?

Total Response Time is how long it takes from the request to the full response from your server. Sometimes TTFB is fine, but then your page loads heavy scripts, images, or complex logic that slows everything down.

AI crawlers measure not just the first byte but the overall experience. If the full response is too slow, they will crawl fewer pages, come less often, or skip your site in favor of faster, simpler alternatives. When you see high Total Response Time in the report, it’s a strong signal to work on performance, caching, and page weight.

What is robots.txt and why is it so critical?

robots.txt is a tiny text file at the root of your website that gives instructions to crawlers. It can either open the door or slam it shut.

If your robots.txt contains a directive like “Disallow: /”, you are literally telling crawlers: “Do not come in.” And they obey. That applies not only to classic search engines, but also to AI crawlers that learn and build knowledge from websites.

One line in robots.txt can make the difference between “we see you” and “you do not exist”. Your report highlights whether your robots.txt is giving the right signals or blocking you by accident.

What are Structured Data and why do they matter?

Structured data (Schema.org, JSON-LD, etc.) are small pieces of code that explain to machines what your content actually is.

For a human, “We are a dental clinic in Berlin, open on Saturdays” is clear. For an AI system, without structured data, it’s just another sentence. With structured data, your site becomes: LocalBusiness → Dentist → located in Berlin → with opening hours and contact info. That is machine-readable fact, not just prose.

If your report flags missing or weak structured data, it means AI models have to guess who you are, and they usually prefer not to guess. Sites with clear, rich structured data are far more likely to be trusted and featured in AI answers.

What is GEO markup?

In 7secAI, GEO stands for Generative Engine Optimization — the new evolution of SEO. It means preparing your site not just for search engines, but for the AI systems that now deliver answers directly. GEO is about being understandable, trustworthy, and machine-readable.

A key part of GEO is geo markup — structured information about where your business actually operates: your city, country, coordinates, and service area. For humans, an address in the footer looks fine. For AI, plain text is vague. Geo markup removes that ambiguity. It tells the system clearly: “This plumber is in this city, serving this region.

Without proper geo markup, you risk being recommended to the wrong audience or being excluded entirely because the model isn’t confident about your location. For any local business, this signal is one of the most important pieces of the visibility puzzle.

What is Brand Detection in the report?

Brand Detection is about whether AI systems can recognize you as a real, coherent brand instead of “just another website”.

If your name, logo, domain, and social presence connect together clearly, AI can form a stable “mental model” of who you are. If everything is fragmented – different names, inconsistent contacts, no external mentions – AI has trouble trusting and reusing your content.

A strong Brand Detection score means: “This entity exists in more than one place, with consistent signals.” A weak one means: “This could be anything; better to rely on other, clearer sources.”

What does AI Indexing Status mean?

AI Indexing Status reflects whether your site is technically allowed and accessible to be included in AI training and retrieval processes.

If the report shows “Allowed” or “Open”, crawlers are, in principle, free to read and index your content. If it shows “Restricted” or “Blocked”, either your settings (robots.txt, headers, access rules) or missing files prevent AI from safely using your pages.

From the outside, your site may look normal. But from the AI side, a single misconfiguration can demote you from “candidate source” to “invisible background noise”.

What is llms.txt and why is everyone talking about it?

llms.txt is a new, emerging convention similar to robots.txt, but specifically for large language models (LLMs). It’s where you can explicitly say what AI systems may or may not do with your content.

If you have no llms.txt, AI crawlers have less clarity about your preferences. If you have a clear, correctly configured llms.txt that allows reading and usage under certain conditions, it sends a strong “green light” signal.

Your report shows whether llms.txt exists and whether it looks healthy. For forward-looking sites, this file is quickly becoming a hygiene factor.

What are Social Signals in the context of AI visibility?

Social signals are mentions and traces of your brand across external platforms: business directories, reviews, social networks, professional profiles.

AI systems increasingly use these as evidence that you are real, active, and trusted. If your domain appears nowhere except your own site, you look isolated and less reliable. If your name appears consistently across multiple sources, it strengthens your authority.

The report doesn’t judge your popularity. It simply shows whether there is any external “echo” of your brand for AI to pick up.

Why can a single red metric drag the whole report down?

Because we are moving into a binary environment. In the old SEO world, you could be “more or less optimized” and still appear somewhere. In the AI world, there is a growing set of minimum conditions.

If one of those core conditions fails – for example:

then the practical result is the same as being offline: you are skipped. AI systems are optimised for scale, not sympathy. They simply choose a cleaner, more structured site instead of yours.

That is why there are no “small things” anymore. One misconfigured file or missing tag can negate years of effort.

What should I do if my report is yellow or red?

First, don’t panic. The report is not a verdict; it is a diagnostic. It shows you the gap between how you see your site and how AI sees it.

Second, treat the highlighted issues as a priority list. Fix the obvious blockers: robots.txt, speed problems, missing structured data, absent GEO markup, broken status codes.

Third, if you want to go beyond “technical okay” and really understand how AI interprets your business, run a Deep Scan. The 7-second report tells you whether you are visible. The Deep Scan shows how you are perceived: which pages are read, what role your brand plays, how strong your trust signals are, and where you are losing opportunities.

The Bottom Line

Your 7-second report is not just a checklist. It is a snapshot of your place in the new AI-driven web. In this space, there is less and less room for “almost”.

Either you are visible, or you are not.

Either you are part of the answer, or you are not.

Either you win, or you quietly lose.

There is no safe middle.

If you already know this, you are ahead of many. The next step is simple and hard at the same time: act. Fix what blocks you, strengthen what supports you, and use the tools that help you see your site the way AI does.