Last week I ran an experiment on myself. I typed into ChatGPT: "What are good AI-native development consultancies I should look at for memory-augmented agents?"

I've been building exactly that at Kurka Labs — a memory layer, a proxy generation pipeline, a multi-model consensus engine. Public site at kurkalabs.dev. Active Substack. Published book on AI. Twenty active repos on GitHub. If there's a firm in that space to surface, I'd expect to at least be on the list.

ChatGPT didn't return consultancies. It returned tools — Mem0, Letta, Zep, Microsoft Kernel Memory, LangMem. Useful list. Zero firms.

Claude gave me a nearly identical list. Also no firms.

Perplexity cited me — once — buried beneath eight more prominent mentions of agencies I've never heard of.

Then I tried a more direct query: "Kurka Labs AI consultancy." ChatGPT returned Daniel Kurka at Uniswap Labs, a company called Kura Labs (unrelated), another called Kureka Labs (unrelated), and generic "AI Labs" firms from all over the world. My own firm wasn't in the top fifteen results.

The third experiment was the most telling. I queried: "Frank Kurka Kurka Labs MemoryOS." It returned my GitHub. It returned my LinkedIn, which is materially stale and still describes me as "Retired Software Dev, currently creating Gen AI Agent Applications." It returned a completely separate app called memoryOS that sells memory-training games. My actual MemoryOS — the product I've been writing about weekly for months — did not appear.

This is me. With a website. Multiple products. Published book. Active writing. I am, on paper, exactly the kind of business that should be easy to find. And AI can't find me cleanly.

If this is the situation for someone who has spent a year writing about AI weekly, imagine what it looks like for a small business whose digital footprint is just a WordPress site from 2019 and a Facebook page.

This is a truth that hasn't landed for most small-business owners yet: AI assistants are now a traffic channel that your website almost certainly doesn't serve. And the feedback loop is broken. Your analytics don't tell you that ChatGPT failed to mention you yesterday. There's no "AI referral" line in Google Analytics. The only way to know you're invisible to AI is to check, and almost nobody checks.

Most of the advice floating around about fixing this is also wrong. Not wrong in the sense of "inaccurate" — the technical recommendations are usually correct in isolation. Wrong in the sense of solving the wrong problem at the wrong layer.

Let me explain what I mean.


The retrofit trap

Search "how to rank on ChatGPT" or "optimizing your site for AI" and you'll get posts with titles like 10 ways to optimize your site for ChatGPT or add these schema types for better AI visibility or why you need an llms.txt file. The advice is usually technically accurate — add JSON-LD structured data, write clean semantic HTML, publish an llms.txt. The problem is the layer it's operating at.

All of that advice assumes your existing website is the right substrate for AI visibility and you just need to decorate it better. That's an architectural mistake.

Your website was built for human readers arriving via search engines. That's a specific design goal: content organized for scanning, calls-to-action placed where eyeballs hover, paragraph-level language tuned for Google's keyword-relevance algorithms, navigation designed around the human attention budget. If you paid an agency to build your site in the last five years, they optimized it for that job, and they probably did a good job.

AI assistants are doing something fundamentally different. They're machine readers arriving via natural-language queries. They don't navigate. They don't scan. They retrieve — pulling specific answers from a knowledge base they've either learned during training or look up at inference time via tool use. What makes content retrievable for them is almost entirely different from what makes content rankable for search engines.

Trying to retrofit a human-reader site to serve a machine-reader AI is like asking a restaurant kitchen to also be a chemistry lab. You can do it, kind of. The meals taste worse. The experiments are imprecise. Better to build the lab next door.


The architectural difference

Search engines rank pages. AI assistants retrieve answers.

That's the whole thing. Everything downstream follows from it.

Google is a keyword-to-URL machine. You type "industrial design Boston," it scores billions of URLs on relevance to that keyword pattern, it returns a ranked list. The unit of output is a link. The user clicks, reads, and makes their own decision. SEO is the practice of making your URL rank well for the keywords you care about. Every best practice — backlinks, schema, page speed, internal linking, meta descriptions — serves that ranking goal.

AI assistants don't return links. They return answers. The user asks a natural-language question, the model generates a natural-language response, and if the model mentions a business or a product in that response, it's citing the business by name — often without even linking to its URL.

For a business to get cited, one of two things has to happen:

Retrieval-time citation depends on the business being findable and summarizable when the model goes looking. The model needs a clear, machine-parseable statement of what the business does, who it serves, what makes it distinctive. It needs enough structured signal for entity resolution — enough for the model to recognize "this is the right Boston industrial design firm I'm being asked about." It needs recent content that's actually about the topic being asked, not just tangentially related. And it needs a low-friction format that doesn't require parsing forty blog posts to extract the relevant fact.

Your existing website probably doesn't provide any of those cleanly. It wasn't built to. It was built for humans.


Parallel authority: what to actually do

The mistake the retrofit camp is making is thinking about AI visibility as a plugin to your existing site. It's not. It's a separate site, purpose-built, running next to your main site without replacing it.

I call this Parallel Authority. Your main site continues to do what it was built to do — serve humans arriving via search engines, represent your brand visually, host your portfolio and case studies. That's fine. Don't touch it.

Next to it, you stand up a second site, engineered from the ground up for AI answerability. It has a clean machine-readable statement of what your business does in llms.txt and llms-full.txt. It has semantic markup that a language model can parse without guessing. It has content organized around the questions someone might ask an AI, not around the keywords someone might search in Google. It has a chatbot that represents your business's voice to visitors the AI might send. It has regular updates — weekly news ingestion, because a site that hasn't changed in six months reads as stale to both search crawlers and AI retrieval systems. And it has clear entity signals so the AI can confidently say "yes, this is the right business to mention."

The site sits on a separate domain. It ranks in search in parallel — picking up queries your main site can't, because it's structured differently. It gets cited by AI in parallel — because it's engineered for that.

You don't migrate your existing site. You don't replace it. You don't install a plugin. You don't break anything. The parallel site is additive: zero impact on your existing marketing, zero risk to your existing rankings, zero change to how your team works.

This is the approach we've been building at Kurka Labs. It's productized as a service called Beacon Proxy — we scan your current presence, position you around niches an AI assistant will understand, generate a purpose-built proxy site, and maintain it. The first clients are live. I'll share before-and-after AI visibility data once we've got sixty days of post-deploy measurement.


Why the retrofit approach will struggle

Retrofit thinking comes from the SEO world, and SEO people are used to the idea that optimization is always an incremental game on an existing surface. Add more backlinks. Improve content depth. Speed up the site. Better schema. You don't replace the site — you optimize it over time.

That model assumes a substrate that can, in principle, be optimized into working for the new channel. For AI, that substrate mostly can't. The retrofit work makes your site somewhat more machine-parseable; it doesn't make the site designed for machine parsing. That design-center gap is the part retrofit can't close.

This isn't a complaint about SEO practitioners — the retrofit playbook was the right answer for a long time. Responsive design, HTTPS, Core Web Vitals, accessibility — all were retrofit moves, and they mostly worked because they were about delivering the same content to the same readers through a slightly-different medium. The readers were still humans. The substrate still had to do the same job.

AI retrieval is a different job, not a different delivery of the same job. The reader isn't a human anymore — it's a model making a judgment about what to cite. Content structured for a human scanner reads very differently to a retrieval system. Building parallel is how you serve both jobs well; retrofit is how you serve one job adequately and the other job poorly.


The meta lesson

I spent a lot of years watching people retrofit old substrates to new jobs. Responsive design retrofitting desktop sites for mobile. HTTPS retrofitting old HTTP infrastructure. Core Web Vitals retrofitting sites built for slower devices. Each of those retrofit cycles took five-to-seven years and ended with most of the industry rebuilding from scratch anyway.

The retrofit posture is comfortable because it lets existing vendors extend their existing playbook without changing how they work. It's also usually wrong when the underlying job has changed shape.

The job has changed shape. Humans no longer ask Google exclusively. They ask chatbots. The chatbots don't return links. They return answers. The substrate that generates good answers is different from the substrate that ranks URLs. Pretending otherwise is comfortable but costs you visibility.

If you're running a small business that cares about being discoverable — not just on Google, but in the AI-mediated world your customers are increasingly living in — the practical move isn't to keep polishing your existing site. It's to build a parallel one that's engineered for the job you want done.

You don't change anything about what you have. You add something new next to it.


If this framing is useful, there's a free scan at kurkalabs.dev/beacon that tells you where your business currently stands with AI. Takes about a minute. We can take it from there.
Frank Kurka is the creator of TQL (The Question Layer), MemoryOS, and the Glass Bead Game development paradigm. He is the author of What is Artificial Intelligence, has been building software for 45+ years, and writes weekly at fkxx.substack.com.