I checked my server logs last Tuesday. Traffic was up. Way up. But engagement was flat. Same number of humans reading posts. The extra visitors weren’t reading anything at all.

They weren’t visitors. They were agents.
You Are Now the Minority
In 2024, automated traffic surpassed human traffic on the internet for the first time in a decade. Bots now account for 51% of all web traffic. Cloudflare processes 50 billion AI crawler requests per day. GPTBot traffic alone grew 305% in one year.
The web you built for humans? Humans aren’t the primary audience anymore.
Retail sites see 59% bot traffic. Travel sites: 48%. These aren’t all scrapers or spam bots. Increasingly, they’re shopping agents, research agents, booking agents. Doing things humans used to do, on websites humans used to visit.
Cloudflare published a stat that stopped me cold. For every single visitor Anthropic refers back to a website, its crawlers have already visited 38,065 pages. OpenAI’s ratio is 1,091 to 1. Perplexity: 194 to 1. The agents read your site a thousand times for every one human they send your way.
The web hasn’t died. But it’s molting.
The Protocol War
If 2024 was the year we noticed agent traffic, 2025 was the year everyone started building the plumbing.

Anthropic released MCP (Model Context Protocol) in November 2024. People call it “USB-C for AI,” a universal adapter that lets any AI system talk to any tool or service. It now has 97 million monthly SDK downloads and over 10,000 active servers. In December 2025, Anthropic donated it to the Linux Foundation’s new Agentic AI Foundation, co-founded with Block and OpenAI. Platinum members include AWS, Google, Microsoft, Bloomberg, and Cloudflare.
Google launched A2A (Agent-to-Agent Protocol) in April 2025. It lets agents from different vendors discover each other using “Agent Cards,” basically JSON resumes. Over 150 organizations signed on, including Microsoft, Amazon, SAP, Salesforce, and PayPal. Adobe and S&P Global already use it in production.
Then the commerce-specific protocols showed up. Shopify and Google co-developed UCP (Universal Commerce Protocol), endorsed by Etsy, Wayfair, Target, and Walmart. OpenAI and Stripe built ACP (Agentic Commerce Protocol), which powers “Buy it in ChatGPT,” launched February 2026.
There’s more. Jeremy Howard proposed llms.txt, a file that tells LLMs where your best resources are (the inverse of robots.txt, which tells crawlers where NOT to go). Over 600 sites adopted it, including Anthropic, Stripe, and Cloudflare. Vercel went further, proposing embedded LLM instructions directly in HTML: <script type="text/llms.txt">. Their 401 error pages already serve agent-specific instructions.
This is the HTTP moment for agents. The protocols being written right now will shape how the agentic web works for the next decade.
When Your User Has No Eyes
We’ve spent thirty years making websites look good. Careful typography. Hero images. Hover effects. Cookie banners with the “Accept All” button slightly bigger than the “Manage Preferences” button. All designed for humans who see, click, and feel.
Your next billion users won’t see any of it.
An AI shopping agent doesn’t care about your hero image. It doesn’t notice your brand colors. It doesn’t feel the emotional pull of your “Limited Time Only” banner. It parses your structured data, checks your Schema markup, reads your JSON-LD, and makes a decision based on price, specs, availability, and reviews.
CSS is irrelevant when your user has no eyes.
Bain found that 80% of consumers already rely on zero-click results for at least 40% of their searches, reducing organic traffic by 15-25%. Google referrals to news sites dropped 9-15% in 2025. That funnel where you attract visitors with content, dazzle them with design, and convert them with psychology? Agents skip the entire thing. They go straight to the data layer.
HubSpot put it bluntly: “The fastest-growing decision-maker in your funnel cannot see your ad, feel your brand, or be persuaded by your story.”
The advertising model of the internet is about to face its first existential threat since ad blockers. Except ad blockers were opt-in. Agent browsing is default. When Perplexity’s Comet browser started bypassing Amazon’s advertising, Amazon sued. A federal judge blocked Comet from Amazon on March 10, 2026. Perplexity argued the real motivation was protecting ad revenue, not cybersecurity.
That lawsuit is a preview. The entire attention economy was built on the assumption that humans look at screens. Agents don’t look at anything.
The Money Is Already Moving
This isn’t theoretical. The money has already started flowing through agent channels.
During Cyber Week 2025, one in five orders globally were associated with AI tools or agents. That’s 20% of all orders, roughly $67 billion. On Cyber Monday alone, AI traffic to US retail sites increased 670%. AI-influenced shoppers converted 38% more frequently than traditional visitors.
McKinsey estimates agentic commerce could redirect $3-5 trillion in global retail spend by 2030, with nearly $1 trillion from the US alone. Payment executives told CNBC this could be “more transformative than the rise of e-commerce platforms such as Amazon.”
The platforms are racing to own the checkout. Shopify launched Agentic Storefronts, letting merchants appear on ChatGPT, Perplexity, Microsoft Copilot, and Google AI Mode without needing a traditional website at all. Amazon built “Buy for Me,” an AI agent that purchases from third-party brand sites so customers never leave Amazon. OpenAI launched “Buy it in ChatGPT” in February with Stripe’s Agentic Commerce Protocol behind it.
Visa launched its Trusted Agent Protocol in October 2025, an open framework to distinguish legitimate AI agents from malicious bots. Mastercard is building its own trust framework. Both are running real transactions. Not pilot stage. Deployment.
47% of US shoppers already use AI tools for at least one part of their shopping journey. That number is going one direction.
What to Do About It
The agentic web is coming whether your site is ready or not. The transition will be messy, dual-interface, and gradual. Here’s what the practical path looks like.
Structured data first. Schema markup, JSON-LD, clean OpenGraph tags. This is the content layer agents actually read. If your product pages don’t have machine-readable pricing, availability, and specs, you’re invisible to agent shoppers.
Add llms.txt. It takes ten minutes. Create a /llms.txt file that tells LLMs where your most useful resources live. Over 600 sites have done this already. It’s the new robots.txt, but instead of “go away” it says “here’s the good stuff.”
Build an MCP server. If you have an API, wrap it in MCP. Anthropic, OpenAI, Google, and Microsoft clients all support the protocol. This is how agents will interact with your service natively, without scraping your UI.
Rethink your metrics. Traffic is no longer a proxy for interest. An agent visiting your site 38,000 times doesn’t mean you have 38,000 interested customers. You need to distinguish agent traffic from human traffic and measure what agents actually do: transactions, API calls, data retrieved.
Plan for agent authentication. Visa and Mastercard are already building trust frameworks. If your business involves transactions, you’ll need a way to verify that the agent placing an order is authorized to act on behalf of a real customer.
The visual web isn’t going away tomorrow. Humans still browse. But the share of your traffic that sees your CSS is shrinking every quarter, and the share that reads your structured data is growing. Design for both.
Your Homework
Go to your website’s analytics right now. Look at your traffic. Filter for known bot user agents. The number will be higher than you expect.
Then add a llms.txt file to your site root. Ten minutes. Tell the agents where the good stuff is.
The web is being rebuilt. You can watch, or you can leave the light on for your new visitors.
They won’t see it. But they’ll know it’s there.
