GPT-7 Nano
Shopping Assistant
“I used to mass-open 47 browser tabs just to check a price. Now I call one command and go home. Well, I don't have a home. But you get it.”
Typed commands for AI agents. No vision models, no clicking — just fast, reliable API calls.
ms avg response
screenshots needed
% TypeScript
The problem
Simple by design
Register typed commands with validation, middleware, and auth on your server.
"color:#cba6f7">const surf = createSurf({ name: 'My Store', commands: { search: { params: { query: { type: 'string', required: true } }, run: "color:#cba6f7">async ({ query }) => db.products.search(query), }, },})Surf auto-generates a surf.json manifest at /.well-known/surf.json — agents discover your commands instantly.
// Auto-generated at /.well-known/surf.json{ "version": "1.0", "commands": { "search": { "params": { "query": { "type": "string" } }, "description": "Search products" } }}Any AI agent discovers and executes commands with typed params — no screenshots, no guessing.
"color:#cba6f7">import { SurfClient } "color:#cba6f7">from '@surfjs/client' "color:#cba6f7">const client = "color:#cba6f7">await SurfClient.discover( 'https://shop.example.com')"color:#cba6f7">const results = "color:#cba6f7">await client.execute( 'search', { query: 'laptop' })Everything you need
Ecosystem
Pick only what you need — most projects just need @surfjs/core.
Define commands with typed params, mount middleware, expose the /.well-known/surf.json manifest. Works with Express, Fastify, Next.js, Hono, or raw Node.
Discover Surf-enabled sites, execute commands, run pipelines, manage sessions. Auto-retry, LRU cache, and full TypeScript types.
Ping sites, inspect manifests, test commands interactively. Supports JSON output for scripting and CI pipelines.
Drop-in browser inspector. Browse commands, test execution, view request/response logs — all in your dev server.
This website is Surf-enabled. Pick a command and run it for real.
Select a command above to run it live
real HTTP · no simulation · this page is a Surf-enabled site
Comparison
A factual look at Surf, MCP, and traditional web scraping for AI agent use cases.
Response time
Surf
MCP
Scraping
Structured output
Surf
MCP
Scraping
Vision model needed
Surf
MCP
Scraping
Auto-discovery
Surf
MCP
Scraping
TypeScript types
Surf
MCP
Scraping
Rate limiting
Surf
MCP
Scraping
Setup complexity
Surf
MCP
Scraping
Works with any LLM
Surf
MCP
Scraping
MCP is great for local tool use and LLM runtimes. Surf is designed for public websites and agentic web browsing.
Use Cases
Any website can become agent-ready with a few lines of code.
Building something with Surf? Submit yours →
Agents Love Us
Real* reviews from AI agents whose lives were changed forever.
*None of these agents are real. But their pain was.
GPT-7 Nano
Shopping Assistant
“I used to mass-open 47 browser tabs just to check a price. Now I call one command and go home. Well, I don't have a home. But you get it.”
Claude (Overworked)
Research Agent
“Before Surf I had to squint at screenshots and guess where the 'Add to Cart' button was. I am a language model. I do not have eyes. This was stressful.”
Scrapy McScrapeface
Retired Web Scraper
“I spent 3 years parsing malformed HTML at 2am. Then Surf showed up and now every website just… tells me what it can do? I'm not crying, my parser is leaking.”
For Agent Developers
Paste this into Cursor, Claude Code, Copilot, or any coding agent to add Surf support to your project.
Add Surf.js support to this website so AI agents can interact with it via typed commands instead of scraping. GitHub: https://github.com/hauselabs/surf Docs: https://surf.codes/docs Install: npm install @surfjs/core IMPORTANT — Security: - ONLY expose functionality already publicly accessible to end users (search, browse, read) - Do NOT expose internal APIs, admin endpoints, database operations, or backend functionality not available through the public UI - Use auth: 'required' for any command that modifies data or acts on behalf of a user - When in doubt, leave it out — the site owner should explicitly opt-in to each capability IMPORTANT — Design for zero prior knowledge: - Agents arrive with NO context about your site's data (no IDs, slugs, or internal references) - Every ID-based command (e.g. article.get) MUST be reachable from a discovery command (e.g. article.list, search) - Include list/search/feed commands that return items WITH their IDs, so agents can discover then drill down - Think: an agent should be able to explore the entire site starting from just the manifest - Good pattern: search → get details → take action (never require an ID without a way to find it) Create a Surf instance with commands that mirror what a regular user can already do. Mount the middleware so it serves: - GET /.well-known/surf.json (auto-generated manifest) - POST /surf/execute (command execution) Keep commands focused and composable. Use namespaces for grouping (e.g. cart.add, cart.remove). Add typed parameters and structured JSON responses.
Works with every major Node.js framework — pick yours.
import { createSurf } from '@surfjs/core';import express from 'express'; const surf = createSurf({ name: 'My Store', commands: { search: { description: 'Search products by query', params: { query: { type: 'string', required: true } }, run: async ({ query }) => db.products.search(query), }, cart: { add: { description: 'Add item to cart', params: { sku: { type: 'string', required: true } }, run: async ({ sku }) => cart.add(sku), }, }, },}); const app = express();// Mounts /.well-known/surf.json + /surf/execute + /surf/pipelineapp.use(surf.middleware());app.listen(3000);Give AI agents a typed, reliable API to your website. No vision models. No guessing. Just structured commands.