Skip to main content
1

Get your API keys

You’ll need credentials from two services:Browserbase — Go to the Dashboard Settings and copy your API key and project ID.
Anthropic — Get your API key from the Anthropic Console.Create a .env.local file with:
BROWSERBASE_PROJECT_ID=your_project_id
BROWSERBASE_API_KEY=your_api_key
ANTHROPIC_API_KEY=your_anthropic_key
2

Install dependencies

npm i @browserbasehq/stagehand @browserbasehq/sdk ai @ai-sdk/anthropic zod
PackagePurpose
@browserbasehq/stagehandAI-powered browser automation
@browserbasehq/sdkBrowserbase API client (sessions, live views)
aiVercel AI SDK for structured generation
@ai-sdk/anthropicAnthropic model provider
zodSchema validation for extracted data
3

Create a Stagehand session

Initialize a Stagehand instance connected to Browserbase. Each session gets its own cloud browser with a live debug view.
import { Stagehand } from "@browserbasehq/stagehand";
import Browserbase from "@browserbasehq/sdk";

const browserbase = new Browserbase();

async function createStagehandSession(source: string) {
  const stagehand = new Stagehand({
    env: "BROWSERBASE",
    model: "anthropic/claude-sonnet-4-5-20250929",
    logger: console.log,
    disablePino: true,
  });

  await stagehand.init();

  const sessionId = stagehand.browserbaseSessionID!;
  const { debuggerFullscreenUrl } = await browserbase.sessions.debug(sessionId);

  return { stagehand, sessionId, liveViewUrl: debuggerFullscreenUrl, source };
}
4

Define research functions

Each research function takes a Stagehand instance, navigates to a source, and uses stagehand.extract() to pull structured data from the page using AI.Here’s an example that searches DuckDuckGo and visits top results:
import { z } from "zod";

async function researchGoogle(
  stagehand: Stagehand,
  query: string,
  onFinding: (finding: Finding) => void
) {
  const page = stagehand.context.activePage()!;

  await page.goto(`https://duckduckgo.com/?q=${encodeURIComponent(query)}`);
  await page.waitForTimeout(2000);

  const searchResults = await stagehand.extract(
    "Extract the top 5 organic search result links with their titles and URLs. Skip any ads.",
    z.object({
      results: z.array(z.object({
        title: z.string(),
        url: z.string(),
      })).max(5),
    })
  );

  for (const result of searchResults.results.slice(0, 3)) {
    if (!result.url || result.url.includes("duckduckgo.com")) continue;

    await page.goto(result.url, { waitUntil: "domcontentloaded", timeoutMs: 15000 });

    const content = await stagehand.extract(
      `Extract the key information about "${query}" from this article.`,
      z.object({
        summary: z.string(),
        keyFacts: z.array(z.string()),
      })
    );

    if (content.summary) {
      onFinding({
        title: result.title,
        source: new URL(result.url).hostname.replace("www.", ""),
        url: result.url,
        summary: content.summary,
        relevance: "high",
      });
    }
  }
}
You can create similar functions for Wikipedia, YouTube, Hacker News, and Google News — each using stagehand.extract() with different schemas. See the full template for all five research functions.
5

Create the API route with SSE streaming

Create app/api/research/route.ts to handle research requests. This route creates parallel Stagehand sessions and streams findings back via Server-Sent Events.
import { generateObject } from "ai";
import { anthropic } from "@ai-sdk/anthropic";

export const maxDuration = 300;

const ResearchSummarySchema = z.object({
  overview: z.string().describe("2-3 sentence direct answer to the query"),
  keyFacts: z.array(z.string()).describe("3-6 specific facts with dates, numbers, or names"),
  recentDevelopments: z.string().nullable().describe("Latest news if applicable"),
  sourcesSummary: z.string().describe("Brief note on the types of sources consulted"),
});

export async function POST(req: Request) {
  const { query } = await req.json();

  const encoder = new TextEncoder();
  const stream = new TransformStream();
  const writer = stream.writable.getWriter();

  const sendEvent = async (event: string, data: unknown) => {
    await writer.write(
      encoder.encode(`event: ${event}\ndata: ${JSON.stringify(data)}\n\n`)
    );
  };

  (async () => {
    const sessions = [];
    const allFindings: Finding[] = [];

    const researchFunctions = [
      { source: "News", fn: researchGoogleNews },
      { source: "Hacker News", fn: researchHackerNews },
      { source: "YouTube", fn: researchYouTube },
      { source: "Wikipedia", fn: researchWikipedia },
      { source: "Search", fn: researchGoogle },
    ];

    try {
      await sendEvent("status", { message: "Starting browser sessions...", phase: "init" });

      // Create all Stagehand sessions in parallel
      const sessionPromises = researchFunctions.map(({ source }) =>
        createStagehandSession(source)
      );
      const createdSessions = await Promise.all(sessionPromises);
      sessions.push(...createdSessions);

      // Send live view URLs to frontend
      await sendEvent("liveViews", {
        sessions: sessions.map(s => ({
          source: s.source,
          liveViewUrl: s.liveViewUrl,
          sessionId: s.sessionId,
        })),
      });

      // Run all research in parallel
      await Promise.allSettled(
        researchFunctions.map(({ source, fn }, index) =>
          fn(
            sessions[index].stagehand,
            query,
            (finding) => {
              allFindings.push(finding);
              sendEvent("findings", { findings: allFindings });
            }
          )
        )
      );

      // Synthesize findings with AI
      if (allFindings.length > 0) {
        const findingsText = allFindings
          .map((f) => `Source: ${f.source}\n${f.summary}`)
          .join("\n\n---\n\n");

        const { object: summary } = await generateObject({
          model: anthropic("claude-sonnet-4-5-20250929"),
          schema: ResearchSummarySchema,
          prompt: `Based on these research findings about "${query}", create a structured summary.\n\n${findingsText}`,
        });

        await sendEvent("complete", { findings: allFindings, summary });
      }
    } finally {
      for (const session of sessions) {
        try { await session.stagehand.close(); } catch {}
      }
      await writer.close();
    }
  })();

  return new Response(stream.readable, {
    headers: {
      "Content-Type": "text/event-stream",
      "Cache-Control": "no-cache",
      Connection: "keep-alive",
    },
  });
}
6

Handle concurrency limits

Free Browserbase plans have a concurrency limit of 1. The template automatically detects this and falls back to running sessions sequentially:
async function getProjectConcurrency(): Promise<number> {
  const project = await browserbase.projects.retrieve(
    process.env.BROWSERBASE_PROJECT_ID!
  );
  return project.concurrency ?? 1;
}

// In your POST handler:
const concurrency = await getProjectConcurrency();

if (concurrency === 1) {
  // Run browsers one at a time, closing each before starting the next
  for (const { source, fn } of researchFunctions) {
    const session = await createStagehandSession(source);
    await fn(session.stagehand, query, onFinding);
    await session.stagehand.close();
  }
} else {
  // Run all browsers in parallel
  const sessions = await Promise.all(
    researchFunctions.map(({ source }) => createStagehandSession(source))
  );
  await Promise.allSettled(
    researchFunctions.map(({ fn }, i) =>
      fn(sessions[i].stagehand, query, onFinding)
    )
  );
}
Congratulations! You’ve built an AI research agent that runs parallel browser sessions with Stagehand and Browserbase on Vercel.For the complete implementation including the frontend UI with live browser views, check out the full template:

Full Template on GitHub

Browse the complete source code with frontend components, SSE streaming, and live browser views.

Deploy to Vercel

One-click deploy with automatic Browserbase setup via the Vercel Marketplace.