Deploying Puppeteer on Vercel
Learn how to setup your project, deploy to Vercel, and scale with Browserbase
Introduction
This guide will walk you through how to build a fully functional backend that can convert any website into HTML, take screenshots, and fill out forms using Puppeteer, Vercel, and Browserbase.
To run these at scale, we’ll also use headless browsers (browsers without a user interface), which are often used for scaling web automations, testing, and data collection.
Prerequistites
Vercel: A developer infrastructure platform that provides a variety of features that you can build, deploy, and scale. In addition, Vercel owns & maintains one of the most popular frontend frameworks, Next.js, that allows your to building applications completely out-of-box without additional configuration.
Browserbase: A headless browser infrastructure platform that provides ready-to-use browsers out of the box. This includes observability, proxies, stealth mode, and additional debugging tools for your automation scripts. Browsers are essential for interacting with the web, and Browserbase simplifies this process by managing multiple browser sessions and providing debugging capabilities from the start.
Puppeteer: A Node library which provides a high-level API to control Chrome or Chromium over the DevTools Protocol. Puppeteer runs headless by default, but can be configured to run a full version of Chrome or Chromium.
Step 1: Setting up your project
First, you’ll need to create a Browserbase account. You can sign up for a free account here.
You’ll also need a Vercel account, of which you can sign up for a free account here.
Now, let’s create a Next.js app through the CLI. We’ll name this project vercel-automation
Install packages
To begin using headless browsers, we’ll use Browserbase’s NodeJS SDK and Stagehand AI SDK. In addition, we’ll use Zod for data validation and Prettier to format our code, ensuring we return clean, structured outputs.
Managing API Keys
Be sure to add environment variables for this project.
You’ll need a Browserbase API key and Project ID. You can get these from the Browserbase Settings.
Step 2: Using Next.js Route Handlers
Next.js Route Handlers allow us to create custom API endpoints that process HTTP requests and return web content through our APIs, directly within your application.
- Next.js provides helper classes
NextRequest
andNextResponse
to simplify working with native Request/Response APIs. - Route Handlers are exclusively available within the
app
directory.
In this project, we’ll create three route handlers, each a different web automation task.
The route handlers will use Browserbase to utilize headless browser infrastructure for HTML content collection, screenshot captures, and form submissions.
Let’s ensure we have the following directory structure:
HTML
Let’s create our first Route Handler for retrieving HTML.
Create an html
directory in our app/api
folder. To create an endpoint, we’ll add a route.ts
file that will handle the API requests. We’ll use the GET
method to retrieve HTML content from a specified URL.
Here’s the code for our first route handler:
Screenshots
For our second Route Handler, we will create a new browser session with a specified viewport, navigate to the URL, and screenshot the screen. You can create screenshot/route.ts
and utilize the following code to enable screenshot abilities.
Form Inputs
Often times, using Puppeteer can be a bit cumbersome. For our Form API route handler, we’ve implemented web automation using Stagehand, an AI SDK.
Stagehand simplifies complex browser interactions by allowing us to use plain English for web automation. Stagehand consists of three main functions; Act
, Extract
, and Observe
.
In this example, we’ve initialized Stagehand with our Browserbase credentials and an LLM model to efficiently fill out a sample form, rather needing to write a more complex Puppeteer script.
Below is the same web automation task, comparing the Puppeteer implementation and Stagehand implementation.
If you’re using Stagehand, you’ll need to set up an LLM provider. Be sure to include the environment variable for your LLM provider in your .env
file.
Testing the API Endpoints
Now that we’ve implemented our three route handlers in the Next.js app, let’s test the API endpoints.
-
Start the development server:
-
Access the API at the base URL:
http://localhost:3000/api
-
Test each endpoint by navigating to:
http://localhost:3000/api/html
http://localhost:3000/api/screenshot
http://localhost:3000/api/form
Each endpoint should return a 200
status code when working correctly.
Step 3: Deploying to Vercel
Finally, after testing our API endpoints locally, we can deploy to Vercel:
- Sign in to Vercel
- Click “Add New…” → “Project”
- Connect and select your repository
- Add any environment variables
- Click “Deploy”
- Once complete, you’ll get a deployment URL
Deploying with fluid compute
Fluid compute is a new infrastructure model from Vercel that balances the benefits of dedicated servers and serverless computing. These mini-servers start up only when needed, grow instantly as traffic increases, use what’s already running before adding more compute. You only pay for what you actually use.
Fluid compute also handles advanced tasks, cuts costs, runs close to your data, requires no setup, and works with standard Node.js and Python. To learn more, you can read more about it in the announcement and documentation.
Why Fluid Compute for Browser Automations
For your route handlers connecting to Browserbase’s headless browser infrastructure, fluid compute offers some benefits:
- Performance optimization - Route handlers that orchestrate complex browser automations remain responsive under load, with warm mini-servers eliminating cold start delays when initiating browser sessions
- Optimized concurrency - Multiple function invocations share a single instance, allowing concurrent processing while some requests wait for Browserbase responses, eliminating idle resource waste
- Extended, efficient runtimes - Complex automation workflows that would timeout in standard serverless functions complete successfully, while you only pay when your route handlers are processing requests
Although Vercel doesn’t handle the browser sessions directly (Browserbase does), Fluid compute’s makes browser automation projects significantly more reliable, cost-effective, and performant at scale for AI applications.
How to Enable Fluid Compute
- Go to your project settings in Vercel
- Select Functions from the left navigation menu
- Toggle the Fluid compute button to enable it
- Click Save
- Redeploy your project
You will see a higher New Function Duration and New Function Max Duration as a result
Fluid compute is able to show how much storage and computing resources you’ve saved by optimizing resource usage across requests.
As you grow in traffic, multiple requests begin to add up. You can monitor these savings in the Observability tab, which displays metrics on function performance, resource utilization, and cost efficiency.
This data helps you quantify the benefits of Fluid compute as your application scales, potentially reducing your compute costs by up to 85% compared to traditional serverless approaches.
Conclusion
Congratulations! Now you have a fully functional web application that can convert any website into HTML, take screenshots, and fill out forms using Puppeteer and Browserbase.
This project demonstrates how to leverage Vercel’s serverless functions, Next.js route handlers, Fluid Compute, Stagehand, and Browserbase headless browsers to create a practical web application.
Feel free to check out the completed code on GitHub.
Was this page helpful?