Overview
When websites change their DOM structure, your web scrapers break. Instead of manually debugging selectors, you can use Claude Code with Browserbase to automatically detect and fix these issues in CI/CD.
This GitHub Action creates a Browserbase session, lets Claude Code inspect the live DOM, and opens a PR with fixed selectors.
The Problem
Your scraper suddenly stops working because the website changed their CSS classes:
// broken-automation/scraper.ts - These selectors stopped working!
const stories = document.querySelectorAll('.story-row'); // ❌ Should be .athing
const title = row.querySelector('.story-title .story-link'); // ❌ Should be .titleline > a
const score = row.querySelector('.story-score'); // ❌ Should be .score
The Solution
Add a GitHub Action that uses Claude Code with Browserbase to automatically fix it:
name: Auto-Fix Broken Selectors
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * 0' # Weekly check
jobs:
fix-selectors:
runs-on: ubuntu-latest
permissions:
contents: write
pull-requests: write
steps:
- uses: actions/checkout@v4
- name: Setup Node.js
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install CLIs
run: |
npm install -g @browserbasehq/browse-cli
npm install -g @anthropic-ai/claude-code
- name: Install Browserbase Skills
run: |
claude plugin marketplace add browserbase/skills
claude plugin install browse --scope local
- name: Create Browserbase Session
id: session
env:
BROWSERBASE_API_KEY: ${{ secrets.BROWSERBASE_API_KEY }}
BROWSERBASE_PROJECT_ID: ${{ secrets.BROWSERBASE_PROJECT_ID }}
run: |
RESPONSE=$(curl -s -X POST https://api.browserbase.com/v1/sessions \
-H "Content-Type: application/json" \
-H "x-bb-api-key: $BROWSERBASE_API_KEY" \
-d "{\"projectId\": \"$BROWSERBASE_PROJECT_ID\", \"keepAlive\": true}")
echo "session_id=$(echo $RESPONSE | jq -r '.id')" >> $GITHUB_OUTPUT
- name: Fix Selectors with Claude Code
env:
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
BROWSERBASE_API_KEY: ${{ secrets.BROWSERBASE_API_KEY }}
SESSION_ID: ${{ steps.session.outputs.session_id }}
run: |
WS_URL="wss://connect.browserbase.com?sessionId=${SESSION_ID}&apiKey=${BROWSERBASE_API_KEY}"
echo "Fix broken-automation/scraper.ts using browse CLI.
1. browse open https://news.ycombinator.com --ws \"${WS_URL}\"
2. browse snapshot -c
3. Find correct selectors and update the file
4. browse stop" | claude --print --dangerously-skip-permissions
- name: Create Pull Request
uses: peter-evans/create-pull-request@v5
with:
commit-message: "fix: update CSS selectors via Claude Code"
branch: auto-fix-selectors-${{ github.run_id }}
title: "[Automated] Fixed broken CSS selectors"
body: "Claude Code connected to Browserbase and fixed the selectors."
Trigger Options
The example above uses manual dispatch and a schedule, but you can trigger self-healing from various sources:
Observability Alerts
Webhook
On Failure
Schedule
Trigger the workflow when your monitoring detects scraper failures:on:
repository_dispatch:
types: [scraper-failure]
Then call the GitHub API from your observability platform (Datadog, PagerDuty, Grafana, etc.):curl -X POST \
-H "Authorization: token $GITHUB_TOKEN" \
-H "Accept: application/vnd.github.v3+json" \
https://api.github.com/repos/OWNER/REPO/dispatches \
-d '{"event_type": "scraper-failure", "client_payload": {"file": "scrapers/news.ts", "error": "Selector .story-row not found"}}'
Access the payload in your workflow:- name: Fix Selectors with Claude Code
run: |
echo "Fix ${{ github.event.client_payload.file }}.
Error was: ${{ github.event.client_payload.error }}" | claude --print
Trigger via HTTP webhook for integration with any external system:on:
repository_dispatch:
types: [fix-selectors]
Call from your backend, Slack bot, or any service:curl -X POST \
-H "Authorization: token $GITHUB_TOKEN" \
https://api.github.com/repos/OWNER/REPO/dispatches \
-d '{"event_type": "fix-selectors"}'
Trigger when another workflow fails:on:
workflow_run:
workflows: ["Scraper Tests"]
types: [completed]
jobs:
fix-on-failure:
if: ${{ github.event.workflow_run.conclusion == 'failure' }}
runs-on: ubuntu-latest
steps:
# ... self-healing steps
Run periodic checks to catch issues proactively:on:
schedule:
- cron: '0 */6 * * *' # Every 6 hours
- cron: '0 9 * * 1' # Monday 9am
The most powerful setup combines observability alerts with self-healing. When your monitoring detects increased error rates or missing data, it automatically triggers Claude Code to investigate and fix the issue.
Required Secrets
Add these secrets to your GitHub repository:
| Secret | Description |
|---|
ANTHROPIC_API_KEY | Claude API key for Claude Code |
BROWSERBASE_API_KEY | Your Browserbase API key |
BROWSERBASE_PROJECT_ID | Your Browserbase project ID |
How It Works
Trigger
The action runs on manual dispatch or a weekly schedule.
Setup
Installs the browse CLI, Claude Code, and Browserbase skills.
Browser Session
Creates a cloud browser session via Browserbase API.
AI Analysis
Claude Code connects to the browser, inspects the live DOM with browse snapshot, and identifies correct selectors.
Fix & PR
Updates the code and opens a pull request with the changes.
The Result
Claude Code automatically:
- Opens a real browser via Browserbase
- Inspects the current DOM structure
- Identifies the correct selectors
- Updates your code and opens a PR
This pattern works for any web automation that breaks due to DOM changes—login flows, data extraction, form submissions, and more.