Phase 0–6 baseline. Google Places + Firecrawl + Claude. Auth via GWS CLI / gcloud ADC. Cloudflare Workers cron + Pages dashboard.
# 1 — Authenticate
gcloud auth login
gcloud auth application-default login
# 2 — Set project
gcloud config set project YOUR_GCP_PROJECT_ID
# 3 — Enable required APIs
gcloud services enable \
places.googleapis.com \
mybusinessbusinessinformation.googleapis.com \
businessprofileperformance.googleapis.com
# 4 — Create service account
gcloud iam service-accounts create map-leads-sa \
--display-name="Map Leads Service Account"
# 5 — Grant Business Profile access
gcloud projects add-iam-policy-binding YOUR_GCP_PROJECT_ID \
--member="serviceAccount:map-leads-sa@YOUR_GCP_PROJECT_ID.iam.gserviceaccount.com" \
--role="roles/businessprofileperformance.viewer"
# 6 — Generate key (for CF Workers / server environments)
gcloud iam service-accounts keys create ./credentials/map-leads-sa.json \
--iam-account=map-leads-sa@YOUR_GCP_PROJECT_ID.iam.gserviceaccount.com
# 7 — Verify
gcloud auth application-default print-access-token
PHASE 0
Bootstrap
Organized Codebase, pnpm monorepo, wrangler.toml, kata CLI. Run GWS CLI setup script.
organized-codebase-applicator at Windsurf/map-leads- Run
scripts/setup-gws.sh to provision GCP service account + enable APIs GOOGLE_APPLICATION_CREDENTIALS=./credentials/map-leads-sa.json in .env- Initialize kata CLI phase enforcement
Organized Codebasegcloud CLIkata CLI
PHASE 1
Business Discovery
Google Places New API — Text Search. Authed via gcloud ADC. Results pushed to CF KV + scrape-queue.
- Places API key via
gcloud services api-keys create --api-target=service=places.googleapis.com - Filter: rating ≥ 2.5, review_count ≥ 10
- Create Durable Object per lead: state → discovered
- Push to
SCRAPE_QUEUE
Google Places APIgcloud ADCCF Queue
PHASE 2
Firecrawl Scraper
Reviews + website crawl. Extracts contact forms, staff names, services. KV cache 7d TTL.
- Firecrawl scrape Google Maps reviews page (100–200 reviews)
- Firecrawl crawl business site: contact form URL, staff, services
- Build ReviewCorpus JSONL (low-star first) + SiteContext
- State → scraped. Push to analyze-queue
Firecrawl MCPJSONL
PHASE 3
Pain Point Extractor
Claude Sonnet extracts top 3 pain points with evidence quotes, severity, and solvability.
- Batch reviews to 6k token chunks
- Return
{ pain_points: [{ topic, frequency_score, evidence_quote, severity, solvable_by }] } - Zod schema validation. Store pain categories to KV
- State → analyzed. Push to score-queue
Claude Sonnet 4.6Zod
PHASE 4
Lead Scorer
0.0–1.0 scoring with dynamic weights from CF KV. Hot → outreach-queue. All → flywheel baseline.
- Weights from KV:
map-leads:weights:current - Hot >0.7 → outreach + all channels. Warm 0.4–0.7 → email only. Cold → log
- Push baseline to outcome-queue
Dynamic weightsCF KV
PHASE 5
Email Generator + Send
Claude Sonnet cold email via Resend. A/B subjects. Tracking pixel. GBP via gcloud service account.
- Claude generates email from KV template
- Resend API — A/B 2 subjects, open/click webhook → outcome-queue
- GBP Messages: auth via
GOOGLE_APPLICATION_CREDENTIALS (gcloud SA key) POST https://mybusiness.googleapis.com/v4/...messages
ClaudeResendGBP + gcloud SA
PHASE 6
Dashboard + Deploy
CF Pages dashboard. Workers cron (2am nightly). Deploy via wrangler CLI.
wrangler pages deploy apps/dashboard/dist --project-name map-leads --commit-dirty=truewrangler deploy --name map-leads-discovery --commit-dirty=true (repeat per worker)- Custom domain:
mapleads.organizedai.vip - CRON:
0 2 * * * discovery, 0 3 * * * flywheel
wrangler CLICF Pages