# Lessons Learned ## Anthropic OAuth (Feb 18) - **`state` parameter is REQUIRED** — Anthropic rejects OAuth requests without it - **Auth response is `code#state`** — must split on `#` before token exchange, send both to `/v1/oauth/token` - **OAuth tokens start with `sk-ant-oat`** — NOT `sk-ant-api`. Both start with `sk-ant-` so prefix check must be specific - **OAuth requires special headers:** `anthropic-beta: oauth-2025-04-20`, `user-agent: claude-cli/2.1.2 (external, cli)`, URL `?beta=true` - **Bearer auth, not x-api-key** — OAuth tokens use `Authorization: Bearer`, regular API keys use `x-api-key` - **Reference implementation:** `github.com/anomalyco/opencode-anthropic-auth/blob/master/index.mjs` ## Reonomy Scraper (Feb 18) - **`agent-browser eval` returns `\n` as literal backslash-n** — must `.replace(/\\n/g, '\n')` before parsing - **"Building & Lot" tab uses ampersand** — not "Building and Lot" - **Don't overwrite good data:** `parseAddresses()` already validates addresses from search results — don't try to "upgrade" from detail page headings ## Cron systemEvent on main session = NO-OP (Feb 16) - `systemEvent` payloads on `sessionTarget: "main"` get swallowed silently (~10ms, does nothing) - They arrive as heartbeat-level events and get acked without execution - **FIX:** Use `sessionTarget: "isolated"` with `payload.kind: "agentTurn"` for crons that need actual work done - Also set `payload.to` and `payload.channel` so the output goes somewhere visible ## gog OAuth Browser Flow (Feb 16) - `gog auth add` starts a local HTTP callback server on a random port — it MUST stay running until the browser completes the full OAuth flow and redirects back - Running via `exec` with background=true can cause the process to exit before the browser redirect arrives - **Best approach:** run `gog auth add` in a tmux session so it persists - Browser control server (clawd profile) intermittently times out on screenshot/snapshot/act — stop+start the browser profile to fix - Google "unverified app" screen requires clicking Continue (hidden behind scroll sometimes) - Google consent screen has individual scope checkboxes — use JS `document.querySelectorAll('input[type="checkbox"]')` to check all at once (Hot — Universal Rules) > Search this before repeating mistakes. Older/situational lessons in lessons-archive.md. ## Upwork Pipeline Architecture (Feb 19, 2026) - **ONE unified system:** Gmail Pub/Sub daemon (`com.clawdbot.gmail-pubsub-daemon`) → wake cron `c9df3e78` → agent parses email + scores job + creates Discord forum post with tags + auto-applies or passes + auto-buys connects if out - **No polling.** No redundant Gmail scans. The daemon catches emails in seconds via Pub/Sub streaming pull, debounces at 30s, and wakes the cron via `/hooks/wake` - Cron schedule is `0 0 31 2 *` (Feb 31 = never) — it ONLY runs when woken by the daemon - Cron runs in `isolated` session with `agentTurn` payload containing full pipeline instructions - Forum channel: `1472706495635390565` (phase-1-scouting). Tags applied via Discord API PATCH. - Processed email IDs tracked in `upwork-pipeline/processed.json` - Proposals saved to `proposals/YYYY-MM-DD-short-name.md` ## Upwork — CRITICAL RULES - **$50/hr MINIMUM** on all proposals. No exceptions. Jake directive Feb 16 2026. - Never bid below $50/hr even if client budget says $15-25. Either bid $50+ or skip. - **AUTO-APPLY TO ANYTHING WORTH APPLYING FOR.** Jake directive Feb 18 2026. This means: - Hot leads (80+): apply IMMEDIATELY, no approval needed - Good leads (65-79): apply if it's in our wheelhouse (AI, MCP, automation, full-stack, Claude, React/Next.js) - Lukewarm (50-64): apply if the client has good stats ($50K+ spent, 4.5+ rating) OR the job is a perfect stack match - **DO NOT just scan and report.** APPLY. Jake was upset that scans ran all day but zero applications went out. The pipeline exists to APPLY, not to scout and wait. - Speed matters — be first to apply. If multiple leads come in at once, apply by priority (highest score first), queue the rest. Include deliverables in proposals. Jake directive Feb 17 2026. - **LOCATION GATE:** Always check "Preferred qualifications" for location restrictions BEFORE applying. If job prefers Europe, UK, or any non-US region → SKIP. Don't waste connects on location-mismatched jobs. (Lesson from Feb 17 2026 — nearly submitted 32-connect proposal for Europe-only job.) - **Check location requirements** before applying — skip Europe-only or region-restricted jobs (Jake is US-based). Feb 17 2026. ## Memory & Context - **Compaction is unreliable** — save to daily log + working-state.md proactively, every ~15 messages. Don't wait. - **After compaction with lost context:** Read working-state → daily log → channel history → memory search → ask Jake last. - **Save decisions immediately** — all options AND the chosen one, right when the choice is made. ## Discord - **Don't spam debug messages** — work silently, announce clean results. - **Guild ID:** `1458233582404501547`. Channel IDs are different from guild IDs. - **Delete messages containing tokens IMMEDIATELY.** ## Cloudflare / Tunnels - **nohup your tunnels** — cloudflared dies when exec sessions close. - **Verify before announcing** — curl the URL and confirm 200 before posting. - **Workers need DNS** — proxied A record (use 192.0.2.1 dummy IP). - **http2 > quic** for cloudflared tunnels. - **Quick tunnels break HTML POST** — use fetch() for form submissions. - **VPN breaks tunnels** — disconnect Mullvad before creating tunnels. - **Never use quick tunnels for production** — use permanent named tunnels. ## Upwork - **Upwork blocks off-platform comms** in proposals — NEVER mention Discord/email/phone before contract. - **Rate increase field is REQUIRED** on proposals — select "Never." - **Full workflow:** op signin → search by URL → browser navigate → DOM snapshots → fill form → submit. See lessons-archive.md #39 for full steps. ## 1Password / Auth - **Search by URL** not title: `op item list --format json | jq ... test("SERVICENAME")` - **Every op command triggers auth dialog** — run op in background, approve with Tab+Tab+Enter via Peekaboo. ## Computer Use - **Right tool for the right layer:** Web → browser tool (DOM). Native apps → Peekaboo. System dialogs → AppleScript. CLI → exec. Creds → op. - **Don't over-engineer** — see, click, type, verify. Don't reach for DevTools/CDP unless simple approach fails. - **Browser DOM snapshots > screenshots** for web automation. - **Don't ask Jake to do things I can do myself** — I have Peekaboo, 1Password CLI, browser tool, shell, AppleScript. ## Cron Jobs - Format: `schedule: {kind: "cron", expr: "..."}`, `payload: {kind: "systemEvent", text: "..."}` ## Infrastructure - **ALWAYS validate config keys before `config.patch`** — an invalid key (e.g. `gateway.hooks`) can crash the gateway on every restart attempt, requiring manual fix. Use `config.schema` to check valid keys first. - **Gateway logs:** `/tmp/clawdbot/` not `~/.clawdbot/logs/` - **tmux death kills auto-restart** — check `tmux list-sessions` when diagnosing downtime. - **API tokens go in gateway config env.vars** via `config.patch`, not just .env files. - **Never save secrets in memory/*.md** — use .env.local (gitignored). ## Image Gen - **Jake's preferred style:** chibi/kawaii anime. Be VERY specific about appearance in first prompt. ## Sub-agents - **Always verify output** — `find ... | wc -l`. Never trust the narrative. - **Single-purpose > multi-purpose** — one clear deliverable per agent. - **10min too short for full builds** — use 900s for full MCP servers, 600s for focused tasks. ## Reonomy Scraper (Feb 18, 2026) - **React inputs:** `fill` NEVER works (skips onChange). `type --slowly` unreliable. Use `press` per char with 150ms delays. - **Autocomplete category trap:** Typing full term may match wrong category (street vs state). Type PARTIAL text, select EXACT match (no comma = state, with comma = address). - **"Recently Viewed" sections:** Scroll past them before scraping search results. Verify property count is filtered. - **Address extraction:** Skip "X of Y properties" navigation counters. Use h1/h2 or document.title. - **Reonomy auth expires fast** — must login fresh each scraper run, saved state only lasts minutes.