clawdbot-workspace/memory/2026-02-03.md

96 lines
5.1 KiB
Markdown

# 2026-02-03 Memory Log
## Reonomy Scraper — Discord Integration
### What Jake Asked
- Add the working Reonomy scraper to the Discord server so people can request property lists and get CSV files back.
- Added user `1468417808323838033` (Henry Eisenstein / henryeisenstein.) to the Discord guild `1465465447687393373` allowlist via config.apply.
### Henry's First Request
- **Criteria:** NJ only, Industrial only, 50,000+ SF, not sold within 10 years
- Spent significant time debugging scraper v13→v14 issues
### Key Technical Findings
#### Reonomy Search IDs Are Session-Specific
- Search IDs (UUIDs in URL like `#!/search/{id}`) do NOT persist across browser sessions
- Opening a search ID URL in a new session redirects to home page
- **Must build search filters within the same browser session that does the scraping**
#### agent-browser Snapshot Modes
- `snapshot -i` (interactive) — shows only buttons, textboxes, tabs, links (clickable elements)
- `snapshot` (full) — shows ALL elements including headings, paragraphs, text
- **Property addresses on Reonomy search results are `heading [level=6]` elements** — ONLY visible in full snapshot, NOT in `-i` mode
- Must use full snapshot to find/parse property cards
#### Reonomy Search Box Autocomplete
- `fill` command triggers "Recent Searches" dropdown, NOT location autocomplete
- `type` command (keystroke-by-keystroke) DOES trigger location autocomplete but is slow
- **Working approach:** Click search box → opens recent searches dropdown → then `type "New J"` → triggers state autocomplete → click "New Jersey" menuitem
- The search box ref changes after clicking "Advanced Search" — must take fresh snapshot
#### Reonomy Size Filter (Building Area SF)
- Size dropdown has TWO sections: "Total Units" (first min/max) and "Building Area (SF)" (second min/max)
- Easy to accidentally fill Total Units instead of Building Area
- **Working approach:** Click Size → click the Building Area min field (second `textbox "min"`) → `type "50000"` → press Enter → filter applies as "50000+ SF" tag
- JS `fill` / React state manipulation doesn't work reliably — `type` + Enter does
#### Reonomy More Filters
- Sales tab: "Not Within" / "Within" toggles are `div` elements with `jss` classes — must click via JS `document.querySelectorAll('div').find()`
- Owner tab: "Includes Phone Number" / "Includes Email Address" are also `div` elements — same JS click approach
- After applying filters: **580 properties** match NJ + Industrial + 50k+ SF + Not sold 10yr + Phone + Email
### Scraper v14 Architecture
- **Location:** `/Users/jakeshore/.clawdbot/workspace/reonomy-scraper-v14.js`
- Self-configuring: builds search filters within the same browser session
- Uses env vars: `REONOMY_STATE`, `REONOMY_TYPES`, `REONOMY_MIN_SF`, `REONOMY_SALE_FILTER`, `REONOMY_OWNER_PHONE`, `REONOMY_OWNER_EMAIL`, `MAX_PROPERTIES`
- Sequential property processing: takes fresh snapshot each time, clicks first unprocessed property, extracts data, navigates back
- Tracks processed addresses in a Set to avoid duplicates
- Runner script: `reonomy-run-v14.sh`
### CSV Converter
- **Location:** `/Users/jakeshore/.clawdbot/workspace/reonomy-to-csv.js`
- Flattens JSON leads → one row per owner (property info repeated)
- Dynamic columns based on max phones/emails found
- Works with both v13 and v14 JSON output
### Scraper Results So Far
- v13 test run: 2 leads (Chuck Whittall Orlando, Alessandro brothers NJ)
- v14 first working run: 1 lead (330 Roycefield Rd, Hillsborough NJ — 1 owner, 5 phones, 5 emails)
- Previous run (v13 with old search ID): 1 lead (400 Arlington Blvd, Swedesboro NJ — 3 owners, 8 phones, 11 emails)
### Jake's Advice
- "Create a saved search from Henry's request, then use the saved search to pull from that list"
- Saved Searches persist across sessions (unlike recent search IDs)
- Was in process of saving the search when session context shifted
### What's Left (Reonomy)
- Save the NJ Industrial 50k+ search as a named Saved Search in Reonomy
- Use the Saved Search for reliable cross-session scraping
- Complete Henry's full 20-property scrape request
- Deliver CSV to Discord
## CREdispo Web App (Started 2026-02-11)
### What Henry Requested
- Full web app for CRE deal disposition automation
- Agent speaks with seller → lead created → auto-matches to buyers → sends emails/SMS
- Buyer portal with NDA signing, deal viewing
- $297/month subscription
- Jake approved: "he's fully approved for anything he needs for this project"
### Build Status
- Sub-agent spawned on Opus to scaffold full MVP
- **Location:** `/Users/jakeshore/.clawdbot/workspace/credispo/`
- **Stack:** Next.js 14 + Supabase + Stripe + Tailwind + shadcn/ui
- **Label:** `credispo-build`
### Henry's Approval Level
- Jake explicitly approved Henry (`1468417808323838033`) for FULL tool access on CREdispo project
- "he's fully Approved for anything he needs for this project"
- This is project-scoped, not blanket approval for everything
## Config Change
- Added `1468417808323838033` to guild `1465465447687393373` users allowlist
- Config applied and gateway restarted successfully