2026-01-28 23:00:58 -05:00
bluemsg-landing @ a33896e2d4
2026-02-06 23:01:30 -05:00
blueshift-video @ 8491ce9b00
2026-02-06 23:01:30 -05:00
cresync-landing @ fa84fb7a1a
2026-01-28 23:00:58 -05:00
2026-01-31 23:01:09 -05:00
2026-02-05 23:01:36 -05:00
das-town-video @ 626cf62ba4
2026-02-06 06:27:28 -05:00
2026-01-28 23:00:58 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-04 23:01:37 -05:00
2026-01-28 23:00:58 -05:00
2026-02-17 23:03:48 -05:00
2026-02-14 23:01:35 -05:00
2026-02-17 23:03:48 -05:00
mcpengine-repo @ 3ab8a23c7f
2026-02-14 23:01:35 -05:00
2026-02-06 23:01:30 -05:00
2026-02-18 23:01:51 -05:00
2026-02-18 23:01:51 -05:00
2026-02-05 23:01:36 -05:00
2026-01-28 23:00:58 -05:00
solana-sniper-bot @ 4b0cffefba
2026-02-15 16:59:07 -05:00
songsense @ f4ada73253
2026-02-01 01:50:46 -05:00
2026-02-05 23:01:36 -05:00
2026-02-18 23:01:51 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-18 23:01:51 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-04 23:01:37 -05:00
2026-01-28 23:00:58 -05:00
2026-01-28 23:00:58 -05:00
2026-01-28 23:00:58 -05:00
2026-01-28 23:00:58 -05:00
2026-02-05 23:01:36 -05:00
2026-02-05 23:01:36 -05:00
2026-02-18 23:01:51 -05:00
2026-02-18 23:01:51 -05:00
2026-02-17 23:03:48 -05:00
2026-02-14 23:01:35 -05:00

Reonomy Lead Scraper

A browser automation tool that scrapes property and owner leads from Reonomy and exports them to Google Sheets.

Features

  • Automated login to Reonomy
  • 🔍 Search for properties by location
  • 📊 Extract lead data:
    • Owner Name
    • Property Address
    • City, State, ZIP
    • Property Type
    • Square Footage
    • Owner Location
    • Property Count
    • Property/Owner URLs
  • 📈 Export to Google Sheets via gog CLI
  • 🔐 Secure credential handling (environment variables or 1Password)
  • 🖥️ Headless or visible browser mode

Prerequisites

Required Tools

  1. Node.js (v14 or higher)

    # Check if installed
    node --version
    
  2. gog CLI - Google Workspace command-line tool

    # Install via Homebrew
    brew install gog
    
    # Or from GitHub
    # https://github.com/stripe/gog
    
    # Authenticate
    gog auth login
    
  3. Puppeteer (installed via npm with this script)

Optional Tools

  • 1Password CLI (op) - For secure credential storage
    brew install --cask 1password-cli
    

Installation

  1. Clone or navigate to the workspace directory:

    cd /Users/jakeshore/.clawdbot/workspace
    
  2. Install Node.js dependencies:

    npm install
    
  3. Make the script executable (should already be done):

    chmod +x scrape-reonomy.sh
    

Setup

Set your Reonomy credentials as environment variables:

export REONOMY_EMAIL="henry@realestateenhanced.com"
export REONOMY_PASSWORD="your_password_here"

Or add to your shell profile (e.g., ~/.zshrc or ~/.bash_profile):

echo 'export REONOMY_EMAIL="henry@realestateenhanced.com"' >> ~/.zshrc
echo 'export REONOMY_PASSWORD="9082166532"' >> ~/.zshrc
source ~/.zshrc
  1. Create a 1Password item named "Reonomy"

  2. Add fields:

    • email: Your Reonomy email
    • password: Your Reonomy password
  3. Use the --1password flag when running the scraper:

    ./scrape-reonomy.sh --1password
    

Option 3: Interactive Prompt

If you don't set credentials, the script will prompt you for them:

./scrape-reonomy.sh

Usage

Basic Usage

Run the scraper with default settings (searches "New York, NY"):

./scrape-reonomy.sh

Search a Different Location

./scrape-reonomy.sh --location "Los Angeles, CA"

Use Existing Google Sheet

./scrape-reonomy.sh --sheet "1ABC123XYZ..."

Run in Headless Mode (No Browser Window)

./scrape-reonomy.sh --headless

Combined Options

# Search Chicago, use headless mode, save to existing sheet
./scrape-reonomy.sh \
  --location "Chicago, IL" \
  --headless \
  --sheet "1ABC123XYZ..."

Using 1Password

./scrape-reonomy.sh --1password --headless

Direct Node.js Usage

You can also run the scraper directly with Node.js:

REONOMY_EMAIL="..." \
REONOMY_PASSWORD="..." \
REONOMY_LOCATION="Miami, FL" \
HEADLESS=true \
node reonomy-scraper.js

Output

Google Sheet

The scraper creates or appends to a Google Sheet with the following columns:

Column Description
Scrape Date Date the lead was scraped
Owner Name Property owner's name
Property Address Street address of the property
City Property city
State Property state
ZIP Property ZIP code
Property Type Type of property (e.g., "General Industrial")
Square Footage Property size
Owner Location Owner's location
Property Count Number of properties owned
Property URL Direct link to property page
Owner URL Direct link to owner profile
Email Owner email (if available)
Phone Owner phone (if available)

Log File

Detailed logs are saved to:

/Users/jakeshore/.clawdbot/workspace/reonomy-scraper.log

Command-Line Options

Option Description
-h, --help Show help message
-l, --location LOC Search location (default: "New York, NY")
-s, --sheet ID Google Sheet ID (creates new sheet if not provided)
-H, --headless Run in headless mode (no browser window)
--no-headless Run with visible browser
--1password Fetch credentials from 1Password

Environment Variables

Variable Required Description
REONOMY_EMAIL Yes Your Reonomy email address
REONOMY_PASSWORD Yes Your Reonomy password
REONOMY_LOCATION No Search location (default: "New York, NY")
REONOMY_SHEET_ID No Google Sheet ID (creates new sheet if not set)
REONOMY_SHEET_TITLE No Title for new sheet (default: "Reonomy Leads")
HEADLESS No Run in headless mode ("true" or "false")

Troubleshooting

"Login failed" Error

  • Verify your credentials are correct
  • Check if Reonomy has changed their login process
  • Try running without headless mode to see what's happening:
    ./scrape-reonomy.sh --no-headless
    

"gog command failed" Error

  • Ensure gog is installed and authenticated:
    gog auth login
    
  • Check your Google account has Google Sheets access

"No leads extracted" Warning

  • The page structure may have changed
  • The search location might not have results
  • Check the screenshot saved to /tmp/reonomy-no-leads.png or /tmp/reonomy-error.png

Puppeteer Issues

If you encounter browser-related errors, try:

npm install puppeteer --force

Security Notes

Credential Security

⚠️ Important: Never commit your credentials to version control!

Best Practices:

  1. Use environment variables (set in your shell profile)
  2. Use 1Password for production environments
  3. Add .env files to .gitignore
  4. Never hardcode credentials in scripts
# Credentials
.env
.reonomy-credentials.*

# Logs
*.log
reonomy-scraper.log

# Screenshots
*.png
/tmp/reonomy-*.png

# Node
node_modules/
package-lock.json

Advanced Usage

Scheduled Scraping

You can set up a cron job to scrape automatically:

# Edit crontab
crontab -e

# Add line to scrape every morning at 9 AM
0 9 * * * /Users/jakeshore/.clawdbot/workspace/scrape-reonomy.sh --headless --1password >> /tmp/reonomy-cron.log 2>&1

Custom Search Parameters

The scraper currently searches by location. To customize:

  1. Edit reonomy-scraper.js
  2. Modify the extractLeadsFromPage function
  3. Add filters for:
    • Property type
    • Price range
    • Building size
    • Owner type

Integrating with Other Tools

The Google Sheet can be connected to:

  • Google Data Studio for dashboards
  • Zapier for automations
  • Custom scripts for further processing

Development

File Structure

workspace/
├── reonomy-scraper.js      # Main scraper script
├── scrape-reonomy.sh       # Shell wrapper
├── package.json            # Node.js dependencies
├── README.md               # This file
├── reonomy-scraper.log     # Run logs
└── node_modules/           # Dependencies

Testing

Test the scraper in visible mode first:

./scrape-reonomy.sh --no-headless --location "Brooklyn, NY"

Extending the Scraper

To add new data fields:

  1. Update the headers array in initializeSheet()
  2. Update the extractLeadsFromPage() function
  3. Add new parsing functions as needed

Support

Getting Help

  • Check the log file: reonomy-scraper.log
  • Run with visible browser to see issues: --no-headless
  • Check screenshots in /tmp/ directory

Common Issues

Issue Solution
Login fails Verify credentials, try manual login
No leads found Try a different location, check search results
Google Sheets error Run gog auth login to re-authenticate
Browser timeout Increase timeout in the script

License

This tool is for educational and personal use. Respect Reonomy's Terms of Service when scraping.

Changelog

v1.0.0 (Current)

  • Initial release
  • Automated login
  • Location-based search
  • Google Sheets export
  • 1Password integration
  • Headless mode support
Description
No description provided
Readme 67 MiB
Languages
JavaScript 77.7%
TypeScript 12.7%
HTML 6%
Python 1.4%
CSS 0.9%
Other 1.1%