clawdbot-workspace/proposals/2026-02-17-govgpt-senior-python-backend.md
2026-02-17 23:03:48 -05:00

5.2 KiB

GovGPT - Senior Python Backend Developer

Job URL: https://www.upwork.com/jobs/~022023876916073088374 Applied: 2026-02-17 Rate: $65/hr Connects: 31 Client: Schaumburg IL, $97K spent, 4.65★, 45 jobs, 100% hire rate, payment verified Score: 79/100 Contract-to-hire opportunity


Cover Letter

Jameel — I read every word of your posting. This isn't a template response.

Your Intel agent architecture is essentially what I build daily. I'm going to walk you through exactly what I'd do, because I think you'll see the overlap immediately.

RELEVANT EXPERIENCE

I built and maintain a production multi-agent system called Buba that runs 24/7 on a Mac Mini. It uses a ReACT-style architecture where the orchestrator decides which tools to call based on the query — web search, API calls, browser automation, database queries, document processing — and chains them together with stateful context across sessions. It manages its own memory system, routes between Claude and other models based on task complexity, and handles OAuth2 integrations with Google Workspace, Salesforce-style CRMs (GoHighLevel), and Microsoft services. This isn't a demo — it processes hundreds of requests daily with sub-2s response times.

Separately, I built 30+ enterprise MCP server integrations connecting AI agents to production APIs — CRMs, databases, file systems, communication platforms — all with proper auth flows, error handling, and retry logic. Each connector handles OAuth2 token refresh, rate limiting, and structured data normalization.

For RAG specifically: I built a semantic search system for a commercial real estate platform (CREdispo) that ingests property data from multiple sources, chunks and embeds documents, and runs hybrid search (vector + keyword) against a PostgreSQL + pgvector setup. The pipeline handles deduplication across data sources — exactly the pattern you need for government contracts appearing on SAM.gov, FPDS, and USAspending simultaneously.

TECHNICAL APPROACH — Multi-Agent Routing

For Intel's "should I query an API or search the vector database?" decision, here's how I'd architect it:

  1. Query Classification Layer — A lightweight classifier (Claude Haiku or a fine-tuned small model) analyzes the user's question and tags it: factual-lookup (API), document-search (RAG), hybrid (both), or analytical (compute over cached data). This runs in <200ms.

  2. Tool Registry — Each data source (SAM.gov, FPDS, USAspending, GovTribe, vector DB) registers its capabilities as structured tool definitions in LangChain. The agent sees: "SAM.gov: active solicitations, contract awards, entity data. FPDS: historical award data, pricing. Vector DB: past proposals, capability statements, solicitation PDFs."

  3. LangGraph State Machine — The agent maintains a state graph where each node is a decision point. After the classifier, it enters a planning node that decomposes the query into sub-tasks. For "What counter-drone contracts has the Air Force awarded in the last 12 months?" — it plans: (a) query SAM.gov for NAICS codes related to counter-drone, (b) query FPDS filtered by Air Force + date range, (c) cross-reference with USAspending for dollar amounts, (d) search vector DB for any related proposals in the client's document library.

  4. Tool Cascade with Fallback — If SAM.gov API returns incomplete data, the agent falls back to GovTribe, then web search. Each step feeds context to the next. LangGraph's conditional edges handle this naturally.

  5. Citation Tracking — Every data point gets a provenance tag: {source: "FPDS", query_id: "...", timestamp: "...", confidence: 0.95}. These flow through the entire chain and surface as footnotes in the final response.

LANGCHAIN / LANGGRAPH EXPERIENCE

I've built production agent workflows using LangChain's tool-calling interface with Claude and GPT-4. My current system uses a custom tool registry pattern where tools self-describe their capabilities, and the agent selects tools based on query analysis — similar to what you need for routing between SAM.gov, FPDS, and your vector DB. I've implemented multi-step reasoning chains where each step's output feeds the next step's context, with proper error handling and retry logic at each stage.

OAUTH2 / ENTERPRISE INTEGRATIONS

I've implemented OAuth2 flows for Microsoft Graph API (mail, calendar, drive), Google Workspace (Gmail, Calendar, Drive, Contacts, Sheets, Docs), GoHighLevel CRM (contacts, pipelines, conversations), and custom REST APIs. Each integration includes token refresh, scope management, and proper error handling. For your SharePoint + Word + Excel requirements, I've worked with the Microsoft Graph API extensively — creating documents programmatically, reading/writing to SharePoint, and generating Excel files from structured data.

AVAILABILITY

I can start this week. I'm available 30+ hours/week and I'm US East Coast (perfect timezone overlap for your Schaumburg team). I use Claude as my primary development accelerator — it's integrated into my entire workflow, not just for code completion but for architecture decisions, debugging, and documentation. I ship fast.

Portfolio with case studies: https://portfolio.mcpengage.com


Screening Questions

(Will fill based on what appears in the proposal form)