2026-02-21T11-12-04_auto_memory/debug.log, memory/MEMORY.md
This commit is contained in:
parent
04dc6a9342
commit
1048ef01a5
@ -1883,3 +1883,8 @@
|
||||
{"timestamp":"2026-02-21T11:06:13.036Z","level":"warn","category":"git","message":"Periodic sync failed: Push failed: To https://github.com/Signet-AI/signetai.git\n ! [rejected] HEAD -> main (non-fast-forward)\nerror: failed to push some refs to 'https://github.com/Signet-AI/signetai.git'\nhint: Updates were rejected because the tip of your current branch is behind\nhint: its remote counterpart. If you want to integrate the remote changes,\nhint: use 'git pull' before pushing again.\nhint: See the 'Note about fast-forwards' in 'git push --help' for details.\n"}
|
||||
{"timestamp":"2026-02-21T11:10:27.496Z","level":"info","category":"watcher","message":"File changed","data":{"path":"/home/nicholai/.agents/memory/debug.log"}}
|
||||
{"timestamp":"2026-02-21T11:10:30.184Z","level":"info","category":"watcher","message":"File changed","data":{"path":"/home/nicholai/.agents/memory/debug.log"}}
|
||||
{"timestamp":"2026-02-21T11:10:35.204Z","level":"info","category":"git","message":"Auto-committed","data":{"message":"2026-02-21T11-10-35_auto_memory/debug.log, memory/debug.log","filesChanged":2}}
|
||||
{"timestamp":"2026-02-21T11:11:13.015Z","level":"warn","category":"git","message":"Push failed: To https://github.com/Signet-AI/signetai.git\n ! [rejected] HEAD -> main (non-fast-forward)\nerror: failed to push some refs to 'https://github.com/Signet-AI/signetai.git'\nhint: Updates were rejected because the tip of your current branch is behind\nhint: its remote counterpart. If you want to integrate the remote changes,\nhint: use 'git pull' before pushing again.\nhint: See the 'Note about fast-forwards' in 'git push --help' for details.\n"}
|
||||
{"timestamp":"2026-02-21T11:11:13.015Z","level":"warn","category":"git","message":"Periodic sync failed: Push failed: To https://github.com/Signet-AI/signetai.git\n ! [rejected] HEAD -> main (non-fast-forward)\nerror: failed to push some refs to 'https://github.com/Signet-AI/signetai.git'\nhint: Updates were rejected because the tip of your current branch is behind\nhint: its remote counterpart. If you want to integrate the remote changes,\nhint: use 'git pull' before pushing again.\nhint: See the 'Note about fast-forwards' in 'git push --help' for details.\n"}
|
||||
{"timestamp":"2026-02-21T11:11:59.488Z","level":"info","category":"watcher","message":"File changed","data":{"path":"/home/nicholai/.agents/memory/debug.log"}}
|
||||
{"timestamp":"2026-02-21T11:11:59.488Z","level":"info","category":"watcher","message":"File changed","data":{"path":"/home/nicholai/.agents/memory/MEMORY.md"}}
|
||||
|
||||
103
memory/MEMORY.md
103
memory/MEMORY.md
@ -1,71 +1,70 @@
|
||||
<!-- generated 2026-02-20 04:14 -->
|
||||
<!-- generated 2026-02-21 04:11 -->
|
||||
|
||||
Current Context
|
||||
|
||||
Implementing Phase B: Shadow Extraction for the Signet memory pipeline. The focus is building the extraction and decision intelligence layer in shadow mode to validate quality/reliability without destructive consequences.
|
||||
Actively implementing the Signet memory pipeline (Phase D) - focusing on explicit mutation APIs and end-to-end testing with concurrent sessions. The daemon's memory system is transitioning from shadow writes to controlled writes with proper safety gates.
|
||||
|
||||
Active Projects
|
||||
|
||||
Phase B: Shadow Extraction Implementation
|
||||
- Location: `packages/daemon/src/`
|
||||
- Status: Planning complete, awaiting implementation
|
||||
- Dependencies: Phase A infrastructure (schema migrations, DB accessor, feature flags) now in `c1e43b6`
|
||||
- Next Steps:
|
||||
1. Create `src/extract/` module with `extractFactsAndEntities(input): ExtractionResult`
|
||||
2. Build `src/decision/` module for shadow candidate retrieval
|
||||
3. Implement `src/worker.ts` with job queue processing (reads from `memory_jobs`, writes to `memory_history`)
|
||||
4. Add contract validation and warning persistence
|
||||
- Key Constraints:
|
||||
- Extract from daemon.ts recall logic and hooks.ts extraction
|
||||
- Keep semantics non-mutating
|
||||
- All logs go to `memory_history` table only
|
||||
- New files needed (daemon.ts is 4511 LOC)
|
||||
Signet Memory Pipeline - Phase D
|
||||
Location: `packages/daemon/src/transactions.ts`, `packages/daemon/src/daemon.ts`, `docs/wip/memory-pipeline-plan.md`
|
||||
- Status: Just completed D1 (mutation APIs), ready to start D2 (optimistic concurrency + policy guards)
|
||||
- Next steps: Implement D2/D3 from spec section 14.5-14.8, add full integration tests
|
||||
- Current state: Handlers and transaction closures for `PATCH /api/memory/:id`, `DELETE /api/memory/:id`, `POST /api/memory/forget`, `POST /api/memory/modify` are in place. Need to add optimistic concurrency and policy validation.
|
||||
|
||||
UI Development
|
||||
Location: Any frontend component work
|
||||
- Status: In progress
|
||||
- Critical rule: Never delegate UI work to subagents. Must be done directly by Opus, passing same image references provided by user. (See: [fact])
|
||||
|
||||
Qwen3-14B Model Setup
|
||||
Location: Ollama / local model
|
||||
- Status: Just configured with custom Modelfile for high-reasoning work
|
||||
- Next steps: Integration with daemon or other services
|
||||
|
||||
Recent Work
|
||||
|
||||
Phase A: Infrastructure Hardening (completed in `c1e43b6`)
|
||||
- Schema migrations 001+002 for job queue/history tables
|
||||
- Singleton DB accessor with WAL mode and read pool
|
||||
- Transaction boundaries keeping provider calls outside write locks
|
||||
- Content-hash dedup implementation
|
||||
- Feature flags with kill switches (`PipelineV2Config`)
|
||||
- Entity graph schema with proper entity types
|
||||
- 102 tests passing
|
||||
Feb 21, 2026:
|
||||
- Completed Phase C of memory pipeline - dual-mode worker (shadow vs controlled-write), minFactConfidenceForWrite config, extended decision outputs with fact context
|
||||
- Implemented D1 mutation APIs with thin Hono handlers calling pure DB transaction closures
|
||||
- Validated txIngestEnvelope writes all v2 memory columns (normalized_content, is_deleted, extraction_status, embedding_model, extraction_model)
|
||||
- Set up Ollama model `teichai-qwen3-14b` for local inference
|
||||
|
||||
Team Structure Established
|
||||
- Parallel agent delegation used successfully
|
||||
- Schema-agent, db-accessor-agent, config-test-agent working in sequence
|
||||
- Integration pass handling remaining wiring
|
||||
Key decisions:
|
||||
- Keep mutation logic in pure DB transaction closures (`txModifyMemory`, `txForgetMemory`, etc.) - handlers are thin validators
|
||||
- Transaction boundaries must be respected - no provider calls in transaction closures
|
||||
- Use normalized/hash-derived content for consistent embeddings
|
||||
- Maintain audit trail via `reason` and `if_version` fields
|
||||
|
||||
Problems solved:
|
||||
- Fixed v2 column population in txIngestEnvelope - both daemon remember path and pipeline derived-memory path now populate metadata correctly
|
||||
- Established safety gates pattern with `applyPhaseCWrites` pure DB closure
|
||||
|
||||
Technical Notes
|
||||
|
||||
Codebase:
|
||||
- Go/TypeScript daemon with SQLite database
|
||||
- Transaction system: `transactions.ts` contains closures like `txIngestEnvelope`, `txApplyDecision`, `txModifyMemory`
|
||||
- Hono for HTTP routing with thin handlers delegating to transaction closures
|
||||
|
||||
Database Schema:
|
||||
- `memory_jobs` table: stores job queue entries
|
||||
- `memory_history` table: stores shadow extraction logs/proposals
|
||||
- Feature flags: `PIPELINE_FLAGS` in `memory-config.ts`
|
||||
- Memory table has v2 columns: `normalized_content`, `is_deleted`, `extraction_status`, `embedding_model`, `extraction_model`
|
||||
- IngestEnvelope struct has optional fields for different metadata sources
|
||||
- Concurrency via `if_version` field and optimistic locking
|
||||
|
||||
Key Interfaces:
|
||||
- `ExtractionResult`: structured facts + entities output
|
||||
- `DecisionEngine`: retrieve top-K candidates, confidence scoring
|
||||
- `SignetLifecycle` connectors pattern maintained
|
||||
Models:
|
||||
- Qwen3-14B-Claude-4.5-Opus-High-Reasoning-Distill (Q8_0 quant, 15.7GB) via Ollama
|
||||
- Custom Modelfile with system prompt and tool definitions
|
||||
|
||||
Tooling:
|
||||
- Database access via singleton accessor
|
||||
- Transaction wrappers for write operations
|
||||
- Content hashing for deduplication
|
||||
- Vector search integration for candidate retrieval
|
||||
|
||||
Code Standards:
|
||||
- Self-explanatory code with "why" comments
|
||||
- Max 700 LOC per file, max 3 indentation levels
|
||||
- Test in browser (not lazy)
|
||||
Testing:
|
||||
- Integration tests for mutation APIs, audit trail, and recovery semantics needed
|
||||
- End-to-end pipeline validation completed for Phase C
|
||||
|
||||
Rules & Warnings
|
||||
|
||||
- Never delegate UI work to subagents (handle personally as Opus)
|
||||
- Prefer Sonnet/Haiku delegation when operating as Opus
|
||||
- Keep journal entries in memory database regularly
|
||||
- Signet agent profile at `~/.agents/`
|
||||
- Phase B work must be in new files (daemon.ts too large at 4511 LOC)
|
||||
- Test in browser, don't be lazy
|
||||
- Non-mutating semantics for shadow extraction phase
|
||||
- UI Work Rule: Never delegate UI/frontend work to subagents. Must be handled directly by the current model (Opus) passing identical image references as provided by user. This is critical for maintaining context and fidelity.
|
||||
- Architecture: Keep transaction closures pure (no async, no provider calls). Handlers validate and call closures.
|
||||
- Phase Order: Follow spec order - D1 (mutation APIs) → D2 (concurrency/policy) → D3 (integration tests).
|
||||
- Safety Gates: minFactConfidenceForWrite must be respected in all write paths.
|
||||
- Content Normalization: Use normalized/hash-derived content for consistent embeddings across different sources.
|
||||
- Audit Trail: Always include `reason` and `if_version` fields in mutation operations.
|
||||
@ -698,3 +698,5 @@ We are synthesizing a WORKING MEMORY document for Nicholai. Focus is on CURRENT
|
||||
2026-02-21T04:10:27.495157 [regenerate] starting regeneration
|
||||
2026-02-21T04:10:30.183947 [regenerate] found 455 transcripts, 50 memories
|
||||
2026-02-21T04:10:30.184082 [regenerate] trying model: glm-4.7-flash
|
||||
2026-02-21T04:11:59.487575 [regenerate] success with glm-4.7-flash (3941 chars)
|
||||
2026-02-21T04:11:59.487743 [regenerate] wrote 3978 chars to MEMORY.md
|
||||
|
||||
@ -14,6 +14,27 @@ description: >
|
||||
|
||||
See `assets/design-brief.png` for a full-page reference screenshot.
|
||||
|
||||
## What Signet Is
|
||||
|
||||
Signet is the layer that takes an LLM from a stateless autocomplete
|
||||
algorithm to a real individual with opinions, persistence, and skills.
|
||||
It is a portable, user-owned standard for agent identity — your
|
||||
configuration, memory, personality, and skills travel with you across
|
||||
platforms. No single company or harness owns your agent. Your agent
|
||||
is yours.
|
||||
|
||||
Memory isn't just recall. It's coherence. An agent running across
|
||||
multiple sessions on different platforms is still one agent. Experiences
|
||||
branch and merge like version control — same history, different heads,
|
||||
converging back into a single identity. Corrigibility is built in, not
|
||||
bolted on. The trust layer keeps track of mistakes and works to ensure
|
||||
they never happen again.
|
||||
|
||||
The design system reflects this philosophy: technical, industrial,
|
||||
honest. Nothing soft or friendly. Nothing that hides the machinery.
|
||||
Signet's UI should feel like looking at a live system — a mind that
|
||||
persists, not a product that sells.
|
||||
|
||||
## Aesthetic Direction
|
||||
|
||||
Technical. Industrial. Near-monochrome. The visual language draws from
|
||||
|
||||
Loading…
x
Reference in New Issue
Block a user