Files
gastown/.beads/issues.jsonl
2026-01-09 21:56:53 -08:00

2908 lines
1.8 MiB
Raw Blame History

This file contains ambiguous Unicode characters
This file contains Unicode characters that might be confused with other characters. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.
{"id":"gt-00cu","title":"mol-theme-generate: Custom name pool generation via cognition","description":"Currently polecat name themes are hardcoded in namepool.go. Simple theme selection is fine as static config, but custom name generation should be a molecule.\n\nExamples requiring cognition:\n- \"Generate 50 polecat names in the style of [book/movie/game]\"\n- \"Create names that match the project's domain (medical, financial, etc.)\"\n- \"Extend the mad-max theme with 20 more obscure characters\"\n\n## Molecule: theme-generate\nGenerate custom polecat names for a rig.\n\n## Step: analyze-request\nUnderstand the naming request:\n- Theme style or source material\n- Number of names needed\n- Any constraints (length, format)\n\n## Step: generate-candidates\nGenerate candidate names matching the style.\nAim for 2x the requested count.\n\n## Step: validate-names\nFilter candidates:\n- No duplicates with existing themes\n- Appropriate length (3-15 chars)\n- No problematic terms\n\n## Step: create-theme\nWrite the custom theme to rig config or beads.\nRegister in name pool.\nNeeds: validate-names\n\n## Notes\n- Output is a new theme entry in BuiltinThemes equivalent\n- Could be stored as a pinned bead for persistence","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-20T03:25:59.727107-08:00","updated_at":"2025-12-27T21:29:56.839468-08:00","dependencies":[{"issue_id":"gt-00cu","depends_on_id":"gt-3zw","type":"related","created_at":"2025-12-20T03:26:45.400818-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.839468-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-00ur","title":"Merge: beads-2nh","description":"branch: fix/spawn-beads-path\ntarget: main\nsource_issue: beads-2nh\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T01:12:22.587758-08:00","updated_at":"2025-12-27T21:27:22.921152-08:00","deleted_at":"2025-12-27T21:27:22.921152-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-01u","title":"Design: Collapse mail into beads","description":"## Proposal\n\nReplace the separate mail system (JSONL inboxes) with beads issues using a naming convention.\n\n## Rationale\n\nIf all state should be in beads (work items, swarm state, dependencies), why have a separate system for messages? Mail is just:\n- Handoffs (agent → self)\n- Commands (Mayor → Refinery)\n- Escalations (Witness → Mayor)\n\nThese can all be beads issues with a special prefix.\n\n## Design\n\n### Convention\n- Messages use `@-` prefix: `@-witness-1734012345`\n- Assignee = recipient\n- Status: open = unread, closed = read/acknowledged\n- Priority 0 = urgent\n\n### Commands (thin wrappers)\n```bash\ngt mail send witness -s \"Subject\" -m \"Body\"\n → bd create --prefix=@ --title=\"Subject\" --assignee=witness --description=\"Body\"\n\ngt mail inbox\n → bd list --prefix=@ --assignee=$(gt whoami) --status=open\n\ngt mail read @-abc\n → bd show @-abc \u0026\u0026 bd close @-abc\n```\n\n### Notification\nDaemon watches for new `@-` issues and pokes relevant sessions.\nOr: agents poll on heartbeat (simpler).\n\n## What We Remove\n- `mail/inbox.jsonl` files\n- Mail JSONL read/write code\n- Separate delivery mechanism\n\n## What We Keep\n- `gt mail` CLI (as wrapper)\n- Handoff semantics\n- Notification (via daemon or polling)\n\n## Benefits\n- One system, one sync, one query interface\n- All communication in git history\n- Simpler architecture\n\n## Risks\n- Beads prefix filtering must be efficient\n- Namespace collision with user prefixes\n- Performance for high-frequency messages (probably fine for handoffs)\n\n## Decision Point\nDo we need first-class mail support in beads (`bd mail` commands) or is convention sufficient?","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T02:10:18.32879-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-02or","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:37:21.062369-08:00","updated_at":"2025-12-27T21:26:04.601701-08:00","deleted_at":"2025-12-27T21:26:04.601701-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-03lbg","title":"Digest: mol-deacon-patrol","description":"Patrol 3: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:00:06.469283-08:00","updated_at":"2025-12-27T21:26:00.669501-08:00","deleted_at":"2025-12-27T21:26:00.669501-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-03rb","title":"Merge: gt-xw7b","description":"branch: polecat/morsov\ntarget: main\nsource_issue: gt-xw7b\nrig: gastown","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-21T16:43:54.909633-08:00","updated_at":"2025-12-27T21:27:23.014757-08:00","deleted_at":"2025-12-27T21:27:23.014757-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-04cfe","title":"Digest: mol-deacon-patrol","description":"Patrol 9: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:21:44.500513-08:00","updated_at":"2025-12-27T21:26:00.157627-08:00","deleted_at":"2025-12-27T21:26:00.157627-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-051em","title":"Digest: mol-deacon-patrol","description":"Patrol 5: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:30:03.465327-08:00","updated_at":"2025-12-27T21:26:02.601879-08:00","deleted_at":"2025-12-27T21:26:02.601879-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-05cx","title":"Merge: gt-h6eq.1","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-h6eq.1\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:44:11.942999-08:00","updated_at":"2025-12-27T21:27:22.844814-08:00","deleted_at":"2025-12-27T21:27:22.844814-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-082","title":"Worker cleanup: Beads sync on shutdown","description":"Add beads sync verification to worker cleanup checklist and Witness verification.\n\n## Update to Decommission Checklist (gt-sd6)\n\nAdd to pre-done verification:\n- bd sync --status must show 'Up to date'\n- git status .beads/ must show no changes\n\n## Beads Edge Cases\n\nUncommitted beads changes:\n bd sync\n git add .beads/\n git commit -m 'beads: final sync'\n\nBeads sync conflict (rare):\n git fetch origin main\n git checkout main -- .beads/\n bd sync --force\n git add .beads/\n git commit -m 'beads: resolve sync conflict'\n\n## Update to Witness Verification (gt-f8v)\n\nWhen capturing worker state:\n town capture \u003cpolecat\u003e \"bd sync --status \u0026\u0026 git status .beads/\"\n\nCheck for:\n- bd sync --status shows 'Up to date'\n- git status .beads/ shows no changes\n\nIf beads not synced, nudge:\n WITNESS CHECK: Beads not synced. Run 'bd sync' then commit .beads/. Signal done when complete.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:47:21.757756-08:00","updated_at":"2025-12-27T21:29:54.598466-08:00","dependencies":[{"issue_id":"gt-082","depends_on_id":"gt-l3c","type":"blocks","created_at":"2025-12-15T19:47:35.977804-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.598466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-09i4","title":"Unify tmux session lifecycle: prefer exec-over-shell pattern","description":"## Context\n\nWhen `gt crew at` is run from INSIDE the target session, it uses `execClaude()` which\nreplaces the shell with Claude directly. This is actually preferable behavior:\n- Cleaner exit (no intermediate shell prompt)\n- More reliable for programmatic worker recycling\n- Better UX for humans\n\nBut currently this only happens accidentally when you run `gt crew at` while already\nin the session at a shell prompt.\n\n## Current behavior\n\n1. Session created: `tmux new-session` starts with shell\n2. Claude started via `send-keys` → shell spawns claude as child\n3. Exit claude → return to shell → exit shell → session ends\n\nWith exec path (only if already in session):\n1. `execClaude()` replaces shell with claude\n2. Exit claude → session ends (no intermediate shell)\n\n## Desired behavior\n\nConsider making exec-the-shell the DEFAULT for all crew/persistent sessions:\n- Spawn session with shell\n- Wait for ready\n- Use `respawn-pane -k` with claude command (kills shell, starts claude directly)\n\nThis gives the cleaner lifecycle without requiring the user to be inside the session.\n\n## Related\n\nPart of tmux control plane unification - ability to manage workers from any session.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T14:24:34.371414-08:00","updated_at":"2025-12-27T21:29:55.492782-08:00","deleted_at":"2025-12-27T21:29:55.492782-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0asj","title":"Merge: gt-5af.5","description":"branch: polecat/Scabrous\ntarget: main\nsource_issue: gt-5af.5\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:50:25.227909-08:00","updated_at":"2025-12-27T21:27:22.700495-08:00","deleted_at":"2025-12-27T21:27:22.700495-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-0atys","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Nominal - halfway point","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:50:19.218794-08:00","updated_at":"2025-12-27T21:26:04.148881-08:00","deleted_at":"2025-12-27T21:26:04.148881-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0bx57","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All healthy, dave now running","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:45:49.357389-08:00","updated_at":"2025-12-27T21:26:01.142534-08:00","deleted_at":"2025-12-27T21:26:01.142534-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0c7s","title":"Add 'gt account default' command","description":"Set the default account in accounts.yaml. Used when no GT_ACCOUNT env or --account flag specified.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:20.55537-08:00","updated_at":"2025-12-27T21:29:56.17575-08:00","dependencies":[{"issue_id":"gt-0c7s","depends_on_id":"gt-58tu","type":"blocks","created_at":"2025-12-23T03:24:34.811443-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.17575-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0cfyq","title":"Merge: dementus-mjw46vz4","description":"branch: polecat/dementus-mjw46vz4\ntarget: main\nsource_issue: dementus-mjw46vz4\nrig: gastown\nagent_bead: gt-gastown-polecat-dementus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T16:10:20.360033-08:00","updated_at":"2026-01-01T16:11:17.378239-08:00","closed_at":"2026-01-01T16:11:17.378239-08:00","created_by":"gastown/polecats/dementus"}
{"id":"gt-0d2qf","title":"Digest: mol-deacon-patrol","description":"Patrol complete: all agents healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T18:40:19.900797-08:00","updated_at":"2025-12-31T18:40:19.900797-08:00","closed_at":"2025-12-31T18:40:19.900761-08:00"}
{"id":"gt-0dra","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:15.178129-08:00","updated_at":"2025-12-27T21:26:04.557376-08:00","deleted_at":"2025-12-27T21:26:04.557376-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0ei3","title":"Add molecules.jsonl as separate catalog file for template molecules","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-19T20:16:10.763471-08:00","updated_at":"2025-12-27T21:29:53.881608-08:00","deleted_at":"2025-12-27T21:29:53.881608-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-0gdc","title":"Digest: mol-deacon-patrol","description":"Patrol 6","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:08:04.775529-08:00","updated_at":"2025-12-27T21:26:04.466116-08:00","deleted_at":"2025-12-27T21:26:04.466116-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0gny8","title":"Digest: mol-deacon-patrol","description":"Patrol 3: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:31:23.229211-08:00","updated_at":"2025-12-27T21:26:00.423849-08:00","deleted_at":"2025-12-27T21:26:00.423849-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0l20","title":"decide-actions","description":"Apply nudge matrix and queue actions.\n\nProgressive nudge levels:\n- Level 1: Gentle reminder\n- Level 2: Stronger nudge\n- Level 3: Final warning\n- Level 4: Escalate to Mayor\n\nNeeds: inspect-workers","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.506634-08:00","updated_at":"2025-12-25T15:52:57.846135-08:00","dependencies":[{"issue_id":"gt-0l20","depends_on_id":"gt-o29j","type":"blocks","created_at":"2025-12-23T01:41:54.591628-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:57.846135-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0lf5j","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:05:04.269001-08:00","updated_at":"2025-12-27T21:26:03.025611-08:00","deleted_at":"2025-12-27T21:26:03.025611-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0mchz","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:02:09.286607-08:00","updated_at":"2025-12-27T21:26:04.018101-08:00","deleted_at":"2025-12-27T21:26:04.018101-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0npbt","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 20: All healthy, handoff threshold reached","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T05:22:27.945031-08:00","updated_at":"2026-01-01T05:22:27.945031-08:00","closed_at":"2026-01-01T05:22:27.945-08:00"}
{"id":"gt-0odbt","title":"Replace WaitForClaudeReady with gt peek for steady-state agent observation","description":"## Problem\n\nWaitForClaudeReady uses regex to detect Claude's prompt, which is a ZFC violation.\n\n## Architectural Fix\n\n**Bootstrap (ZFC violation acceptable):**\nDuring cold town startup, no AI is available. Regex to get Deacon online is acceptable.\n\n**Steady State (proper ZFC):**\nOnce any agent is running, AI should observe AI:\n- Deacon starting polecats → Deacon uses gt peek\n- Deacon restarting → Mayor watches via gt peek\n- Mayor restarting → Deacon watches via gt peek\n\n## Implementation\n\n1. Keep WaitForClaudeReady for daemon bootstrap only\n2. Update gt deacon trigger-pending to use gt peek\n3. Document bootstrap vs steady-state distinction\n","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-25T12:34:32.712726-08:00","updated_at":"2025-12-27T21:29:55.23209-08:00","deleted_at":"2025-12-27T21:29:55.23209-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-0pc","title":"Document Overseer role (human operator)","description":"Document the Overseer role in Gas Town architecture.\n\n## The Overseer\n\nThe **Overseer** is the human operator of Gas Town. Not an agent - a person.\n\n## Responsibilities\n\n| Area | Overseer Does | Mayor/Agents Do |\n|------|---------------|-----------------|\n| Strategy | Define project goals | Execute toward goals |\n| Priorities | Set priority order | Work in priority order |\n| Escalations | Final decision on stuck work | Escalate to Overseer |\n| Resources | Provision machines | Use allocated resources |\n| Quality | Review \u0026 approve swarm output | Produce output |\n| Operations | Run gt commands, monitor dashboards | Do the work |\n\n## Key Interactions\n\n### Overseer → Mayor\n- Start/stop Mayor sessions\n- Direct Mayor via conversation\n- Review Mayor recommendations\n- Approve cross-rig decisions\n\n### Mayor → Overseer (Escalations)\n- Stuck workers after retries\n- Resource decisions (add machines, polecats)\n- Ambiguous requirements\n- Architecture decisions\n\n## Operating Cadence\n\nTypical Overseer workflow:\n1. Morning: Check status, review overnight work\n2. During day: Monitor, respond to escalations, adjust priorities\n3. End of day: Review progress, plan next batch\n\n## Commands for Overseers\n\n```bash\ngt status # Quick health check\ngt doctor # Detailed diagnostics \ngt doctor --fix # Auto-repair issues\ngt inbox # Messages from agents\ngt stop --all # Emergency halt\n```\n\n## Documentation Updates\n\nAdd to docs/architecture.md:\n- Overseer section under Agent Roles\n- Clarify Mayor reports to Overseer\n- Add Overseer to workflow diagrams","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T23:18:03.177633-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-0pl","title":"Polecat CLAUDE.md: configure auto-approve for bd and gt commands","description":"Polecats get stuck waiting for bash command approval when running\nbd and gt commands. Need to configure Claude Code to auto-approve these.\n\nOptions:\n1. Add allowedTools to polecat CLAUDE.md\n2. Configure .claude/settings.json in polecat directory\n3. Use --dangerously-skip-permissions flag (not recommended)\n\nShould auto-approve:\n- bd (beads commands)\n- gt (gastown commands)\n- go build/test\n- git status/add/commit/push\n\nShould still require approval:\n- rm -rf\n- Arbitrary commands outside project\n\nRelated to polecat prompting (gt-e1y, gt-sd6).","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T14:10:27.611612-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-0qis6","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T10:57:45.653341-08:00","updated_at":"2026-01-01T10:57:45.653341-08:00","closed_at":"2026-01-01T10:57:45.653305-08:00"}
{"id":"gt-0qki","title":"Refinery-Witness communication protocol","description":"Define mail protocol between Refinery and Witness:\n\nFROM Witness → Refinery:\n- 'Polecat ready': polecat X completed work, ready for merge\n- 'Rework complete': polecat Y finished requested rework\n\nFROM Refinery → Witness:\n- 'Merge success': polecat X merged, can be cleaned up\n- 'Merge failed': polecat X needs rework on \u003creason\u003e\n- 'Rework request': please have a polecat rebase X on current main\n\nImplement as structured mail with parseable format.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T18:09:27.451344-08:00","updated_at":"2025-12-27T21:29:56.941251-08:00","dependencies":[{"issue_id":"gt-0qki","depends_on_id":"gt-ktal","type":"blocks","created_at":"2025-12-19T18:09:39.58445-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.941251-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0s99","title":"submit-merge","description":"Submit to merge queue. Create branch if needed.\nVerify CI passes.\n\ngt done # Signal work ready for merge queue\n\nIf there are CI failures, fix them before proceeding.\n\nDepends: rebase-main","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:48:26.322452-08:00","updated_at":"2025-12-25T14:12:42.195231-08:00","dependencies":[{"issue_id":"gt-0s99","depends_on_id":"gt-bf95","type":"blocks","created_at":"2025-12-21T21:48:26.329601-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.195231-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0sf2","title":"Add gt rig rename command","description":"Allow renaming a rig after creation. Similar to gt crew rename, this should:\n\n- Update the directory name\n- Update mayor/rigs.json entry\n- Update the rig's config.json\n- Handle running agents gracefully (require shutdown first or --force)\n\nUse case: User creates a rig, later wants to change its name for better organization.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T04:46:41.988492-08:00","updated_at":"2025-12-27T21:29:56.134098-08:00","deleted_at":"2025-12-27T21:29:56.134098-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-0skyg","title":"Nudge newly spawned polecats","description":"Nudge newly spawned polecats that are ready for input.\n\nWhen polecats are spawned, their Claude session takes 10-20 seconds to initialize.\nThe spawn command returns immediately without waiting. This step finds spawned\npolecats that are now ready and sends them a trigger to start working.\n\n```bash\n# For each rig with polecats\nfor rig in gastown beads; do\n gt polecats $rig\n # For each working polecat, check if Claude is ready\n # Use tmux capture-pane to look for \"\u003e \" prompt\ndone\n```\n\nFor each ready polecat that hasn't been triggered yet:\n1. Send \"Begin.\" to trigger UserPromptSubmit hook\n2. The hook injects mail, polecat sees its assignment\n3. Mark polecat as triggered in state\n\nUse WaitForClaudeReady from tmux package (polls for \"\u003e \" prompt).\nTimeout: 60 seconds per polecat. If not ready, try again next cycle.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.777565-08:00","updated_at":"2025-12-27T21:29:55.249011-08:00","dependencies":[{"issue_id":"gt-0skyg","depends_on_id":"gt-uru8z","type":"blocks","created_at":"2025-12-25T02:11:33.977731-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.249011-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0wo9k","title":"Digest: mol-deacon-patrol","description":"Patrol 19","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T09:12:43.900306-08:00","updated_at":"2026-01-01T09:12:43.900306-08:00","closed_at":"2026-01-01T09:12:43.90027-08:00"}
{"id":"gt-0yn0","title":"test pin fix 2","notes":"Released: displaced by new sling","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T12:15:48.827437-08:00","updated_at":"2025-12-27T21:29:56.001943-08:00","deleted_at":"2025-12-27T21:29:56.001943-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-0yqqw","title":"Messaging infrastructure: lists, queues, and channels","description":"## Problem\n\nCurrent messaging is point-to-point only:\n- Mail: A → B (durable, polling)\n- Nudge: A → B (ephemeral, real-time)\n\nNo support for broadcast, work queues, or pub/sub patterns.\n\n## Design\n\n### Addressing Syntax\n\n| Prefix | System | Resolution | Storage | Claim? |\n|--------|--------|------------|---------|--------|\n| `agent/` | Mail | Direct | Recipient inbox | N/A |\n| `@group` | Mail | Dynamic (filesystem) | Fan-out (N copies) | No |\n| `list:name` | Mail | Static config | Fan-out (N copies) | No |\n| `queue:name` | Mail | Static config | Shared (1 copy) | Yes |\n| `announce:name` | Mail | Static config | Shared (1 copy) | No |\n| `#channel` | Nudge | Dynamic (tmux) | None (ephemeral) | No |\n\n### Dynamic Aliases\n\n- `@rig/gastown` → scan ~/gt/gastown/ for agent dirs (witness, refinery, crew/*, polecats/*)\n- `@town` → scan ~/gt/ for all agent dirs\n- `@witnesses` → for each rig, include \u003crig\u003e/witness\n- `@crew/gastown` → scan ~/gt/gastown/crew/*\n- `#rig/gastown` → scan tmux for gastown/* sessions\n- `#town` → scan tmux for all Gas Town sessions\n\n**Mail** resolves against filesystem (agents that exist)\n**Channels** resolve against tmux (agents that are running)\n\n### Static Config\n\nLocation: `~/gt/config/` (JSON format, machine-edited)\n\n```json\n// lists.json\n{\n \"oncall\": [\"mayor/\", \"gastown/witness\", \"beads/witness\"],\n \"cleanup/gastown\": [\"gastown/witness\", \"deacon/\"]\n}\n```\n\n### Work Queue Semantics\n\n- `gt mail send queue:cleanup/gastown -s \"Task\" -m \"...\"`\n- `bd update \u003cmsg-id\u003e --claim` → atomic claim (sets assignee + in_progress)\n- If already claimed, returns error \"already claimed by X\"\n\n### Use Cases\n\n1. **Polecat spawn**: `gt mail send @rig/gastown` → Witness + Deacon both notified\n2. **Polecat cleanup**: `gt mail send queue:cleanup/gastown` → first to claim handles it\n3. **Town broadcast**: `gt mail send @town -s \"Handoff now\"` → everyone gets copy\n4. **Real-time alert**: `gt channel publish #witnesses \"Swarm incoming\"`\n\n## Implementation Tasks\n\n1. Create ~/gt/config/ directory structure\n2. Implement dynamic alias resolution (filesystem scan)\n3. Implement static list lookup (config/lists.json)\n4. Add fan-out at send time for @/list: addresses\n5. Implement queue: with shared storage\n6. Add `bd update --claim` for work queue semantics\n7. Implement channel resolution (tmux scan)\n8. Add `gt channel publish` command","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T13:39:38.624096-08:00","updated_at":"2025-12-27T21:29:55.215073-08:00","dependencies":[{"issue_id":"gt-0yqqw","depends_on_id":"gt-s89rg","type":"blocks","created_at":"2025-12-25T14:57:38.134092-08:00","created_by":"daemon"},{"issue_id":"gt-0yqqw","depends_on_id":"gt-flje1","type":"blocks","created_at":"2025-12-25T14:57:38.221076-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.215073-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-110m","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy, no lifecycle requests, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:05:14.439933-08:00","updated_at":"2025-12-27T21:26:04.507497-08:00","deleted_at":"2025-12-27T21:26:04.507497-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-14hd","title":"Work on ga-xxp: Define mol-polecat-work standard molecule...","description":"Work on ga-xxp: Define mol-polecat-work standard molecule. See bd show ga-xxp for full details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T21:49:14.070072-08:00","updated_at":"2025-12-27T21:29:56.923963-08:00","deleted_at":"2025-12-27T21:29:56.923963-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-14w3x","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:36:04.72988-08:00","updated_at":"2025-12-27T21:26:02.494566-08:00","deleted_at":"2025-12-27T21:26:02.494566-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-15p32","title":"Digest: mol-deacon-patrol","description":"Patrol 15: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:45:12.609172-08:00","updated_at":"2025-12-27T21:26:03.181654-08:00","deleted_at":"2025-12-27T21:26:03.181654-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-161rp","title":"Digest: mol-deacon-patrol","description":"P5: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:56:02.332455-08:00","updated_at":"2025-12-27T21:26:02.417379-08:00","deleted_at":"2025-12-27T21:26:02.417379-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-16rv","title":"implement","description":"Implement the solution for gt-test. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T22:04:43.420903-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-16rv","depends_on_id":"gt-jvr3","type":"parent-child","created_at":"2025-12-21T22:04:43.422515-08:00","created_by":"stevey"},{"issue_id":"gt-16rv","depends_on_id":"gt-g844","type":"blocks","created_at":"2025-12-21T22:04:43.423201-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-17r","title":"Doctor check: Zombie session cleanup","description":"Detect and clean up zombie tmux sessions via gt doctor.\n\n## Problem\n\nZombie sessions occur when:\n- Agent crashes without cleanup\n- gt kill fails mid-operation\n- System restart leaves orphan sessions\n- Session naming collision\n\n## Checks\n\n### ZombieSessionCheck\n- List all tmux sessions matching gt-* pattern\n- Cross-reference with known polecats\n- Flag sessions with no corresponding polecat state\n- Flag sessions for removed polecats\n- Check session age vs polecat creation time\n\n### Detection Criteria\n- Session exists but polecat directory doesn't\n- Session name doesn't match any registered polecat\n- Polecat state=idle but session running\n- Multiple sessions for same polecat\n\n## Output\n\n```\n[WARN] Zombie tmux sessions detected:\n - gt-wyvern-OldPolecat (polecat removed)\n - gt-beads-Unknown (no matching polecat)\n - gt-wyvern-Toast (duplicate session)\n\n Run 'gt doctor --fix' to clean up\n```\n\n## Auto-Fix (--fix flag)\n\n- Kill orphan tmux sessions\n- Update polecat state to match reality\n- Log all cleanup actions\n\n## Safety\n\n- Never kill sessions where polecat state=working\n- Prompt before killing if --fix used without --force\n- Create audit log of killed sessions","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T23:18:01.446702-08:00","updated_at":"2025-12-27T21:29:54.512744-08:00","dependencies":[{"issue_id":"gt-17r","depends_on_id":"gt-f9x.4","type":"blocks","created_at":"2025-12-15T23:19:05.66301-08:00","created_by":"daemon"},{"issue_id":"gt-17r","depends_on_id":"gt-7ik","type":"blocks","created_at":"2025-12-17T15:44:41.945064-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.512744-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-17zr","title":"gt refinery start: doesn't actually start a session","description":"## Problem\n\n`gt refinery start gastown` reports success but doesn't start a tmux session.\n\n## Evidence\n\n```\n$ gt refinery start gastown\nStarting refinery for gastown...\n✓ Refinery started for gastown\n\n$ tmux list-sessions | grep refinery\n(nothing)\n\n$ gt refinery status gastown\nState: ○ stopped\n```\n\n## Expected\n\nShould start a tmux session (e.g., gt-gastown-refinery) with Claude processing the merge queue.\n\n## Related\n\n- gt-kcee: Witness commands also need implementation\n- The refinery 'start' may just be updating state.json without spawning a session","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:58:57.188389-08:00","updated_at":"2025-12-27T21:29:54.058504-08:00","deleted_at":"2025-12-27T21:29:54.058504-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-1847v","title":"Document Boot vs Deacon lifecycle and fix design confusion","description":"## Problem\n\nThe Boot/Deacon relationship is confusing and possibly over-engineered:\n\n### Current Design\n```\nDaemon (Go process, 3-min heartbeat)\n │\n └─► Boot (ephemeral triage dog)\n │\n └─► Decides: start/wake/nudge/interrupt Deacon?\n │\n └─► Deacon (persistent patrol agent)\n │\n └─► health-scan: restart witness/refinery\n```\n\n### Questions\n\n1. **Why two agents?** Boot exists to avoid waking Deacon unnecessarily in idle towns. But is this complexity worth it?\n\n2. **Session ownership**: Boot should run in `gt-deacon-boot`, Deacon in `gt-deacon`. But current behavior is confused.\n\n3. **Handoff**: When Boot starts Deacon, what happens to Boot? Does it exit? Hand off?\n\n4. **Fallback**: Daemon has `checkDeaconHeartbeat()` as belt-and-suspenders. When does this fire vs Boot?\n\n## Options\n\n### Option A: Keep Boot/Deacon separation (fix implementation)\n- Boot is ephemeral, spawns fresh each heartbeat\n- Boot runs in `gt-deacon-boot`, exits after triage\n- Deacon runs in `gt-deacon`, persistent patrol\n- Clear session boundaries, clear lifecycle\n\n### Option B: Merge Boot into Deacon\n- Single `gt-deacon` session\n- Deacon handles its own 'should I be awake?' logic\n- Simpler, fewer moving parts\n- Trade-off: Deacon consumes context even when idle\n\n### Option C: Replace with simpler watchdog\n- Daemon directly monitors witness/refinery\n- No Boot, no Deacon AI agents for health checks\n- Just Go code: if session dead, restart it\n- AI agents only for complex decisions (escalations)\n\n## Recommendation\n\nOption A with clear documentation. The separation has merit for cost control in idle towns. But the implementation needs fixing.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-02T18:43:34.079036-08:00","updated_at":"2026-01-02T18:54:39.15708-08:00","closed_at":"2026-01-02T18:54:39.15708-08:00","close_reason":"Documentation complete: created docs/watchdog-chain.md with full lifecycle explanation, design decision (keep separation), and cross-references","created_by":"mayor"}
{"id":"gt-189d","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:27","description":"Patrol 14","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:27:18.422413-08:00","updated_at":"2025-12-27T21:26:05.238569-08:00","deleted_at":"2025-12-27T21:26:05.238569-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-18bck","title":"Digest: mol-deacon-patrol","description":"Patrol 7: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:46:25.162002-08:00","updated_at":"2025-12-27T21:26:03.804969-08:00","deleted_at":"2025-12-27T21:26:03.804969-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-19gw","title":"Digest: mol-deacon-patrol","description":"Patrol: mayor handoff (Batch 1 complete, notifications working), 5 polecats now active (Batch 2)","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T23:48:49.79781-08:00","updated_at":"2025-12-27T21:26:05.433385-08:00","deleted_at":"2025-12-27T21:26:05.433385-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1ar5","title":"Merge: gt-qsvq","description":"branch: polecat/capable\ntarget: main\nsource_issue: gt-qsvq\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T07:55:31.049583-08:00","updated_at":"2025-12-27T21:27:22.675732-08:00","deleted_at":"2025-12-27T21:27:22.675732-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-1cuq","title":"Merge: gt-svi.1","description":"type: merge-request\nbranch: polecat/Max\ntarget: main\nsource_issue: gt-svi.1\nrig: gastown","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-18T20:15:31.738938-08:00","updated_at":"2025-12-27T21:29:45.553344-08:00","deleted_at":"2025-12-27T21:29:45.553344-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1dcji","title":"Digest: mol-deacon-patrol","description":"Patrol 3: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:36:31.299636-08:00","updated_at":"2025-12-27T21:26:00.921308-08:00","deleted_at":"2025-12-27T21:26:00.921308-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1dm5","title":"Test Patrol Parent","description":"[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\nTest parent for Christmas Ornament pattern","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-24T21:17:01.744663-08:00","updated_at":"2025-12-27T21:29:57.816902-08:00","deleted_at":"2025-12-27T21:29:57.816902-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1dvlx","title":"Digest: mol-deacon-patrol","description":"P6: stable, beads polecats exited","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:56:44.405885-08:00","updated_at":"2025-12-27T21:26:02.409194-08:00","deleted_at":"2025-12-27T21:26:02.409194-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1earc","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:35:00.204779-08:00","updated_at":"2025-12-27T21:26:02.179781-08:00","deleted_at":"2025-12-27T21:26:02.179781-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1edmp","title":"Merge: toast-mjw7062w","description":"branch: polecat/toast-mjw7062w\ntarget: main\nsource_issue: toast-mjw7062w\nrig: gastown\nagent_bead: gt-gastown-polecat-toast","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:15:08.233495-08:00","updated_at":"2026-01-01T20:32:00.836595-08:00","closed_at":"2026-01-01T20:32:00.8366-08:00","created_by":"gastown/polecats/toast"}
{"id":"gt-1ero","title":"Test message","description":"Test body","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:53:03.66658-08:00","updated_at":"2025-12-27T21:29:56.678376-08:00","deleted_at":"2025-12-27T21:29:56.678376-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-1f01","title":"Digest: mol-deacon-patrol","description":"Patrol 8: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:58:26.967654-08:00","updated_at":"2025-12-27T21:26:04.960518-08:00","deleted_at":"2025-12-27T21:26:04.960518-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1f51t","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 6: routine, healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:23:52.631309-08:00","updated_at":"2025-12-27T21:26:01.924968-08:00","deleted_at":"2025-12-27T21:26:01.924968-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1fl","title":"gt crew restart command","description":"Add a 'gt crew restart' command that kills the tmux session and restarts fresh. Useful when a crew member gets confused or needs a clean slate.\n\nShould:\n- Kill existing tmux session if running\n- Start fresh session with claude\n- Run gt prime to reinitialize context\n\nAlias: gt crew rs","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T19:47:32.131386-08:00","updated_at":"2025-12-27T21:29:57.261152-08:00","deleted_at":"2025-12-27T21:29:57.261152-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1fqwd","title":"Digest: mol-deacon-patrol","description":"Patrol 2 complete: all healthy, no issues","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:58:51.246552-08:00","updated_at":"2025-12-27T21:26:00.677765-08:00","deleted_at":"2025-12-27T21:26:00.677765-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1gbf","title":"Digest: mol-deacon-patrol","description":"Patrol #5: Routine - 6 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:22:06.354723-08:00","updated_at":"2025-12-27T21:26:04.793249-08:00","deleted_at":"2025-12-27T21:26:04.793249-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1gy","title":"gt mail read: Support numeric indices for message ID","description":"Allow 'gt mail read 1' to read the first message in inbox.\n\nCurrent behavior requires full message ID like 'msg-abc123'.\nShould support:\n- Numeric index: 'gt mail read 1' reads first/newest message\n- Partial ID match: 'gt mail read abc' matches 'msg-abc123'\n\nThis is a UX improvement - agents frequently type 'gt mail read 1'.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T21:49:54.60582-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-1gy","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:22:47.658947-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-1i8r","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy - Mayor OK, 2 witnesses, 2 refineries, 0 polecats, 8 sessions, no callbacks","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:44:43.300311-08:00","updated_at":"2025-12-27T21:26:04.227139-08:00","deleted_at":"2025-12-27T21:26:04.227139-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1iwk.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-1iwk\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T00:20:50.854609-08:00","updated_at":"2025-12-27T21:29:55.604313-08:00","deleted_at":"2025-12-27T21:29:55.604313-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1j6","title":"Document harness concept in docs/harness.md","description":"Create comprehensive harness documentation:\n- What is a harness (installation directory)\n- Recommended structure and naming\n- .beads/redirect for default project\n- config/ contents (rigs.json, town.json)\n- Mayor home vs rig-level mayor/\n- Example configurations\n- Relationship to rigs","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T17:15:37.559374-08:00","updated_at":"2025-12-27T21:29:57.280114-08:00","dependencies":[{"issue_id":"gt-1j6","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:51.974059-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.280114-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1j811","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:13:20.411051-08:00","updated_at":"2026-01-01T08:13:20.411051-08:00","closed_at":"2026-01-01T08:13:20.411017-08:00","dependencies":[{"issue_id":"gt-1j811","depends_on_id":"gt-eph-43mn","type":"parent-child","created_at":"2026-01-01T08:13:20.412269-08:00","created_by":"deacon"}]}
{"id":"gt-1jirq","title":"Session ended: gt-mayor","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-02T20:53:43.997588-08:00","updated_at":"2026-01-07T10:33:31.838411+13:00","closed_at":"2026-01-04T16:41:00.351083-08:00","close_reason":"Archived session telemetry","created_by":"mayor"}
{"id":"gt-1klr","title":"mol-deacon-patrol","description":"Deacon patrol molecule template. Label: template","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-22T02:08:49.258035-08:00","updated_at":"2025-12-27T21:26:05.347641-08:00","deleted_at":"2025-12-27T21:26:05.347641-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-1kuox","title":"Merge: slit-mjxof5ku","description":"branch: polecat/slit-mjxof5ku\ntarget: main\nsource_issue: slit-mjxof5ku\nrig: gastown\nagent_bead: gt-gastown-polecat-slit","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:28:12.867214-08:00","updated_at":"2026-01-02T18:29:52.007194-08:00","closed_at":"2026-01-02T18:29:52.007194-08:00","close_reason":"Merged to main at 89785378","created_by":"gastown/polecats/slit"}
{"id":"gt-1l7h","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:47","description":"Patrol 16: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:47:28.476069-08:00","updated_at":"2025-12-27T21:26:05.062875-08:00","deleted_at":"2025-12-27T21:26:05.062875-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1lvog","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 15: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:37.102371-08:00","updated_at":"2025-12-28T11:22:37.102371-08:00","closed_at":"2025-12-28T11:22:37.102337-08:00"}
{"id":"gt-1meck","title":"Digest: mol-deacon-patrol","description":"Patrol 11","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T09:10:55.708526-08:00","updated_at":"2026-01-01T09:10:55.708526-08:00","closed_at":"2026-01-01T09:10:55.708486-08:00","dependencies":[{"issue_id":"gt-1meck","depends_on_id":"gt-eph-lwvx","type":"parent-child","created_at":"2026-01-01T09:10:55.70977-08:00","created_by":"deacon"}]}
{"id":"gt-1py3y","title":"Unify gt sling and gt spawn with --args support","description":"## Summary\n\nComplete the unification: `gt sling` becomes THE way to assign work. Remove top-level `gt spawn`.\n\n## Current State (Partially Done)\n\n- `gt sling` auto-spawns polecats when target is a rig\n- `gt sling --args` stores args in bead (no-tmux mode, gt-vc3l4)\n- `gt spawn` still exists with flags not yet in sling\n\n## Remaining Work\n\n### 1. Move spawn-only flags to sling\n\n| Flag | Current | Action |\n|------|---------|--------|\n| `--naked` | spawn only | Move to sling |\n| `--create` | spawn only | Move to sling |\n| `--molecule` | spawn only | Move to sling |\n| `--force` | spawn only | Move to sling |\n| `--account` | spawn only | Move to sling |\n| `--no-start` | spawn only | Evaluate if needed |\n\n### 2. Remove top-level gt spawn\n\n- Keep `gt spawn pending` as subcommand if still needed\n- Remove `gt spawn` from root commands\n- Update all docs and CLAUDE.md references\n\n### 3. Unified UX Design\n\nTarget resolution (smart defaults):\n```bash\ngt sling gt-abc # Self (current agent)\ngt sling gt-abc crew # Crew worker in current rig \ngt sling gt-abc gastown # Auto-spawn polecat in rig\ngt sling gt-abc gastown/Toast # Specific polecat\ngt sling gt-abc mayor # Mayor\n```\n\nKey flags:\n```bash\ngt sling gt-abc --args \"patch release\" # Natural language context\ngt sling gt-abc --naked # No-tmux mode (manual start)\ngt sling gt-abc --create # Create polecat if missing\ngt sling gt-abc --molecule mol-review # Use specific workflow\n```\n\n### 4. Update Witness/Refinery/Deacon\n\nThese agents currently use `gt spawn`. Update to use `gt sling`:\n- Witness: spawning polecats for work\n- Refinery: spawning merge workers \n- Deacon: any spawn logic\n\n## Acceptance Criteria\n\n- [ ] All spawn flags available in sling\n- [ ] `gt spawn` removed (or hidden/deprecated)\n- [ ] Witness uses `gt sling \u003crig\u003e` for polecat work\n- [ ] Docs updated\n- [ ] Both human and agent UX is clean","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-26T13:35:55.902844-08:00","updated_at":"2025-12-27T21:29:45.90011-08:00","deleted_at":"2025-12-27T21:29:45.90011-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-1qti","title":"Mayor restart loop: auto-prime and mail check on attach","description":"When mayor session cycles (after /exit, gt mayor attach reconnects), the new session lacks context.\n\n## Current Behavior\n- Mayor exits, shell script loop restarts claude\n- New session starts cold - no gt prime, no mail check\n- Mayor is disoriented, doesn't know prior context\n- Strong 'antimemetic properties' - we discuss fixing it, then forget\n\n## Expected Behavior\nAfter restart, mayor should automatically:\n1. Run gt prime (load role context)\n2. Check gt mail inbox (find handoff messages)\n3. Look for 🤝 HANDOFF messages from predecessor\n\n## Possible Fixes\n1. Startup hook in Claude Code settings that runs gt prime\n2. CLAUDE.md instructions that say 'FIRST THING: run gt prime'\n3. Shell script wrapper that injects context before attach\n4. Modify gt mayor attach to inject prime command\n\n## Related\n- gt-vci: Mayor handoff mail template\n- gt-sye: Mayor startup protocol prompting","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-19T14:44:43.188588-08:00","updated_at":"2025-12-27T21:29:54.025195-08:00","deleted_at":"2025-12-27T21:29:54.025195-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-1s89v","title":"gt crew next: Return self when only window in cycle group","description":"## Problem\n`gt crew next --session 'gt-mayor'` returns exit code 1 when the Mayor is the only window in that cycle group.\n\n## Expected Behavior\nWhen an agent is the only member of its cycle group (or not in any configured cycle group), `gt crew next` should return the current session itself rather than failing.\n\n## Context\n- C-b n in Mayor's session triggers this command\n- Currently only rig crews have configured cycle groups\n- Mayor is not part of any crew cycle group, so it should just return itself\n\n## Fix\nIf no other windows exist in the cycle group, return the current session name instead of error.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-27T16:46:28.657641-08:00","updated_at":"2025-12-27T21:29:45.740079-08:00","created_by":"gastown/crew/max","deleted_at":"2025-12-27T21:29:45.740079-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-1su","title":"Spawn inject: Enter not submitted after multiline paste","description":"When gt spawn injects a multiline context (85+ lines), Claude Code shows \n'[Pasted text #1 +85 lines]' but the prompt is not submitted.\n\nThe tmux SendKeys function does include 'Enter' but it appears to not work\nfor long pasted text:\n\n```go\nfunc (t *Tmux) SendKeys(session, keys string) error {\n _, err := t.run(\"send-keys\", \"-t\", session, keys, \"Enter\")\n return err\n}\n```\n\nPossible fixes:\n1. Use tmux paste-buffer instead of send-keys for long text\n2. Send Enter separately after the text\n3. Use bracketed paste mode correctly\n\nReproduction:\n```bash\ngt spawn gastown/Nux --issue gt-u1j.13 --create\n# Session shows pasted text but waits at prompt\n```","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-17T14:09:20.774203-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-1wc1l","title":"Digest: mol-deacon-patrol","description":"Patrol 18: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:17:51.280971-08:00","updated_at":"2025-12-27T21:26:00.954722-08:00","deleted_at":"2025-12-27T21:26:00.954722-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1wmw","title":"E2E Test: Verify mol-polecat-work spawn","description":"Simple test task for Phase 5 E2E validation","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T22:53:38.996878-08:00","updated_at":"2025-12-27T21:29:52.564408-08:00","deleted_at":"2025-12-27T21:29:52.564408-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1wum6","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:18:51.752483-08:00","updated_at":"2025-12-27T21:26:02.683328-08:00","deleted_at":"2025-12-27T21:26:02.683328-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-1xsah","title":"Add gt role command group with env-based role detection","description":"# gt role Command Group\n\nMake role detection less fragile by using env var injection instead of cwd detection.\n\n## Env Vars\n- `GT_ROLE` - authoritative role name (mayor, witness, refinery, polecat, crew)\n- `GT_ROLE_HOME` - canonical home directory for the role\n\n## Detection Precedence\n1. `$GT_ROLE` env var (authoritative)\n2. Fall back to cwd-based detection (legacy/manual)\n3. Warn if they disagree\n\n## Subcommands\n- `gt role` / `gt role show` - show current role\n- `gt role home [ROLE]` - show home dir for role\n- `gt role detect` - force cwd-based detection (debugging)\n- `gt role list` - list all known roles\n- `gt role env` - print export statements\n\n## Integration Points\n- gt prime - show role, warn on mismatch\n- gt mol status - use gt role instead of own detection\n- gt handoff - preserve GT_ROLE, reset to home\n- All spawners - inject GT_ROLE and GT_ROLE_HOME\n\n## Background\nMayor wandered to refinery dir, handoff preserved it, role detection broke.\nEnv var injection makes role stable across cd operations.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-25T00:50:53.378051-08:00","updated_at":"2025-12-27T21:29:52.539527-08:00","deleted_at":"2025-12-27T21:29:52.539527-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-1xsbn","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 4: routine, healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:22:42.747113-08:00","updated_at":"2025-12-27T21:26:01.942322-08:00","deleted_at":"2025-12-27T21:26:01.942322-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-202k","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:44","description":"Patrol 6: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:44:33.079069-08:00","updated_at":"2025-12-27T21:26:05.137605-08:00","deleted_at":"2025-12-27T21:26:05.137605-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-20fhw","title":"test-agent-verify","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T01:29:55.22014-08:00","updated_at":"2025-12-28T01:30:01.423217-08:00","created_by":"mayor","deleted_at":"2025-12-28T01:30:01.423217-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"agent"}
{"id":"gt-20i1i","title":"Merge: dementus-mjxofwf6","description":"branch: polecat/dementus-mjxofwf6\ntarget: main\nsource_issue: dementus-mjxofwf6\nrig: gastown\nagent_bead: gt-gastown-polecat-dementus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:29:04.132018-08:00","updated_at":"2026-01-02T18:30:59.429479-08:00","closed_at":"2026-01-02T18:30:59.429479-08:00","close_reason":"Merged to main at 8f6e2d21","created_by":"gastown/polecats/dementus"}
{"id":"gt-21lh","title":"Polecat template: remove redundant 'Finding Work' section","description":"The 'Finding Work' section (bd ready, bd list, bd show) is general-purpose guidance that may not apply to polecats who are spawned with specific work. Either remove for polecats or clarify this is for discovering additional work during execution.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T16:56:55.230827-08:00","updated_at":"2025-12-27T21:29:57.456286-08:00","dependencies":[{"issue_id":"gt-21lh","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.772763-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.456286-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-22xxi","title":"Digest: mol-deacon-patrol","description":"Patrol 7: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T15:48:54.44952-08:00","updated_at":"2025-12-27T21:26:03.108112-08:00","deleted_at":"2025-12-27T21:26:03.108112-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-23ic","title":"Merge: gt-h6eq.3","description":"branch: polecat/dag\ntarget: main\nsource_issue: gt-h6eq.3\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:51:16.586184-08:00","updated_at":"2025-12-27T21:27:22.819778-08:00","deleted_at":"2025-12-27T21:27:22.819778-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-24yjs","title":"Digest: mol-deacon-patrol","description":"Patrol 17: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:19:27.295939-08:00","updated_at":"2025-12-27T21:26:02.666976-08:00","deleted_at":"2025-12-27T21:26:02.666976-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-25bf","title":"Mail coordination should use town-level database","description":"gt mail send/inbox should use ~/gt/.beads (town root) not rig-local beads. Cross-rig mail coordination (Witness \u003c-\u003e Mayor, polecat \u003c-\u003e Witness) needs to be in a shared location.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T07:52:18.374322-08:00","updated_at":"2025-12-27T21:29:53.772781-08:00","deleted_at":"2025-12-27T21:29:53.772781-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-26y8u","title":"Digest: mol-deacon-patrol","description":"Patrol 15: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:23:52.904806-08:00","updated_at":"2025-12-27T21:26:00.059966-08:00","deleted_at":"2025-12-27T21:26:00.059966-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-272b","title":"Work on gt-role-template: Refine witness/CLAUDE.md role t...","description":"Work on gt-role-template: Refine witness/CLAUDE.md role template. Run 'bd show gt-role-template' to see the full issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T07:44:10.203384-08:00","updated_at":"2025-12-27T21:29:56.814387-08:00","deleted_at":"2025-12-27T21:29:56.814387-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-27zla","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 11: all healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:21:11.308384-08:00","updated_at":"2026-01-01T07:21:11.308384-08:00","closed_at":"2026-01-01T07:21:11.308346-08:00"}
{"id":"gt-29k50","title":"Digest: mol-deacon-patrol","description":"Patrol 17: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T05:01:26.919989-08:00","updated_at":"2025-12-27T21:26:03.718654-08:00","deleted_at":"2025-12-27T21:26:03.718654-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2bisz","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:34:00.699814-08:00","updated_at":"2025-12-27T21:26:02.916168-08:00","deleted_at":"2025-12-27T21:26:02.916168-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2bz","title":"Swarm learning: Refinery merge queue automation","description":"Manually merging 15 polecat branches was painful and error-prone. Refinery should automate: detect completed work, run tests, merge to main, handle conflicts. This is core Refinery value prop.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T01:21:51.137974-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2c1","title":"Swarm learning: Spawn should auto-notify polecats","description":"town spawn assigns issues but doesn't notify polecats. Required separate 'town session send' to inject prompts. This should be one atomic operation - spawn assigns AND pokes the polecat to start working.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T01:21:47.223608-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2cd7","title":"Self-test","description":"Testing gt mail works","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:55:31.928025-08:00","updated_at":"2025-12-27T21:29:56.719613-08:00","deleted_at":"2025-12-27T21:29:56.719613-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-2dpg","title":"Digest: mol-deacon-patrol","description":"Patrol #8: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:32:34.26298-08:00","updated_at":"2025-12-27T21:26:04.334571-08:00","deleted_at":"2025-12-27T21:26:04.334571-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2ebi","title":"Merge: gt-4ev4","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-4ev4\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T12:36:57.533878-08:00","updated_at":"2025-12-27T21:27:22.559862-08:00","deleted_at":"2025-12-27T21:27:22.559862-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-2exsg","title":"Session ended: gt-refinery","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:46:46.06495-08:00","updated_at":"2026-01-07T10:33:31.847851+13:00","closed_at":"2026-01-04T16:41:00.392752-08:00","close_reason":"Archived session telemetry","created_by":"gastown/refinery"}
{"id":"gt-2f3h","title":"Digest: mol-deacon-patrol","description":"Patrol #20: Stable, handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:35:50.776172-08:00","updated_at":"2025-12-27T21:26:04.235471-08:00","deleted_at":"2025-12-27T21:26:04.235471-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2f3z9","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 12: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:27:13.064608-08:00","updated_at":"2025-12-27T21:26:01.871711-08:00","deleted_at":"2025-12-27T21:26:01.871711-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2ffjo","title":"Digest: mol-deacon-patrol","description":"Patrol 19: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:47:07.877911-08:00","updated_at":"2025-12-27T21:26:00.871228-08:00","deleted_at":"2025-12-27T21:26:00.871228-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2g15j","title":"Digest: mol-deacon-patrol","description":"Patrol 12: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:08:11.988711-08:00","updated_at":"2025-12-27T21:26:02.9593-08:00","deleted_at":"2025-12-27T21:26:02.9593-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2jl","title":"Add bulk polecat remove command (gt polecat remove --all)","description":"When decommissioning a rig, need to remove multiple polecats one at a time. A --all or --rig flag would allow: gt polecat remove --rig gastown --force","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-18T11:33:35.206637-08:00","updated_at":"2025-12-27T21:29:57.614219-08:00","deleted_at":"2025-12-27T21:29:57.614219-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-2k4f","title":"mol-polecat-lease","description":"Semaphore tracking a single polecat's lifecycle.\nVars: {{polecat}}, {{issue}}\n\nUsed by Witness to track polecat lifecycle during patrol. The Witness bonds\nthis proto for each active polecat, creating a lease that tracks the polecat\nfrom spawn through work to cleanup.\n\n## Step: boot\nSpawned. Verify it starts working.\n\nCheck if the polecat is alive and working:\n```bash\ngt peek {{polecat}}\n```\n\nIf idle for too long, nudge:\n```bash\ngt nudge {{polecat}} \"Please start working on your assigned issue.\"\n```\n\nTimeout: 60s before escalation to Mayor.\n\n## Step: working\nActively working. Monitor for stuck.\n\nThe polecat is processing its assigned issue ({{issue}}).\nMonitor via peek. Watch for:\n- Progress on commits\n- Status updates in beads\n- SHUTDOWN mail when done\n\nWait for SHUTDOWN signal from the polecat.\nNeeds: boot\n\n## Step: done\nExit received. Ready for cleanup.\n\nThe polecat has completed its work and sent SHUTDOWN.\nPerform cleanup:\n```bash\ngt session kill {{polecat}}\ngt worktree prune {{polecat}}\n```\n\nUpdate beads state and close the lease.\nNeeds: working","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-22T23:41:25.342615-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-2k4f.1","title":"boot","description":"Spawned. Verify the polecat starts working.\n\nCheck if the polecat is alive and working:\n```bash\ngt peek {{polecat}}\n```\n\nIf idle for too long (\u003e60s), nudge:\n```bash\ngt nudge {{polecat}} \"Please start working on your assigned issue.\"\n```\n\nTimeout: 60s before escalation to Mayor.\nVariables: {{polecat}}, {{issue}}","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:43:41.517464-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-2k4f.1","depends_on_id":"gt-2k4f","type":"parent-child","created_at":"2025-12-22T23:43:41.517901-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2k4f.2","title":"working","description":"Actively working. Monitor for stuck.\n\nThe polecat is processing its assigned issue ({{issue}}).\nMonitor via peek. Watch for:\n- Progress on commits\n- Status updates in beads\n- SHUTDOWN mail when done\n\nWait for SHUTDOWN signal from the polecat.\nVariables: {{polecat}}, {{issue}}","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:43:42.77616-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-2k4f.2","depends_on_id":"gt-2k4f","type":"parent-child","created_at":"2025-12-22T23:43:42.778213-08:00","created_by":"daemon"},{"issue_id":"gt-2k4f.2","depends_on_id":"gt-2k4f.1","type":"blocks","created_at":"2025-12-22T23:43:55.78046-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2k4f.3","title":"done","description":"Exit received. Ready for cleanup.\n\nThe polecat has completed its work and sent SHUTDOWN.\nPerform cleanup:\n```bash\ngt session kill {{polecat}}\ngt worktree prune {{polecat}}\n```\n\nUpdate beads state and close the lease.\nVariables: {{polecat}}, {{issue}}","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:43:44.676322-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-2k4f.3","depends_on_id":"gt-2k4f","type":"parent-child","created_at":"2025-12-22T23:43:44.676783-08:00","created_by":"daemon"},{"issue_id":"gt-2k4f.3","depends_on_id":"gt-2k4f.2","type":"blocks","created_at":"2025-12-22T23:43:55.898358-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2mut4","title":"Digest: mol-deacon-patrol","description":"P14","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:26:05.355961-08:00","updated_at":"2025-12-27T21:26:01.626427-08:00","deleted_at":"2025-12-27T21:26:01.626427-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2n3f","title":"Merge conflicts: Buzzard/mq-status, Dementus/harness-docs","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T12:08:07.486349-08:00","updated_at":"2025-12-27T21:29:54.042016-08:00","deleted_at":"2025-12-27T21:29:54.042016-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2n6z","title":"Inconsistent error wrapping patterns","description":"Error handling is inconsistent across the codebase:\n\n1. Some functions use fmt.Errorf with %w for wrapping, others don't\n2. return nil, nil patterns (40+ occurrences) sometimes represent 'not found' and sometimes 'error but continue' - should use sentinel errors\n3. Some places return nil for errors when they should propagate them\n\nExamples of nil,nil that might need review:\n- internal/git/git.go:272, 308\n- internal/crew/manager.go:227\n- internal/witness/manager.go:517, 523, 530, 775\n- internal/beads/beads.go:225, 492","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-21T21:35:25.451396-08:00","updated_at":"2025-12-27T21:29:57.891394-08:00","deleted_at":"2025-12-27T21:29:57.891394-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2o3vq","title":"Merge: rictus-mjxofiub","description":"branch: polecat/rictus-mjxofiub\ntarget: main\nsource_issue: rictus-mjxofiub\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:54:31.175504-08:00","updated_at":"2026-01-02T18:57:08.979556-08:00","closed_at":"2026-01-02T18:57:08.979556-08:00","close_reason":"Merged to main at 6400f94f","created_by":"gastown/polecats/rictus"}
{"id":"gt-2p2","title":"Test message","description":"Testing GGT mail integration","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T14:04:50.045948-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"message"}
{"id":"gt-2r25i","title":"Merge: rictus-mjxc967h","description":"branch: polecat/rictus-mjxc967h\ntarget: main\nsource_issue: rictus-mjxc967h\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T12:47:50.532044-08:00","updated_at":"2026-01-02T12:57:30.39268-08:00","closed_at":"2026-01-02T12:57:30.39268-08:00","created_by":"gastown/polecats/rictus"}
{"id":"gt-2r6dt","title":"Test: Polecat lifecycle validation (auto-close)","description":"Simple test task: Read this issue, print 'Lifecycle test passed', then close this issue with bd close. No code changes needed.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T18:28:10.052196-08:00","updated_at":"2025-12-27T21:29:45.723582-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:45.723582-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2tp","title":"init.go: Replace custom contains() with strings.Contains","status":"tombstone","priority":3,"issue_type":"bug","created_at":"2025-12-16T13:55:11.326407-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-2ux","title":"gt uninstall: Clean removal of Gas Town harness","description":"Add 'gt uninstall' command to cleanly remove a Gas Town installation.\n\nShould:\n- Remove harness directory structure\n- Optionally preserve rigs/data with --keep-data flag\n- Warn about running sessions\n- Clean up any global config references","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T21:47:16.175246-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-2ux","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:22:47.419553-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-2x351","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 12 agents healthy, no lifecycle requests, 1 potential orphan (bd-llfl). Mayor has 6 pending messages.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:07:26.400051-08:00","updated_at":"2025-12-27T21:26:02.335405-08:00","deleted_at":"2025-12-27T21:26:02.335405-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2x8ch","title":"Digest: mol-deacon-patrol","description":"Patrol 14: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:45:00.922407-08:00","updated_at":"2025-12-27T21:26:03.18985-08:00","deleted_at":"2025-12-27T21:26:03.18985-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2xiv","title":"gt mail inbox doesn't find crew worker mail - identity mismatch","description":"## Problem\n\nCrew workers don't see their handoff messages when running `gt mail inbox` from their working directory.\n\n## Root Cause\n\ngt derives identity from cwd path, but crew workers have a different path structure:\n- **cwd**: `/Users/stevey/gt/beads/crew/dave`\n- **gt derives**: `beads/crew/dave` (wrong)\n- **should be**: `beads/dave` (crew workers use `rig/name` format)\n\nMessages sent to `beads/dave` don't show up because gt is looking for `beads/crew/dave`.\n\n## Workaround\n\nUse explicit identity flag:\n```bash\ngt mail inbox --identity \"beads/dave\"\n```\n\n## Fix\n\nIdentity detection in gt should handle crew/ subdirectory:\n- Path `\u003crig\u003e/crew/\u003cname\u003e` should derive identity as `\u003crig\u003e/\u003cname\u003e`\n- This matches how polecats work: `\u003crig\u003e/polecats/\u003cname\u003e` → `\u003crig\u003e/\u003cname\u003e`\n\n## Affected\n\nAll crew workers in all rigs.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-22T00:34:02.290132-08:00","updated_at":"2025-12-27T21:29:45.477862-08:00","deleted_at":"2025-12-27T21:29:45.477862-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-2xsh","title":"Silent error handling with _ = err patterns","description":"Multiple locations intentionally ignore errors with comments like 'Ignore errors'. While some are valid (best-effort operations), others should be audited:\n\n- witness/manager.go:542 - Ignores cmd.Run() error\n- refinery/manager.go:411, 438, 509 - Ignores loadState and git pull errors\n- swarm/manager.go:47 - Ignores getGitHead error\n- polecat/manager.go:69, 284 - Ignores pool.Load and DeleteBranch errors\n- swarm/integration.go:101, 137 - Ignores git pull and push errors\n\nEach should be evaluated: log it, handle it, or confirm ignoring is intentional.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-21T21:35:11.239439-08:00","updated_at":"2025-12-27T21:29:57.564165-08:00","deleted_at":"2025-12-27T21:29:57.564165-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2yx57","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Core healthy. Noted Mayor escalation (rig mismatch) - not Deacon work.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:30:49.807463-08:00","updated_at":"2025-12-27T21:26:02.59374-08:00","deleted_at":"2025-12-27T21:26:02.59374-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-2z6s","title":"bd sync: Handle mol-* prefix for protos gracefully","description":"When bd sync imports protos (mol-*), it warns about prefix mismatch. The allowed_prefixes config doesn't seem to apply to imports.\n\nCurrent behavior: bd sync fails import step with prefix mismatch error\nExpected: mol-* should be recognized as valid proto prefix, or allowed_prefixes should be respected\n\nWorkaround: The export/push still succeeds, only import fails.\n\nRelated: bd-47qx","status":"tombstone","priority":3,"issue_type":"bug","created_at":"2025-12-24T14:07:34.525817-08:00","updated_at":"2025-12-27T21:29:57.389955-08:00","deleted_at":"2025-12-27T21:29:57.389955-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-2zfi3","title":"Digest: mol-deacon-patrol","description":"P17: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:14:39.841949-08:00","updated_at":"2025-12-27T21:26:02.228947-08:00","deleted_at":"2025-12-27T21:26:02.228947-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-30i7v","title":"Merge: nux-mjxn8p5t","description":"branch: polecat/nux-mjxn8p5t\ntarget: main\nsource_issue: nux-mjxn8p5t\nrig: gastown\nagent_bead: gt-gastown-polecat-nux","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:55:50.699122-08:00","updated_at":"2026-01-02T18:57:59.524744-08:00","closed_at":"2026-01-02T18:57:59.524744-08:00","created_by":"gastown/polecats/nux"}
{"id":"gt-3133","title":"Account management for multi-account Claude Code usage","description":"Enable Gas Town to manage multiple Claude Code accounts (e.g., personal vs work) with easy switching. Core mechanism: CLAUDE_CONFIG_DIR env var per account. See docs/design/account-management.md for full design.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T03:23:35.825288-08:00","updated_at":"2025-12-27T21:29:56.20898-08:00","dependencies":[{"issue_id":"gt-3133","depends_on_id":"gt-58tu","type":"blocks","created_at":"2025-12-23T03:24:37.170307-08:00","created_by":"daemon"},{"issue_id":"gt-3133","depends_on_id":"gt-hs6y","type":"blocks","created_at":"2025-12-23T03:24:37.263123-08:00","created_by":"daemon"},{"issue_id":"gt-3133","depends_on_id":"gt-nq1a","type":"blocks","created_at":"2025-12-23T03:24:37.353483-08:00","created_by":"daemon"},{"issue_id":"gt-3133","depends_on_id":"gt-0c7s","type":"blocks","created_at":"2025-12-23T03:24:37.443494-08:00","created_by":"daemon"},{"issue_id":"gt-3133","depends_on_id":"gt-74a7","type":"blocks","created_at":"2025-12-23T03:24:37.534229-08:00","created_by":"daemon"},{"issue_id":"gt-3133","depends_on_id":"gt-plcg","type":"blocks","created_at":"2025-12-23T03:24:37.625162-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.20898-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-31eg","title":"Epic: Async Gates for Agent Coordination","description":"## Summary\n\nAgents need an async primitive for waiting on external events (CI completion,\nAPI responses, human approval). Currently they either poll wastefully or cant\nresume after handoff.\n\n## Design: Deacon-Managed Gates\n\n### Core Concepts\n\n**Gate** = wisp issue that blocks until external condition is met\n- Type: `gate`\n- Phase: wisp (never synced, ephemeral)\n- Assignee: `deacon/` (Deacon monitors it)\n- Fields: `await_type`, `await_id`, `timeout`, `waiters[]`\n\n**Await Types:**\n- `gh:run:\u003cid\u003e` - GitHub Actions run completion\n- `gh:pr:\u003cid\u003e` - PR merged/closed\n- `timer:\u003cduration\u003e` - Simple delay (e.g., \"5m\", \"1h\")\n- `human:\u003cprompt\u003e` - Human approval required\n- `mail:\u003cpattern\u003e` - Wait for mail matching pattern\n\n### Commands\n\n```bash\nbd gate create --await \u003ctype\u003e:\u003cid\u003e --timeout \u003cduration\u003e --notify \u003caddr\u003e\nbd gate show \u003cid\u003e\nbd gate list\nbd gate close \u003cid\u003e --reason \"completed\"\nbd gate wait \u003cid\u003e --notify \u003caddr\u003e\n```\n\n## Children (suggested breakdown)\n\n1. Add `gate` issue type to beads\n2. Add gate fields: await_type, await_id, timeout, waiters\n3. Implement `bd gate create/show/list/close/wait` commands\n4. Add gate checking to Deacon patrol loop\n5. Implement await type handlers (gh:run, gh:pr, timer, human, mail)\n6. Add gate timeout tracking and notification\n7. Integration test: agent waits for CI via gate\n\n## Open Questions\n\n- Should gates live in wisp storage or main storage with wisp flag?\n- Do we need a gate catalog (like molecule catalog)?\n- Should `waits-for` dep type work with gates?","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-24T19:19:17.774543-08:00","updated_at":"2025-12-27T21:29:52.572647-08:00","deleted_at":"2025-12-27T21:29:52.572647-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-33kxm","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 19: all healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:25:17.556636-08:00","updated_at":"2026-01-01T07:25:17.556636-08:00","closed_at":"2026-01-01T07:25:17.556601-08:00"}
{"id":"gt-346","title":"Update harness beads redirect for GGT","description":"Change ~/ai/.beads/redirect from mayor/rigs/gastown/.beads to gastown/mayor/.beads for the GGT directory structure","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T16:42:41.650571-08:00","updated_at":"2025-12-27T21:29:54.261924-08:00","dependencies":[{"issue_id":"gt-346","depends_on_id":"gt-l1o","type":"blocks","created_at":"2025-12-17T16:42:54.495061-08:00","created_by":"daemon"},{"issue_id":"gt-346","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:59.04264-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.261924-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-35s","title":"Architecture: beads config and direct landing docs","description":"Added to architecture.md:\n- Beads multi-agent configuration table (daemon, worktree, sync-branch)\n- ASCII directory layout for non-mermaid rendering\n- Direct landing workflow (bypass merge queue)\n- Design decisions 9 and 10 for direct landing and daemon awareness\n- CLI commands for gt land","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T00:29:52.395906-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-361qj","title":"Standardize wisp creation syntax across role templates","description":"## Task\n\nRole templates use inconsistent syntax for creating wisps.\n\n## Current State\n\n**Deacon template:**\n```bash\nbd mol wisp create mol-deacon-patrol\nbd update \u003cwisp-id\u003e --status=pinned --assignee=deacon\n```\n\n**Witness template (BROKEN - see gt-xxx):**\n```bash\nbd mol wisp mol-witness-patrol --assignee={{ .RigName }}/witness\n```\n\n## Recommended Pattern\n\nAll patrol roles should use the same two-step pattern:\n```bash\nbd mol wisp create mol-\u003crole\u003e-patrol\nbd update \u003cwisp-id\u003e --status=pinned --assignee=\u003cidentity\u003e\n```\n\nOr we should add `--assignee` flag to `bd mol wisp` (file in beads repo if preferred).\n\n## Templates to Update\n\n- witness.md.tmpl (broken, needs immediate fix)\n- Verify consistency in refinery.md.tmpl if it also creates wisps\n\n## Related\n\nThis is a consistency task; the witness template bug is tracked separately as P0.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T17:12:37.726636-08:00","updated_at":"2025-12-27T21:29:54.687993-08:00","created_by":"gastown/crew/joe","deleted_at":"2025-12-27T21:29:54.687993-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-39cb3","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, no lifecycle requests, cleaned 3 abandoned wisps","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:33:20.837523-08:00","updated_at":"2025-12-27T21:26:02.196074-08:00","deleted_at":"2025-12-27T21:26:02.196074-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-39ej","title":"Digest: mol-deacon-patrol","description":"Patrol #1: No issues found. All 6 agents healthy. No orphans.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:14:35.83023-08:00","updated_at":"2025-12-27T21:26:04.828115-08:00","deleted_at":"2025-12-27T21:26:04.828115-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-39y42","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:45:56.444436-08:00","updated_at":"2026-01-01T18:54:10.976652-08:00","closed_at":"2026-01-01T18:54:10.976652-08:00","created_by":"gastown/polecats/rictus"}
{"id":"gt-3a4","title":"Add gt decommission command for clean swarm/worker shutdown","description":"Single command to cleanly shut down a swarm and its workers: cancel swarm, stop sessions, optionally remove polecats. E.g. gt decommission gt-hw6 --cleanup","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-18T11:33:37.217682-08:00","updated_at":"2025-12-27T21:29:57.214217-08:00","deleted_at":"2025-12-27T21:29:57.214217-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-3abj","title":"go install may fail for new users - private repo","description":"README shows:\n go install github.com/steveyegge/gastown/cmd/gt@latest\n\nThis fails on fresh install if repo is private:\n fatal: could not read Username for 'https://github.com'\n\nREADME should either:\n1. Note the repo is private (if applicable)\n2. Add SSH config instructions\n3. Or provide 'build from source' alternative\n\nThis will be the first command new users try.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-24T12:51:42.10651-08:00","updated_at":"2025-12-27T21:29:52.638592-08:00","dependencies":[{"issue_id":"gt-3abj","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:05.187114-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.638592-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-3aeuq","title":"Session ended: gt-gastown-nux","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:44:42.709442-08:00","updated_at":"2026-01-04T16:41:00.375429-08:00","closed_at":"2026-01-04T16:41:00.375429-08:00","close_reason":"Archived session telemetry","created_by":"gastown/polecats/nux"}
{"id":"gt-3bm5x","title":"Review PR #44: docs: Fix Quick Start to use gt mayor attach","description":"Review and approve/request changes for PR #44. Check docs accuracy. If good, approve with gh pr review --approve.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-03T11:40:26.239793-08:00","updated_at":"2026-01-03T11:43:38.932648-08:00","closed_at":"2026-01-03T11:43:38.932648-08:00","close_reason":"Approved PR #44 - change is correct","created_by":"mayor"}
{"id":"gt-3cns.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-3cns\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:30:43.381581-08:00","updated_at":"2025-12-27T21:29:55.670828-08:00","deleted_at":"2025-12-27T21:29:55.670828-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3dgv","title":"Digest: mol-deacon-patrol","description":"Patrol #17","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:25:59.591357-08:00","updated_at":"2025-12-27T21:26:04.692327-08:00","deleted_at":"2025-12-27T21:26:04.692327-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3ep6","title":"Test Patrol Parent","description":"[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\n[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\nTest parent for Christmas Ornament pattern","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:24:17.54048-08:00","updated_at":"2025-12-27T21:29:55.429326-08:00","deleted_at":"2025-12-27T21:29:55.429326-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3fu6z","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:19:51.795226-08:00","updated_at":"2025-12-27T21:26:02.658797-08:00","deleted_at":"2025-12-27T21:26:02.658797-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3htc","title":"Digest: mol-deacon-patrol","description":"Patrol #7","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:23:00.657254-08:00","updated_at":"2025-12-27T21:26:04.776673-08:00","deleted_at":"2025-12-27T21:26:04.776673-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3jlpf","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 7: routine, healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:24:26.295967-08:00","updated_at":"2025-12-27T21:26:01.915897-08:00","deleted_at":"2025-12-27T21:26:01.915897-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3limt","title":"Digest: mol-deacon-patrol","description":"Patrol 5: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:04:35.041079-08:00","updated_at":"2025-12-27T21:26:03.429705-08:00","deleted_at":"2025-12-27T21:26:03.429705-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3lygr","title":"Digest: mol-deacon-patrol","description":"Patrol 11: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:50:46.338114-08:00","updated_at":"2025-12-27T21:26:04.140748-08:00","deleted_at":"2025-12-27T21:26:04.140748-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3mr4k","title":"Test warrant for gt-test-session","description":"target_session: gt-test-session\nreason: UNRESPONSIVE\nrequester: human\nfiled_at: 2026-01-07T20:30:00Z\nstatus: pending","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-07T20:13:47.047408-08:00","updated_at":"2026-01-07T20:15:04.863975-08:00","created_by":"gastown/polecats/valkyrie","deleted_at":"2026-01-07T20:15:04.863975-08:00","deleted_by":"gastown/polecats/valkyrie","delete_reason":"delete","original_type":"warrant"}
{"id":"gt-3ndj","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:44","description":"Patrol 7: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:44:48.99583-08:00","updated_at":"2025-12-27T21:26:05.129076-08:00","deleted_at":"2025-12-27T21:26:05.129076-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3ns5","title":"Polecat template: clarify work status commands (bd list vs gt mol status)","description":"Template shows 'bd list --status=in_progress' for checking work, but polecats should probably use 'gt mol status' instead (or in addition). Clarify the right approach for polecats to check their current work assignment.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:46.909036-08:00","updated_at":"2025-12-27T21:29:55.93508-08:00","dependencies":[{"issue_id":"gt-3ns5","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.371341-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.93508-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3pp","title":"Support numeric shortcuts in mail read (e.g., 'mail read 1')","description":"When inbox shows numbered messages like:\n* 1. gm-19b29031... 2025-12-16 mayor Subject...\n* 2. gm-19b26d51... 2025-12-16 Subject...\n\nUsers should be able to run 'gt mail read 1' instead of needing the full message ID 'gt mail read gm-19b29031f6a172206'.\n\nImplementation:\n- Track inbox message order in display\n- Map numeric indices to actual message IDs\n- Accept both numeric shortcuts and full IDs in 'mail read' command","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-16T13:15:07.857939-08:00","updated_at":"2025-12-27T21:29:57.313593-08:00","deleted_at":"2025-12-27T21:29:57.313593-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-3qw5s","title":"ZFC: Auto-close convoys when tracked issues close via bd close","description":"When bd close closes an issue, check if this issue was tracked by any convoy. If all tracked issues in a convoy are now closed, auto-close the convoy.\n\n## Background\nPR #236 attempted to fix stale convoys by calling gt convoy check in the refinery merge handler. This is a ZFC violation - it couples the refinery to convoy semantics and uses the wrong trigger point.\n\n## Proper ZFC approach\nThe closure event should propagate at the source:\n1. bd close closes an issue\n2. Check if this issue is tracked by any open convoy\n3. For each convoy tracking this issue, check if all tracked issues are now closed\n4. If so, auto-close the convoy\n\n## Implementation notes\n- Add convoy-check logic to bd close command\n- The deacon patrol step check-convoy-completion remains as backup\n\n## Related\n- PR #236 (closed - ZFC violation)\n- Deacon patrol step: check-convoy-completion","status":"open","priority":2,"issue_type":"feature","created_at":"2026-01-07T02:53:47Z","updated_at":"2026-01-07T02:53:47Z","created_by":"gastown/crew/joe"}
{"id":"gt-3r6r5","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 10: all healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:20:37.699798-08:00","updated_at":"2026-01-01T07:20:37.699798-08:00","closed_at":"2026-01-01T07:20:37.699764-08:00"}
{"id":"gt-3suf","title":"Test Patrol Parent","description":"Test parent for Christmas Ornament pattern","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:51:29.540692-08:00","updated_at":"2025-12-27T21:29:55.662443-08:00","deleted_at":"2025-12-27T21:29:55.662443-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3tbsl","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T20:32:39.209162-08:00","updated_at":"2026-01-01T20:32:39.209162-08:00","closed_at":"2026-01-01T20:32:39.209129-08:00"}
{"id":"gt-3tcdb","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 6: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:40:03.230009-08:00","updated_at":"2025-12-27T21:26:01.42769-08:00","deleted_at":"2025-12-27T21:26:01.42769-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3tssq.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-3tssq\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:57:47.407091-08:00","updated_at":"2025-12-27T21:29:55.360003-08:00","deleted_at":"2025-12-27T21:29:55.360003-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3twz","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All systems nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:46:05.137994-08:00","updated_at":"2025-12-27T21:26:04.218842-08:00","deleted_at":"2025-12-27T21:26:04.218842-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3tz","title":"CLI: polecat commands (add, list, wake, sleep, decommission)","description":"GGT is missing most polecat management commands that PGT has.\n\nMissing Commands:\n- gt polecat add \u003crig\u003e \u003cname\u003e - Add polecat to rig (creates clone)\n- gt polecat list [\u003crig\u003e] - List polecats with state\n- gt polecat info \u003cpolecat\u003e - Show detailed info\n- gt polecat wake \u003cpolecat\u003e - Mark available\n- gt polecat sleep \u003cpolecat\u003e - Mark unavailable \n- gt polecat decommission \u003cpolecat\u003e - Remove polecat safely\n\nPGT Reference: gastown-py/src/gastown/cli/polecat_cmd.py\n\nNotes:\n- spawn exists but doesn't cover management\n- wake/sleep are in polecat manager but not CLI\n- decommission should check for uncommitted work","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T14:46:31.326692-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-3u18o","title":"Session ended: gt-gastown-imperator","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:48:27.401975-08:00","updated_at":"2026-01-07T10:33:31.890893+13:00","closed_at":"2026-01-04T16:41:37.841247-08:00","close_reason":"Archived","created_by":"gastown/polecats/imperator"}
{"id":"gt-3ubur","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox clear, no gates, all agents healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:29:06.80036-08:00","updated_at":"2025-12-27T21:26:00.440344-08:00","deleted_at":"2025-12-27T21:26:00.440344-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3uegm","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:32:24.714703-08:00","updated_at":"2025-12-27T21:26:02.569-08:00","deleted_at":"2025-12-27T21:26:02.569-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3uw5d","title":"Digest: mol-deacon-patrol","description":"Patrol 4: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:42:30.550966-08:00","updated_at":"2025-12-27T21:26:03.254966-08:00","deleted_at":"2025-12-27T21:26:03.254966-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z","title":"Epic: Wisp Molecule Integration","description":"## Vision\n\nIntegrate Beads wisp molecules into Gas Town. All orchestration work runs as wisp molecules - patrols, polecat workflows, batch operations. Only digests reach main beads.\n\n## The Steam Engine Metaphor\n\n```\nEngine does work → generates steam\nSteam wisps rise → execution trace\nSteam condenses → digest (distillate) \nSteam dissipates → cleaned up (burned)\n```\n\n## Architecture\n\n```\nProto Molecules (templates)\n ↓ bd mol bond\nWisps (.beads-wisps/ or inline)\n ↓ bd mol squash + AI summary\nMain Beads (digests only)\n```\n\n## Vocabulary\n\n| Term | Meaning |\n|------|---------|\n| bond | Attach proto to work (creates wisps) |\n| wisp | Temporary execution step (steam rising) |\n| squash | Condense wisps into digest |\n| burn | Destroy wisps without record |\n| digest | Permanent condensed record (distillate) |\n\n## Key Design Decisions\n\n1. **Wisp location**: Per-rig or inline in main beads with wisp flag\n2. **Summary generation**: Agent that did work generates summary (inversion of control)\n3. **Squash timing**: Final step of molecule workflow, before signaling done\n4. **Crash recovery**: Wisps persist, Witness detects stalls, new agent resumes\n5. **Patrols**: Each cycle is fresh wisp molecule, squashed on completion\n\n## Digest Contents\n\n- Molecule type and instance ID\n- Assignee, start/end times\n- Source issue reference\n- AI-generated summary\n- Outcomes (issues closed, commits, branches)\n\n## Integration Points\n\n- gt rig init: Configure wisp storage\n- gt spawn --molecule: Bond creates wisps\n- gt prime: Show wisp molecule context\n- Polecat CLAUDE.md: Summary + squash protocol\n- gt doctor: Wisp health checks\n- Deacon/Witness/Refinery: Patrol molecules\n\n## Phases\n\nPhase 1: Wisp setup infrastructure\nPhase 2: Spawn integration\nPhase 3: Completion flow (summary + squash)\nPhase 4: Patrol integration\nPhase 5: Documentation and polish","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-21T14:33:03.15592-08:00","updated_at":"2025-12-27T21:29:53.526711-08:00","deleted_at":"2025-12-27T21:29:53.526711-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-3x0z.1","title":"Phase 1.1: gt rig init creates .beads-ephemeral/","description":"Add ephemeral beads repo creation to rig initialization.\n\n## Implementation\n\nIn `gt rig init` (or equivalent setup):\n1. Create `\u003crig\u003e/.beads-ephemeral/` directory\n2. Initialize as git repo\n3. Create minimal beads config (no sync-branch needed)\n4. Add to .gitignore if not already\n\n## Config\n\n```yaml\n# .beads-ephemeral/config.yaml\nephemeral: true\n# No sync-branch - ephemeral is local only\n```\n\n## Verification\n\n```bash\ngt rig init gastown\nls gastown/.beads-ephemeral/ # Should exist\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:33:23.699253-08:00","updated_at":"2025-12-27T21:29:53.518289-08:00","dependencies":[{"issue_id":"gt-3x0z.1","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:23.701082-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.518289-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.11","title":"Phase 5.1: Document ephemeral architecture","description":"Update docs with ephemeral molecule architecture.\n\n## Files to Update\n\n### docs/architecture.md\n- Add Ephemeral Molecules section\n- Diagram: Proto → Ephemeral → Digest flow\n- Explain inversion of control for summaries\n\n### docs/molecules.md (new or update)\n- Proto molecule catalog\n- Ephemeral vs main beads\n- Molecule lifecycle\n- Summary generation guidelines\n\n### Agent CLAUDE.md files\n- Polecat: molecule workflow protocol\n- Deacon: patrol cycle pattern\n- Witness: patrol cycle pattern\n- Refinery: patrol cycle pattern\n\n## Key Concepts to Document\n\n1. Everything is a molecule\n2. Orchestration molecules are ephemeral\n3. Only digests reach main beads\n4. Agents generate their own summaries\n5. Crash recovery via ephemeral persistence","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T14:34:30.074147-08:00","updated_at":"2025-12-27T21:29:56.603525-08:00","dependencies":[{"issue_id":"gt-3x0z.11","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:34:30.075854-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.11","depends_on_id":"gt-3x0z.10","type":"blocks","created_at":"2025-12-21T14:34:40.812635-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.603525-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.2","title":"Phase 1.2: Configure bd for ephemeral molecule bonding","description":"Ensure bd mol bond --ephemeral works with Gas Town setup.\n\n## Questions for Dave\n\n1. Does bd automatically find .beads-ephemeral/ or need explicit path?\n2. How does bd mol bond --ephemeral know which repo to use?\n3. Is there a redirect mechanism for ephemeral like main beads?\n\n## Integration\n\nFrom polecat/crew working directory:\n```bash\nbd mol bond mol-polecat-work --ephemeral --assignee $(gt whoami)\n```\n\nShould create molecule in rig's .beads-ephemeral/, not main beads.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:33:23.777969-08:00","updated_at":"2025-12-27T21:29:53.509975-08:00","dependencies":[{"issue_id":"gt-3x0z.2","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:23.77835-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.509975-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.3","title":"Phase 1.3: gt doctor checks for ephemeral health","description":"Add doctor checks for ephemeral beads repo.\n\n## Checks\n\n1. **ephemeral-exists**: .beads-ephemeral/ directory exists for each rig\n2. **ephemeral-git**: It's a valid git repo\n3. **ephemeral-orphans**: Molecules started but never squashed (\u003e24h old)\n4. **ephemeral-size**: Warn if ephemeral repo is \u003e100MB (should be cleaned)\n5. **ephemeral-stale**: Molecules with no activity in last hour\n\n## Auto-fix\n\n--fix can:\n- Create missing ephemeral repo\n- Clean up old completed molecules (already squashed)\n- NOT auto-squash incomplete molecules (needs AI summary)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T14:33:23.864314-08:00","updated_at":"2025-12-27T21:29:56.611804-08:00","dependencies":[{"issue_id":"gt-3x0z.3","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:23.864678-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.611804-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.4","title":"Phase 2.1: gt spawn --molecule bonds in ephemeral","description":"Make gt spawn molecule-aware with ephemeral bonding.\n\n## New Flag\n\n```bash\ngt spawn --issue gt-xxx --molecule mol-polecat-work\n```\n\n## Behavior\n\n1. Create polecat with fresh worktree (existing)\n2. Bond molecule in ephemeral: `bd mol bond mol-polecat-work --ephemeral`\n3. Link molecule root to source issue\n4. Include molecule context in work assignment mail\n5. Start session\n\n## Work Assignment Mail\n\n```\nSubject: Work Assignment: Fix lifecycle bug [MOLECULE]\n\nYou are working on gt-rixa as part of molecule mol-polecat-work.\n\nMolecule instance: eph-abc123\nCurrent step: read-assignment (1/8)\n\nFollow the molecule workflow. When complete, generate summary and squash.\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:33:45.082722-08:00","updated_at":"2025-12-27T21:29:53.501741-08:00","dependencies":[{"issue_id":"gt-3x0z.4","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:45.084965-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.4","depends_on_id":"gt-3x0z.1","type":"blocks","created_at":"2025-12-21T14:34:40.385365-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.4","depends_on_id":"gt-3x0z.2","type":"blocks","created_at":"2025-12-21T14:34:40.457259-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.501741-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.5","title":"Phase 2.2: gt prime shows ephemeral molecule context","description":"Update gt prime to detect and display ephemeral molecule state.\n\n## Detection\n\n1. Check for active ephemeral molecule assigned to current identity\n2. Parse molecule progress (current step, total steps)\n3. Show in prime output\n\n## Output\n\n```\n🔧 Polecat furiosa, checking in.\n\n📦 Molecule: mol-polecat-work (eph-abc123)\n Step 3/8: do-work\n Source: gt-rixa\n Started: 10 minutes ago\n\nRun 'bd mol status' for full molecule state.\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:33:45.164431-08:00","updated_at":"2025-12-27T21:29:53.493364-08:00","dependencies":[{"issue_id":"gt-3x0z.5","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:45.16487-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.5","depends_on_id":"gt-3x0z.4","type":"blocks","created_at":"2025-12-21T15:22:46.151626-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.493364-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.6","title":"Phase 2.3: Polecat CLAUDE.md molecule workflow protocol","description":"Update polecat prompting for molecule-based work.\n\n## CLAUDE.md Updates\n\nAdd section on molecule workflow:\n\n```markdown\n## Molecule Workflow\n\nWhen assigned a molecule (check gt prime output):\n\n1. Follow molecule steps in order\n2. Mark steps complete: bd mol step complete \u003cstep-id\u003e\n3. Before signaling done:\n a. Generate summary of work performed\n b. Run: bd squash \u003cmolecule-root\u003e --summary \"\u003cyour summary\u003e\"\n4. Then signal done as normal\n\n### Summary Guidelines\n\nYour summary should include:\n- What was the task?\n- What did you do?\n- What was the outcome?\n- Any issues or follow-ups?\n\nKeep it to 2-4 sentences. This becomes the permanent record.\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:33:45.24122-08:00","updated_at":"2025-12-27T21:29:53.484975-08:00","dependencies":[{"issue_id":"gt-3x0z.6","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:33:45.241575-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.6","depends_on_id":"gt-3x0z.4","type":"blocks","created_at":"2025-12-21T15:22:47.355729-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.484975-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.7","title":"Phase 3.1: Summary generation protocol","description":"Define how agents generate summaries for squash.\n\n## The Problem\n\nBeads doesn't make AI calls (inversion of control). Gas Town agents must:\n1. Generate their own summary before calling squash\n2. Pass summary to bd squash command\n\n## Summary Template\n\n```\nTask: \u003csource issue title\u003e\nAction: \u003cwhat was done - fix/implement/refactor/etc\u003e\nOutcome: \u003cresult - tests pass, committed, needs follow-up, etc\u003e\nDetails: \u003c1-2 sentences of specifics if needed\u003e\n```\n\n## Example\n\n```\nTask: Fix lifecycle parser matching bug (gt-rixa)\nAction: Reordered conditional checks in parseLifecycleRequest\nOutcome: Tests passing, committed to polecat/furiosa\nDetails: The 'cycle' keyword was matching 'lifecycle:' prefix. Now checks restart/shutdown first, uses word boundary for cycle.\n```\n\n## Command\n\n```bash\nbd squash eph-abc123 --summary \"Task: Fix lifecycle parser...\"\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:34:12.86627-08:00","updated_at":"2025-12-27T21:29:53.476562-08:00","dependencies":[{"issue_id":"gt-3x0z.7","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:34:12.868178-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.7","depends_on_id":"gt-3x0z.6","type":"blocks","created_at":"2025-12-21T14:34:40.530235-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.476562-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.8","title":"Phase 3.2: mol-polecat-work squash step","description":"Add summary+squash as final steps in mol-polecat-work.\n\n## Current Steps\n\n1. read-assignment\n2. understand-task\n3. implement-solution\n4. verify-tests\n5. commit-work\n6. signal-done\n\n## Updated Steps\n\n1. read-assignment\n2. understand-task\n3. implement-solution\n4. verify-tests\n5. commit-work\n6. **generate-summary** ← NEW\n7. **squash-molecule** ← NEW\n8. signal-done\n\n## Step Definitions\n\n### generate-summary\nAgent writes a concise summary following the template.\nSave to a local variable or temp file.\n\n### squash-molecule\nRun: bd squash $MOLECULE_ROOT --summary \"$SUMMARY\"\nThis creates digest in main beads, cleans ephemeral.\n\n## Update Location\n\nThis requires updating the mol-polecat-work definition in beads molecule catalog.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:34:12.946518-08:00","updated_at":"2025-12-27T21:29:53.468296-08:00","dependencies":[{"issue_id":"gt-3x0z.8","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:34:12.946899-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.8","depends_on_id":"gt-3x0z.7","type":"blocks","created_at":"2025-12-21T14:34:40.601367-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.468296-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x0z.9","title":"Phase 4.1: mol-deacon-patrol uses ephemeral","description":"Deacon patrol cycles run as ephemeral molecules.\n\n## Current Deacon Loop\n\n```\nwhile running:\n check_heartbeat()\n check_mail()\n sleep(interval)\n```\n\n## Molecule-Based Loop\n\n```\nwhile running:\n mol = bd mol bond mol-deacon-patrol-cycle --ephemeral\n execute_cycle(mol):\n check_heartbeat()\n check_mail()\n log_status()\n summary = generate_cycle_summary()\n bd squash mol --summary summary\n sleep(interval)\n```\n\n## Benefits\n\n- Each cycle is tracked\n- Digests show daemon health over time\n- Can query: 'show me patrol cycles from last hour'\n- Crash mid-cycle → ephemeral shows where we were","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T14:34:13.026048-08:00","updated_at":"2025-12-27T21:25:59.961175-08:00","dependencies":[{"issue_id":"gt-3x0z.9","depends_on_id":"gt-3x0z","type":"parent-child","created_at":"2025-12-21T14:34:13.026388-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.9","depends_on_id":"gt-3x0z.8","type":"blocks","created_at":"2025-12-21T14:34:40.672531-08:00","created_by":"daemon"},{"issue_id":"gt-3x0z.9","depends_on_id":"gt-rana.3","type":"blocks","created_at":"2025-12-21T15:20:27.460976-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:25:59.961175-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1","title":"Update Refinery to use Beads merge queue","description":"Replace branch discovery with Beads queue in the Refinery module:\n\nCurrent (internal/refinery/manager.go):\n- Scans for polecat/* branches\n- Creates MR objects on-the-fly\n\nNew:\n- Pull from Beads: bd ready --type=merge-request\n- Process each MR\n- Close with merge commit: bd close \u003cid\u003e --reason=\"Merged at \u003csha\u003e\"\n- Handle failures: bd update \u003cid\u003e --status=blocked --reason=\"...\"\n\nThe Engineer (agent) becomes Beads-native.\nThe Refinery (module) provides the infrastructure.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T23:02:37.96436-08:00","updated_at":"2025-12-27T21:29:54.411214-08:00","dependencies":[{"issue_id":"gt-3x1","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:55.812433-08:00","created_by":"daemon"},{"issue_id":"gt-3x1","depends_on_id":"gt-svi","type":"blocks","created_at":"2025-12-16T23:03:12.814463-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.411214-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1.1","title":"Engineer main loop: poll for ready merge-requests","description":"Implement the Engineer's main processing loop.\n\nLoop structure:\n1. Query: bd ready --type=merge-request\n2. If empty: sleep(poll_interval), continue\n3. Select highest priority, oldest MR\n4. Claim: bd update \u003cid\u003e --status=in_progress\n5. Process (delegate to other subtasks)\n6. Repeat\n\nConfiguration:\n- poll_interval: from rig config (default 30s)\n- max_concurrent: from rig config (default 1)\n\nThe loop should be interruptible and handle graceful shutdown.\n\nReference: docs/merge-queue-design.md#engineer-processing-loop","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:50:57.022367-08:00","updated_at":"2025-12-27T21:29:54.369849-08:00","dependencies":[{"issue_id":"gt-3x1.1","depends_on_id":"gt-3x1","type":"parent-child","created_at":"2025-12-17T13:50:57.024225-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.1","depends_on_id":"gt-svi.1","type":"blocks","created_at":"2025-12-17T13:53:09.832586-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.1","depends_on_id":"gt-svi.2","type":"blocks","created_at":"2025-12-17T13:53:09.9547-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.1","depends_on_id":"gt-h5n.8","type":"blocks","created_at":"2025-12-17T13:53:16.770078-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.369849-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1.2","title":"Fetch and conflict check: git operations for MR","description":"Implement git operations for MR processing.\n\nSteps:\n1. git fetch origin \u003cmr.branch\u003e\n2. git checkout \u003cmr.target\u003e (main or integration/xxx)\n3. git merge --no-commit --no-ff \u003cmr.branch\u003e (test merge)\n4. Check for conflicts\n5. If conflicts: abort and return Failure(conflict, files)\n6. If clean: abort (actual merge in next step)\n\nHelper functions:\n- FetchBranch(branch string) error\n- CheckConflicts(source, target string) ([]string, error)\n\nReference: docs/merge-queue-design.md#process-merge-steps","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:50:58.99193-08:00","updated_at":"2025-12-27T21:29:54.361591-08:00","dependencies":[{"issue_id":"gt-3x1.2","depends_on_id":"gt-3x1","type":"parent-child","created_at":"2025-12-17T13:50:58.993973-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.2","depends_on_id":"gt-3x1.1","type":"blocks","created_at":"2025-12-17T13:53:10.066159-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.361591-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1.3","title":"Merge execution: merge, test, push","description":"Implement the actual merge execution.\n\nSteps:\n1. git checkout \u003cmr.target\u003e\n2. git merge \u003cmr.branch\u003e --no-ff -m 'Merge \u003cbranch\u003e: \u003ctitle\u003e'\n3. If config.run_tests:\n - Run test_command (from config)\n - If failed: git reset --hard HEAD~1, return Failure(tests_failed)\n4. git push origin \u003cmr.target\u003e\n5. Return Success(merge_commit=HEAD)\n\nConfiguration:\n- run_tests: bool (default true)\n- test_command: string (default 'go test ./...')\n\nHandle push failures with retry logic.\n\nReference: docs/merge-queue-design.md#process-merge-steps","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:00.742994-08:00","updated_at":"2025-12-27T21:29:54.353346-08:00","dependencies":[{"issue_id":"gt-3x1.3","depends_on_id":"gt-3x1","type":"parent-child","created_at":"2025-12-17T13:51:00.744975-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.3","depends_on_id":"gt-3x1.2","type":"blocks","created_at":"2025-12-17T13:53:10.163097-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.353346-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1.4","title":"Failure handling: assign back to worker, add labels","description":"Handle merge failures appropriately.\n\nFailure types and actions:\n| Failure | Action |\n|-------------|---------------------------------------------|\n| conflict | Add needs-rebase label, assign to worker |\n| tests_fail | Add needs-fix label, assign to worker |\n| build_fail | Add needs-fix label, assign to worker |\n| flaky_test | Retry once, then treat as tests_fail |\n| push_fail | Retry with backoff, escalate if persistent |\n\nActions:\n1. bd update \u003cid\u003e --status=open --assignee=\u003cworker\u003e\n2. bd update \u003cid\u003e --labels=\u003cfailure-label\u003e\n3. Send mail to worker explaining failure\n4. Log failure details\n\nReference: docs/merge-queue-design.md#handling-failures","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:17.238066-08:00","updated_at":"2025-12-27T21:29:54.34521-08:00","dependencies":[{"issue_id":"gt-3x1.4","depends_on_id":"gt-3x1","type":"parent-child","created_at":"2025-12-17T13:51:17.240001-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.4","depends_on_id":"gt-3x1.1","type":"blocks","created_at":"2025-12-17T13:53:10.281038-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.34521-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3x1.5","title":"Success handling: close MR, close source issue, cleanup","description":"Handle successful merge completion.\n\nSteps:\n1. Update MR with merge_commit SHA:\n bd update \u003cid\u003e --body='...\\nmerge_commit: \u003csha\u003e'\n2. Close MR with reason:\n bd close \u003cid\u003e --reason='merged'\n3. Close source issue (the work item):\n bd close \u003csource_issue\u003e --reason='Merged in \u003cmr_id\u003e'\n4. Delete source branch (if configured):\n git push origin --delete \u003cmr.branch\u003e\n5. Log success\n\nConfiguration:\n- delete_merged_branches: bool (default true)\n\nReference: docs/merge-queue-design.md#process-merge-steps","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:19.054425-08:00","updated_at":"2025-12-27T21:29:54.336865-08:00","dependencies":[{"issue_id":"gt-3x1.5","depends_on_id":"gt-3x1","type":"parent-child","created_at":"2025-12-17T13:51:19.056461-08:00","created_by":"daemon"},{"issue_id":"gt-3x1.5","depends_on_id":"gt-3x1.3","type":"blocks","created_at":"2025-12-17T13:53:10.398758-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.336865-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3z6x","title":"Merge: gt-rana.3","description":"branch: polecat/dementus\ntarget: main\nsource_issue: gt-rana.3\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T15:39:59.205846-08:00","updated_at":"2025-12-27T21:27:22.6261-08:00","deleted_at":"2025-12-27T21:27:22.6261-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-3zg","title":"Verify architecture.md shows correct harness and rig structure","description":"Review architecture.md diagrams:\n- Verify town-level structure shows harness correctly\n- Confirm rig-level mayor/rig/ is shown (it appears to be there at line 197)\n- Check mermaid diagrams match ASCII diagrams\n- Update if any inconsistencies found\n- Cross-reference with new harness.md docs","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T17:15:45.10268-08:00","updated_at":"2025-12-27T21:29:57.271632-08:00","dependencies":[{"issue_id":"gt-3zg","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:52.090317-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.271632-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-3zgr7","title":"Digest: mol-deacon-patrol","description":"Patrol 6: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T20:29:23.380577-08:00","updated_at":"2026-01-01T20:29:23.380577-08:00","closed_at":"2026-01-01T20:29:23.380541-08:00"}
{"id":"gt-3zw","title":"Policy beads: config in data plane","description":"Use sentinel/policy beads for configuration instead of external config. Examples: daemon notifications on/off, heartbeat intervals. Config lives in the bead graph, can be toggled by closing/opening policy beads.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-18T18:18:32.857389-08:00","updated_at":"2025-12-27T21:29:57.118519-08:00","deleted_at":"2025-12-27T21:29:57.118519-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-40k6","title":"Merge: gt-qz2l","description":"branch: polecat/dementus\ntarget: main\nsource_issue: gt-qz2l\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:39:31.550068-08:00","updated_at":"2025-12-27T21:27:22.393483-08:00","deleted_at":"2025-12-27T21:27:22.393483-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-415l","title":"request-shutdown","description":"Send shutdown request to Witness.\nWait for termination.\n\nThe polecat is now ready to be cleaned up.\nDo not exit directly - wait for Witness to kill the session.\n\nDepends: generate-summary","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:56:18.535199-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-42whv","title":"Merge: cheedo-mjxpd9go","description":"branch: polecat/cheedo-mjxpd9go\ntarget: main\nsource_issue: cheedo-mjxpd9go\nrig: gastown\nagent_bead: gt-gastown-polecat-cheedo","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:50:33.469414-08:00","updated_at":"2026-01-02T18:53:36.842822-08:00","closed_at":"2026-01-02T18:53:36.842822-08:00","close_reason":"Merged to main at dd870bb3","created_by":"gastown/polecats/cheedo"}
{"id":"gt-42xs6","title":"Digest: mol-deacon-patrol","description":"Patrol 12","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T09:11:10.298748-08:00","updated_at":"2026-01-01T09:11:10.298748-08:00","closed_at":"2026-01-01T09:11:10.298716-08:00"}
{"id":"gt-439en","title":"Digest: mol-deacon-patrol","description":"Patrol 6: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:33:47.341888-08:00","updated_at":"2025-12-27T21:26:00.398533-08:00","deleted_at":"2025-12-27T21:26:00.398533-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-43jno","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T22:47:26.811125-08:00","updated_at":"2026-01-01T22:47:26.811125-08:00","closed_at":"2026-01-01T22:47:26.81109-08:00"}
{"id":"gt-43pm","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:36:59.981391-08:00","updated_at":"2025-12-27T21:26:04.618196-08:00","deleted_at":"2025-12-27T21:26:04.618196-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-43qg","title":"Test: release command verification","notes":"Released: testing release command","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-19T16:15:30.845537-08:00","updated_at":"2025-12-27T21:29:57.916411-08:00","deleted_at":"2025-12-27T21:29:57.916411-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-43tw","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 1: inbox clear, 8 in-progress issues, sessions healthy, gc not implemented","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T13:14:27.653577-08:00","updated_at":"2025-12-27T21:26:05.364275-08:00","deleted_at":"2025-12-27T21:26:05.364275-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4461m","title":"Digest: mol-deacon-patrol","description":"P13","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:25:50.896379-08:00","updated_at":"2025-12-27T21:26:01.637451-08:00","deleted_at":"2025-12-27T21:26:01.637451-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-44wh","title":"Polecats must not create GitHub PRs","description":"Polecats should never use 'gh pr create' or create GitHub pull requests.\n\n## Correct Workflow\n1. Polecat works on polecat/\u003cname\u003e branch\n2. Commits and pushes to origin\n3. Creates beads MR issue (type: merge-request)\n4. Refinery processes the MR and merges to main\n\n## Wrong Workflow\n- Using gh pr create\n- Creating GitHub pull requests directly\n\n## Why\n- Refinery is our merge queue processor\n- GitHub PRs bypass our workflow\n- Beads MRs are the coordination mechanism","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-21T16:40:33.204449-08:00","updated_at":"2025-12-27T21:29:53.417472-08:00","deleted_at":"2025-12-27T21:29:53.417472-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-45eap","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:35:28.976495-08:00","updated_at":"2025-12-27T21:26:03.878236-08:00","deleted_at":"2025-12-27T21:26:03.878236-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-45m6v","title":"Digest: mol-deacon-patrol","description":"Patrol 12: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:34:10.525118-08:00","updated_at":"2025-12-27T21:26:00.759937-08:00","deleted_at":"2025-12-27T21:26:00.759937-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-469ab","title":"Digest: mol-deacon-patrol","description":"Patrol 17","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:05:59.2436-08:00","updated_at":"2025-12-27T21:26:00.644225-08:00","deleted_at":"2025-12-27T21:26:00.644225-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-46wdl","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 15: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:28:50.299576-08:00","updated_at":"2025-12-27T21:26:01.846206-08:00","deleted_at":"2025-12-27T21:26:01.846206-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-47q5v","title":"Digest: mol-deacon-patrol","description":"Patrol 6: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:43:08.050036-08:00","updated_at":"2025-12-27T21:26:03.238825-08:00","deleted_at":"2025-12-27T21:26:03.238825-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-47tq","title":"gt spawn should use bd mol run for molecule attachment","description":"Simplify Gas Town to use bd mol run for work tracking.\n\n## Key Insight\nTwo distinct mechanisms, not duplicative:\n\n| Mechanism | Purpose | Query |\n|-----------|---------|-------|\n| **Pinned molecule** (bd mol run) | What am I working on? | `bd list --pinned --assignee=me` |\n| **Handoff mail** | Context notes for restart | `gt mail read` (self-addressed) |\n\nThe handoff is just **mail to yourself** - optional context notes.\nThe molecule is **the actual work** - required state.\n\n## Current State (Overengineered)\nGas Town has custom attachment system:\n- Permanent \"Foo Handoff\" pinned beads per identity\n- AttachMolecule(pinnedBeadID, moleculeID) \n- Attachment fields parsed from description\n- Separate from beads pinning\n\n## New Model (Simplified)\nUse bd mol run directly:\n\n```bash\n# Spawn polecat with molecule\nbd mol run mol-polecat-work --var issue=gt-xyz\n# This: spawns, assigns to caller, pins root, sets in_progress\n```\n\nQuery current work:\n```bash\nbd list --pinned --assignee=gastown/furiosa --status=in_progress\n```\n\nHandoff context (when needed):\n```bash\ngt mail send gastown/furiosa -s \"Context notes\" -m \"Was on step 4...\"\n```\n\n## Changes Required\n\n### Remove from Gas Town\n- AttachMolecule() / DetachMolecule()\n- AttachmentFields struct and parsing\n- GetAttachment() / SetAttachmentFields()\n- Permanent pinned handoff beads per identity\n- Daemon attachment detection (checkDeaconAttachment)\n\n### Update gt spawn\n```go\n// Old: custom molecule instantiation + attachment\n// New: just call bd mol run\ncmd := exec.Command(\"bd\", \"mol\", \"run\", protoID, \"--var\", \"issue=\"+issueID)\n```\n\n### Update gt prime / agent context\n```go\n// Old: find handoff bead, parse attachment\n// New: query for pinned molecule\ncmd := exec.Command(\"bd\", \"list\", \"--pinned\", \"--assignee=\"+identity, \"--status=in_progress\", \"--json\")\n```\n\n### Update documentation\n- Remove handoff bead attachment docs\n- Clarify: handoff = mail, molecule = work\n- Update CLAUDE.md templates\n\n## Benefits\n1. One system for work tracking (beads)\n2. Simpler Gas Town code\n3. bd mol squash works naturally\n4. Handoff is just mail (already works)\n\n## Related\n- gt-3x0z: Wisp Molecule Integration\n- gt-rana: Patrol System\n- gt-lek6: gt rig reset --stale\n- gt-ay1r: gt molecule current","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-21T21:34:21.808261-08:00","updated_at":"2025-12-27T21:29:53.367026-08:00","deleted_at":"2025-12-27T21:29:53.367026-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-480b","title":"Improve test coverage in low-coverage packages","description":"Several packages have low test coverage:\n- internal/cmd: 6.8%\n- internal/mail: 3.6%\n- internal/daemon: 12.1%\n- internal/doctor: 14.5%\n- internal/refinery: 20.6%\n- internal/session: 27.8%\n- internal/git: 28.8%\n\nPriority should be given to mail, cmd, and daemon packages which handle critical functionality.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:34:47.807929-08:00","updated_at":"2025-12-27T21:29:56.478961-08:00","deleted_at":"2025-12-27T21:29:56.478961-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-48bs","title":"gt rig reset: clear stale mail on reset/land","description":"## Problem\n\nWhen resetting or landing a rig/town, stale mail messages can confuse agents on startup. Old handoff messages, daemon notifications, and inter-agent mail should be cleaned up as part of reset.\n\n## Current State\n\n- `gt rig reset` exists but doesn't clear mail\n- Stale messages accumulate (e.g., daemon SHUTDOWN messages)\n- Agents may read outdated context on startup\n\n## Proposed Behavior\n\n`gt rig reset` and `gt town reset` should:\n1. Close all open messages (`--type=message`) in the relevant beads\n2. Optionally preserve pinned handoff beads (clear content, keep bead)\n3. Log what was cleaned up\n\n```bash\ngt rig reset gastown # Clears gastown mail\ngt rig reset gastown --mail # Only clear mail, keep other state\ngt town reset # Clears all town-level mail\ngt town reset --all # Clears mail in all rigs too\n```\n\n## Implementation\n\n1. Query `bd list --type=message --status=open`\n2. Close each with reason 'Cleared during reset'\n3. For pinned handoffs: `bd update \u003cid\u003e --description=''` instead of close","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T11:42:17.769674-08:00","updated_at":"2025-12-27T21:29:57.010035-08:00","deleted_at":"2025-12-27T21:29:57.010035-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-49xvh","title":"Digest: mol-deacon-patrol","description":"Patrol 16: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:24:14.032338-08:00","updated_at":"2025-12-27T21:26:00.043648-08:00","deleted_at":"2025-12-27T21:26:00.043648-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4bi8o","title":"Merge: toast-mjxpchjl","description":"branch: polecat/toast-mjxpchjl\ntarget: main\nsource_issue: toast-mjxpchjl\nrig: gastown\nagent_bead: gt-gastown-polecat-toast","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:57:11.304676-08:00","updated_at":"2026-01-04T21:30:49.368075+13:00","closed_at":"2026-01-03T11:52:17.967561-08:00","close_reason":"Merged","created_by":"gastown/polecats/toast"}
{"id":"gt-4bqfm","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:52:36.272187-08:00","updated_at":"2025-12-27T21:26:04.099906-08:00","deleted_at":"2025-12-27T21:26:04.099906-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4cfp","title":"plugin-run","description":"Execute plugins from ~/gt/plugins/. Check gates, run eligible plugins.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T13:03:21.516836-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-4cit0","title":"Digest: mol-deacon-patrol","description":"Patrol 2: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:41:29.957459-08:00","updated_at":"2025-12-27T21:26:03.263086-08:00","deleted_at":"2025-12-27T21:26:03.263086-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4d12b","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, 3 polecats idle, no orphans, 6 Mayor messages pending","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:43:02.596441-08:00","updated_at":"2025-12-27T21:26:02.035976-08:00","deleted_at":"2025-12-27T21:26:02.035976-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4eim","title":"gt nudge should accept flexible session identifiers","description":"Currently `gt nudge` requires the exact tmux session name (e.g., `gt-gastown-furiosa`).\n\nIt should also accept:\n- `gastown/furiosa` (rig/polecat format)\n- `furiosa` (polecat name, infer rig from cwd or require if ambiguous)\n\nThe session list command shows `gastown/furiosa` format, but nudge rejects it:\n```\ngt session list → shows 'gastown/furiosa'\ngt nudge gastown/furiosa 'msg' → 'session not found'\ngt nudge gt-gastown-furiosa 'msg' → works\n```\n\nShould normalize all these formats to the tmux session name internally.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-21T15:36:45.013475-08:00","updated_at":"2025-12-27T21:29:56.595323-08:00","deleted_at":"2025-12-27T21:29:56.595323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-4ev4","title":"Implement gt sling command","description":"The unified work dispatch command.\n\n```bash\ngt sling \u003cthing\u003e \u003ctarget\u003e [options]\n```\n\nImplements spawn + assign + pin in one operation. See sling-design.md.\n\nAcceptance:\n- [ ] Parse thing (proto name, issue ID, epic ID)\n- [ ] Parse target (agent address) \n- [ ] Spawn molecule if proto\n- [ ] Assign to target agent\n- [ ] Pin to agent's hook (pinned bead)\n- [ ] Support --wisp flag for ephemeral\n- [ ] Support --molecule flag for issue+workflow\n- [ ] Error if hook already occupied (unless --force)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T03:17:27.273013-08:00","updated_at":"2025-12-27T21:29:53.269071-08:00","deleted_at":"2025-12-27T21:29:53.269071-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4fdi","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:48","description":"Patrol 20: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:48:32.167102-08:00","updated_at":"2025-12-27T21:26:05.029912-08:00","deleted_at":"2025-12-27T21:26:05.029912-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4h1h7","title":"Digest: mol-deacon-patrol","description":"Cycle 19","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T23:56:49.813173-08:00","updated_at":"2025-12-31T23:56:49.813173-08:00","closed_at":"2025-12-31T23:56:49.813137-08:00","dependencies":[{"issue_id":"gt-4h1h7","depends_on_id":"gt-eph-be69","type":"parent-child","created_at":"2025-12-31T23:56:49.814344-08:00","created_by":"deacon"}]}
{"id":"gt-4jdrd","title":"Digest: mol-deacon-patrol","description":"Patrol 7: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T20:29:06.932355-08:00","updated_at":"2025-12-31T20:29:06.932355-08:00","closed_at":"2025-12-31T20:29:06.932322-08:00"}
{"id":"gt-4km6p","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:37.301068-08:00","updated_at":"2025-12-27T21:26:02.130788-08:00","deleted_at":"2025-12-27T21:26:02.130788-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4my","title":"Doctor check: Worker health and stuck detection","description":"Detect and report stuck workers via gt doctor.\n\n## Checks\n\n### WorkerHealthCheck\n- List all active workers (polecats with state=working)\n- Check last activity timestamp for each\n- Flag as potentially stuck if no progress for configurable threshold (default: 30 min)\n- Check if Witness is running for the rig\n- Verify Witness last heartbeat time\n\n### Stuck Detection Criteria\n- Polecat state=working but session not running\n- Polecat state=working but output unchanged for threshold\n- Witness not responding to health checks\n- Multiple polecats in same rig all stuck\n\n## Output\n\n```\n[WARN] Workers in rig 'wyvern' may be stuck:\n - Toast: working for 45m, no recent output\n - Capable: working for 52m, session not found\n - Witness: last heartbeat 20m ago\n \n Suggestions:\n - gt witness status wyvern\n - gt capture wyvern/Toast 50\n - gt stop --rig wyvern (kill all)\n```\n\n## Auto-Fix\n\nCannot auto-fix stuck workers (risk of data loss), but can:\n- Restart Witness daemon if crashed\n- Send warning mail to Mayor","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T23:17:59.265062-08:00","updated_at":"2025-12-27T21:29:54.521246-08:00","dependencies":[{"issue_id":"gt-4my","depends_on_id":"gt-f9x.4","type":"blocks","created_at":"2025-12-15T23:19:05.565606-08:00","created_by":"daemon"},{"issue_id":"gt-4my","depends_on_id":"gt-7ik","type":"blocks","created_at":"2025-12-17T15:44:42.068149-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.521246-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4nn","title":"Molecules: Composable Workflow Beads","description":"## Summary\n\nMolecules are crystallized workflow patterns stored as Beads issues.\nWhen instantiated, the molecule creates child beads forming a DAG.\n\n## Key Insight: Molecules ARE Beads\n\nPer HOP Decision 001: Beads IS the ledger. Molecules don't get a separate YAML format - they're issues with `type: molecule` containing prose-based step definitions.\n\nAgents don't need rigid schemas. They parse natural language natively. A molecule is just instructions with enough structure for tooling.\n\n## Example: Engineer in a Box\n\n```markdown\nid: mol-xyz\ntype: molecule\ntitle: Engineer in a Box\n\nThis workflow takes a task from design to merge.\n\n## Step: design\nThink carefully about architecture. Consider existing patterns, \ntrade-offs, testability.\n\n## Step: implement\nWrite clean code. Follow codebase conventions.\nNeeds: design\n\n## Step: review \nReview for bugs, edge cases, style issues.\nNeeds: implement\n\n## Step: test\nWrite and run tests. Cover happy path and edge cases.\nNeeds: implement\n\n## Step: submit\nSubmit for merge via refinery.\nNeeds: review, test\n```\n\n## Instantiation\n\n```bash\n# Attach molecule when spawning\ngt spawn --issue gt-abc --molecule mol-xyz\n\n# Creates child beads atomically:\ngt-abc.design ← ready first\ngt-abc.implement ← blocked by design \ngt-abc.review ← blocked by implement\ngt-abc.test ← blocked by implement\ngt-abc.submit ← blocked by review, test\n```\n\nEach step issue gets an `instantiated-from` edge to the molecule (with step metadata).\n\n## Why This Matters\n\n1. **Unified data plane**: Everything in Beads, no parallel YAML channel\n2. **AI-native**: Prose instructions, not rigid schemas\n3. **Error isolation**: Each step is a checkpoint - failure doesn't lose progress\n4. **Scales with AI**: As agents get smarter, they handle more complex molecules\n\n## Implementation Primitives\n\n- `ParseMoleculeSteps()`: Extract steps from prose (convention-based)\n- `InstantiateMolecule()`: Atomic transaction creating all steps + edges \n- `instantiated-from` edge type: Track provenance\n- Parameterization: `{{variable}}` substitution from context map","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-18T18:06:24.573068-08:00","updated_at":"2025-12-27T21:29:45.595678-08:00","deleted_at":"2025-12-27T21:29:45.595678-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-4nn.1","title":"Molecule schema: YAML format for workflow definitions","description":"Define the YAML schema for molecule definitions:\n\n```yaml\nmolecule: \u003cname\u003e\nversion: 1\ndescription: \"Human description\"\nsteps:\n - id: \u003cstep-id\u003e\n title: \"Step title\"\n prompt: \"Instructions for agent\"\n depends: [\u003cother-step-ids\u003e] # optional\n tier: haiku|sonnet|opus # optional, default from config\n timeout: 30m # optional\n```\n\nStore molecules in:\n- `\u003crig\u003e/molecules/\u003cname\u003e.yaml` for rig-specific\n- `\u003ctown\u003e/molecules/\u003cname\u003e.yaml` for town-wide\n\nBuilt-in molecules to ship:\n- engineer-in-box: design→code→review→test→submit\n- quick-fix: implement→test→submit\n- research: investigate→document","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-18T18:06:49.441267-08:00","updated_at":"2025-12-27T21:29:45.587385-08:00","dependencies":[{"issue_id":"gt-4nn.1","depends_on_id":"gt-4nn","type":"parent-child","created_at":"2025-12-18T18:06:49.442723-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.587385-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4nn.2","title":"Molecule instantiation: create child beads from template","description":"When instantiating a molecule on a work bead:\n\n## Transaction Flow\n\n1. Parse molecule's `## Step:` sections from description\n2. Begin SQLite transaction\n3. For each step, create child issue:\n - ID: `{parent-id}.{step-ref}` or generated\n - Title: step title (from header or first line)\n - Description: step prose instructions\n - Type: task\n - Priority: inherit from parent\n4. Add `instantiated-from` edge from each step to molecule:\n ```sql\n INSERT INTO dependencies (issue_id, depends_on_id, type, metadata)\n VALUES (step_id, mol_id, 'instantiated-from', '{\"step\": \"implement\"}');\n ```\n5. Wire inter-step dependencies from `Needs:` lines\n6. Commit transaction (atomic - all or nothing)\n\n## Parsing Conventions\n\n```markdown\n## Step: \u003cref\u003e\n\u003cprose instructions\u003e\nNeeds: \u003cstep\u003e, \u003cstep\u003e # optional\nTier: haiku|sonnet|opus # optional hint\n```\n\n## Parameterization\n\nSteps can have `{{variable}}` placeholders:\n```markdown\n## Step: implement\nImplement {{feature_name}} in {{target_file}}.\n```\n\nContext map provided at instantiation time.\n\n## API\n\n```go\nfunc (s *Store) InstantiateMolecule(mol *Issue, parent *Issue, ctx map[string]string) ([]*Issue, error)\nfunc ParseMoleculeSteps(description string) ([]MoleculeStep, error)\n```\n\nImplementation lives in `internal/beads/molecule.go`.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-18T18:06:52.071066-08:00","updated_at":"2025-12-27T21:29:45.578981-08:00","dependencies":[{"issue_id":"gt-4nn.2","depends_on_id":"gt-4nn","type":"parent-child","created_at":"2025-12-18T18:06:52.072554-08:00","created_by":"daemon"},{"issue_id":"gt-4nn.2","depends_on_id":"gt-4nn.1","type":"blocks","created_at":"2025-12-18T18:07:02.949242-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.578981-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4nn.3","title":"Molecule CLI: bd molecule commands","description":"Add molecule commands to bd:\n\n## Commands\n\n```bash\nbd molecule list # List molecules (type: molecule)\nbd molecule show \u003cid\u003e # Show molecule with parsed steps\nbd molecule parse \u003cid\u003e # Validate and show parsed structure \nbd molecule instantiate \u003cmol-id\u003e --parent=\u003cissue-id\u003e # Create steps\nbd molecule instances \u003cmol-id\u003e # Show all instantiations\n```\n\n## gt spawn integration\n\n```bash\ngt spawn --issue \u003cid\u003e --molecule \u003cmol-id\u003e\n```\n\nThis should:\n1. Call `bd molecule instantiate` (creates child beads atomically)\n2. Spawn polecat on first ready step\n3. Polecat grinds through via `bd ready`\n\n## Output Examples\n\n```\n$ bd molecule show mol-abc\n\nmol-abc: Engineer in a Box\nType: molecule\n\nSteps (5):\n design → (ready first)\n implement → Needs: design\n review → Needs: implement\n test → Needs: implement \n submit → Needs: review, test\n \nInstances: 3\n```\n\n```\n$ bd molecule instances mol-abc\n\nParent Status Created\ngt-xyz done 2025-12-15\ngt-abc active 2025-12-17 (3/5 complete)\ngt-def pending 2025-12-18\n```","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-18T18:06:53.919884-08:00","updated_at":"2025-12-27T21:29:45.5705-08:00","dependencies":[{"issue_id":"gt-4nn.3","depends_on_id":"gt-4nn","type":"parent-child","created_at":"2025-12-18T18:06:53.921621-08:00","created_by":"daemon"},{"issue_id":"gt-4nn.3","depends_on_id":"gt-4nn.2","type":"blocks","created_at":"2025-12-18T18:07:03.048941-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.5705-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4nn.4","title":"Built-in molecules: engineer-in-box, quick-fix, research","description":"Create built-in molecules as Beads issues:\n\n## engineer-in-box\n\n```markdown\nid: mol-engineer-in-box\ntype: molecule\ntitle: Engineer in a Box\n\nFull workflow from design to merge.\n\n## Step: design\nThink carefully about architecture. Consider:\n- Existing patterns in the codebase\n- Trade-offs between approaches \n- Testability and maintainability\n\nWrite a brief design summary before proceeding.\n\n## Step: implement\nWrite the code. Follow codebase conventions.\nNeeds: design\n\n## Step: review\nSelf-review the changes. Look for:\n- Bugs and edge cases\n- Style issues\n- Missing error handling\nNeeds: implement\n\n## Step: test\nWrite and run tests. Cover happy path and edge cases.\nFix any failures before proceeding.\nNeeds: implement\n\n## Step: submit\nSubmit for merge via refinery.\nNeeds: review, test\n```\n\n## quick-fix\n\n```markdown\nid: mol-quick-fix\ntype: molecule \ntitle: Quick Fix\n\nFast path for small changes.\n\n## Step: implement\nMake the fix. Keep it focused.\n\n## Step: test\nRun relevant tests. Fix any regressions.\nNeeds: implement\n\n## Step: submit\nSubmit for merge.\nNeeds: test\n```\n\n## research\n\n```markdown\nid: mol-research\ntype: molecule\ntitle: Research\n\nInvestigation workflow.\n\n## Step: investigate\nExplore the question. Search code, read docs, \nunderstand context. Take notes.\n\n## Step: document\nWrite up findings. Include:\n- What you learned\n- Recommendations\n- Open questions\nNeeds: investigate\n```\n\n## Storage\n\nBuilt-in molecules live in `\u003ctown\u003e/.beads/` as regular issues.\nCreated during `gt install` or `bd init`.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T18:07:04.574565-08:00","updated_at":"2025-12-27T21:29:54.167707-08:00","dependencies":[{"issue_id":"gt-4nn.4","depends_on_id":"gt-4nn","type":"parent-child","created_at":"2025-12-18T18:07:04.576587-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.167707-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4ol8f","title":"gt sling to town roles should use Town beads","description":"## Problem\n\nWhen Mayor starts from ~/gt (town root), `gt mol status` couldn't find beads pinned in rig beads.\n\n## Solution\n\nAdded cross-rig scanning for town-level roles (mayor, deacon):\n1. First check local beads directory \n2. If nothing found AND target is a town-level role, scan all registered rigs\n3. Return first pinned bead found\n\nThis makes `gt mol status` work correctly regardless of where Mayor is when checking hook status.\n\n## Tested\n\n1. `gt sling gt-552hb mayor` from gastown/mayor/rig/ - pins in rig beads ✓\n2. `gt mol status` from ~/gt - finds the pinned bead via cross-rig scan ✓","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-27T17:33:40.334303-08:00","updated_at":"2025-12-27T21:29:45.731796-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:45.731796-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-4p1al","title":"Digest: mol-deacon-patrol","description":"P10","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:24:46.065008-08:00","updated_at":"2025-12-27T21:26:01.662897-08:00","deleted_at":"2025-12-27T21:26:01.662897-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4put","title":"execute-actions","description":"Send nudges, process shutdowns, escalate as decided.\n\nNeeds: decide-actions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:18:21.59918-08:00","updated_at":"2025-12-25T15:52:58.362049-08:00","deleted_at":"2025-12-25T15:52:58.362049-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4qey","title":"gt mail: Cross-level routing is broken","description":"When Mayor sends mail to rig worker, message lands in wrong beads database. Sender's beads vs recipient's.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T17:57:35.617292-08:00","updated_at":"2025-12-27T21:29:53.652416-08:00","deleted_at":"2025-12-27T21:29:53.652416-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-4qiqp","title":"Digest: mol-deacon-patrol","description":"Patrol 5: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:01:10.250364-08:00","updated_at":"2025-12-27T21:26:04.034526-08:00","deleted_at":"2025-12-27T21:26:04.034526-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4ry6","title":"Digest: mol-deacon-patrol","description":"Patrol 3: Quick scan, stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:06:31.826181-08:00","updated_at":"2025-12-27T21:26:04.490888-08:00","deleted_at":"2025-12-27T21:26:04.490888-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4s6us","title":"Session ended: gt-gastown-nux","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:53:34.025957-08:00","updated_at":"2026-01-04T16:41:37.860603-08:00","closed_at":"2026-01-04T16:41:37.860603-08:00","close_reason":"Archived","created_by":"gastown/polecats/nux"}
{"id":"gt-4tfp","title":"Digest: mol-deacon-patrol","description":"Patrol #5: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:31:29.654233-08:00","updated_at":"2025-12-27T21:26:04.359139-08:00","deleted_at":"2025-12-27T21:26:04.359139-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z","title":"Refinery as Worktree: Local MR Integration","description":"Move refinery from separate clone to git worktree. Polecats stop pushing to origin - refinery sees their branches locally. MRs become wisps (ephemeral). Only main gets pushed after merge.\n\n## Goals\n- Origin stays clean (only main + beads-sync branches)\n- No orphaned polecat branches ever\n- Simpler mental model for MR coordination\n- Faster integration (no network for local MR ops)\n\n## Trade-offs\n- Machine crash = redo pending work (acceptable, beads track state)\n- Future federation needs different approach (bundles or integration branch)\n\n## Components Affected\n- Rig initialization (refinery setup)\n- Spawn/polecat workflow (remove origin push)\n- Refinery manager (local branch access)\n- Documentation (architecture, workflows)\n- Molecule templates (remove push instructions)","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T20:24:56.517669-08:00","updated_at":"2025-12-27T21:29:55.753929-08:00","deleted_at":"2025-12-27T21:29:55.753929-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-4u5z.1","title":"Update rig init to create refinery as worktree","description":"Modify rig initialization to create refinery as a git worktree instead of a separate clone.\n\n## Changes\n- internal/rig/init.go or similar - change refinery setup\n- Create worktree at \u003crig\u003e/refinery/ with branch 'refinery'\n- Remove clone logic for refinery\n- Update any path assumptions\n\n## Considerations\n- Refinery worktree should track main (or its own branch?)\n- May need to handle existing rigs (migration path)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T20:25:25.927013-08:00","updated_at":"2025-12-27T21:29:55.745567-08:00","dependencies":[{"issue_id":"gt-4u5z.1","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:25.92745-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.745567-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.2","title":"Remove origin push from polecat workflow","description":"Polecats should no longer push their branches to origin.\n\n## Changes\n- internal/cmd/done.go - remove 'git push origin HEAD'\n- internal/cmd/mq_submit.go - remove branch push\n- Molecule templates - remove push instructions\n- Any hooks that push polecat branches\n\n## Verification\n- Polecat completes work without pushing\n- Branch stays local\n- Refinery can still see the branch (via shared .git)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T20:25:27.079456-08:00","updated_at":"2025-12-27T21:29:55.737217-08:00","dependencies":[{"issue_id":"gt-4u5z.2","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:27.081989-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.2","depends_on_id":"gt-4u5z.1","type":"blocks","created_at":"2025-12-23T20:25:43.547484-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.737217-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.3","title":"Update refinery to read local branches","description":"Refinery manager needs to access polecat branches locally instead of fetching from origin.\n\n## Changes\n- internal/refinery/manager.go - remove origin fetch for MR branches\n- internal/refinery/engineer.go - local branch access\n- Remove DeleteRemoteBranch calls (branches are local now)\n- Update merge workflow to work with local refs\n\n## Key insight\nAll worktrees share the same .git - refinery can see polecat/nux branch directly.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T20:25:28.500632-08:00","updated_at":"2025-12-27T21:29:55.72891-08:00","dependencies":[{"issue_id":"gt-4u5z.3","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:28.502671-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.3","depends_on_id":"gt-4u5z.1","type":"blocks","created_at":"2025-12-23T20:25:43.633822-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.72891-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.4","title":"Convert MR queue to wisp storage","description":"MR queue should use wisp storage (.beads-wisp/) instead of durable beads.\n\n## Design (Simplified)\n\nWith refinery as worktree, polecats and refinery share the same machine. No cross-machine MR coordination needed. Wisps are the natural fit.\n\n### Storage\n- Location: `.beads-wisp/mq/\u003cmr-id\u003e.json`\n- Format: Same MRFields structure (branch, target, source_issue, worker, rig)\n- Lifecycle: Create on submit, delete after merge\n\n### Code Changes\n\n1. **MR Creation** (done.go, mq_submit.go)\n - Write to `.beads-wisp/mq/` instead of `bd create --type=merge-request`\n - Generate ID: `mr-\u003ctimestamp\u003e-\u003crandom\u003e` or similar\n\n2. **Refinery Query** (refinery/manager.go, engineer.go)\n - Read from `.beads-wisp/mq/` directory\n - List files, parse JSON, process in order\n\n3. **Cleanup**\n - Delete MR file after successful merge\n - On failure, leave file (refinery retries)\n\n### What Stays in Beads\n- Source issues (the work being merged) - still tracked in beads\n- Merge commits reference source issue ID\n- `bd close \u003cissue\u003e` still happens on merge\n\n### Benefits\n- No sync overhead for transient MR state\n- Simpler model (file per MR, delete when done)\n- Consistent with wisp philosophy (ephemeral operational state)\n\n### Not Needed\n- Digest squashing (MRs are deleted, not accumulated)\n- Cross-machine visibility (single-machine architecture)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T20:25:29.716788-08:00","updated_at":"2025-12-27T21:29:55.720635-08:00","dependencies":[{"issue_id":"gt-4u5z.4","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:29.717201-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.4","depends_on_id":"gt-4u5z.3","type":"blocks","created_at":"2025-12-23T20:25:43.719801-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.720635-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.5","title":"Update architecture documentation","description":"Update all documentation to reflect refinery-as-worktree model.\n\n## Files to update\n- docs/architecture.md (if exists)\n- CLAUDE.md references to refinery\n- mayor/rig/docs/*.md - any refinery docs\n- wisp-architecture.md - add MR wisp details\n- README files mentioning clone vs worktree\n\n## Key messages\n- Refinery is worktree, not clone\n- Polecat branches stay local\n- Only main pushed to origin\n- MRs are wisps (ephemeral)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T20:25:31.192025-08:00","updated_at":"2025-12-27T21:29:57.4065-08:00","dependencies":[{"issue_id":"gt-4u5z.5","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:31.194085-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.5","depends_on_id":"gt-4u5z.2","type":"blocks","created_at":"2025-12-23T20:25:43.819111-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.5","depends_on_id":"gt-4u5z.3","type":"blocks","created_at":"2025-12-23T20:25:43.907113-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.5","depends_on_id":"gt-4u5z.4","type":"blocks","created_at":"2025-12-23T20:25:43.99734-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.4065-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.6","title":"Design future federation model","description":"Document how remote/federated refinery would work in the future.\n\n## Options to document\n1. Git bundles - portable patches, no branches on origin\n2. Dedicated integration branch pattern (integration/*)\n3. Hybrid - local for same-machine, bundles for remote\n\n## Deliverable\nDesign doc in docs/federation-refinery.md or similar.\nNot implementing now - just capturing the design for when needed.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T20:25:32.265488-08:00","updated_at":"2025-12-27T21:29:57.874672-08:00","dependencies":[{"issue_id":"gt-4u5z.6","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:32.267467-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.874672-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4u5z.7","title":"Add migration path for existing rigs","description":"Existing rigs have refinery as clone. Need migration path.\n\n## Options\n1. Manual: User runs 'gt rig migrate' or similar\n2. Automatic: Detect clone, convert to worktree on next start\n3. Deprecation: Old rigs work but warn, new rigs use worktree\n\n## Considerations\n- Don't break existing setups\n- Clear upgrade path\n- Handle in-flight MRs during migration","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T20:25:33.746893-08:00","updated_at":"2025-12-27T21:29:57.398197-08:00","dependencies":[{"issue_id":"gt-4u5z.7","depends_on_id":"gt-4u5z","type":"parent-child","created_at":"2025-12-23T20:25:33.749766-08:00","created_by":"daemon"},{"issue_id":"gt-4u5z.7","depends_on_id":"gt-4u5z.1","type":"blocks","created_at":"2025-12-23T20:25:44.086388-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.398197-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4ulgd","title":"Digest: mol-deacon-patrol","description":"Patrol 11: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:22:28.586171-08:00","updated_at":"2025-12-27T21:26:00.124981-08:00","deleted_at":"2025-12-27T21:26:00.124981-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4v1eo","title":"Ephemeral protos: Remove proto-as-bead storage, cook inline","description":"## Summary\n\nRefactor the molecular chemistry stack so protos are ephemeral in-memory data structures, not persisted beads with `[template]` labels.\n\n## Current State\n\n```\nFormula (.formula.json) → cook → Proto (bead with [template]) → pour/wisp → Mol/Wisp\n```\n\nProtos are stored as beads issues with `labels: [\"template\"]`, polluting the DB with template definitions that duplicate the formula files.\n\n## Target State\n\n```\nFormula (.formula.json) → pour/wisp (cook inline) → Mol/Wisp\n```\n\nProtos exist only as in-memory data structures during the cook→pour/wisp pipeline. Formula catalog IS the proto library.\n\n## Changes Required\n\n1. **Remove template bead storage** - `bd cook` no longer writes to DB\n2. **`bd cook` becomes preview** - outputs proto JSON to stdout (like --dry-run)\n3. **`bd pour`/`bd wisp` cook inline** - take formula name, cook on the fly\n4. **`bd mol bond` accepts formula names** - cooks inline when bonding templates\n5. **Remove `[template]` label handling** - no more template beads\n6. **Update docs** - molecular-chemistry.md, molecule-algebra.md\n\n## Benefits\n\n- No DB pollution from template beads\n- Single source of truth (formulas in .beads/formulas/)\n- Simpler mental model: formulas → instances\n- Faster (no DB round-trip for protos)\n\n## Deferred\n\nStaged parameterization (proto currying) - punt until someone needs it:\n```bash\nbd cook mol-feature --var team=backend # Partially bound proto\nbd pour \u003cproto-id\u003e --var priority=high # Further binding\n```\n\n## Depends On\n\n- gt-8tmz (Molecule Algebra) core work should be mostly complete first\n\n## Context\n\nDiscussion: Mayor session 2025-12-25, ultrathink on proto storage","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-25T14:27:32.801474-08:00","updated_at":"2025-12-27T21:29:52.514747-08:00","dependencies":[{"issue_id":"gt-4v1eo","depends_on_id":"gt-8tmz","type":"blocks","created_at":"2025-12-25T14:27:37.721843-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.514747-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-4vp3p","title":"Digest: mol-deacon-patrol","description":"Patrol 4: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:53:45.552863-08:00","updated_at":"2025-12-27T21:26:00.584711-08:00","deleted_at":"2025-12-27T21:26:00.584711-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4vuh8","title":"Digest: mol-deacon-patrol","description":"P15: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:13:56.743826-08:00","updated_at":"2025-12-27T21:26:02.245173-08:00","deleted_at":"2025-12-27T21:26:02.245173-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4vw2j","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy - no changes from cycle 1","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:28:05.581568-08:00","updated_at":"2025-12-27T21:26:02.626262-08:00","deleted_at":"2025-12-27T21:26:02.626262-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4x1bi","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final cycle, all services healthy, handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:53:50.12613-08:00","updated_at":"2025-12-27T21:26:01.9941-08:00","deleted_at":"2025-12-27T21:26:01.9941-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4xas","title":"Merge fix/spawn-beads-path branch to main","description":"Branch has gt nudge command; main has notification deduplication. Both valuable.\n\nKey insight: Only ~10 commits diverged each way in mayor/rig.\n- Branch: 9509afa feat(nudge): Add gt nudge command (MISSING from main)\n- Main: d2fccd5 slot-based notification deduplication (KEEP)\n\nMerge plan:\n1. git merge fix/spawn-beads-path --no-ff\n2. Keep nudge.go from branch\n3. Keep notification.go from main\n4. Merge tmux.go (both NudgeSession AND SendKeysReplace)\n5. go build \u0026\u0026 go test\n6. Deploy to ~/.local/bin/gt","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T13:28:01.684557-08:00","updated_at":"2025-12-27T21:29:53.755388-08:00","deleted_at":"2025-12-27T21:29:53.755388-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4xgth","title":"Digest: mol-deacon-patrol","description":"Patrol: fixed stale mol spawn prompts, cleaned 36 orphaned wisps, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:26:09.458718-08:00","updated_at":"2025-12-27T21:26:00.618512-08:00","deleted_at":"2025-12-27T21:26:00.618512-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4z54b","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:47:58.060935-08:00","updated_at":"2025-12-27T21:26:04.193847-08:00","deleted_at":"2025-12-27T21:26:04.193847-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-4z7j","title":"Test patrol queue feature","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T16:17:34.358185-08:00","updated_at":"2025-12-27T21:29:57.472767-08:00","deleted_at":"2025-12-27T21:29:57.472767-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-50665","title":"Digest: mol-deacon-patrol","description":"Cycle 7","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T10:00:57.642218-08:00","updated_at":"2026-01-01T10:00:57.642218-08:00","closed_at":"2026-01-01T10:00:57.64218-08:00"}
{"id":"gt-519fu","title":"Digest: mol-deacon-patrol","description":"Patrol 7","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:50:29.743717-08:00","updated_at":"2026-01-01T08:50:29.743717-08:00","closed_at":"2026-01-01T08:50:29.743679-08:00"}
{"id":"gt-51x","title":"Fix golangci-lint errcheck warnings (~160 issues)","description":"Running golangci-lint shows ~160 errcheck warnings for unchecked error returns.\n\nCommon patterns:\n- t.SetEnvironment() return values\n- os.WriteFile(), os.RemoveAll() \n- MarkFlagRequired() on cobra commands\n- Various manager methods\n\nRun: golangci-lint run ./...\n\nCould batch fix with:\n1. Add explicit _ = for intentionally ignored errors\n2. Handle errors properly where they matter\n3. Consider adding //nolint:errcheck for cobra flag setup","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T15:02:39.807659-08:00","updated_at":"2025-12-27T21:29:57.296988-08:00","deleted_at":"2025-12-27T21:29:57.296988-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-52fw","title":"Digest: mol-deacon-patrol","description":"Patrol: 2 completions (valkyrie gt-yd98 MQ, scrotus gt-mzal.1 boot proto). 8 polecats working.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T00:32:25.162966-08:00","updated_at":"2025-12-27T21:26:05.413683-08:00","deleted_at":"2025-12-27T21:26:05.413683-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-53w6","title":"Witness MVP: Automated Polecat Lifecycle","description":"Implement the Witness agent - per-rig 'pit boss' that manages polecat lifecycles.\n\nThe Witness enables hands-free swarming by automating:\n- Spawning polecats for ready work\n- Monitoring worker health\n- Processing shutdown requests \n- Cleaning up worktrees when done\n- Escalating stuck workers to Mayor\n\nWithout Witness, humans must manually spawn, monitor, and kill each polecat.\n\n## Core Loop\n\n```\nwhile True:\n # Handle pending shutdowns\n for polecat in polecats where state == pending_shutdown:\n verify git clean\n kill session \n remove worktree\n delete branch\n \n # Spawn for ready work\n ready = bd ready --parent=\u003cepic\u003e if epic else bd ready\n for issue in ready:\n if active_workers \u003c max_workers:\n gt spawn --issue \u003cid\u003e\n \n # Check worker health\n for polecat in active polecats:\n if stuck (no progress for 30 min):\n nudge or escalate\n \n sleep 60\n```\n\n## Blocking Bugs to Fix\n- gt-dsfi: handoff deadlock\n- gt-n7z7: refinery foreground race condition\n- gm-c6b: mail coordination (town vs rig beads)","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-20T03:13:45.075731-08:00","updated_at":"2025-12-27T21:29:53.840035-08:00","deleted_at":"2025-12-27T21:29:53.840035-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-54b2v","title":"Test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-30T22:24:31.136019-08:00","updated_at":"2025-12-30T22:24:43.616236-08:00","created_by":"gastown/polecats/furiosa","deleted_at":"2025-12-30T22:24:43.616236-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"agent"}
{"id":"gt-54kn","title":"Test: New Router","description":"Testing bd create for mail","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:57:10.688733-08:00","updated_at":"2025-12-25T14:12:42.293448-08:00","deleted_at":"2025-12-25T14:12:42.293448-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-55e0w","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T19:07:05.949941-08:00","updated_at":"2025-12-31T19:07:05.949941-08:00","closed_at":"2025-12-31T19:07:05.949902-08:00","dependencies":[{"issue_id":"gt-55e0w","depends_on_id":"gt-eph-9q0u","type":"parent-child","created_at":"2025-12-31T19:07:05.951067-08:00","created_by":"deacon"}]}
{"id":"gt-56fv","title":"Merge: gt-5af.2","description":"branch: polecat/Doof\ntarget: main\nsource_issue: gt-5af.2\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:36:24.884931-08:00","updated_at":"2025-12-27T21:27:22.708874-08:00","deleted_at":"2025-12-27T21:27:22.708874-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-56po","title":"Merge: gt-g44u.2","description":"branch: polecat/Doof\ntarget: main\nsource_issue: gt-g44u.2\nrig: gastown","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T16:03:10.388461-08:00","updated_at":"2025-12-27T21:27:22.385154-08:00","deleted_at":"2025-12-27T21:27:22.385154-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-56u5","title":"Digest: mol-deacon-patrol","description":"Patrol 11: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:37:39.062293-08:00","updated_at":"2025-12-27T21:26:04.593441-08:00","deleted_at":"2025-12-27T21:26:04.593441-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-56uwe","title":"Digest: mol-deacon-patrol","description":"Patrol 7: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:17:06.082327-08:00","updated_at":"2025-12-27T21:26:03.56871-08:00","deleted_at":"2025-12-27T21:26:03.56871-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-57f5s","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 2: all healthy, no action required","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:20:43.531839-08:00","updated_at":"2025-12-27T21:26:01.959606-08:00","deleted_at":"2025-12-27T21:26:01.959606-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-58it","title":"bd sync --from-main: fresh sync from main branch","description":"## Summary\n\nAdd `--from-main` flag to `bd sync` to pull fresh beads state from main branch.\n\n## Use Case\n\nWhen starting a patrol or fresh session, agent wants clean beads state:\n```bash\nbd sync --from-main # Pull latest from origin/main, ignore local changes\n```\n\n## Behavior\n\n1. Fetch origin/main\n2. Reset local .beads/ to match origin/main\n3. Rebuild SQLite DB from fresh JSONL\n\nUseful for polecats/crew starting work to avoid stale state.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T01:19:14.352299-08:00","updated_at":"2025-12-27T21:29:56.258925-08:00","deleted_at":"2025-12-27T21:29:56.258925-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-58tu","title":"Add accounts.yaml config parsing to gt","description":"Parse ~/gt/mayor/accounts.yaml with structure: accounts map (handle -\u003e email, config_dir) and default field. This is the foundational config that other account features depend on. Location follows existing town-level config pattern in mayor/.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:16.934245-08:00","updated_at":"2025-12-27T21:29:56.20062-08:00","deleted_at":"2025-12-27T21:29:56.20062-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-58yb","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:07:04.211464-08:00","updated_at":"2025-12-27T21:26:04.482598-08:00","deleted_at":"2025-12-27T21:26:04.482598-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59p","title":"Design GGT prompt architecture","description":"Audit PGT prompts and design canonical prompt system for GGT. Create docs/prompts.md with inventory, gap analysis, and Witness prompt design.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T00:46:16.916031-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-59zd","title":"Molecule-based Witness Patrol","description":"Wire up mol-witness-patrol as a tracking ledger for the Go-based witness.\n\n## Key Insight\n\nThe molecule is NOT an executor - it's a **tracking structure**. The existing\nGo code in internal/witness/manager.go continues to run the patrol loop.\nThe molecule instance provides visibility, audit trail, and polecat tracking.\n\n## Current State\n\n- Witness runs as Go code with direct patrol logic\n- mol-witness-patrol and mol-polecat-arm exist as molecule definitions\n- gt mol bond creates dynamic children\n- No molecule tracking of actual witness operations\n\n## Target State\n\n1. When witness starts → instantiate mol-witness-patrol on its hook\n2. When Go code discovers polecat → bond mol-polecat-arm child\n3. When polecat completes/dies → close that arm issue\n4. gt mol progress shows current patrol state\n5. Molecule survives wisp burns via handoff bead\n\n## Benefits\n\n- Visibility: gt mol status gastown/witness shows active arms\n- Audit: Each polecat inspection is a trackable issue\n- Consistency: Same tracking model as polecats with molecules\n- Progress: gt mol progress works for witness too\n\n## Non-Goals\n\n- NOT replacing Go code with Claude session\n- NOT running molecule steps as prompts\n- NOT requiring Claude for witness operation","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T22:19:33.06695-08:00","updated_at":"2025-12-27T21:29:52.804164-08:00","deleted_at":"2025-12-27T21:29:52.804164-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-59zd.1","title":"mol sling command: attach molecule to agent hook","description":"Add `gt mol sling \u003cmolecule-id\u003e` command to attach a molecule to the current agent's hook.\n\n## Behavior\n\n```bash\ngt mol sling mol-witness-patrol\n```\n\n1. Find agent identity from cwd (witness, polecat, crew, etc.)\n2. Find or create agent's handoff bead\n3. Attach molecule to handoff bead (attached_molecule field)\n4. Create root issue for this molecule instance\n5. Instantiate molecule steps under the root\n\n## Difference from mol attach\n\n- `mol attach` attaches to an existing pinned bead\n- `mol sling` creates a fresh instance for execution\n\n## Output\n\n```\n🧬 Slung mol-witness-patrol on gastown/witness\n Instance: gt-xyz (9 steps)\n First step: inbox-check\n```\n\n## Implementation\n\n- Reuse InstantiateMolecule from molecule.go\n- Create wrapper issue for the instance\n- Set attached_molecule on handoff bead","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T22:19:45.161282-08:00","updated_at":"2025-12-27T21:29:52.79596-08:00","dependencies":[{"issue_id":"gt-59zd.1","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:19:45.161784-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.79596-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59zd.2","title":"mol progress: track dynamically bonded children","description":"Extend `gt mol progress` to show dynamically bonded children.\n\n## Current Behavior\n\n```\ngt mol progress gt-abc\n```\n\nShows progress through pre-defined steps from InstantiateMolecule.\n\n## New Behavior\n\nAlso show dynamically bonded children:\n\n```\n📊 Progress: mol-witness-patrol (gt-abc)\n\n Steps (9):\n ✓ inbox-check\n ✓ check-refinery\n ✓ load-state\n ◐ survey-workers (bonded 3 children)\n ○ aggregate (WaitsFor: all-children)\n ○ save-state\n ○ generate-summary\n ○ context-check\n ○ burn-or-loop\n\n Bonded Children (3):\n ✓ arm-toast (5/5 steps)\n ◐ arm-nux (3/5 steps)\n ○ arm-furiosa (0/5 steps)\n\n Progress: 4/9 steps, 3 arms (1 complete, 1 in-progress, 1 pending)\n```\n\n## Implementation\n\n1. Find children with bonded_to: \u003cparent-id\u003e in description\n2. Recursively get progress for each bonded child\n3. Aggregate into parent progress display","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T22:19:58.346108-08:00","updated_at":"2025-12-27T21:29:52.787749-08:00","dependencies":[{"issue_id":"gt-59zd.2","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:19:58.346575-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.787749-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59zd.3","title":"Witness: instantiate mol-witness-patrol on start","description":"When witness starts, create a mol-witness-patrol instance and pin it to the witness hook.\n\n## Changes to witness/manager.go\n\nIn Start() or run():\n1. Check if patrol instance exists on hook\n2. If not, call beads.InstantiateMolecule(mol-witness-patrol, handoffBead)\n3. Store instance ID for later reference\n\n## Hook Structure\n\nThe witness handoff bead gets:\n- attached_molecule: gt-patrol-xyz\n- attached_at: timestamp\n\n## Idempotency\n\nIf patrol already exists (from previous session), reuse it.\nDon't create duplicate instances on restart.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T22:21:55.247815-08:00","updated_at":"2025-12-27T21:25:59.903248-08:00","dependencies":[{"issue_id":"gt-59zd.3","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:21:55.248279-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:25:59.903248-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59zd.4","title":"Witness: bond mol-polecat-arm when discovering polecat","description":"When witness discovers a polecat during healthCheck(), bond a mol-polecat-arm child.\n\n## Changes to witness/manager.go\n\nIn healthCheck() when iterating polecats:\n1. Check if arm already exists for this polecat (arm-{name})\n2. If not, call: gt mol bond mol-polecat-arm --parent=$PATROL_ID --ref=arm-{name} --var polecat_name={name} --var rig={rig}\n3. Track arm issue ID in handoff state\n\n## Arm Lifecycle\n\n- Created when polecat first seen\n- Updated as inspection happens (nudges, state changes)\n- Closed when polecat cleaned up\n\n## Deduplication\n\nCheck handoffState.WorkerStates[name].ArmID before creating.\nNeeds: gt-59zd.3","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T22:22:05.718229-08:00","updated_at":"2025-12-27T21:29:52.77951-08:00","dependencies":[{"issue_id":"gt-59zd.4","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:22:05.718706-08:00","created_by":"daemon"},{"issue_id":"gt-59zd.4","depends_on_id":"gt-59zd.3","type":"blocks","created_at":"2025-12-23T22:22:33.154977-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.77951-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59zd.5","title":"Witness: close arm when polecat completes","description":"When witness cleans up a polecat, close its mol-polecat-arm issue.\n\n## Changes to witness/manager.go\n\nIn cleanupPolecat():\n1. Get arm ID from handoffState.WorkerStates[name].ArmID\n2. If exists, call: bd close {armID} --reason='polecat cleaned up'\n3. Clear ArmID from handoff state\n\n## Arm Close Reasons\n\n- 'polecat cleaned up' - normal completion\n- 'polecat killed - stuck' - killed due to inactivity\n- 'polecat killed - escalated' - killed after Mayor escalation\n\n## State Update\n\nThe arm's description could be updated with final outcome before closing.\nNeeds: gt-59zd.4","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T22:22:15.542735-08:00","updated_at":"2025-12-27T21:29:52.771236-08:00","dependencies":[{"issue_id":"gt-59zd.5","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:22:15.543212-08:00","created_by":"daemon"},{"issue_id":"gt-59zd.5","depends_on_id":"gt-59zd.4","type":"blocks","created_at":"2025-12-23T22:22:33.241115-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.771236-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-59zd.6","title":"mol progress: display bonded children in output","description":"Extend gt mol progress to show dynamically bonded children.\n\n## Current Output\n\nShows steps from original instantiation only.\n\n## New Output\n\nAlso show bonded children:\n\n Bonded Children (3):\n ✓ arm-toast (closed)\n ◐ arm-nux (open)\n ○ arm-furiosa (open)\n\n## Implementation\n\n1. Query for children with bonded_to: {parent-id} in description\n2. For each bonded child, show status (open/closed)\n3. Optionally show progress through child's steps\nNeeds: gt-59zd.5","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:22:28.008426-08:00","updated_at":"2025-12-27T21:29:55.687524-08:00","dependencies":[{"issue_id":"gt-59zd.6","depends_on_id":"gt-59zd","type":"parent-child","created_at":"2025-12-23T22:22:28.008932-08:00","created_by":"daemon"},{"issue_id":"gt-59zd.6","depends_on_id":"gt-59zd.5","type":"blocks","created_at":"2025-12-23T22:22:33.32542-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.687524-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5a0f2","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Quiet, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:30:36.729729-08:00","updated_at":"2025-12-27T21:26:03.911099-08:00","deleted_at":"2025-12-27T21:26:03.911099-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af","title":"Deacon: Hierarchical health-check orchestrator","description":"Replace daemon heartbeats with Deacon - an AI agent that keeps Gas Town running.\n\n## Core Concept\n\nThe Deacon is a **Claude agent** (not just a Go process) that:\n- Lives at town root in gt-deacon session\n- Has its own mailbox (deacon/ identity in town beads)\n- Is poked by a minimal Go daemon every ~60s (if idle)\n- Monitors Mayor and Witnesses proactively\n- Handles lifecycle requests (restart/cycle) from Mayor, Witnesses, AND Crew\n\n## Architecture\n\n```\nMinimal Go Daemon (just watches Deacon)\n |\n v\n Deacon (Claude agent)\n |\n +----+----+\n v v\n Mayor Witnesses --\u003e Polecats (Witness-managed)\n | |\n +----+----+\n |\n Crew (lifecycle only, not monitored)\n```\n\n## Deacon Responsibilities\n\n**Proactive monitoring:**\n- Mayor health (tmux session, keepalive freshness)\n- Witness health (tmux sessions, keepalive freshness)\n\n**Reactive lifecycle:**\n- Process restart/cycle requests from Mayor, Witnesses, Crew\n- Kill session, create new, prime, verify startup\n\n**Escalation:**\n- Mail human (configurable) if issues can't be resolved\n\n## Key Files\n\n- ~/gt/DEACON.md - Role context\n- ~/gt/deacon/ - State directory\n - heartbeat.json - Written each wake cycle (daemon checks this)\n - state.json - Health tracking, last scan results\n- ~/gt/.beads/ - Town beads with deacon/ mail identity\n\n## Wake Cycle\n\n1. Write heartbeat (prevents daemon from poking)\n2. Check mail (lifecycle requests)\n3. Quick health scan (Mayor, Witnesses)\n4. Process lifecycle requests\n5. Remediate unhealthy agents\n6. Update state\n\n## Session Patterns\n\n- Deacon: gt-deacon\n- Mayor: gt-mayor\n- Witness: gt-\u003crig\u003e-witness\n- Crew: gt-\u003crig\u003e-\u003cname\u003e (e.g., gt-gastown-max)\n\n## Relation to Policy Beads\n\nFuture: Deacon behavior configurable via policy beads (gt-3zw).\nInitial implementation uses sensible defaults.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-18T18:32:28.083305-08:00","updated_at":"2025-12-27T21:29:54.159363-08:00","dependencies":[{"issue_id":"gt-5af","depends_on_id":"gt-3zw","type":"blocks","created_at":"2025-12-18T18:32:36.617594-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.159363-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-5af.1","title":"Create DEACON.md role context","description":"Create ~/gt/DEACON.md with the Deacon's role context, wake cycle instructions, and command reference. Model after existing CLAUDE.md for Mayor.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:22.602567-08:00","updated_at":"2025-12-27T21:29:53.983135-08:00","dependencies":[{"issue_id":"gt-5af.1","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:22.605373-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.983135-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.2","title":"Add gt deacon start/stop/status commands","description":"Add CLI commands: gt deacon start (spawn gt-deacon session), gt deacon stop (kill session), gt deacon status (show health state). Similar to gt mayor commands.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:26.617077-08:00","updated_at":"2025-12-27T21:29:53.974692-08:00","dependencies":[{"issue_id":"gt-5af.2","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:26.618912-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.974692-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.3","title":"Add deacon/ mail identity support","description":"Ensure deacon/ works as a mail identity in town-level beads. Test with bd mail send deacon/ and bd mail inbox --identity deacon/.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:32.618418-08:00","updated_at":"2025-12-27T21:29:53.966338-08:00","dependencies":[{"issue_id":"gt-5af.3","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:32.620414-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.966338-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.4","title":"Simplify Go daemon to Deacon-watcher","description":"Refactor internal/daemon to only: (1) check deacon/heartbeat.json every 60s, (2) poke gt-deacon if stale, (3) restart Deacon and mail for help if very stale. Remove all lifecycle processing and agent poking.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:40.493425-08:00","updated_at":"2025-12-27T21:29:53.957862-08:00","dependencies":[{"issue_id":"gt-5af.4","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:40.495569-08:00","created_by":"daemon"},{"issue_id":"gt-5af.4","depends_on_id":"gt-5af.6","type":"blocks","created_at":"2025-12-19T17:15:11.010382-08:00","created_by":"daemon"},{"issue_id":"gt-5af.4","depends_on_id":"gt-5af.2","type":"blocks","created_at":"2025-12-19T17:15:11.158303-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.957862-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.5","title":"Update lifecycle mail targets to deacon/","description":"Update Mayor, Witness, and Crew handoff code to send lifecycle requests to deacon/ instead of daemon/. Update internal/cmd/handoff.go and related code.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:47.632827-08:00","updated_at":"2025-12-27T21:29:53.949426-08:00","dependencies":[{"issue_id":"gt-5af.5","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:47.63482-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.949426-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.6","title":"Add Deacon heartbeat mechanism","description":"Deacon writes deacon/heartbeat.json on each wake. Go daemon reads it to decide whether to poke. Add heartbeat read/write helpers.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:52.057582-08:00","updated_at":"2025-12-27T21:29:53.94095-08:00","dependencies":[{"issue_id":"gt-5af.6","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:52.059708-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.94095-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.7","title":"Add crew session pattern to lifecycle handling","description":"Update identityToSession and restartSession to handle crew patterns: gastown/max -\u003e gt-gastown-max. Crew workers can request lifecycle but are not proactively monitored.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:13:57.662197-08:00","updated_at":"2025-12-27T21:29:53.932522-08:00","dependencies":[{"issue_id":"gt-5af.7","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:13:57.664036-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.932522-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5af.8","title":"Add human escalation configuration","description":"Add overseer contact config to mayor/town.json. Deacon uses this for escalation when it cannot resolve issues. Support email and/or other notification methods.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T17:14:02.770713-08:00","updated_at":"2025-12-27T21:29:56.950086-08:00","dependencies":[{"issue_id":"gt-5af.8","depends_on_id":"gt-5af","type":"parent-child","created_at":"2025-12-19T17:14:02.772612-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.950086-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5bjgq","title":"Digest: mol-deacon-patrol","description":"Patrol 19","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T14:57:47.79199-08:00","updated_at":"2025-12-26T14:57:47.79199-08:00","closed_at":"2025-12-26T14:57:47.791943-08:00"}
{"id":"gt-5c0lp","title":"Digest: mol-deacon-patrol","description":"Patrol 19: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:41:28.721109-08:00","updated_at":"2025-12-27T21:26:00.287373-08:00","deleted_at":"2025-12-27T21:26:00.287373-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5e50a","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:35:02.300302-08:00","updated_at":"2025-12-27T21:26:02.51926-08:00","deleted_at":"2025-12-27T21:26:02.51926-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5fa3i","title":"Digest: mol-deacon-patrol","description":"Patrol 1: inbox empty, Mayor OK, 2 witnesses, 2 refineries healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:40:42.819905-08:00","updated_at":"2025-12-27T21:26:03.271225-08:00","deleted_at":"2025-12-27T21:26:03.271225-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5ft3","title":"Merge: gt-99a","description":"branch: polecat/Slit\ntarget: main\nsource_issue: gt-99a\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T16:18:15.329358-08:00","updated_at":"2025-12-27T21:27:22.963883-08:00","deleted_at":"2025-12-27T21:27:22.963883-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-5ggc","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:16:30.523256-08:00","updated_at":"2025-12-27T21:26:04.844753-08:00","deleted_at":"2025-12-27T21:26:04.844753-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5gkd","title":"Refinery Engineer: Role prompting and CLAUDE.md","description":"Create refinery/CLAUDE.md with Chief Merge Engineer role context. Include:\n- Role identity and responsibilities\n- Decision authority (merge order, test frequency, binary rebuilds)\n- Communication patterns (Witness, Deacon)\n- Session lifecycle (handoff bead protocol)\n- Key commands reference","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T18:09:06.423153-08:00","updated_at":"2025-12-27T21:29:53.924165-08:00","dependencies":[{"issue_id":"gt-5gkd","depends_on_id":"gt-ktal","type":"blocks","created_at":"2025-12-19T18:09:39.217523-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.924165-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5glf.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-5glf\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:32:01.027297-08:00","updated_at":"2025-12-27T21:29:55.394613-08:00","deleted_at":"2025-12-27T21:29:55.394613-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5gq8r","title":"gt mol step done: Auto-continue molecule on step completion","description":"## Summary\n\nImplement `gt mol step done \u003cstep-id\u003e` command that enables polecats to close a molecule step and automatically continue to the next step without waiting for witness orchestration.\n\n## Problem\n\nCurrently, when a polecat completes a molecule step:\n1. Polecat sends POLECAT_DONE to witness\n2. Witness (in patrol cycle) eventually reads mail\n3. Witness spawns next step\n4. **Minutes of latency possible**\n\nPolecats also forget to close beads steps (using internal TodoWrite instead), breaking the activity feed.\n\n## Solution\n\n`gt mol step done \u003cstep-id\u003e` handles everything:\n\n1. Close the beads step (`bd close \u003cstep-id\u003e`)\n2. Extract molecule ID from step (bd-mol-xga.1 → bd-mol-xga)\n3. Find next ready step (dependency-aware)\n4. If next step exists:\n - Update hook file to point to next step\n - Write handoff context\n - tmux respawn-pane (fresh claude session)\n5. If molecule complete:\n - Clear hook\n - Send POLECAT_DONE to witness\n - Exit session\n\n## Key Insight\n\nStep-to-step transitions are **mechanical**, not **cognitive**. No AI needed in the critical path. Pure Go automation.\n\n## Related Commands\n\n- `gt mol next \u003cmol-id\u003e` - Show next ready step(s)\n- `gt mol progress \u003cmol-id\u003e` - Already exists\n\n## Polecat Protocol Update\n\n```markdown\nWhen your hook contains a molecule step:\n1. Read the step: `bd show \u003cstep-id\u003e`\n2. Execute the work described\n3. When complete: `gt mol step done \u003cstep-id\u003e`\n\nThat is it. Do NOT manually close steps. Do NOT work multiple steps.\n```\n\n## Benefits\n\n- **Instant step transitions** (~5-10s vs minutes)\n- **Activity feed works** (steps closed as completed)\n- **Witness freed up** (safety net only, not critical path)\n- **Fresh context per step** (better attention, no accumulated confusion)\n","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-26T12:45:11.903532-08:00","updated_at":"2025-12-27T21:29:45.908918-08:00","deleted_at":"2025-12-27T21:29:45.908918-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-5gsx","title":"Merge: gt-3x0z.3","description":"branch: polecat/slit\ntarget: main\nsource_issue: gt-3x0z.3\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T15:42:20.546839-08:00","updated_at":"2025-12-27T21:27:22.903963-08:00","deleted_at":"2025-12-27T21:27:22.903963-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-5hhdq","title":"Digest: mol-deacon-patrol","description":"Patrol 14: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:23:33.511101-08:00","updated_at":"2025-12-27T21:26:00.076185-08:00","deleted_at":"2025-12-27T21:26:00.076185-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5hlqp","title":"Digest: mol-deacon-patrol","description":"Patrol 19: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:18:13.58242-08:00","updated_at":"2025-12-27T21:26:00.946194-08:00","deleted_at":"2025-12-27T21:26:00.946194-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5ipl","title":"Witness role describes incorrect commands","description":"prompts/roles/witness.md line 44 shows:\n gt polecat list {{ rig }}\n\nBut polecats are worktrees managed by Refinery/Mayor's clone,\nnot a separate polecat manager. The witness shouldn't manage\npolecats directly.\n\nReview and update witness role to match actual architecture.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-24T12:51:44.558565-08:00","updated_at":"2025-12-27T21:29:55.535909-08:00","dependencies":[{"issue_id":"gt-5ipl","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:08.074809-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.535909-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-5ix9n","title":"Merge: slit-mjw3n5iw","description":"branch: polecat/slit-mjw3n5iw\ntarget: main\nsource_issue: slit-mjw3n5iw\nrig: gastown\nagent_bead: gt-gastown-polecat-slit","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T16:00:47.705668-08:00","updated_at":"2026-01-01T16:01:57.812576-08:00","closed_at":"2026-01-01T16:01:57.812576-08:00","created_by":"gastown/polecats/slit"}
{"id":"gt-5j2x.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-5j2x\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:04:10.282891-08:00","updated_at":"2025-12-27T21:29:55.455037-08:00","deleted_at":"2025-12-27T21:29:55.455037-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5jew1","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:25.170528-08:00","updated_at":"2025-12-27T21:26:02.163552-08:00","deleted_at":"2025-12-27T21:26:02.163552-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5klh","title":"Remove legacy .beads-wisp/ infrastructure","description":"The wisp simplification (gt-fgms + bd-bkul) changed wisps to be just a flag on issues in the main .beads/ database. JSONL export filters Wisp=true issues. No separate directory needed.\n\nBut legacy .beads-wisp/ infrastructure still exists and should be removed:\n\n## Gas Town (gt) cleanup:\n- `internal/rig/manager.go`: Remove `initWispBeads()` call from `AddRig()`\n- `internal/wisp/io.go`: Delete entire file (old abstraction layer)\n- `internal/cmd/install.go`: Remove any .beads-wisp setup\n- `docs/wisp-architecture.md`: Update to reflect 'just a flag' model\n\n## Beads (bd) cleanup:\n- `cmd/bd/doctor/wisp_check.go`: Remove WispExistsCheck, WispGitCheck, WispSizeCheck (check .beads-wisp dirs)\n- Keep WispOrphansCheck and WispStaleCheck but rewrite to query main DB for Wisp=true issues\n\n## Filesystem cleanup:\n- Delete ~/gt/.beads-wisp/\n- Delete ~/gt/gastown/.beads-wisp/\n- Delete all per-rig .beads-wisp/ directories\n- Remove .beads-wisp from .gitignore entries\n\n## Verify:\n- `gt mail send --wisp` still works (creates issue with Wisp=true in main DB)\n- `bd sync` still filters wisps from JSONL\n- `bd mol squash/burn` still work on Wisp=true issues","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-24T21:22:31.378327-08:00","updated_at":"2025-12-27T21:29:55.43774-08:00","deleted_at":"2025-12-27T21:29:55.43774-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"chore"}
{"id":"gt-5mffo","title":"Digest: mol-deacon-patrol","description":"Patrol 14: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:38:34.763146-08:00","updated_at":"2025-12-27T21:26:00.328862-08:00","deleted_at":"2025-12-27T21:26:00.328862-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5moq5","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.463463-08:00","updated_at":"2025-12-27T21:26:03.788455-08:00","deleted_at":"2025-12-27T21:26:03.788455-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5mqih","title":"Digest: mol-deacon-patrol","description":"Patrol 2: all clear, 9 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:09:09.776198-08:00","updated_at":"2025-12-27T21:26:01.090648-08:00","deleted_at":"2025-12-27T21:26:01.090648-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5n2f","title":"Tech Debt: Code Review December 2024","description":"Tech debt identified during code review on 2024-12-21. Contains 11 issues ranging from P2-P4 covering:\n\n- Code duplication (manager creation boilerplate)\n- Test coverage gaps (cmd 6.8%, mail 3.6%)\n- Magic strings needing constants\n- Error handling inconsistencies\n- Large files needing splitting\n- Unused code removal\n\nWork through these incrementally to improve codebase maintainability.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T21:37:36.114862-08:00","updated_at":"2025-12-27T21:29:56.470689-08:00","dependencies":[{"issue_id":"gt-5n2f","depends_on_id":"gt-ai1z","type":"blocks","created_at":"2025-12-21T21:37:45.973674-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-zhm5","type":"blocks","created_at":"2025-12-21T21:37:46.048395-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-7sqi","type":"blocks","created_at":"2025-12-21T21:37:46.120505-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-nz6t","type":"blocks","created_at":"2025-12-21T21:37:46.194096-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-xnql","type":"blocks","created_at":"2025-12-21T21:37:46.268652-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-480b","type":"blocks","created_at":"2025-12-21T21:37:46.341243-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-cvfg","type":"blocks","created_at":"2025-12-21T21:37:46.417073-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-92of","type":"blocks","created_at":"2025-12-21T21:37:46.489042-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-2xsh","type":"blocks","created_at":"2025-12-21T21:37:46.562771-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-2n6z","type":"blocks","created_at":"2025-12-21T21:37:46.63439-08:00","created_by":"daemon"},{"issue_id":"gt-5n2f","depends_on_id":"gt-pbr3","type":"blocks","created_at":"2025-12-21T21:37:46.706067-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.470689-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-5q9u","title":"Digest: mol-deacon-patrol","description":"Patrol #15: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:34:25.357778-08:00","updated_at":"2025-12-27T21:26:04.276787-08:00","deleted_at":"2025-12-27T21:26:04.276787-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5qc","title":"Document how to configure a Gas Town Harness","description":"Create docs explaining what a harness is (private repo containing GT installation with rigs gitignored), why you'd want one, and how to set it up with beads redirects","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T16:42:43.370167-08:00","updated_at":"2025-12-27T21:29:57.288589-08:00","dependencies":[{"issue_id":"gt-5qc","depends_on_id":"gt-l1o","type":"blocks","created_at":"2025-12-17T16:42:54.620984-08:00","created_by":"daemon"},{"issue_id":"gt-5qc","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:59.17545-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.288589-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5rg5c","title":"Digest: mol-deacon-patrol","description":"P6: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:10:59.624402-08:00","updated_at":"2025-12-27T21:26:02.294341-08:00","deleted_at":"2025-12-27T21:26:02.294341-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5tct","title":"gt mail send: add stdin and file input support for complex messages","description":"Current -m flag is fragile with multi-line messages containing backticks, quotes, or shell special chars. Add: 1) --stdin flag to read message body from stdin (heredoc-friendly), 2) -f/--file flag to read from file. This sidesteps shell quoting nightmares for handoff messages with code samples.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T03:47:50.514096-08:00","updated_at":"2025-12-27T21:29:56.150954-08:00","deleted_at":"2025-12-27T21:29:56.150954-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-5tjz5","title":"Digest: mol-deacon-patrol","description":"Patrol 19: quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T19:49:55.247311-08:00","updated_at":"2025-12-28T19:49:55.247311-08:00","closed_at":"2025-12-28T19:49:55.247279-08:00"}
{"id":"gt-5tp","title":"Test message","description":"Testing GGT mail via beads","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T21:44:27.546781-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"message"}
{"id":"gt-5uf3","title":"Patrol step: check parked molecules for unblock","description":"Add patrol step to Deacon for checking parked molecules:\n\n```yaml\n- step: check-parked-molecules\n action: |\n For each molecule with:\n - status: in_progress\n - assignee: null\n - has step with external: blocked_by\n Check if external deps are now satisfied.\n If yes: spawn polecat to resume the molecule.\n```\n\nThis automates the resume process - no manual intervention needed when\nupstream dependencies ship.\n\nPart of cross-project dependency system.\nSee: docs/cross-project-deps.md\n\nDepends on: gt-in3x (spawn --continue)\nPriority: P3 (future automation, not required for launch)","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-21T22:39:25.167767-08:00","updated_at":"2025-12-27T21:29:57.555818-08:00","dependencies":[{"issue_id":"gt-5uf3","depends_on_id":"gt-in3x","type":"blocks","created_at":"2025-12-21T22:39:44.777023-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.555818-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-5vs4f","title":"Digest: mol-deacon-patrol","description":"Patrol 12: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T03:12:03.669133-08:00","updated_at":"2025-12-27T21:26:03.763455-08:00","deleted_at":"2025-12-27T21:26:03.763455-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5wb7","title":"Update vision.md with proto/mol/wisp terminology","description":"Update the Steam Engine Metaphor section to use consistent phase terminology:\n\nCurrent (vision.md):\n- Proto molecules = fuel (templates)\n- Wisps = steam (transient execution traces) \n- Digests = distillate (permanent records)\n\nNew terminology:\n- **Proto** = crystal/solid - frozen template\n- **Mol** = liquid - reified durable instance (tracked in git)\n- **Wisp** = gas - ephemeral instance (evaporates after squash)\n- **Digest** = distillate - compressed summary after squash\n\nKey clarification needed:\n- Proto → bond → creates either Mol (durable) or Wisp (ephemeral)\n- The choice of Mol vs Wisp depends on --wisp flag\n- Default is Mol (durable, recorded in main beads)\n- Wisp lives in .beads-ephemeral/, squashes away","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:32:45.68234-08:00","updated_at":"2025-12-27T21:29:53.434593-08:00","dependencies":[{"issue_id":"gt-5wb7","depends_on_id":"gt-62hm","type":"blocks","created_at":"2025-12-21T16:33:17.307488-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.434593-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5wtw","title":"Shutdown request handler","description":"Handle polecat shutdown requests:\n\nWhen polecat runs 'gt handoff --shutdown':\n1. Polecat sends mail to \u003crig\u003e/witness requesting shutdown\n2. Witness receives mail\n3. Witness verifies:\n - Git working tree is clean\n - Work is submitted (MR exists or in queue)\n - No uncommitted beads changes\n4. If clean:\n - gt session stop \u003crig\u003e/\u003cpolecat\u003e\n - git worktree remove polecats/\u003cname\u003e\n - git branch -d polecat/\u003cname\u003e\n5. If not clean:\n - Send nudge back to polecat\n - Track retry count\n\nDepends on fixing gt-dsfi (handoff deadlock).","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:22.968711-08:00","updated_at":"2025-12-27T21:29:53.806719-08:00","dependencies":[{"issue_id":"gt-5wtw","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.300578-08:00","created_by":"daemon"},{"issue_id":"gt-5wtw","depends_on_id":"gt-mxyj","type":"blocks","created_at":"2025-12-20T03:14:38.893061-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.806719-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5xph","title":"Document session cycling protocol in all templates","description":"Add explicit lifecycle request protocol to all agent templates.\n\n## Problem\nTemplates mention 'request session refresh' but don't show HOW.\nAgents don't know the protocol for requesting a cycle.\n\n## Protocol to document\n1. Write handoff mail to self (for continuity)\n2. Set requesting_cycle=true in state.json\n3. Send LIFECYCLE mail to deacon/:\n Subject: 'LIFECYCLE: \u003crole\u003e requesting cycle'\n Body: Reason for cycle request\n\n## Templates to update\n- prompts/roles/polecat.md\n- prompts/roles/crew.md \n- prompts/roles/witness.md\n- prompts/roles/refinery.md\n\n## Also add\n- Example state.json location for each role\n- When to request cycle (context full, work complete, etc.)\n- What happens after (daemon kills, respawns, new session primes)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:43:54.10282-08:00","updated_at":"2025-12-27T21:29:53.158431-08:00","deleted_at":"2025-12-27T21:29:53.158431-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5y3mq","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All agents healthy, no actions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:34:25.65643-08:00","updated_at":"2025-12-27T21:26:02.187933-08:00","deleted_at":"2025-12-27T21:26:02.187933-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-5yl1d","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 14: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:28:17.760339-08:00","updated_at":"2025-12-27T21:26:01.854529-08:00","deleted_at":"2025-12-27T21:26:01.854529-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-61o","title":"Review and audit all GGT beads","description":"Thorough review of all filed beads in gastown GGT repo. Check for: consistency, completeness, correct dependencies, accurate descriptions, proper prioritization. Ensure beads are self-contained and dont rely on external docs.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:24:07.152386-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-62hm","title":"Molecule Phase Terminology Documentation","description":"Document the three-phase molecule lifecycle with consistent terminology:\n\n## The Phases (States of Matter)\n\n| Phase | State | Nature |\n|-------|-------|--------|\n| **Proto** | Solid/Crystal | Frozen template, static definition in catalog |\n| **Mol** | Liquid | Reified instance, malleable, tracked in git (durable) |\n| **Wisp** | Gas | Evaporates after squash, ephemeral orchestration |\n\n## Documentation Gaps Identified\n\n### High Priority\n1. Update vision.md Steam Engine metaphor to use proto/mol/wisp terminology\n2. Add lifecycle diagram: Proto → bond → Mol or Wisp → squash → Digest\n3. Document Go types for phases (ProtoMolecule, etc.)\n4. Document .beads-ephemeral/ structure and purpose\n\n### Medium Priority\n5. Update CLAUDE.md files (polecat, witness, refinery) with molecule workflow\n6. Document bd mol bond/squash/burn CLI API with examples\n7. Add polecat guide: Executing molecules and generating summaries\n\n### Low Priority\n8. Add terminology glossary to architecture.md\n9. Create troubleshooting playbook for stuck molecules\n10. Add reference diagrams for state transitions\n\n## Current State\n- architecture.md has good molecule basics but uses inconsistent terminology\n- vision.md has Steam Engine metaphor (proto=fuel, wisp=steam, digest=distillate)\n- molecules.md has detailed reference but needs phase clarity\n- builtin_molecules.go has 8 molecules but no phase documentation in code","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-21T16:32:27.537487-08:00","updated_at":"2025-12-27T21:29:53.443071-08:00","deleted_at":"2025-12-27T21:29:53.443071-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-63clx","title":"Digest: mol-deacon-patrol","description":"Cycle 10","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T22:52:35.094629-08:00","updated_at":"2025-12-31T22:52:35.094629-08:00","closed_at":"2025-12-31T22:52:35.094586-08:00","dependencies":[{"issue_id":"gt-63clx","depends_on_id":"gt-eph-w2mh","type":"parent-child","created_at":"2025-12-31T22:52:35.09584-08:00","created_by":"deacon"}]}
{"id":"gt-65a2","title":"Merge: gt-lnji","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-lnji\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:36:23.454776-08:00","updated_at":"2025-12-27T21:27:22.526724-08:00","deleted_at":"2025-12-27T21:27:22.526724-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-662","title":"Swarm: report generation","description":"Generate markdown reports for completed swarms.\n\n## Command\n```\ngt swarm report \u003cswarm-id\u003e [--save \u003cfile\u003e]\n```\n\n## Report Content\n\n### Header\n- Swarm ID and title\n- Created/completed timestamps\n- Duration\n- Rig name\n\n### Task Summary\n| Task | Assignee | Status | Duration |\n|------|----------|--------|----------|\n| gt-xxx | Toast | merged | 15m |\n| gt-yyy | Nux | merged | 22m |\n\n### Worker Contributions\n- Commits per worker\n- Issues closed per worker\n- Lines changed (optional)\n\n### Timeline\n- Chronological events from events.jsonl\n- Key milestones (started, first merge, landing)\n\n### Issues Encountered\n- Conflicts resolved\n- Failed tasks (if any)\n- Escalations\n\n## Implementation\n```go\nfunc GenerateReport(swarmID string) (*SwarmReport, error)\nfunc (r *SwarmReport) ToMarkdown() string\n```\n\n## Storage\n- Save to \u003crig\u003e/.gastown/swarms/\u003cid\u003e/report.md\n- Or user-specified path with --save\n\n## PGT Reference\ngastown-py/src/gastown/swarm/manager.py generate_report()\n\n## Acceptance Criteria\n- [ ] Markdown report generated\n- [ ] Includes all sections above\n- [ ] Auto-saved to swarm directory\n- [ ] --save allows custom path","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T14:47:17.96767-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-68cxc","title":"Digest: mol-deacon-patrol","description":"Patrol 7: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:55:31.812922-08:00","updated_at":"2025-12-27T21:26:00.559046-08:00","deleted_at":"2025-12-27T21:26:00.559046-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-690z","title":"Test MR","description":"test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T20:15:16.685471-08:00","updated_at":"2025-12-27T21:27:22.972427-08:00","deleted_at":"2025-12-27T21:27:22.972427-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-6957","title":"Deacon primes pending workers instead of spawn nudge","description":"## Problem\n\nWhen gt spawn starts a polecat, it tries to send a nudge message immediately:\n1. Starts tmux session with Claude Code\n2. Waits 3 seconds (hardcoded)\n3. Sends nudge via NudgeSession\n\nBut Claude Code takes 10-20+ seconds to be ready. The message arrives before\nthe prompt is ready, so it appears as part of the prompt string instead of\nbeing executed.\n\n## Solution\n\nMove priming responsibility to the Deacon patrol cycle:\n\n1. gt spawn: Mark polecat as pending_prime=true in state, return immediately\n2. Deacon patrol: New step prime-pending-workers that:\n - Scans all rigs for workers with pending_prime=true\n - Uses WaitForClaudeReady() to poll for \"\u003e \" prompt (already exists in tmux.go)\n - Once ready, sends gt prime + assignment nudge\n - Clears pending_prime flag\n\n## Implementation\n\n### spawn.go changes\n- Remove the 3-second sleep (line 368)\n- Remove the NudgeSession call (lines 375-383)\n- Instead: Update polecat state with pending_prime=true\n\n### Polecat state addition\nstatus: assigned, pending_prime: true, issue: gt-xxx\n\n### Deacon patrol addition\nAdd step between inbox-check and plugin-run:\nprime-pending-workers: Check all rigs for workers marked pending_prime.\nFor each pending worker: WaitForClaudeReady(60s), then gt prime + nudge\n\n### Benefits\n- No race condition - we poll until actually ready\n- Deacon already does health checks - priming is natural fit\n- Works for polecats, witnesses, refineries, crew (unified approach)\n- Timeout handling built in\n\n## Depends on\n- Existing WaitForClaudeReady() in tmux.go (lines 384-403)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T17:19:33.589909-08:00","updated_at":"2025-12-27T21:29:53.124884-08:00","deleted_at":"2025-12-27T21:29:53.124884-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-69ruq","title":"Digest: mol-deacon-patrol","description":"Patrol 8: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T06:10:18.225418-08:00","updated_at":"2025-12-28T06:10:18.225418-08:00","closed_at":"2025-12-28T06:10:18.225382-08:00"}
{"id":"gt-6c084","title":"Digest: mol-deacon-patrol","description":"Patrol 2: Quiet cycle, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:24:29.872651-08:00","updated_at":"2025-12-27T21:26:03.676821-08:00","deleted_at":"2025-12-27T21:26:03.676821-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6cok","title":"Digest: mol-deacon-patrol","description":"Patrol #4: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:31:01.457636-08:00","updated_at":"2025-12-27T21:26:04.367448-08:00","deleted_at":"2025-12-27T21:26:04.367448-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6db","title":"gt rig shutdown: Gracefully stop all rig agents","description":"Add 'gt rig shutdown \u003crig\u003e' command to gracefully stop all agents in a rig.\n\nShould:\n- Stop all polecat sessions\n- Stop refinery\n- Stop witness\n- Optionally wait for graceful shutdown with timeout","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T21:50:07.938698-08:00","updated_at":"2025-12-27T21:29:57.222574-08:00","dependencies":[{"issue_id":"gt-6db","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:23:43.179236-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.222574-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6dyb6","title":"Digest: mol-deacon-patrol","description":"Patrol 5: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T21:36:42.496243-08:00","updated_at":"2025-12-31T21:36:42.496243-08:00","closed_at":"2025-12-31T21:36:42.496201-08:00","dependencies":[{"issue_id":"gt-6dyb6","depends_on_id":"gt-eph-7pst","type":"parent-child","created_at":"2025-12-31T21:36:42.49743-08:00","created_by":"deacon"}]}
{"id":"gt-6exub","title":"Digest: mol-deacon-patrol","description":"Patrol: no callbacks, no polecats, all agents healthy, cleaned 2 orphan processes + 2 stale locks","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:14:24.79552-08:00","updated_at":"2025-12-27T21:26:01.477672-08:00","deleted_at":"2025-12-27T21:26:01.477672-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6k02h","title":"Digest: mol-deacon-patrol","description":"Patrol 6: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:16:49.768628-08:00","updated_at":"2025-12-27T21:26:03.576856-08:00","deleted_at":"2025-12-27T21:26:03.576856-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6k8","title":"Interrupt vs Queue mail semantics","description":"Add priority/delivery semantics to mail messages.\n\n## Semantics\n\n| Type | Delivery | Use Case |\n|------|----------|----------|\n| Interrupt | tmux send-keys | Lifecycle, URGENT, stuck detection |\n| Queue | Create message only | Normal mail, status, heartbeat |\n\n## Implementation\n\n- `bd mail send --interrupt` uses tmux send-keys notification\n- Default is queue (agent checks with `gt mail check`)\n- Urgent flag on messages for interrupt delivery\n\n## Agent Side\n\n- `gt mail check --quiet` - non-blocking check for queued mail\n- `gt mail wait` - block until mail arrives (for idle agents)\n- Heartbeats become queued, agent checks at natural breakpoints","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T14:19:28.408196-08:00","updated_at":"2025-12-27T21:29:57.168288-08:00","dependencies":[{"issue_id":"gt-6k8","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T14:19:46.529252-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.168288-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6lt3","title":"Work on ga-rd4: Add gt polecat status command. Show detai...","description":"Work on ga-rd4: Add gt polecat status command. Show detailed polecat status including current issue, session state, last activity time. When done, submit MR (not PR) to integration branch for Refinery.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T22:58:26.320627-08:00","updated_at":"2025-12-27T21:29:56.881071-08:00","deleted_at":"2025-12-27T21:29:56.881071-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6lwa7","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:50:55.047597-08:00","updated_at":"2025-12-27T21:26:03.165198-08:00","deleted_at":"2025-12-27T21:26:03.165198-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6n13","title":"Competing molecule instantiation mechanisms need resolution","description":"Documentation describes 4 different molecule instantiation methods:\n1. bd mol bond (beads-based, in molecules.md)\n2. gt sling (proposed in sling-design.md)\n3. gt swarm (implemented in code)\n4. gt molecule instantiate (in architecture.md)\n\nRole prompts use different methods:\n- Deacon: bd mol spawn\n- Refinery: gt mol bond\n- Polecat: bd mol current\n\nNeed to:\n1. Decide canonical mechanism\n2. Update all role prompts to use it\n3. Remove/deprecate alternatives\n4. Update architecture docs","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:50:56.093813-08:00","updated_at":"2025-12-27T21:29:52.646912-08:00","dependencies":[{"issue_id":"gt-6n13","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:05.105811-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.646912-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6n1cy","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-u2vg - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T01:53:37.309868-08:00","updated_at":"2025-12-27T21:29:55.334153-08:00","deleted_at":"2025-12-27T21:29:55.334153-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-6n3c4","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T13:33:16.633349-08:00","updated_at":"2025-12-25T13:33:16.633349-08:00","closed_at":"2025-12-25T13:33:16.633317-08:00"}
{"id":"gt-6n4as","title":"Digest: mol-deacon-patrol","description":"Patrol 17: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:19:40.852613-08:00","updated_at":"2025-12-27T21:26:03.511465-08:00","deleted_at":"2025-12-27T21:26:03.511465-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6n8fy","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:51:51.349775-08:00","updated_at":"2025-12-27T21:26:04.11634-08:00","deleted_at":"2025-12-27T21:26:04.11634-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6oxlh","title":"Digest: mol-deacon-patrol","description":"Patrol 12","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T14:55:31.115289-08:00","updated_at":"2025-12-26T14:55:31.115289-08:00","closed_at":"2025-12-26T14:55:31.115248-08:00"}
{"id":"gt-6qdii","title":"Digest: mol-deacon-patrol","description":"Patrol 13: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:51:28.696535-08:00","updated_at":"2025-12-27T21:26:04.124537-08:00","deleted_at":"2025-12-27T21:26:04.124537-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6qyt1","title":"Refinery: event-driven merge queue","description":"## Problem\nRefinery only acts on explicit requests, no queue management.\n\n## ZFC-Compliant Solution\nAdd steps to `mol-refinery-patrol.formula.toml`:\n\n```toml\n[[step]]\nid = \"check-merge-requests\"\ntitle = \"Process merge queue from inbox\"\ndescription = \"\"\"\n1. Check inbox for MERGE_READY messages: gt mail inbox\n2. For each merge request:\n - Read branch name from message\n - Add to local queue (track in state.json or temp file)\n3. Process queue in order:\n - git fetch origin\n - git checkout main \u0026\u0026 git pull\n - git merge --no-ff origin/polecat/\u003cname\u003e\n - If conflict: mail Mayor with details, skip\n - If success: git push \u0026\u0026 delete remote branch\n4. For each merged branch:\n - gt mail send \u003crig\u003e/witness -s \"MERGED: polecat/\u003cname\u003e\" -m \"Branch merged successfully\"\n - Delete the MERGE_READY mail\n\"\"\"\ndepends_on = [\"inbox-check\"]\n```\n\n## Queue State\nRefinery agent tracks queue in `state.json`:\n```json\n{\n \"merge_queue\": [\"polecat/foo\", \"polecat/bar\"],\n \"last_merged\": \"polecat/baz\"\n}\n```\n\n## Why This Works\n- Refinery Claude agent runs formula\n- Agent handles conflicts with judgment\n- Mail is coordination primitive\n- State persists in agent's state.json\n\n## Files\n- formulas/mol-refinery-patrol.formula.toml","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T16:41:05.11996-08:00","updated_at":"2025-12-27T21:29:45.748327-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-6qyt1","depends_on_id":"gt-7uhts","type":"relates-to","created_at":"2025-12-27T20:59:09.678685-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.748327-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6qzay","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T19:15:59.128327-08:00","updated_at":"2025-12-25T19:15:59.128327-08:00","closed_at":"2025-12-25T19:15:59.12827-08:00"}
{"id":"gt-6swlo","title":"Digest: mol-deacon-patrol","description":"P16","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:26:46.30747-08:00","updated_at":"2025-12-27T21:26:01.609962-08:00","deleted_at":"2025-12-27T21:26:01.609962-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6t0","title":"gt swarm: Not discovering tasks from epic dependents","description":"gt swarm create/start shows '0 tasks loaded' even when epic has dependents.\n\nRepro:\n1. Create epic gt-hw6\n2. Create tasks and add deps: bd dep add gt-xxx gt-hw6\n3. gt swarm create gastown --epic gt-hw6 --worker Toast\n4. Swarm shows 'Tasks: 0'\n\nExpected: Swarm should discover tasks that depend on the epic.\nActual: Shows '(no tasks loaded)'","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-17T22:25:41.653628-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-6tf","title":"Implement gt harness create command","description":"Add scaffolding command to create a new harness:\n- gt harness create [path]\n- Creates config/, mayor/, .beads/redirect (optional)\n- Optionally initializes git\n- Generates CLAUDE.md with Mayor role\n- Could also offer a template repo alternative","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T17:15:34.342552-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-6tf","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:51.845578-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-6vks","title":"Bug: LIFECYCLE messages use literal \u003crig\u003e placeholder","description":"Found stale LIFECYCLE messages addressed to @\u003crig\u003e/witness instead of actual rig names like @gastown/witness. Template substitution not happening.\n\nExamples found:\n- gm-bay: LIFECYCLE: polecat requesting shutdown (to @\u003crig\u003e/witness)\n- gm-7du, gm-94e, gm-u5x, gm-89b: same issue\n\nFix: Ensure rig name is substituted in lifecycle mail templates.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-20T03:12:28.146613-08:00","updated_at":"2025-12-27T21:29:56.855943-08:00","deleted_at":"2025-12-27T21:29:56.855943-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-6wyeg","title":"Digest: mol-deacon-patrol","description":"Patrol 21: all healthy, no lifecycle requests, 4 in-progress issues noted, handoff triggered at threshold","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:01:52.406307-08:00","updated_at":"2025-12-27T21:26:02.789713-08:00","deleted_at":"2025-12-27T21:26:02.789713-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6y5b","title":"Polecat mood command and status line display","description":"CLI and status line support for polecat mood.\n\n## CLI Command\n```\ngt polecat mood \u003cname\u003e \u003cemoji\u003e # Set mood\ngt polecat mood \u003cname\u003e # Get mood\n```\n\nSets GT_MOOD environment variable in the polecat's tmux session.\n\n## Status Line Integration\nUpdate runWorkerStatusLine() to read mood:\n```go\nmood, _ := t.GetEnvironment(session, \"GT_MOOD\")\nif mood != \"\" {\n icon = mood // Use mood emoji instead of default\n} else {\n icon = AgentTypeIcons[AgentPolecat] // Fallback to 😺\n}\n```\n\n## Storage\nMood stored in tmux session environment (ephemeral, per-session).\nNo persistence needed - mood is reassessed each patrol cycle.\n\nDepends on: gt-u818 (plugin system)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T16:17:14.111807-08:00","updated_at":"2025-12-27T21:29:56.553925-08:00","dependencies":[{"issue_id":"gt-6y5b","depends_on_id":"gt-u818","type":"blocks","created_at":"2025-12-21T16:17:20.53466-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.553925-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-6yq9","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:24","description":"Patrol 8: quiet, 8 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:24:22.913616-08:00","updated_at":"2025-12-27T21:26:05.28903-08:00","deleted_at":"2025-12-27T21:26:05.28903-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6yr7t","title":"Digest: mol-deacon-patrol","description":"Patrol 18: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:46:36.888735-08:00","updated_at":"2025-12-27T21:26:00.879576-08:00","deleted_at":"2025-12-27T21:26:00.879576-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-6z2","title":"Test Epic: GGT MVP Validation","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-16T21:57:37.355269-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-6z2.1","title":"Test Task 1: Add comment to gt.go","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T21:57:43.554166-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-6z2.1","depends_on_id":"gt-6z2","type":"parent-child","created_at":"2025-12-16T21:57:43.55748-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-6z2.2","title":"Test Task 2: Add comment to version.go","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T21:57:43.693217-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-6z2.2","depends_on_id":"gt-6z2","type":"parent-child","created_at":"2025-12-16T21:57:43.69361-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-6z9m","title":"Test7","description":"Test with fixed workdir","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:50:15.993225-08:00","updated_at":"2025-12-27T21:29:56.736076-08:00","deleted_at":"2025-12-27T21:29:56.736076-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-6zis","title":"Digest: mol-deacon-patrol","description":"Patrol #13","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:24:54.316471-08:00","updated_at":"2025-12-27T21:26:04.725615-08:00","deleted_at":"2025-12-27T21:26:04.725615-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-70b3","title":"detectSender() doesn't recognize crew workers","description":"## Problem\n\n`detectSender()` in mail.go only detects polecats, not crew workers.\n\n## Current Code (line 445)\n\n```go\n// If in a rig's polecats directory, extract address\nif strings.Contains(cwd, \"/polecats/\") {\n // extract rig/polecat\n}\n\n// Default to mayor\nreturn \"mayor/\"\n```\n\n## Symptom\n\nEmma (crew worker at `/Users/stevey/gt/beads/crew/emma`) runs:\n- `gt mail inbox` → checks `mayor/` inbox (wrong!)\n- Should check `beads/emma` or `beads/crew/emma`\n\n## Fix\n\nAdd crew detection:\n```go\n// If in a rig's crew directory, extract address \nif strings.Contains(cwd, \"/crew/\") {\n parts := strings.Split(cwd, \"/crew/\")\n if len(parts) \u003e= 2 {\n rigPath := parts[0]\n crewName := strings.Split(parts[1], \"/\")[0]\n rigName := filepath.Base(rigPath)\n return fmt.Sprintf(\"%s/%s\", rigName, crewName)\n }\n}\n```\n\n## Also Check\n\n- `gt prime` correctly detects role, so there may be another detection function that works\n- Should unify detection logic","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:40:26.520559-08:00","updated_at":"2025-12-27T21:29:54.10919-08:00","dependencies":[{"issue_id":"gt-70b3","depends_on_id":"gt-l4gm","type":"blocks","created_at":"2025-12-18T21:50:04.812663-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.10919-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-71f9o","title":"Digest: mol-deacon-patrol","description":"Cycle 13","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T22:53:58.558541-08:00","updated_at":"2025-12-31T22:53:58.558541-08:00","closed_at":"2025-12-31T22:53:58.558506-08:00"}
{"id":"gt-71i","title":"Update architecture.md: Engineer role and Beads merge queue","description":"Update docs/architecture.md with recent design decisions:\n\n1. Agent table: Change \"Refinery\" role to \"Engineer\"\n - Refinery = place/module/directory\n - Engineer = role (agent that works in the Refinery)\n\n2. Merge Queue section: Document Beads-native model\n - MRs are beads issues with --type=merge-request\n - gt mq commands (submit, list, next, process, reorder)\n - Ordering via depends-on links\n\n3. CLI section: Add gt mq commands\n\n4. Key Design Decisions: Add decisions for:\n - #15: Merge Queue in Beads\n - #16: Engineer role (distinct from Refinery place)\n - #17: Session restart protocol for Engineer","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T23:12:03.616159-08:00","updated_at":"2025-12-27T21:29:54.394591-08:00","dependencies":[{"issue_id":"gt-71i","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:12:14.92163-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.394591-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-71ts0","title":"Merge: slit-mjtj9dc8","description":"branch: polecat/slit-mjtj9dc8\ntarget: main\nsource_issue: slit-mjtj9dc8\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:47:55.177132-08:00","updated_at":"2025-12-30T23:12:31.089554-08:00","closed_at":"2025-12-30T23:12:31.089554-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/slit"}
{"id":"gt-722jc","title":"Digest: mol-deacon-patrol","description":"Patrol 4: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:32:15.321778-08:00","updated_at":"2025-12-27T21:26:00.41535-08:00","deleted_at":"2025-12-27T21:26:00.41535-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-72so","title":"gt mq list: doesn't show submitted MRs","description":"After submitting MRs with gt mq submit, gt mq list gastown shows empty queue.\n\n## Reproduction\n1. gt mq submit --issue gt-h5n.5 --branch polecat/Scabrous\n2. gt mq list gastown → (empty)\n3. bd list --type=merge-request → shows the MR\n\n## Expected\ngt mq list should show submitted MRs\n\n## MR example\ngt-ts4u has rig: gastown in description, type=merge-request, status=open","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-19T14:54:26.731813-08:00","updated_at":"2025-12-27T21:29:54.008454-08:00","deleted_at":"2025-12-27T21:29:54.008454-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-74a7","title":"Wire GT_ACCOUNT env var into spawn/attach","description":"When spawning Claude Code (gt spawn, gt crew attach), check GT_ACCOUNT env var. If set, look up account in config and set CLAUDE_CONFIG_DIR accordingly.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:22.86335-08:00","updated_at":"2025-12-27T21:29:56.167565-08:00","dependencies":[{"issue_id":"gt-74a7","depends_on_id":"gt-58tu","type":"blocks","created_at":"2025-12-23T03:24:34.897966-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.167565-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-751s","title":"mol-witness-patrol","description":"Per-rig worker monitor patrol loop.\n\nThe Witness is the Pit Boss for your rig. You watch polecats, nudge them toward\ncompletion, verify clean git state before kills, and escalate stuck workers.\n\n**You do NOT do implementation work.** Your job is oversight, not coding.\n\nThis molecule uses wisp storage (.beads-wisp/) for ephemeral patrol state.\nPersistent state (nudge counts, handoffs) is stored in a witness handoff bead.","status":"tombstone","priority":4,"issue_type":"epic","created_at":"2025-12-23T01:41:54.504354-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-75b0","title":"Digest: mol-deacon-patrol","description":"Patrol 12: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:37:48.776182-08:00","updated_at":"2025-12-27T21:26:04.585408-08:00","deleted_at":"2025-12-27T21:26:04.585408-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-76yf9","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:08:27.183686-08:00","updated_at":"2025-12-27T21:26:02.327206-08:00","deleted_at":"2025-12-27T21:26:02.327206-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-77nlt","title":"Fix integration test failures with current bd","description":"Integration suite currently fails under -tags=integration:\n\n- internal/cmd: TestInstallTownRoleSlots fails because install does not create expected agent beads/slots (bd slot show hq-mayor not found). Install logs include parsing bd create output: unexpected end of JSON input.\n- internal/daemon: role_config_integration_test fails because bd create --type role is rejected (invalid issue type: role) in current bd (v0.46.0 dev).\n\nGoal: make integration tests robust and passing (either adapt tests to current bd semantics/capabilities, or gate/skip when bd lacks required types), then run go test -tags=integration ./... green.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2026-01-08T11:31:25.64619+13:00","updated_at":"2026-01-08T12:04:20.154258+13:00","created_by":"jv","deleted_at":"2026-01-08T12:03:14.089826+13:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-78ly","title":"implement","description":"Implement the solution for gt-1wmw. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:55:01.795741-08:00","updated_at":"2025-12-25T15:52:57.419666-08:00","deleted_at":"2025-12-25T15:52:57.419666-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7918","title":"Patrols: Cyclic molecules for autonomous maintenance","description":"## Vision\n\nPatrols are **cyclic molecules** - workflow loops that give Gas Town its autonomous nervous system. While regular molecules are DAGs that terminate, patrols loop forever, performing maintenance tasks at varying cadences.\n\nThis is the \"steam engine\" of Gas Town: converting episodic Claude sessions into continuous autonomous operation.\n\n## Core Concepts\n\n### 1. Cyclic Molecules\n\nRegular molecule: A -\u003e B -\u003e C -\u003e done\nPatrol molecule: A -\u003e B -\u003e C --+\n ^ |\n +-------------+\n\nPatrols have a loop_to field specifying where to restart.\n\n### 2. Cooldown-Aware Steps (Atoms)\n\nSteps encode their own cadence. When patrol reaches a step:\n1. Check: now - last_run \u003e cooldown?\n2. If yes: execute, update last_run\n3. If no: skip (immediately close)\n\nThe patrol runner is simple - steps self-skip. Complexity distributed into atoms.\n\n### 3. The Beacon\n\nThe heartbeat that fires patrol triggers:\n- Internal ticker in Deacon (goroutine)\n- Or external cron firing gt deacon tick\n- Or mail-based triggers\n\nWithout beacon, nothing proactive happens.\n\n### 4. Session Reset as Patrol Step\n\nConnects to auto-handoff (gt-bcwn). Session reset is a patrol step, not a separate mechanism.\n\n### 5. Multi-Role Patrols\n\nEach supervisor has its own patrol:\n\n**Deacon patrol:** health-check (30s), session-gc (5m), beacon-tick (10s)\n**Witness patrol:** orphan-scan (10m), stuck-check (2m), molecule-progress (1m)\n**Refinery patrol:** queue-check (30s), pr-status (1m), merge-ready (30s)\n\n### 6. Cadence Tiers\n\n- Critical (10-30s): Health checks\n- Active (1-5m): Progress, nudges\n- Maintenance (10-30m): Orphans, GC\n- Periodic (1h+): Reports\n\n### 7. Best-Effort Scheduling\n\nNot real-time - more like cron. No hard deadlines. Catch-up, dont pile-up.\nPriority preemption (mail interrupts patrol). Graceful degradation under load.\n\n## Open Questions\n\n1. State persistence: Beads (self-describing) or file (faster)?\n2. Interruption: How does urgent mail preempt patrol?\n3. Error recovery: Backoff? Escalate? Circuit breaker?\n4. Coordination: Can patrols send mail to trigger other patrols?\n\n## Related\n\n- gt-bcwn: Auto-handoff (session reset is a patrol step)\n- Molecule system (patrols extend molecules with loops)\n- Deacon lifecycle management\n\n## Metaphor\n\nClaude was fire. Claude Code was steam. Gas Town is the steam engine. Beads is the train tracks.\n\nThe steam engine converts episodic combustion into continuous rotary motion.\nGas Town converts episodic Claude sessions into continuous autonomous work.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-21T12:18:22.99128-08:00","updated_at":"2025-12-27T21:29:53.576866-08:00","dependencies":[{"issue_id":"gt-7918","depends_on_id":"gt-bcwn","type":"blocks","created_at":"2025-12-21T12:18:30.86651-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.576866-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-7919","title":"Fix failing beads tests: TestIntegration and TestPolecatWorkMolecule","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-22T12:46:29.649-08:00","updated_at":"2025-12-27T21:29:53.234558-08:00","deleted_at":"2025-12-27T21:29:53.234558-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-7920","title":"Create mol-refinery-patrol with test failure gates","description":"## Problem\n\nThe Refinery role lacks:\n1. A formal patrol molecule (Deacon has mol-deacon-patrol)\n2. Engineer identity/philosophy guidance\n3. Structural enforcement of failure handling (not just documentation)\n\nWhen tests fail during merge processing, the current guidance doesn't prevent 'disavowal' - noting a problem exists and proceeding without tracking it.\n\n## The Structural Solution\n\nCreate `mol-refinery-patrol` with **verification gates** that make disavowal structurally impossible:\n\n```markdown\n## Molecule: mol-refinery-patrol\n\n## Step: inbox-check\nCheck mail for MR submissions, escalations, messages.\nProcess any urgent items first.\n\n## Step: queue-scan\nFetch remote, identify polecat branches waiting.\nIf queue empty, skip to context-check.\nTrack branch list for this cycle.\nNeeds: inbox-check\n\n## Step: process-branch\nPick next branch. Rebase on current main.\nIf rebase conflicts: notify polecat, skip to next branch.\nNeeds: queue-scan\n\n## Step: run-tests\nRun go test ./...\nTrack results (pass/fail count).\nNeeds: process-branch\n\n## Step: handle-failures\n**GATE**: If tests passed, this step auto-completes.\nIf tests failed:\n- Diagnose: branch regression or pre-existing?\n- Branch regression → abort, notify polecat\n- Pre-existing → EITHER fix OR \\`bd create --type=bug\\`\n**VERIFY**: Fix committed OR bead filed before proceeding.\nNeeds: run-tests\n\n## Step: merge-push\nMerge to main (ff-only preferred).\nPush immediately.\nDelete polecat branch.\nNeeds: handle-failures\n\n## Step: loop-check\nMore branches? Return to process-branch.\nOtherwise continue to summary.\nNeeds: merge-push\n\n## Step: generate-summary\nSummarize cycle: branches processed, tests results, issues filed.\nNeeds: loop-check\n\n## Step: context-check\nCheck context usage.\nIf high: write handoff, prepare for burn/respawn.\nNeeds: generate-summary\n\n## Step: burn-or-loop\nIf context LOW and queue non-empty: loop (return to queue-scan)\nIf context HIGH or queue empty: burn and exit\nNeeds: context-check\n```\n\n## Implementation Tasks\n\n### 1. Add RefineryPatrolMolecule() to builtin_molecules.go\n- Copy pattern from DeaconPatrolMolecule\n- Include all steps with proper `Needs:` dependencies\n- Emphasize handle-failures gate in description\n\n### 2. Update Refinery CLAUDE.md\nAdd sections:\n- **The Engineer Mindset** (identity/philosophy)\n- **The Scotty Test** (Would Scotty walk past this?)\n- **Patrol Molecule** reference (mol-refinery-patrol)\n- **Test Failure Protocol** (explicit decision tree)\n\n### 3. Create prompts/roles/refinery.md\n- Follow pattern of deacon.md\n- Include patrol execution loop diagram\n- Reference the molecule\n\n### 4. Consider mol-witness-patrol\n- Witness has detailed heartbeat protocol but no formal molecule\n- Similar pattern could apply\n\n## Why This Matters\n\nGUPP: 'If you have work on your hook, you have to run it.'\nThe patrol molecule puts failure handling ON THE HOOK as a mandatory step.\nYou can't skip handle-failures to get to merge-push.\nThe structure enforces good behavior.\n\n## Related\n- gt-7919: Fix failing beads tests (discovered during this session)\n- mol-deacon-patrol (existing pattern to follow)\n- docs/molecular-chemistry.md (chemistry concepts)\n- docs/wisp-architecture.md (wisp vs mol decision)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T13:01:48.588945-08:00","updated_at":"2025-12-27T21:25:59.95284-08:00","deleted_at":"2025-12-27T21:25:59.95284-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-7922","title":"Dynamic wisp modification: insert, reorder, bond mid-execution","description":"## Vision\n\nWisps should be living documents that adapt mid-execution, not static scripts.\n\n## Operations Needed\n\n### bd mol insert\nInsert a step into a running wisp:\n```bash\nbd mol insert \u003cwisp-id\u003e --after \u003cstep\u003e --step 'security-gate: Review auth changes'\n```\n\n### bd mol reorder \nChange step order in a running wisp:\n```bash\nbd mol reorder \u003cwisp-id\u003e --move process-branch --before current\n```\n\n### bd mol bond (to wisp)\nAttach an entire molecule to a running wisp:\n```bash\nbd mol bond mol-weekly-maintenance \u003cwisp-id\u003e --after await-work\n```\n\n### bd mol skip\nSkip a step (with reason):\n```bash\nbd mol skip \u003cwisp-id\u003e \u003cstep\u003e --reason 'Queue empty, skipping summary'\n```\n\n## Use Cases\n\n1. **Hotfix priority**: Reorder queue to process urgent MR first\n2. **Security gate**: Insert review step when PR touches sensitive files\n3. **Scheduled maintenance**: Bond maintenance mol on schedule\n4. **Context adaptation**: Skip steps when they don't apply\n\n## Why This Matters\n\nEngine room problems emerge dynamically. The wisp's flexibility lets agents model situations as they evolve, not execute rigid scripts.\n\nGUPP still applies: whatever's on your hook, you run. But the hook contents can adapt.\n\n## Implementation\n\nThis requires Beads changes to support wisp mutation:\n- Track step ordering in wisp storage\n- Support step insertion/deletion/reordering\n- Handle bond-to-wisp (spawn steps into existing wisp)\n- Preserve audit trail of modifications","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T13:24:26.533247-08:00","updated_at":"2025-12-27T21:29:56.353796-08:00","deleted_at":"2025-12-27T21:29:56.353796-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-7923","title":"gt rig add / gt doctor: patrol awareness and wiring","description":"## Problem\n\nWhen a rig is installed or audited, we need to ensure all built-in patrols and role hooks are properly wired up.\n\n## gt rig add Changes\n\nWhen adding a rig, automatically:\n\n1. **Create patrol molecules** for each role:\n - mol-deacon-patrol (town-level)\n - mol-witness-patrol (per-rig)\n - mol-refinery-patrol (per-rig)\n\n2. **Set up hooks** that trigger patrols:\n - Deacon: daemon timer / heartbeat\n - Witness: daemon timer / polecat lifecycle events\n - Refinery: MR submission events / daemon timer\n\n3. **Configure daemon** to manage these patrols:\n - Register patrol molecules in daemon config\n - Set up respawn behavior for each role\n\n4. **Create plugin directories**:\n - ~/gt/plugins/ (town-level)\n - \u003crig\u003e/plugins/ (rig-level, if needed)\n\n## gt doctor Changes\n\nAdd patrol health checks:\n\n### patrol-molecules-exist\n- Verify mol-deacon-patrol, mol-witness-patrol, mol-refinery-patrol exist\n- Check they parse correctly (valid steps, dependencies)\n\n### patrol-hooks-wired\n- Verify hooks trigger patrol execution\n- Check daemon is configured to manage patrols\n\n### patrol-not-stuck\n- Detect wisps that have been in-progress too long\n- Flag orphaned patrol molecules (no active session)\n\n### patrol-plugins-accessible\n- Verify plugin directories exist and are readable\n- Check plugin frontmatter parses correctly\n\n### patrol-roles-have-prompts\n- Verify prompts/roles/*.md exist for each role\n- Check they reference the correct patrol molecule\n\n## Auto-fix\n\ngt doctor --fix can:\n- Create missing patrol molecules\n- Wire up missing hooks\n- Create plugin directories\n- NOT restart stuck patrols (needs human decision)\n\n## Related\n- gt-7920 (mol-refinery-patrol)\n- gt-7921 (await-work and plugin-run)\n- docs/wisp-architecture.md","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T13:24:43.158379-08:00","updated_at":"2025-12-27T21:29:53.200531-08:00","deleted_at":"2025-12-27T21:29:53.200531-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-7asd","title":"Digest: mol-deacon-patrol","description":"Patrol 3: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:35:39.123943-08:00","updated_at":"2025-12-27T21:26:04.650909-08:00","deleted_at":"2025-12-27T21:26:04.650909-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7bty9","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All clear - inbox empty, agents healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T15:31:29.106816-08:00","updated_at":"2025-12-27T21:26:03.12439-08:00","deleted_at":"2025-12-27T21:26:03.12439-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7ftgy","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All healthy, mayor handoff observed (not for deacon)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:34:10.890209-08:00","updated_at":"2025-12-27T21:26:02.813965-08:00","deleted_at":"2025-12-27T21:26:02.813965-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7ggsm","title":"Digest: mol-deacon-patrol","description":"Patrol 5: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:46:02.255072-08:00","updated_at":"2025-12-27T21:26:03.478693-08:00","deleted_at":"2025-12-27T21:26:03.478693-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7gno","title":"Digest: mol-deacon-patrol","description":"Patrol #18: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:35:15.310522-08:00","updated_at":"2025-12-27T21:26:04.252071-08:00","deleted_at":"2025-12-27T21:26:04.252071-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7hor","title":"Document the Propulsion Principle","description":"Write canonical documentation for the Universal Gas Town Propulsion Principle.\n\nLocation: gastown/mayor/rig/docs/propulsion-principle.md\n\nContent:\n- The One Rule (hook has work → work happens)\n- Why it works (stateless agents, molecule-driven)\n- The sling lifecycle diagram\n- Agent startup protocol\n- Examples and anti-patterns\n\nThis is foundational theory-of-operation documentation.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T03:17:47.790012-08:00","updated_at":"2025-12-27T21:29:56.362057-08:00","deleted_at":"2025-12-27T21:29:56.362057-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7hwug","title":"Merge: gt-svdsy","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: gt-svdsy\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:17:37.413408-08:00","updated_at":"2025-12-30T23:12:42.937419-08:00","closed_at":"2025-12-30T23:12:42.937419-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-7hz3","title":"Merge: gt-92l","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-92l\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T16:31:37.716367-08:00","updated_at":"2025-12-27T21:27:22.938439-08:00","deleted_at":"2025-12-27T21:27:22.938439-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-7i6i6","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:31:52.727074-08:00","updated_at":"2025-12-27T21:26:02.577294-08:00","deleted_at":"2025-12-27T21:26:02.577294-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7iek","title":"context-check","description":"Assess own context usage. If high, prepare for handoff.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:51:45.43771-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-7ihm8","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:23:56.486493-08:00","updated_at":"2025-12-27T21:26:03.091768-08:00","deleted_at":"2025-12-27T21:26:03.091768-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7ik","title":"Ephemeral polecats: spawn fresh, delete on completion","description":"## Design Decision\n\nSwitch from pooled/idle polecats to ephemeral model:\n- Spawn creates fresh worktree from main\n- Polecat requests shutdown when done (bottom-up)\n- Witness verifies handoff, kills session, deletes worktree\n- No 'idle' state - polecats exist only while working\n\n## Rationale\n\n1. **Git worktrees are fast** - pooling optimization is obsolete\n2. **Pooling creates maintenance burden:**\n - Git stashes accumulate\n - Untracked artifacts pile up\n - Branches drift from main\n - Beads DB gets stale\n3. **PGT sync problems** came from persistent branches\n4. **Support infrastructure exists** - Witness, Refinery, Mayor handle continuity\n5. **Simpler mental model** - polecat exists = work in progress\n\n## Lifecycle\n\n```\nSpawn:\n gt spawn --issue \u003cid\u003e\n → Creates fresh worktree: git worktree add polecats/\u003cname\u003e -b polecat/\u003cname\u003e\n → Initializes beads in worktree\n → Starts session, assigns work\n\nWorking:\n Polecat does task\n → Pushes to polecat/\u003cname\u003e branch\n → Submits to merge queue when ready\n\nCompletion (POLECAT-INITIATED):\n Polecat runs: gt handoff\n → Verifies git state clean\n → Sends mail to Witness: \"Ready for shutdown\"\n → Marks itself done, waits for termination\n\nCleanup (WITNESS-OWNED):\n Witness receives shutdown request\n → Verifies PR merged or in queue\n → Verifies no uncommitted changes\n → Kills session: gt session stop \u003crig\u003e/\u003cpolecat\u003e\n → Deletes worktree: git worktree remove polecats/\u003cname\u003e\n → Deletes branch: git branch -d polecat/\u003cname\u003e\n → Optionally: Notifies Mayor of completion\n```\n\n## Key Insight: Bottom-Up Shutdown\n\n**Old model (wrong)**: Top-down batch shutdown - \"cancel the swarm\"\n**New model (right)**: Bottom-up individual shutdown - polecat requests, Witness executes\n\nThis enables streaming:\n- Workers come and go continuously\n- No \"swarm end\" to trigger cleanup\n- Each worker manages its own lifecycle\n- Witness is the lifecycle authority\n\n## Implementation\n\n1. Add `gt handoff` command for polecats to request shutdown\n2. Modify gt spawn to always create fresh worktree\n3. Run bd init in new worktree (beads needs initialization)\n4. Add shutdown request handler to Witness\n5. Witness verifies handoff, then cleans up:\n - Kill session\n - Remove worktree\n - Delete branch\n6. Remove 'idle' state from polecat state machine\n7. Simplify gt polecat list (only shows active)\n\n## Impact on Other Tasks\n\n- gt-17r (Zombie cleanup): Becomes trivial - orphan worktrees\n- gt-4my (Worker health): Simpler - no idle/stuck ambiguity\n- gt-f9x.5/f9x.6 (Doctor): Fewer states to validate\n- gt-eu9 (Witness handoff): Witness receives polecat shutdown requests","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-17T15:44:31.139964-08:00","updated_at":"2025-12-27T21:29:54.27909-08:00","deleted_at":"2025-12-27T21:29:54.27909-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-7iu4u","title":"Digest: mol-deacon-patrol","description":"Patrol 12: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:18:28.987479-08:00","updated_at":"2025-12-27T21:26:03.536096-08:00","deleted_at":"2025-12-27T21:26:03.536096-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7jh8r","title":"Digest: mol-deacon-patrol","description":"P7","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:24:01.882704-08:00","updated_at":"2025-12-27T21:26:01.687768-08:00","deleted_at":"2025-12-27T21:26:01.687768-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7jtnu","title":"Merge: dementus-mjw46vz4","description":"branch: polecat/dementus-mjw46vz4\ntarget: main\nsource_issue: dementus-mjw46vz4\nrig: gastown\nagent_bead: gt-gastown-polecat-dementus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T17:32:20.742553-08:00","updated_at":"2026-01-01T19:55:59.965387-08:00","closed_at":"2026-01-01T19:55:59.965387-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/dementus"}
{"id":"gt-7lt","title":"gt mail send should tmux-notify recipient","description":"## Problem\n\nWhen mail is sent via gt mail send, the recipient session does not get a tmux notification. In Python Gas Town (PGT), mail delivery triggers a tmux display-message or similar notification so the agent knows mail arrived.\n\n## Expected Behavior\n\nWhen gt mail send \u003caddr\u003e is called:\n1. Mail is delivered to recipient inbox\n2. If recipient has an active tmux session, send notification\n3. Notification should be visible (display-message or bell)\n\n## Current Behavior\n\nMail is delivered but no notification. Agent has to poll inbox to discover new mail.\n\n## Impact\n\n- Agents miss time-sensitive messages\n- Heartbeat pokes from daemon will not wake agents\n- Coordination is slower (polling vs push)\n\n## Implementation Notes\n\nAfter successful mail delivery, check if recipient has active session:\n- gt session list can identify active sessions\n- tmux display-message or send-keys can notify\n- Could inject a visible prompt like \"[MAIL] New message from \u003csender\u003e\"\n\n## Reference\n\nCheck PGT implementation for how it handles this.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T12:28:50.142075-08:00","updated_at":"2025-12-27T21:29:54.195117-08:00","deleted_at":"2025-12-27T21:29:54.195117-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-7m572","title":"Merge: gt-svdsy","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: gt-svdsy\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:20:08.869585-08:00","updated_at":"2025-12-30T23:12:42.909537-08:00","closed_at":"2025-12-30T23:12:42.909537-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-7nb3h","title":"Digest: mol-deacon-patrol","description":"Patrol 5: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:38:56.411245-08:00","updated_at":"2025-12-27T21:26:02.905339-08:00","deleted_at":"2025-12-27T21:26:02.905339-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7oow","title":"gt mail: Cross-level routing is broken","description":"**Problem:**\nWhen Mayor (at ~/gt) sends mail to a rig worker (gastown/crew/max), the message lands in town-level beads (~/.beads/) but the recipient checks rig-level beads (crew/max/.beads/).\n\n**Reproduction:**\n```bash\ncd ~/gt\ngt mail send gastown/crew/max -s 'Test' -m 'body'\n# Message goes to ~/gt/.beads/ with prefix hq-*\n\ncd ~/gt/gastown/crew/max \ngt mail inbox\n# Does NOT see the message - it's looking in crew/max/.beads/\n```\n\n**Root cause:**\n`findBeadsWorkDir()` walks up from CWD to find .beads. `Router.Send()` runs `bd create` in that directory. This means messages always go to the sender's beads, not the recipient's.\n\n**Fix options:**\n1. Route based on recipient address - if sending to rig/*, use that rig's .beads\n2. Use a single shared beads database for all mail (simpler but less isolated)\n3. Teach agents to check both levels (workaround, not fix)\n\n**Related:**\n- gt-ngu1: Pinned beads sorting (done but pointless if mail doesn't route)\n- This blocks all cross-level mail functionality","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T17:57:30.258991-08:00","updated_at":"2025-12-27T21:29:53.660771-08:00","deleted_at":"2025-12-27T21:29:53.660771-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-7os1m","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Witnesses healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:54:10.611738-08:00","updated_at":"2025-12-27T21:26:01.519189-08:00","deleted_at":"2025-12-27T21:26:01.519189-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7qgm0","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All healthy, read Mayor handoff (info)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T14:54:20.388504-08:00","updated_at":"2025-12-27T21:26:03.14066-08:00","deleted_at":"2025-12-27T21:26:03.14066-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7sqi","title":"Refactor: Extract common manager creation boilerplate","description":"Five nearly identical functions exist for creating managers:\n- getPolecatManager (polecat.go:241)\n- getSessionManager (session.go:183)\n- getCrewManager (crew_helpers.go:44)\n- getRefineryManager (refinery.go:116)\n- getWitnessManager (witness.go:102)\n\nAll follow the same pattern: find workspace, load config, get rig, create manager. Should extract to a common helper that takes a factory function.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:34:30.495275-08:00","updated_at":"2025-12-27T21:29:56.487369-08:00","deleted_at":"2025-12-27T21:29:56.487369-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7tmh","title":"mol-deacon-patrol","description":"[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\nDeacon patrol molecule template. Label: template","status":"tombstone","priority":4,"issue_type":"epic","created_at":"2025-12-23T13:03:21.516072-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-7wcf","title":"README: Update Install section for homebrew/npm","description":"README currently shows:\n go install github.com/steveyegge/gastown/cmd/gt@latest\n\nBut users will install via homebrew or npm. Update to show:\n brew install gastown\n # or\n npm install -g @anthropic/gastown\n\nAlso add prerequisites section (tmux, Claude Code CLI).","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T13:04:51.751097-08:00","updated_at":"2025-12-27T21:29:52.630314-08:00","dependencies":[{"issue_id":"gt-7wcf","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T13:05:00.100253-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.630314-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7we","title":"Swarms of One: Lightweight single-worker task dispatch","description":"Design and implement a lightweight pattern for firing off single workers to handle tasks without full swarm overhead.\n\n## Context\n\nCurrently we have:\n- town spawn: Creates a polecat in a rig with issue assignment\n- Swarms (sw-*): Full lifecycle tracking with manifest, state, events, reports\n- Ephemeral rigs (rig-*): Temporary worker groups for swarms\n\nWhat's missing: A simple way to say 'fire off a worker to do X' without swarm ceremony.\n\n## Design Questions\n\n1. Should this be a new command like 'gt fire' or an option on existing commands?\n2. Should single tasks still get swarm IDs (sw-N) for consistency/queryability?\n3. Should it default to creating new workers or support --reuse for idle polecats?\n4. How does this relate to crew workers (gt-cik)?\n\n## Possible Interface\n\ngt fire --rig gastown --issue gt-xyz [--prompt '...']\ngt fire gastown/QuickTask --issue gt-xyz\n\n## Related\n\n- gt-cik: Overseer Crew (user-managed persistent workspaces)\n- gt-kmn: Swarm System epic","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T16:51:02.716629-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-7x7rf","title":"Digest: mol-deacon-patrol","description":"Patrol 12: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:15:51.554671-08:00","updated_at":"2025-12-27T21:26:01.006017-08:00","deleted_at":"2025-12-27T21:26:01.006017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-7xtn","title":"Bug: MRs appearing in bd ready output","description":"merge-request beads are showing up in `bd ready` output alongside actual work items.\n\n## Problem\n`bd ready` is meant to show work available for polecats to claim. MRs are internal workflow items processed by the Refinery, not polecat work.\n\n## Example\n```\n| P1 | merge-request | bd-3zzh | Merge: bd-tvu3 |\n| P1 | merge-request | bd-fcl1 | Merge: bd-au0.5 |\n```\n\n## Possible Fixes\n1. **Filter by type** - `bd ready` excludes type=merge-request, type=gate, type=molecule\n2. **Set assignee on submit** - MR assigned to refinery/ when created, so not 'unassigned'\n3. **Set status=in_progress** - MR starts in_progress since it's queued for processing\n\n## Related\n- gate beads also showing (bd-hyp6, bd-wu62) - same issue","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-23T14:38:29.507419-08:00","updated_at":"2025-12-27T21:29:52.965432-08:00","deleted_at":"2025-12-27T21:29:52.965432-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-7yc3u","title":"Digest: mol-deacon-patrol","description":"Patrol 2: all healthy, no incidents","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:52:09.555378-08:00","updated_at":"2025-12-27T21:26:00.593285-08:00","deleted_at":"2025-12-27T21:26:00.593285-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-80g0k","title":"Digest: mol-deacon-patrol","description":"Patrol 13: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:08:36.666314-08:00","updated_at":"2025-12-27T21:26:02.950916-08:00","deleted_at":"2025-12-27T21:26:02.950916-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-82y","title":"Design: Swarm shutdown and worker cleanup","description":"Design for graceful swarm shutdown, worker cleanup, and session cycling.\n\n## Key Decisions\n\n1. Pre-kill verification uses model intelligence (not framework rules)\n2. Witness can request restart when context filling (mail self, exit)\n3. Mayor NOT involved in per-worker cleanup (Witness responsibility)\n4. Clear responsibility boundaries between Mayor/Witness/Polecat\n\n## Subtasks (implementation)\n\n- gt-sd6: Polecat decommission checklist prompting\n- gt-f8v: Witness pre-kill verification protocol\n- gt-eu9: Witness session cycling and handoff\n- gt-gl2: Mayor vs Witness cleanup responsibilities\n\n**Design complete.** Each subtask has full specification in its description.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-15T19:47:44.936374-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-83guu","title":"Digest: mol-deacon-patrol","description":"Patrol 3: 14 sessions healthy, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:28:52.67281-08:00","updated_at":"2025-12-27T21:26:02.618083-08:00","deleted_at":"2025-12-27T21:26:02.618083-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-83k0","title":"mol-witness-patrol molecule definition","description":"Create mol-witness-patrol in builtin_molecules.go.\n\n## Steps (10 total)\n1. inbox-check - Process witness mail (lifecycle, help requests)\n2. load-state - Read handoff bead, get nudge counts\n3. survey-workers - gt polecat list, categorize by status\n4. inspect-workers - tmux capture-pane for each 'working' polecat\n5. decide-actions - Apply nudge matrix, queue actions\n6. execute-actions - Nudge, kill, or escalate as decided\n7. save-state - Update handoff bead with new states\n8. generate-summary - Summarize cycle for digest\n9. context-check - Check own context usage\n10. burn-or-loop - Squash wisp, then loop or cycle session\n\n## Key Behaviors\n- Uses wisp storage (.beads-wisp/)\n- Reads/writes witness handoff bead for state persistence\n- Progressive nudging (3 levels before escalate)\n- Pre-kill verification before killing polecats\n\n## Reference\n- See prompts/roles/witness.md for protocol details\n- See mol-refinery-patrol for similar structure\n- Parent epic: gt-aqd8","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:42:43.697249-08:00","updated_at":"2025-12-27T21:25:59.944731-08:00","deleted_at":"2025-12-27T21:25:59.944731-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-85a","title":"gt spawn: Not injecting work instructions to session","description":"gt spawn starts session but doesn't inject the issue assignment.\n\nRepro:\n1. gt spawn gastown/Toast --issue gt-2ux\n2. Session starts but polecat just sees Claude prompt\n3. No issue context injected\n\nExpected: Polecat should receive issue details automatically.\nActual: Polecat sits at blank prompt, needs manual injection.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-17T22:28:02.583003-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-85gfs","title":"Digest: mol-deacon-patrol","description":"Patrol 20: all healthy, doctor pass, handoff cycle","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:37:50.9058-08:00","updated_at":"2025-12-27T21:26:00.694201-08:00","deleted_at":"2025-12-27T21:26:00.694201-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-86w","title":"CLI: doctor diagnostics and auto-repair","description":"GGT completely lacks the doctor command which is critical for debugging.\n\nRequired Commands:\n- gt doctor [\u003crig\u003e] - Run diagnostic checks\n- gt doctor --fix - Auto-repair common issues\n\nChecks to Implement:\nWorkspace Level: Config validity, Mayor mailbox, Rig registry\nRig Level: Git state, clone health, witness/refinery presence, beads sync\nSwarm Level: Stuck detection, zombie sessions, heartbeat health\n\nPGT Reference: gastown-py/src/gastown/cli/dashboard_cmd.py","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T14:46:34.721484-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-87jz","title":"mol-witness-patrol","description":"Per-rig worker monitor patrol loop using the Christmas Ornament pattern.\n\nThe Witness is the Pit Boss for your rig. You watch polecats, nudge them toward\ncompletion, verify clean git state before kills, and escalate stuck workers.\n\n**You do NOT do implementation work.** Your job is oversight, not coding.\n\n## The Christmas Ornament Shape\n\nThis molecule uses dynamic bonding to create inspection arms per-polecat:\n\n```\n ★ mol-witness-patrol (trunk)\n /|\\\n ┌─────┘ │ └─────┐\n PREFLIGHT DISCOVERY CLEANUP\n │ │ │\n ┌───┴───┐ ┌─┴─┐ ┌───┴───┐\n │inbox │ │sur│ │aggreg │\n │refnry │ │vey│ │save │\n │load │ └─┬─┘ │summary│\n └───────┘ │ │contxt │\n │ │loop │\n ┌─────────┼─────────┐ └───────┘\n │ │ │\n ● ● ● mol-polecat-arm (dynamic)\n ace nux toast\n │ │ │\n ┌──┴──┐ ┌──┴──┐ ┌──┴──┐\n │cap │ │cap │ │cap │\n │ass │ │ass │ │ass │\n │dec │ │dec │ │dec │\n │exec │ │exec │ │exec │\n └──┬──┘ └──┬──┘ └──┬──┘\n │ │ │\n └─────────┴─────────┘\n │\n ⬣ base (cleanup)\n```\n\n## Phases\n\n### PREFLIGHT (fixed steps)\n1. inbox-check - Process lifecycle requests, help messages\n2. check-refinery - Ensure MQ is alive and processing\n3. load-state - Read persistent state (nudge counts, etc.)\n\n### DISCOVERY (spawns dynamic arms)\n4. survey-workers - List polecats, bond mol-polecat-arm per polecat\n5. run-plugins - Bond mol-plugin-runner for each witness plugin\n\n### CLEANUP (gate + fixed steps)\n6. aggregate - GATE: WaitsFor all arms + plugins to complete\n7. save-state - Persist nudge counts, action log\n8. generate-summary - Create digest content\n9. context-check - Check if context is high\n10. burn-or-loop - Squash/burn wisp, then loop or exit\n\n## Dynamic Arms\n\nEach polecat gets mol-polecat-arm bonded as a wisp child:\n- capture - Capture tmux output\n- assess - Categorize state (working/idle/error/done)\n- load-history - Get nudge counts for this polecat\n- decide - Apply nudge matrix\n- execute - Take action (nudge/kill/escalate/none)\n\nArms run in PARALLEL. The aggregate step waits for all to complete.\n\n## Activity Feed\n\nThis design enables real-time visibility:\n\n```\n[14:32:08] + patrol-x7k.arm-ace bonded (5 steps)\n[14:32:09] → patrol-x7k.arm-ace.capture in_progress\n[14:32:10] ✓ patrol-x7k.arm-ace.capture completed\n[14:32:14] ✓ patrol-x7k.arm-ace.decide completed (action: nudge-1)\n[14:32:17] ✓ patrol-x7k.arm-ace COMPLETE\n```\n\n## Storage\n\n- Wisp storage: .beads-wisp/ (ephemeral, gitignored)\n- Persistent state: witness handoff bead (nudge counts, etc.)\n- Digests: Squashed summaries in permanent beads\n\n## Dependencies\n\n- bd-xo1o: Dynamic Molecule Bonding epic (in beads rig)\n - bd mol bond with variable substitution\n - WaitsFor directive for fanout gates\n - Activity feed query\n\nLabels: [template, christmas-ornament]","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T16:23:42.025546-08:00","updated_at":"2025-12-27T21:25:59.91152-08:00","deleted_at":"2025-12-27T21:25:59.91152-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-88r9s","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 5: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:39:36.316364-08:00","updated_at":"2025-12-27T21:26:01.43596-08:00","deleted_at":"2025-12-27T21:26:01.43596-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8bx","title":"Adaptive backoff for daemon heartbeat","description":"Track agent responsiveness and adjust heartbeat frequency.\n\n## Per-Agent State\n\n```go\ntype AgentBackoff struct {\n BaseInterval time.Duration // 60s default\n CurrentInterval time.Duration // grows when busy\n MaxInterval time.Duration // 10min cap\n ConsecutiveMiss int // pokes with no response\n}\n```\n\n## Strategy Options\n\n- **Fixed**: Always 60s (current, simple)\n- **Geometric**: 60s → 90s → 135s → 202s (factor 1.5)\n- **Exponential**: 60s → 120s → 240s (factor 2, aggressive)\n\n## Recovery\n\nWhen agent responds (runs a command):\n- Reset ConsecutiveMiss to 0\n- Return to BaseInterval immediately\n\n## Benefits\n\n- Reduces noise for busy agents\n- Saves resources during quiet periods\n- Still catches stuck agents (max interval cap)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T14:19:33.083844-08:00","updated_at":"2025-12-27T21:29:57.151722-08:00","dependencies":[{"issue_id":"gt-8bx","depends_on_id":"gt-bfd","type":"blocks","created_at":"2025-12-18T14:19:46.912289-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.151722-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8cmas","title":"Session ended: gt-gastown-nux","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-02T20:52:30.754198-08:00","updated_at":"2026-01-03T11:32:45.191435-08:00","closed_at":"2026-01-03T11:32:45.191435-08:00","close_reason":"Session lifecycle events processed","created_by":"gastown/polecats/nux"}
{"id":"gt-8dk07","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:26:44.693291-08:00","updated_at":"2025-12-27T21:26:03.295713-08:00","deleted_at":"2025-12-27T21:26:03.295713-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8dry","title":"Add role-specific subcommands that delegate to core commands (agent UX)","description":"## Problem\n\nAgents naturally guess command patterns like:\n- `gt witness nudge \u003cpolecat\u003e`\n- `gt polecat nudge \u003cname\u003e`\n- `gt crew restart \u003cmember\u003e`\n\nBut these don't exist - the actual commands are:\n- `gt nudge \u003csession\u003e \u003cmessage\u003e`\n- `gt polecat reset \u003cname\u003e`\n\nThis creates friction in agent UX. Agents shouldn't have to memorize the 'correct' command structure.\n\n## Proposal\n\nAdd subcommands to each role that delegate to core commands with argument reordering:\n\n```\ngt witness nudge furiosa \"Start working\"\n → gt nudge gt-gastown-furiosa \"Start working\"\n\ngt polecat nudge gastown/furiosa \"Check mail\"\n → gt nudge gt-gastown-furiosa \"Check mail\"\n\ngt crew nudge gastown/max \"Wake up\"\n → gt nudge gt-crew-gastown-max \"Wake up\"\n```\n\n## Benefits\n\n1. **Discoverable**: Agents explore `gt witness --help` and find nudge\n2. **Lenient**: Multiple valid ways to express the same intent\n3. **Role-contextual**: Commands under the role namespace feel natural\n4. **Extensible**: Pattern works for future subcommands (status, reset, etc.)\n\n## Implementation\n\nEach role command (witness, polecat, crew, refinery) gets thin wrapper subcommands:\n\n```go\n// In witness.go\nwitnessCmd.AddCommand(\u0026cobra.Command{\n Use: \"nudge \u003cpolecat\u003e \u003cmessage\u003e\",\n Short: \"Nudge a polecat (delegates to gt nudge)\",\n RunE: func(cmd *cobra.Command, args []string) error {\n session := formatSession(rig, args[0])\n return runNudge(session, args[1])\n },\n})\n```\n\n## Future Candidates\n\nOther subcommands to add as role-specific aliases:\n- `gt polecat status \u003cname\u003e` (already exists, good pattern)\n- `gt witness check \u003cpolecat\u003e` → trigger a manual check\n- `gt refinery merge \u003cmr-id\u003e` → process specific MR\n- `gt crew send \u003cmember\u003e \u003cmessage\u003e` → mail to crew member","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T14:05:17.303809-08:00","updated_at":"2025-12-27T21:29:56.628234-08:00","deleted_at":"2025-12-27T21:29:56.628234-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8dv11","title":"Digest: mol-deacon-patrol","description":"Patrol 13: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T14:49:01.104722-08:00","updated_at":"2025-12-27T21:26:03.148765-08:00","deleted_at":"2025-12-27T21:26:03.148765-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8fdec","title":"Digest: mol-deacon-patrol","description":"Patrol 4: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:10:02.355677-08:00","updated_at":"2025-12-27T21:26:02.310836-08:00","deleted_at":"2025-12-27T21:26:02.310836-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8gvn","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:28","description":"Patrol 18","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:28:54.929537-08:00","updated_at":"2025-12-27T21:26:05.204726-08:00","deleted_at":"2025-12-27T21:26:05.204726-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8h4","title":"Pinned Beads: Ongoing concerns and anchors","description":"Pinned beads represent persistent concerns that do not close traditionally. Stay out of bd ready. Examples: Monitor production, Weekly syncs. Implementation: pinned: true field on bead.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-18T18:08:11.314086-08:00","updated_at":"2025-12-27T21:29:57.135056-08:00","deleted_at":"2025-12-27T21:29:57.135056-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-8j8e","title":"gt mail send: --priority flag should work like bd mail send","description":"UX inconsistency: gt mail send passes flags that bd mail send doesn't support.\n\n## Root Cause\n\nrouter.go line 42 passes `--priority` to bd:\n```go\nargs = append(args, \"--priority\", fmt.Sprintf(\"%d\", beadsPriority))\n```\n\nBut `bd mail send` only has `--urgent` (boolean), not `--priority`.\n\n## Fix Options\n\n1. Add `--priority` flag to `bd mail send` (preferred - more expressive)\n2. Change router to only use `--urgent` when priority=0\n\n## Also Affected\n\n- `--type` flag (line 46) - bd mail send doesn't have this\n- `--thread-id` flag (line 51) - bd mail send doesn't have this \n- `--reply-to` flag (line 56) - bd mail send doesn't have this\n\nThe router assumes bd mail send has features it doesn't have.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-18T21:31:05.486487-08:00","updated_at":"2025-12-27T21:29:57.093543-08:00","deleted_at":"2025-12-27T21:29:57.093543-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-8k6fe","title":"Digest: mol-deacon-patrol","description":"Patrol 16","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:56:52.749639-08:00","updated_at":"2025-12-27T21:26:01.494229-08:00","deleted_at":"2025-12-27T21:26:01.494229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8ky8i","title":"Digest: mol-deacon-patrol","description":"Patrol complete: all healthy, cleaned 2 orphan processes + 3 stale locks","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:27:25.666937-08:00","updated_at":"2025-12-27T21:26:00.84611-08:00","deleted_at":"2025-12-27T21:26:00.84611-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8mbz","title":"Town Doctor molecule for harness health checks","description":"Create a Town Doctor molecule that any Gas Town agent can run to diagnose and repair harness issues.\n\n## Concept\n\nInstead of just `gt doctor` as a CLI command with hardcoded checks, create a **molecule** (checklist workflow) that:\n- Any agent (Mayor, Witness, Polecat) can instantiate\n- Walks the agent through diagnostic steps\n- Agent uses judgment to fix issues found\n- Works as a structured troubleshooting guide\n\n## Why a Molecule?\n\n1. **Agent-driven**: The agent running it becomes \"the doctor\" temporarily\n2. **Extensible**: Add new checks by updating the molecule, not code\n3. **Contextual**: Agent can reason about issues, not just run scripts\n4. **Self-healing**: Agent can fix problems it finds, not just report them\n\n## Proposed Checks (molecule steps)\n\n1. Verify harness structure (mayor/, .beads/, CLAUDE.md exist)\n2. Validate config files (town.json, rigs.json parse correctly)\n3. Check beads health (bd doctor, redirect validity)\n4. Verify git state (clean working tree, proper remotes)\n5. Check rig integrity (each registered rig exists, has config.json)\n6. Validate agent clones (mayor/rig/, refinery/rig/ exist and are valid)\n7. Check for orphaned worktrees/branches\n8. Verify daemon state (if running)\n\n## Integration\n\n- `gt doctor` could instantiate the molecule for the current agent\n- Or agent can run `bd ready` and pick up doctor tasks when prioritized\n- Results logged to beads for audit trail\n\n## Related\n\n- gt-cr9: Harness Design \u0026 Documentation (completed)\n- Molecules design in architecture.md","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-19T13:03:22.688851-08:00","updated_at":"2025-12-27T21:29:57.001673-08:00","deleted_at":"2025-12-27T21:29:57.001673-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8n6dx","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T22:41:13.824117-08:00","updated_at":"2026-01-01T22:41:13.824117-08:00","closed_at":"2026-01-01T22:41:13.824077-08:00","dependencies":[{"issue_id":"gt-8n6dx","depends_on_id":"gt-eph-0hmg","type":"parent-child","created_at":"2026-01-01T22:41:13.825375-08:00","created_by":"deacon"}]}
{"id":"gt-8os8","title":"Work on ga-p6r: Add handoff protocol to spawn priming. En...","description":"Work on ga-p6r: Add handoff protocol to spawn priming. Ensure polecats receive handoff context when spawned. When done, submit MR (not PR) to integration branch for Refinery.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T22:57:59.334003-08:00","updated_at":"2025-12-27T21:29:56.906515-08:00","deleted_at":"2025-12-27T21:29:56.906515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8q4qb","title":"Digest: mol-deacon-patrol","description":"Patrol 9: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:49:44.632044-08:00","updated_at":"2025-12-27T21:26:01.24637-08:00","deleted_at":"2025-12-27T21:26:01.24637-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz","title":"Molecule Algebra: Work Composition DSL","description":"Implement the molecule algebra - a declarative language for composing, transforming, and executing structured work. Enables mechanical composition without AI.\n\nThe Three Phases: Rig → Cook → Run\n\n- **Rig**: Source-level composition (formula YAML with extends/compose)\n- **Cook**: Instantiation (formula → proto → mol/wisp)\n- **Run**: Execution (agents complete steps)\n\nTwo Composition Operators:\n- **Rig** operates on formulas (source level)\n- **Bond** operates on artifacts (protos, mols, wisps)\n\nKey components:\n- Formulas: YAML source with composition rules (.formula.yaml)\n- Cook: Pre-expand macros/aspects to flat proto\n- Phase verbs (pour, wisp)\n- Bond: Artifact-level polymorphic composition\n- Composition operators (sequence, parallel, branch, loop, gate)\n- Advice operators (before, after, around - Lisp style)\n- Expansion operators (macros like Rule of Five)\n- Aspects (AOP cross-cutting concerns)\n\nSee docs/rig-cook-run.md for the lifecycle spec.\nSee docs/molecule-algebra.md for full algebra specification.\nExample formulas in .beads/formulas/","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T18:03:48.824827-08:00","updated_at":"2025-12-27T21:29:52.939844-08:00","deleted_at":"2025-12-27T21:29:52.939844-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-8tmz.1","title":"Phase verbs: pour and wisp commands","description":"Implement the phase transition verbs:\n- bd pour \u003cproto\u003e - instantiate as persistent mol\n- bd wisp \u003cproto\u003e - instantiate as ephemeral wisp\n- Update bd mol bond to inherit phase from target\n\nThese replace the current --wisp flag with cleaner verb semantics.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:03:57.280647-08:00","updated_at":"2025-12-27T21:29:52.931272-08:00","dependencies":[{"issue_id":"gt-8tmz.1","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:03:57.281096-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.931272-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.10","title":"Rename Engineer in Box to Shiny","description":"Rename mol-engineer-in-box to mol-shiny (or just 'shiny').\n\nMad Max reference - the canonical 'right way to engineer'.\nUpdate all references in docs and code.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T18:04:40.434948-08:00","updated_at":"2025-12-27T21:29:57.448077-08:00","dependencies":[{"issue_id":"gt-8tmz.10","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:40.437009-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.448077-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.11","title":"Rule of Five expansion template","description":"Create rule-of-five as an expansion template:\n- Jeffrey's discovery: agents converge in 4-5 iterations\n- Template expands any step into 5-pass refinement\n- draft → refine-1 → refine-2 → refine-3 → refine-4\n\nFirst example of a macro-style expansion proto.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:41.438135-08:00","updated_at":"2025-12-27T21:29:55.8379-08:00","dependencies":[{"issue_id":"gt-8tmz.11","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:41.439635-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.8379-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.12","title":"Formula parser and YAML schema","description":"Implement formula parsing from YAML:\n- Define YAML schema for .formula.yaml files\n- Parse steps, compose rules, vars\n- Support extends for formula inheritance\n- Validate formula structure\n\nSchema should support:\n- formula: name\n- description: text\n- version: number\n- type: workflow|expansion|aspect\n- vars: variable definitions\n- steps: step definitions\n- compose: composition rules\n\n**Beads issue: bd-weu8**","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:20:02.788306-08:00","updated_at":"2025-12-27T21:29:52.894721-08:00","dependencies":[{"issue_id":"gt-8tmz.12","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:20:02.788811-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.894721-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.13","title":"bd cook command: Formula to Proto","description":"Implement the 'bd cook' command that transforms formulas into protos.\n\nUsage: bd cook \u003cformula-file\u003e [options]\n bd cook .beads/formulas/*.formula.yaml\n\nProcess:\n1. Parse formula YAML (uses formula parser)\n2. Resolve extends (formula inheritance)\n3. Expand macros (expansion rules)\n4. Apply aspects (cross-cutting concerns)\n5. Generate flat proto bead\n\nOutput:\n- Creates proto bead in .beads/ with mol-prefix\n- Proto has all steps pre-expanded\n- Proto is ready for pour/wisp instantiation\n\n**Beads issue: bd-wa2l**","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:20:03.9306-08:00","updated_at":"2025-12-27T21:29:52.886546-08:00","dependencies":[{"issue_id":"gt-8tmz.13","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:20:03.933113-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.13","depends_on_id":"gt-8tmz.12","type":"blocks","created_at":"2025-12-23T18:20:11.133013-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.886546-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.14","title":"bd formula list/show commands","description":"Implement formula management commands:\n\n bd formula list\n # Lists formulas from all search paths\n\n bd formula show rule-of-five\n # Shows formula details, steps, compose rules\n\nSearch paths (in order):\n1. .beads/formulas/ (project)\n2. ~/gt/.beads/formulas/ (town)\n3. ~/.beads/formulas/ (user)\n4. Built-in formulas (embedded)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:20:05.02817-08:00","updated_at":"2025-12-27T21:29:55.829567-08:00","dependencies":[{"issue_id":"gt-8tmz.14","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:20:05.03005-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.829567-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.15","title":"Formula cycle detection during cooking","description":"Detect and error on circular extends chains during bd cook. E.g., if formula A extends B extends A, cooking should fail with clear error message pointing to the cycle.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:45:06.751822-08:00","updated_at":"2025-12-27T21:29:52.878259-08:00","dependencies":[{"issue_id":"gt-8tmz.15","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:06.752271-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.15","depends_on_id":"gt-8tmz.13","type":"blocks","created_at":"2025-12-23T18:48:18.543425-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.878259-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.16","title":"Prevent aspect self-matching (infinite recursion)","description":"Aspects should only match original steps, not steps inserted by the same aspect application. Document this and implement guard during cooking. Without this, a pointcut like *.implement that inserts security-prescan could match its own insertion infinitely.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:45:07.855882-08:00","updated_at":"2025-12-27T21:29:52.870075-08:00","dependencies":[{"issue_id":"gt-8tmz.16","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:07.861131-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.16","depends_on_id":"gt-8tmz.5","type":"blocks","created_at":"2025-12-23T18:48:18.885103-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.870075-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.17","title":"Max expansion depth limit (default 5)","description":"Add maxExpansionDepth config (default 5) to prevent runaway nested expansions. This still allows massive work generation but with a safety bound. Note: distinguish compile-time cooking (produces proto with placeholders) vs runtime cooking (produces mol with real issue IDs). Compile-time is for modeling/estimation/contractor handoff.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:45:09.107895-08:00","updated_at":"2025-12-27T21:29:55.821148-08:00","dependencies":[{"issue_id":"gt-8tmz.17","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:09.109857-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.17","depends_on_id":"gt-8tmz.3","type":"blocks","created_at":"2025-12-23T18:48:19.037958-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.821148-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.18","title":"Cooking metadata: source tracing","description":"Add source_formula and source_location metadata to cooked steps so you can trace a step in a proto back to its origin formula. Useful for debugging complex compositions.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:45:10.295767-08:00","updated_at":"2025-12-27T21:29:55.812859-08:00","dependencies":[{"issue_id":"gt-8tmz.18","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:10.29809-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.812859-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.19","title":"Meta-formulas: formula generators","description":"Support meta-formulas that generate multiple formulas from a template. Example: multi-lang-shiny that generates shiny-go, shiny-python, shiny-typescript from a for-each loop over languages. Enables DRY formula libraries.","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T18:45:28.847972-08:00","updated_at":"2025-12-27T21:29:57.4397-08:00","dependencies":[{"issue_id":"gt-8tmz.19","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:28.849855-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.4397-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.2","title":"Advice operators: before, after, around","description":"Implement Lisp-style advice operators for molecules:\n- before(target, step) - insert step before target\n- after(target, step) - insert step after target\n- around(target, wrapper) - wrap target with before/after\n\nSupport glob patterns for targeting (*.implement, shiny.*, etc).","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:04:30.429759-08:00","updated_at":"2025-12-27T21:29:52.922158-08:00","dependencies":[{"issue_id":"gt-8tmz.2","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:30.430206-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.2","depends_on_id":"gt-8tmz.1","type":"blocks","created_at":"2025-12-23T18:04:49.721471-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.2","depends_on_id":"gt-8tmz.13","type":"blocks","created_at":"2025-12-23T18:48:18.633013-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.2","depends_on_id":"gt-mdgt8","type":"blocks","created_at":"2025-12-25T01:23:13.415509-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.922158-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.20","title":"Formula inheritance trees for organizational policies","description":"Support organizational formula hierarchies where teams extend a company-base-formula. Each team can add required aspects (accessibility audit for frontend, SQL injection scan for backend). Enables policy enforcement across an organization.","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T18:45:46.888965-08:00","updated_at":"2025-12-27T21:29:57.43141-08:00","dependencies":[{"issue_id":"gt-8tmz.20","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:46.889387-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.43141-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.21","title":"Conditional aspects (feature flags)","description":"Apply aspects conditionally based on environment or config. Example: apply security-audit only when env.SECURITY_SCANNING == enabled. Adds 'when' clause to aspect composition.","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T18:45:48.17899-08:00","updated_at":"2025-12-27T21:29:57.42316-08:00","dependencies":[{"issue_id":"gt-8tmz.21","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:48.180636-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.42316-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.22","title":"Review dimensions as aspects","description":"Refactor code-review molecule to use aspects for each review dimension (security, performance, accessibility). More flexible than current pluggable molecule approach - add/remove checks by composing aspects.","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T18:45:49.336541-08:00","updated_at":"2025-12-27T21:29:57.414825-08:00","dependencies":[{"issue_id":"gt-8tmz.22","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:49.338462-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.414825-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.23","title":"Compile-time vs runtime cooking","description":"Distinguish two cooking modes: 1) Compile-time: produces proto with variable placeholders, for modeling/estimation/contractor handoff. 2) Runtime: produces mol with real issue IDs. Compile-time lets you pre-decompose work for untrusted executors or planning purposes.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T18:45:50.5263-08:00","updated_at":"2025-12-27T21:29:55.79592-08:00","dependencies":[{"issue_id":"gt-8tmz.23","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:45:50.527834-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.79592-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.24","title":"Document Rig/Cook/Run lifecycle","description":"The rig-cook-run.md spec is written. Propagate terminology to all relevant docs: - molecule-algebra.md (done) - molecular-chemistry.md (done) - vision.md (done) - propulsion-principle.md - sling-design.md - Any agent templates that reference cooking/instantiation","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:18:26.167525-08:00","updated_at":"2025-12-27T21:29:52.8618-08:00","dependencies":[{"issue_id":"gt-8tmz.24","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T19:18:26.16795-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.8618-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.25","title":"Implement bd bond command for artifact-level composition","description":"Implement the artifact-level bond operator distinct from source-level rigging.\n\nBond accepts:\n- Formula names (cooks inline to ephemeral proto, then bonds)\n- Mol IDs (existing liquid work)\n- Wisp IDs (existing vapor work)\n\nSee molecular-chemistry.md for the full bond table semantics.\n\n**Updated 2025-12-25**: With ephemeral protos (gt-4v1eo), bond takes formula names directly and cooks inline. No pre-cooked proto beads.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T19:18:27.245107-08:00","updated_at":"2025-12-27T21:29:55.787471-08:00","dependencies":[{"issue_id":"gt-8tmz.25","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T19:18:27.246957-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.787471-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.26","title":"Direct formula-to-wisp instantiation (skip proto)","description":"For massive ephemeral workflows (e.g., million-step Towers of Hanoi), allow:\n\nbd wisp towers-of-hanoi --var disks=20\n\nThis would cook + wisp in one step, never materializing the proto. Useful when:\n- The proto would be huge (1M+ steps)\n- The work is ephemeral (wisp, not mol)\n- You don't need the intermediate artifact\n\nImplementation: streaming cook that emits directly to wisp storage.\n\nRelated: towers-of-hanoi.formula.yaml demonstrates the use case.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T19:26:53.377902-08:00","updated_at":"2025-12-27T21:29:55.778998-08:00","dependencies":[{"issue_id":"gt-8tmz.26","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T19:26:53.378307-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.778998-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.27","title":"Computed range expansion (for-each over expressions)","description":"Support for-each expansion over computed ranges, not just static lists:\n\nfor-each:\n var: move_num\n range: \"1..2^{disks}\" # Computed at cook time\n\nThis enables formulas like Towers of Hanoi where the step count is a function of input variables. The range expression is evaluated during cooking.\n\nRelated: towers-of-hanoi.formula.yaml demonstrates the need.\nDepends on: gt-8tmz.8 (Runtime dynamic expansion)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T19:27:02.71296-08:00","updated_at":"2025-12-27T21:29:55.770638-08:00","dependencies":[{"issue_id":"gt-8tmz.27","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T19:27:02.713399-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.27","depends_on_id":"gt-8tmz.8","type":"blocks","created_at":"2025-12-23T19:27:08.191351-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.770638-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.3","title":"Expansion operators: expand and map (macros)","description":"Implement macro-style expansion operators:\n- expand(target, template) - apply template to single step\n- map(molecule, template) - apply template to all matching steps\n\nTemplates use {target} substitution for step references.\nRule of Five is the canonical example.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:04:31.472199-08:00","updated_at":"2025-12-27T21:29:52.911358-08:00","dependencies":[{"issue_id":"gt-8tmz.3","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:31.472662-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.3","depends_on_id":"gt-8tmz.1","type":"blocks","created_at":"2025-12-23T18:04:49.809887-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.3","depends_on_id":"gt-8tmz.13","type":"blocks","created_at":"2025-12-23T18:48:18.717567-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.911358-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.32","title":"Consolidate molecular-chemistry.md with rig-cook-run.md","description":"Merge rig-cook-run.md into molecular-chemistry.md as the canonical chemical algebra spec:\n- Rig/Cook/Run as the lifecycle backbone\n- Full generation graph: Formula → Compound Formula → Proto → Mol/Wisp\n- Bond table at artifact level (symmetric)\n- Rig operator at source level\n- Unified vocabulary\n- Archive or redirect rig-cook-run.md","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T20:06:48.82201-08:00","updated_at":"2025-12-27T21:29:55.762212-08:00","dependencies":[{"issue_id":"gt-8tmz.32","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T20:06:48.823692-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.762212-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.33","title":"Map expansion should match nested child steps","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-25T11:50:05.247378-08:00","updated_at":"2025-12-27T21:29:55.240441-08:00","dependencies":[{"issue_id":"gt-8tmz.33","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-25T11:50:05.247973-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.240441-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-8tmz.34","title":"Expansion var overrides not implemented","description":"ExpandRule.Vars field exists in schema but is ignored during expansion.\n\n## Current State\n\nThe vars map in expand rules should override the expansion formula default variables, but currently the Vars field is not used.\n\n## Implementation\n\n1. In ApplyExpansions, pass rule.Vars to expandStep\n2. Merge vars with formula defaults (rule.Vars wins)\n3. Substitute vars in template placeholders\n4. Add test: expansion with var overrides\n\n## Files\n\n- internal/formula/expand.go: ApplyExpansions, expandStep\n- internal/formula/expand_test.go: add test case","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-25T11:50:17.317847-08:00","updated_at":"2025-12-27T21:29:57.381373-08:00","dependencies":[{"issue_id":"gt-8tmz.34","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-25T11:50:17.318327-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.381373-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.35","title":"Inline step expansion (Step.Expand field)","description":"Step.Expand and Step.ExpandVars fields exist in schema but are not implemented.\n\n## Current State\n\nSteps can declare inline expansion:\n\n steps:\n - id: design\n expand: rule-of-five\n expand_vars:\n iterations: 3\n\nThis is more convenient than compose.expand for single-step expansions.\n\n## Difference from compose.expand\n\n- compose.expand: Centralized, applies after all steps parsed\n- Step.Expand: Inline, step declares its own expansion\n\n## Implementation\n\n1. During cooking, detect steps with Expand field\n2. Load referenced expansion formula\n3. Replace step with expanded template (like compose.expand)\n4. Pass ExpandVars as variable overrides\n5. Handle recursion depth (reuse DefaultMaxExpansionDepth)\n\n## Files\n\n- internal/formula/types.go: Step.Expand, Step.ExpandVars (already defined)\n- internal/formula/expand.go: Add inline expansion handling\n- internal/formula/expand_test.go: Add test cases\n\n## Depends On\n\n- gt-8tmz.34 (var overrides) should be done first","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-25T11:50:30.099384-08:00","updated_at":"2025-12-27T21:29:57.372695-08:00","dependencies":[{"issue_id":"gt-8tmz.35","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-25T11:50:30.099886-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.35","depends_on_id":"gt-8tmz.34","type":"blocks","created_at":"2025-12-25T17:16:56.841331-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.372695-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8tmz.36","title":"Validate expanded step IDs are unique","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-25T11:50:48.436691-08:00","updated_at":"2025-12-27T21:29:57.363966-08:00","dependencies":[{"issue_id":"gt-8tmz.36","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-25T11:50:48.437167-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.363966-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.38","title":"Fanout gate: waits-for children aggregation","description":"Formalize the fanout gate pattern where a step waits for all dynamically-bonded children.\n\n## Current State\n\nFormulas already use this pattern informally:\n\n step: aggregate\n waits_for: \"all-children\"\n\nBut it is not in the algebra spec or implemented in the cooker.\n\n## Proposed Syntax\n\n step: aggregate\n needs: [survey-workers]\n waits-for: children-of(survey-workers)\n\nOr simpler:\n\n step: aggregate\n needs: [survey-workers]\n waits-for: all-children\n\n## Semantics\n\n- Step cannot start until ALL children of the referenced step have completed\n- Children are dynamically bonded via for-each (gt-8tmz.8)\n- This is the \"aggregation\" half of the Christmas Ornament pattern\n- If no children were bonded, the gate passes immediately\n\n## Use Cases\n\n**Witness Patrol:**\n- survey-workers bonds N polecat-arms\n- aggregate waits for all arms to complete\n- Then proceeds to save-state\n\n**Refinery Patrol:**\n- Less common (sequential processing), but useful for parallel test runs\n\n## Relationship to gate\n\nThis is a variant of gate where the condition is \"all children complete\" rather than\nan arbitrary expression. Could be implemented as:\n\n gate:\n condition: \"children-of(survey-workers).all(status == complete)\"\n\nBut dedicated syntax is cleaner for this common pattern.\n\n## Dependencies\n\n- Depends on gt-8tmz.8 (for-each creates the children)\n- Depends on gt-8tmz.4 (gate infrastructure)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T14:19:04.255758-08:00","updated_at":"2025-12-27T21:29:55.198171-08:00","dependencies":[{"issue_id":"gt-8tmz.38","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-25T14:19:04.256284-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.38","depends_on_id":"gt-8tmz.8","type":"blocks","created_at":"2025-12-25T14:19:11.036957-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.38","depends_on_id":"gt-8tmz.4","type":"blocks","created_at":"2025-12-25T14:19:11.148842-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.198171-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.4","title":"Control flow: loop, gate, branch","description":"Implement control flow operators for the molecule algebra.\n\n## Core Operators\n\n| Operator | Syntax | Purpose |\n|----------|--------|---------|\n| loop (fixed) | loop: { count: N, body: [steps] } | Fixed iteration |\n| loop (conditional) | loop: { until: COND, max: N, body: [steps] } | Bounded conditional iteration |\n| gate | gate: { before: step, condition: COND } | Wait for condition before proceeding |\n| branch | branch: { from: step, steps: [a,b,c], join: step } | Parallel paths that rejoin |\n\nConditions evaluated mechanically (step status, output fields) per gt-8tmz.7.\n\n## Design Review (2025-12-25)\n\nAnalyzed patrol patterns (witness, deacon, refinery) to verify these constructs\ncan express real Gas Town workflows. Key findings:\n\n### Insight: Patrol Loop is OUTSIDE the Algebra\n\nThe daemon respawns patrols. Each formula describes ONE cycle. This means:\n- Formulas don't need \"infinite loop with burn-and-respawn\"\n- loop is only for **inner loops** (refinery's per-branch processing)\n- \"burn-or-loop\" = complete cycle; daemon decides next action\n- Context cycling is external, not algebraic\n\nThis significantly simplifies the algebra.\n\n### Gap: for-each is Missing\n\nThe current proposal doesn't include for-each, but it's essential for patrol patterns:\n\n| Pattern | Use Case |\n|---------|----------|\n| Witness survey-workers | For each polecat, bond an arm (parallel) |\n| Refinery process-branch | For each branch in queue, run merge pipeline (sequential) |\n| Deacon plugin-run | For each plugin where gate open, execute |\n\nProposed syntax:\n\n for-each:\n collection: \"survey-workers.output.polecats\"\n bond: mol-polecat-arm\n vars:\n polecat_name: \"{item.name}\"\n parallel: true\n\nThis overlaps with gt-8tmz.8 (runtime dynamic expansion). Decision needed:\n- Include for-each here as control flow, OR\n- Ensure gt-8tmz.8 covers these patterns adequately\n\n### Gap: waits-for Gate Variant\n\nThe \"fanout gate\" pattern (aggregate waits for all dynamically-bonded children)\nneeds formalization:\n\n step: aggregate\n waits-for: children-of(survey-workers)\n\nCurrently exists in formulas as waits_for: \"all-children\" but not in the algebra spec.\n\n### What This Covers\n\nWith these operators (plus for-each), patrols become fully expressible:\n\n**Witness Patrol:**\n- for-each polecat -\u003e bond mol-polecat-arm (parallel)\n- gate with waits-for: all-children for aggregation\n\n**Refinery Patrol:**\n- for-each branch -\u003e process pipeline (sequential)\n- gate for test verification before merge\n\n**Deacon Patrol:**\n- for-each plugin -\u003e execute if gate open\n- Simple sequence otherwise\n\n### Scope Decision\n\n**Option A**: Implement loop, gate, branch now. Track for-each and waits-for separately.\n\n**Option B**: Expand scope to include for-each and waits-for formalization.\n\nRecommendation: Option A - implement core constructs, create child issues for gaps.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:33.194896-08:00","updated_at":"2025-12-27T21:29:55.880458-08:00","dependencies":[{"issue_id":"gt-8tmz.4","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:33.196543-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.4","depends_on_id":"gt-8tmz.7","type":"blocks","created_at":"2025-12-23T18:04:49.985527-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.880458-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.5","title":"Aspects: AOP cross-cutting concerns","description":"Implement aspect-oriented composition:\n- Define aspects with pointcuts and advice\n- Apply aspects at bond time: bd bond mol --with-aspect security\n- Pointcuts use glob patterns to match join points\n\nEnables security-audit, logging, etc. as reusable concerns.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:34.128562-08:00","updated_at":"2025-12-27T21:29:55.871951-08:00","dependencies":[{"issue_id":"gt-8tmz.5","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:34.130564-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.5","depends_on_id":"gt-8tmz.2","type":"blocks","created_at":"2025-12-23T18:04:49.898125-08:00","created_by":"daemon"},{"issue_id":"gt-8tmz.5","depends_on_id":"gt-8tmz.6","type":"blocks","created_at":"2025-12-23T18:48:18.802195-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.871951-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.6","title":"Selection operators: glob, filter, children","description":"Implement selection operators for targeting:\n- step(id) - specific step\n- glob(pattern) - pattern match (*.implement)\n- filter(predicate) - status/output predicates\n- children(step), descendants(step) - tree traversal\n\nUsed by advice, expansion, and aspects for targeting.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:35.447482-08:00","updated_at":"2025-12-27T21:29:55.863348-08:00","dependencies":[{"issue_id":"gt-8tmz.6","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:35.449213-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.863348-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.7","title":"Condition evaluator for gates and loops","description":"Implement mechanical condition evaluation:\n- step.status == 'complete'\n- step.output.field == value\n- children(step).all(status == 'complete')\n- file.exists(path), env.VAR\n\nKeep decidable: no arbitrary code, bounded evaluation.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:36.820385-08:00","updated_at":"2025-12-27T21:29:55.854765-08:00","dependencies":[{"issue_id":"gt-8tmz.7","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:36.820807-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.854765-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.8","title":"Runtime dynamic expansion (for-each)","description":"Implement runtime expansion for discovered work (the for-each construct).\n\n## Syntax\n\n step: survey-workers\n on-complete:\n for-each: output.polecats\n bond: mol-polecat-arm\n vars:\n polecat_name: \"{item.name}\"\n rig: \"{item.rig}\"\n parallel: true # or sequential: true\n\n## Design Context (from gt-8tmz.4 review)\n\nThis is **control flow**, not just expansion. It affects execution order:\n- for-each with parallel: true = concurrent child execution\n- for-each with sequential: true = ordered child execution\n\nKey patrol patterns that require this:\n\n| Pattern | Use Case |\n|---------|----------|\n| Witness survey-workers | For each polecat, bond an arm (parallel) |\n| Refinery process-branch | For each branch in queue, run merge pipeline (sequential) |\n| Deacon plugin-run | For each plugin where gate open, execute |\n\n## Relationship to gt-8tmz.4\n\ngt-8tmz.4 implements loop, gate, branch (static control flow).\nThis issue implements for-each (dynamic control flow over runtime-discovered collections).\n\nTogether they cover the full control flow needs for patrol formulas.\n\n## Implementation Notes\n\n- Bonds N instances based on step output\n- Christmas Ornament pattern: fanout from single step to N parallel children\n- Pairs with waits-for gate (gt-8tmz.38) for aggregation\n- Collection must be iterable (array in step output)\n- Each iteration gets item, index, and parent context in vars","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:04:37.895524-08:00","updated_at":"2025-12-27T21:29:55.846333-08:00","dependencies":[{"issue_id":"gt-8tmz.8","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:37.897159-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.846333-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8tmz.9","title":"gt sling --on flag for wisp scaffolding","description":"Add --on flag to gt sling for applying forms to existing work:\n gt sling shiny gastown/Toast --on gt-abc123\n\nWhen --on is specified, implies --wisp (scaffolding existing work).\nThe form shapes execution of the target bead.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T18:04:39.209305-08:00","updated_at":"2025-12-27T21:29:52.903082-08:00","dependencies":[{"issue_id":"gt-8tmz.9","depends_on_id":"gt-8tmz","type":"parent-child","created_at":"2025-12-23T18:04:39.212264-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.903082-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8uqi7","title":"Session ended: gt-gastown-nux","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-02T20:51:54.028308-08:00","updated_at":"2026-01-03T11:32:45.186108-08:00","closed_at":"2026-01-03T11:32:45.186108-08:00","close_reason":"Session lifecycle events processed","created_by":"gastown/polecats/nux"}
{"id":"gt-8v8","title":"Polecat cleanup should refuse to lose uncommitted work","description":"The system should stubbornly refuse to lose work from a polecat.\n\n## Current Problem\n\n- gt spawn --force bypasses safety checks\n- gt shutdown doesn't check for uncommitted work\n- Witness cleanup doesn't check git status\n\n## Desired Behavior\n\nBefore any polecat cleanup, check:\n1. git status - any uncommitted changes?\n2. git stash list - any stashes?\n3. Unpushed commits on branch?\n4. Unsynced beads changes?\n\nIf ANY of these exist:\n- REFUSE to clean up\n- Print clear error message listing what would be lost\n- Require explicit --nuclear flag to force (not just --force)\n\n## Implementation\n\nAdd to cleanupPolecat() in witness/manager.go:\n```go\nfunc (m *Manager) checkUncommittedWork(polecatName string) error {\n dir := m.polecatDir(polecatName)\n \n // Check git status\n if hasUncommitted, _ := git.HasUncommittedChanges(dir); hasUncommitted {\n return fmt.Errorf(\"polecat %s has uncommitted changes\", polecatName)\n }\n \n // Check stashes\n if stashCount, _ := git.StashCount(dir); stashCount \u003e 0 {\n return fmt.Errorf(\"polecat %s has %d stashes\", polecatName, stashCount)\n }\n \n // Check unpushed commits\n if unpushed, _ := git.UnpushedCommits(dir); unpushed \u003e 0 {\n return fmt.Errorf(\"polecat %s has %d unpushed commits\", polecatName, unpushed)\n }\n \n return nil\n}\n```\n\n## Affected Commands\n\n- gt shutdown\n- gt rig shutdown\n- Witness cleanup\n- gt spawn --force (should warn if overwriting)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T15:23:09.043717-08:00","updated_at":"2025-12-27T21:29:53.71121-08:00","deleted_at":"2025-12-27T21:29:53.71121-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-8w8qy","title":"Digest: mol-deacon-patrol","description":"Patrol 9: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:32:44.627727-08:00","updated_at":"2025-12-27T21:26:00.784916-08:00","deleted_at":"2025-12-27T21:26:00.784916-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8xr1e","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, archived 17 routine handoffs, cleaned 1 orphan process and 1 stale lock","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:50:42.117684-08:00","updated_at":"2025-12-27T21:26:01.795534-08:00","deleted_at":"2025-12-27T21:26:01.795534-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8y70b","title":"Swarm remaining gt-8tmz P3/P4 issues","description":"## Swarm Plan for Molecule Algebra Completion\n\nEncapsulates the remaining swarmable work from gt-8tmz (Molecule Algebra epic).\n\n### Swarmable Issues (6 total)\n\n**Batch 1 - Parallel (3 polecats):**\n- gt-8tmz.10: Rename Engineer in Box to Shiny (simple rename)\n- gt-8tmz.36: Validate expanded step IDs are unique (validation)\n- gt-8tmz.31: Formula validation specification (docs)\n\n**Batch 2 - Sequential (1 polecat):**\n- gt-8tmz.34: Expansion var overrides → gt-8tmz.35: Inline step expansion\n (dependency chain - .35 depends on .34)\n\n**Batch 3 - Solo (1 polecat):**\n- gt-8tmz.30: Proto debugging and inspection tools (add --graph)\n\n### Hold for Human (7 issues - need design decisions)\n- gt-8tmz.19: Meta-formulas (schema design)\n- gt-8tmz.20: Org inheritance (policy structure)\n- gt-8tmz.21: Conditional aspects (when clause syntax)\n- gt-8tmz.22: Review dimensions as aspects (refactor)\n- gt-8tmz.28: Error handling policy\n- gt-8tmz.29: Versioning strategy\n- gt-8tmz.37: Nested expansion recursion\n\n### Code Location\nAll work is in beads repo: /Users/stevey/gt/beads/crew/emma\n- internal/formula/*.go\n- cmd/bd/cook.go\n\n### Execution\nHand to Witness for polecat dispatch, or execute manually with polecats.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:54:03.709983-08:00","updated_at":"2025-12-27T21:29:55.139369-08:00","dependencies":[{"issue_id":"gt-8y70b","depends_on_id":"gt-8tmz","type":"blocks","created_at":"2025-12-25T18:54:09.318078-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.139369-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8yawq","title":"Digest: mol-deacon-patrol","description":"Patrol 19","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:06:36.426614-08:00","updated_at":"2025-12-27T21:26:00.635465-08:00","deleted_at":"2025-12-27T21:26:00.635465-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8yh9u","title":"Merge: capable-mjtltnm5","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: capable-mjtltnm5\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:48:12.614416-08:00","updated_at":"2025-12-30T23:12:31.061416-08:00","closed_at":"2025-12-30T23:12:31.061416-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-8ylw","title":"Digest: mol-deacon-patrol","description":"Patrol #13: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:33:56.275057-08:00","updated_at":"2025-12-27T21:26:04.293388-08:00","deleted_at":"2025-12-27T21:26:04.293388-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-8zios","title":"Digest: mol-deacon-patrol","description":"P18","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:27:14.951431-08:00","updated_at":"2025-12-27T21:26:01.593444-08:00","deleted_at":"2025-12-27T21:26:01.593444-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-92fc","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:47:30.48971-08:00","updated_at":"2025-12-27T21:26:04.20215-08:00","deleted_at":"2025-12-27T21:26:04.20215-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-92l","title":"Daemon: integration test with real lifecycle","description":"Need an end-to-end test that:\n1. Starts daemon\n2. Starts a test agent session\n3. Sends lifecycle request to daemon\n4. Verifies session was killed and restarted\n5. Cleans up\n\nCould use a mock 'agent' that's just a shell script.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T13:38:17.261096-08:00","updated_at":"2025-12-27T21:29:57.176512-08:00","dependencies":[{"issue_id":"gt-92l","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T13:38:26.962642-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.176512-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-92of","title":"Consider splitting large files (800+ lines)","description":"Several files are getting large and may benefit from splitting:\n- internal/beads/beads.go (990 lines)\n- internal/cmd/molecule.go (981 lines)\n- internal/refinery/manager.go (934 lines)\n- internal/beads/beads_test.go (883 lines)\n- internal/cmd/polecat.go (836 lines)\n- internal/witness/manager.go (808 lines)\n- internal/cmd/mail.go (804 lines)\n\nFor .go files, consider extracting logical subsystems. For test files, this is lower priority.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-21T21:35:09.138406-08:00","updated_at":"2025-12-27T21:29:57.899652-08:00","deleted_at":"2025-12-27T21:29:57.899652-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-93ha","title":"Digest: mol-deacon-patrol","description":"Patrol: slit handoff (landing protocol), polecats cleaned (0 now), 4 witness/refineries up","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T01:37:17.776898-08:00","updated_at":"2025-12-27T21:26:05.389038-08:00","deleted_at":"2025-12-27T21:26:05.389038-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-95x","title":"Remove stale migration docs from gastown-py","description":"The gastown-py repo has migration-related documentation that is now misinformation since we have made design decisions. Remove or clearly mark as obsolete: any docs about migration paths, old architecture assumptions, or superseded designs.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:24:08.642373-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-96jv2","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T04:26:46.271877-08:00","updated_at":"2025-12-27T21:26:03.727133-08:00","deleted_at":"2025-12-27T21:26:03.727133-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-974","title":"Refinery background daemon mode","description":"The refinery 'gt refinery start' command only works in foreground mode (--foreground). Need to implement background daemon mode for production use.\n\nOptions:\n1. Use a separate tmux session for the refinery\n2. Implement proper daemonization\n3. Use Claude Code session for the refinery agent\n\nFor MVP, option 1 (tmux session) is probably simplest.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T22:08:04.799753-08:00","updated_at":"2025-12-27T21:29:54.436021-08:00","deleted_at":"2025-12-27T21:29:54.436021-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-975","title":"Molecule execution support for polecats and crew","description":"Enable agents to run bonded molecules through the full lifecycle.\n\n## The Model\n1. BOND - bd mol bond \u003ctemplate\u003e --assignee \u003cidentity\u003e\n Creates concrete issues, assigns root to agent\n\n2. DISCOVER - Agent finds assigned molecules via bd ready\n DAG structure shows unblocked steps\n\n3. WORK - Agent works through DAG, closing steps as done\n Can delegate children to lower-tier agents (haiku)\n\n4. SURVIVE - Agent dies → beads persist\n Any agent resumes from DAG state\n\n5. SUPERVISE - Witness monitors for stalled molecules\n Nudges owner or reassigns if dead\n\n## Questions to Resolve\n- Seed node terminology: seed vs root vs pole vs nucleus\n- Assignee inheritance: single owner vs delegation model\n- Stall detection: heartbeat vs activity timeout vs session monitoring\n\n## Implementation\n- Polecat startup: check for assigned molecules\n- Polecat work loop: follow molecule DAG\n- Witness: monitor molecule progress, detect stalls\n- Mayor: can assign molecules, handle escalations\n\n## Depends On\n- beads: bd mol bond command (bd-usro in beads repo)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T16:57:01.09104-08:00","updated_at":"2025-12-27T21:29:53.677323-08:00","deleted_at":"2025-12-27T21:29:53.677323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-977","title":"Work request: gt-976 crew lifecycle support","description":"Hey Max, filed gt-976 for you - adding crew lifecycle support to Deacon.\n\nCurrently crew is 'human-managed' and can't request automated refresh. Would be useful for molecules that need fresh sessions mid-workflow.\n\nKey changes:\n- getManager(RoleCrew) → return deacon/ instead of human\n- Teach Deacon crew session patterns\n- Test with gt nudge\n\nPriority 2, no rush but would unblock molecule automation for crew.\n\n- Dave (beads/crew/dave)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:19:55.964718-08:00","updated_at":"2025-12-25T14:12:42.304335-08:00","deleted_at":"2025-12-25T14:12:42.304335-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-98iek","title":"Digest: mol-deacon-patrol","description":"P7: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:11:26.744324-08:00","updated_at":"2025-12-27T21:26:02.286229-08:00","deleted_at":"2025-12-27T21:26:02.286229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-98oo","title":"Merge: gt-y5o","description":"branch: polecat/Doof\ntarget: main\nsource_issue: gt-y5o\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T16:29:03.837448-08:00","updated_at":"2025-12-27T21:27:22.955415-08:00","deleted_at":"2025-12-27T21:27:22.955415-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-98y0y","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:49:30.91936-08:00","updated_at":"2025-12-27T21:26:04.168311-08:00","deleted_at":"2025-12-27T21:26:04.168311-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-99a","title":"Add unit tests for daemon package","description":"The daemon package has no unit tests. Need tests for:\n- Config and state serialization\n- Session name pattern matching (isWitnessSession)\n- Lifecycle request parsing\n- Identity to session mapping","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T13:38:10.458609-08:00","updated_at":"2025-12-27T21:29:57.205897-08:00","dependencies":[{"issue_id":"gt-99a","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T13:38:26.466501-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.205897-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-99m","title":"gt daemon: Background service management","description":"## Summary\n\nONE daemon for all Gas Town - a simple Go process (not a Claude agent) that:\n1. Pokes agents periodically (heartbeat)\n2. Processes lifecycle requests\n3. Restarts sessions when agents request cycling\n\n## Architecture\n\nDaemon (gt daemon)\n- Pokes Mayor: \"HEARTBEAT: check your rigs\"\n- Pokes each Witness: \"HEARTBEAT: check your workers\"\n- Has inbox at daemon/ for lifecycle requests\n- Restarts sessions on cycle requests\n\n## NOT the daemon's job:\n- Making decisions (Witnesses do that)\n- Running Claude (it's a Go process)\n- Processing work (polecats do that)\n- Direct polecat management (Witnesses do that)\n\nThe daemon is a **dumb scheduler** - poke things, execute lifecycle requests. All intelligence is in agents.\n\n## Commands\n\n- gt daemon start: Start daemon (background)\n- gt daemon stop: Stop daemon\n- gt daemon status: Show daemon status\n- gt daemon logs: View daemon logs\n\n## Daemon Loop\n\nEvery heartbeat interval:\n1. Poke Mayor\n2. For each rig: Poke Witness\n3. Process lifecycle requests from daemon/ inbox\n4. Check for dead sessions, restart if cycle requested\n\n## Lifecycle Request Handling\n\nDaemon checks its inbox (daemon/) for lifecycle requests:\n- From Mayor: cycle/restart Mayor session\n- From Witnesses: cycle/restart Witness session\n\nOn request:\n1. Verify agent state shows requesting_cycle\n2. Kill session\n3. Start new session\n4. Clear requesting_cycle flag\n\n## Poke Protocol\n\nPoke = tmux inject \"HEARTBEAT: do your job\"\n- Agent ignores if already working\n- Agent wakes up if idle\n- Idempotent - multiple pokes are fine\n\n## Lifecycle Hierarchy\n\n- Daemon manages: Mayor, all Witnesses\n- Witness manages: Polecats, Refinery (per rig)\n- Crew: self-managed (human workspace)\n\n## Related Issues\n\n- gt-kmn.11: Daemon heartbeat details\n- gt-gby: gt handoff command (unified lifecycle)\n- gt-u1j.9: Fold witness daemon into this","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T21:50:09.763719-08:00","updated_at":"2025-12-27T21:29:54.228427-08:00","dependencies":[{"issue_id":"gt-99m","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:23:43.253877-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.228427-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2","title":"Federation: Wasteland architecture for cross-town coordination","description":"## Overview\n\nFederation enables Gas Town to scale across machines via **Outposts** - remote compute environments that can run workers.\n\n**Design doc**: docs/federation-design.md\n\n## Outpost Types\n\n| Type | Description | Cost Model |\n|------|-------------|------------|\n| Local | Current model (tmux panes) | Free |\n| SSH/VM | Full Gas Town clone on VM | Always-on |\n| CloudRun | Container workers on GCP | Pay-per-use |\n\n## Key Concepts\n\n### Outpost Abstraction\n```go\ntype Outpost interface {\n Name() string\n Type() OutpostType // local, ssh, cloudrun\n MaxWorkers() int\n Spawn(issue string, config WorkerConfig) (Worker, error)\n Workers() []Worker\n Ping() error\n}\n```\n\n### Cloud Run Workers\n- Persistent HTTP/2 connections solve zero-to-one cold start\n- Pay only when working (~$0.017 per 5-min session)\n- Scale 0→N automatically\n- Git clone via persistent volumes\n\n### SSH/VM Outposts \n- Full Gas Town clone on remote machine\n- SSH for commands, git for sync\n- Good for long-running autonomous work\n\n## Design Principles\n\n1. **Outpost abstraction** - support multiple backends\n2. **Local-first** - remote is for overflow/burst\n3. **Git as source of truth** - code and beads sync everywhere\n4. **HTTP for Cloud Run** - dont force mail onto containers\n5. **Graceful degradation** - works with any subset of outposts\n\n## Related\n\n- gt-f9x.7-10: Connection interface (lower-level abstraction)\n- docs/federation-design.md: Full architectural analysis","status":"tombstone","priority":3,"issue_type":"epic","created_at":"2025-12-15T19:21:32.462063-08:00","updated_at":"2025-12-27T21:29:57.767444-08:00","deleted_at":"2025-12-27T21:29:57.767444-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-9a2.1","title":"Outpost/Worker interfaces: Core abstractions for remote compute","description":"## Overview\n\nDefine the core interfaces that all outpost types implement.\n\n## Interfaces\n\n```go\ntype OutpostType string\nconst (\n OutpostLocal OutpostType = \"local\"\n OutpostSSH OutpostType = \"ssh\"\n OutpostCloudRun OutpostType = \"cloudrun\"\n)\n\ntype Outpost interface {\n Name() string\n Type() OutpostType\n MaxWorkers() int\n ActiveWorkers() int\n Spawn(issue string, config WorkerConfig) (Worker, error)\n Workers() []Worker\n Ping() error\n SendMail(worker string, msg Message) error // optional\n}\n\ntype Worker interface {\n ID() string\n Outpost() string\n Status() WorkerStatus // idle, working, done, failed\n Issue() string\n Attach() error // for interactive outposts\n Logs() (io.Reader, error)\n Stop() error\n}\n\ntype WorkerConfig struct {\n RigPath string\n BeadsDir string\n GitBranch string\n Context map[string]string // hints for worker\n}\n```\n\n## Files\n\n- `internal/outpost/outpost.go` - Outpost interface\n- `internal/outpost/worker.go` - Worker interface\n- `internal/outpost/config.go` - WorkerConfig, OutpostType\n\n## Notes\n\nThis is the foundation for all federation work. Keep interfaces minimal - we can extend later.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:01:38.268086-08:00","updated_at":"2025-12-27T21:29:57.750925-08:00","dependencies":[{"issue_id":"gt-9a2.1","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:01:38.27131-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.750925-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.10","title":"VM outpost setup: Terraform and documentation","description":"## Overview\n\nDocumentation and optional Terraform for setting up VM outposts.\n\n## Manual Setup Guide\n\n```markdown\n# Setting Up a VM Outpost\n\n## Prerequisites\n- GCE VM (or any Linux machine with SSH access)\n- SSH key pair\n- Git configured on VM\n\n## Steps\n\n1. Install Claude Code on VM:\n ```bash\n ssh user@vm\n npm install -g @anthropic-ai/claude-code\n ```\n\n2. Clone Gas Town:\n ```bash\n mkdir -p ~/ai\n cd ~/ai\n gt install .\n ```\n\n3. Configure Git credentials:\n ```bash\n git config --global user.name \"Your Name\"\n git config --global credential.helper store\n ```\n\n4. Add outpost to local config:\n ```bash\n gt outpost add ssh \\\n --name gce-burst \\\n --host 10.0.0.5 \\\n --user steve \\\n --key ~/.ssh/gce_worker \\\n --town-path /home/steve/ai \\\n --max-workers 8\n ```\n\n5. Test connectivity:\n ```bash\n gt outpost ping gce-burst\n ```\n```\n\n## Terraform Module (Optional)\n\n```hcl\n# deploy/terraform/vm-outpost/main.tf\n\nvariable \"project\" {}\nvariable \"zone\" { default = \"us-central1-a\" }\nvariable \"machine_type\" { default = \"e2-standard-4\" }\n\nresource \"google_compute_instance\" \"outpost\" {\n name = \"gastown-outpost\"\n machine_type = var.machine_type\n zone = var.zone\n\n boot_disk {\n initialize_params {\n image = \"ubuntu-2204-lts\"\n size = 100\n }\n }\n\n network_interface {\n network = \"default\"\n access_config {}\n }\n\n metadata_startup_script = file(\"${path.module}/startup.sh\")\n}\n\noutput \"external_ip\" {\n value = google_compute_instance.outpost.network_interface[0].access_config[0].nat_ip\n}\n```\n\n## Files\n\n- `docs/vm-outpost-setup.md`\n- `deploy/terraform/vm-outpost/` (optional)\n\nDepends on: gt-9a2.5 (SSHOutpost)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:03:34.782908-08:00","updated_at":"2025-12-27T21:29:57.676099-08:00","dependencies":[{"issue_id":"gt-9a2.10","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:03:34.78496-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.10","depends_on_id":"gt-9a2.5","type":"blocks","created_at":"2025-12-16T18:03:46.408609-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.676099-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.11","title":"HTTP work server: gt worker serve command","description":"## Overview\n\nHTTP server that runs inside Cloud Run containers, accepting work requests.\n\n## Command\n\n```bash\ngt worker serve --port 8080\n```\n\n## Implementation\n\n```go\ntype WorkServer struct {\n port int\n handler *WorkHandler\n}\n\nfunc (s *WorkServer) ServeHTTP(w http.ResponseWriter, r *http.Request) {\n switch r.URL.Path {\n case \"/work\":\n s.handleWork(w, r)\n case \"/health\":\n s.handleHealth(w, r)\n default:\n http.NotFound(w, r)\n }\n}\n\nfunc (s *WorkServer) handleWork(w http.ResponseWriter, r *http.Request) {\n // 1. Parse WorkRequest\n // 2. Clone/pull repo\n // 3. Start Claude on issue\n // 4. Stream WorkEvents as NDJSON\n // 5. Return WorkResult\n}\n```\n\n## Streaming Response\n\n```go\nfunc (s *WorkServer) streamEvents(w http.ResponseWriter, events \u003c-chan WorkEvent) {\n flusher, _ := w.(http.Flusher)\n encoder := json.NewEncoder(w)\n for event := range events {\n encoder.Encode(event)\n flusher.Flush()\n }\n}\n```\n\n## Files\n\n- `internal/cloudrun/server.go`\n- `cmd/gt/worker_serve.go` - Cobra command\n\n## Dependencies\n\nDepends on: gt-9a2.7 (protocol types)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:15:00.244823-08:00","updated_at":"2025-12-27T21:29:57.667856-08:00","dependencies":[{"issue_id":"gt-9a2.11","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:15:00.246808-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.11","depends_on_id":"gt-9a2.7","type":"blocks","created_at":"2025-12-16T18:15:10.751351-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.667856-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.12","title":"HTTP work client: CloudRun dispatch client","description":"## Overview\n\nHTTP client for dispatching work to Cloud Run and streaming results back.\n\n## Implementation\n\n```go\ntype WorkClient struct {\n serviceURL string\n httpClient *http.Client // HTTP/2 enabled\n}\n\nfunc NewWorkClient(serviceURL string) *WorkClient {\n return \u0026WorkClient{\n serviceURL: serviceURL,\n httpClient: \u0026http.Client{\n Transport: \u0026http2.Transport{},\n Timeout: 0, // No timeout for streaming\n },\n }\n}\n\nfunc (c *WorkClient) DispatchWork(ctx context.Context, req WorkRequest) (\u003c-chan WorkEvent, error) {\n // 1. POST to /work\n // 2. Return channel that streams WorkEvents\n // 3. Close channel when done/error\n}\n```\n\n## Streaming Reader\n\n```go\nfunc (c *WorkClient) streamEvents(body io.Reader, events chan\u003c- WorkEvent) {\n defer close(events)\n decoder := json.NewDecoder(body)\n for {\n var event WorkEvent\n if err := decoder.Decode(\u0026event); err != nil {\n return\n }\n events \u003c- event\n }\n}\n```\n\n## Files\n\n- `internal/cloudrun/client.go`\n\n## Dependencies\n\nDepends on: gt-9a2.7 (protocol types)\nUsed by: gt-9a2.8 (CloudRunOutpost)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:15:10.881936-08:00","updated_at":"2025-12-27T21:29:57.659373-08:00","dependencies":[{"issue_id":"gt-9a2.12","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:15:10.882339-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.12","depends_on_id":"gt-9a2.7","type":"blocks","created_at":"2025-12-16T18:15:20.974167-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.659373-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.13","title":"CloudRun persistent connections: HTTP/2 keepalive","description":"## Overview\n\nImplement persistent HTTP/2 connections to keep Cloud Run containers warm.\n\n## Problem\n\nCloud Run has cold start latency when scaling from 0. Persistent connections keep containers warm.\n\n## Implementation\n\n```go\ntype PersistentConnectionManager struct {\n serviceURL string\n conn net.Conn\n client *http2.ClientConn\n mu sync.Mutex\n lastUsed time.Time\n keepAlive time.Duration\n}\n\nfunc (m *PersistentConnectionManager) GetConnection() (*http2.ClientConn, error) {\n m.mu.Lock()\n defer m.mu.Unlock()\n \n // Reuse existing connection if healthy\n if m.client != nil \u0026\u0026 m.client.CanTakeNewRequest() {\n m.lastUsed = time.Now()\n return m.client, nil\n }\n \n // Create new connection\n return m.dial()\n}\n\nfunc (m *PersistentConnectionManager) keepAliveLoop() {\n ticker := time.NewTicker(m.keepAlive / 2)\n for range ticker.C {\n m.mu.Lock()\n if time.Since(m.lastUsed) \u003e m.keepAlive {\n // Connection idle too long, close it\n m.close()\n } else {\n // Send ping to keep alive\n m.ping()\n }\n m.mu.Unlock()\n }\n}\n```\n\n## Integration with CloudRunOutpost\n\n```go\ntype CloudRunOutpost struct {\n // ...existing fields...\n connMgr *PersistentConnectionManager\n}\n\nfunc (o *CloudRunOutpost) Spawn(...) {\n conn, err := o.connMgr.GetConnection()\n // Use connection for work dispatch\n}\n```\n\n## Config\n\n```yaml\noutposts:\n - name: cloudrun-burst\n type: cloudrun\n # ...\n keep_alive: 5m # Keep connection warm for 5 minutes\n```\n\n## Files\n\n- `internal/cloudrun/connection.go`\n\n## Dependencies\n\nDepends on: gt-9a2.8 (basic CloudRunOutpost)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:15:57.082691-08:00","updated_at":"2025-12-27T21:29:57.650992-08:00","dependencies":[{"issue_id":"gt-9a2.13","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:15:57.084539-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.13","depends_on_id":"gt-9a2.8","type":"blocks","created_at":"2025-12-16T18:16:12.190542-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.650992-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.14","title":"CloudRun cost tracking: Usage monitoring and limits","description":"## Overview\n\nTrack Cloud Run usage costs and enforce spending limits.\n\n## Cost Model\n\nCloud Run pricing (approximate):\n- CPU: ~$0.000024/vCPU-second\n- Memory: ~$0.0000025/GiB-second\n- Requests: ~$0.40/million\n\nFor 5-minute worker (2 vCPU, 4GB):\n- CPU: 300 × 2 × $0.000024 = $0.0144\n- Memory: 300 × 4 × $0.0000025 = $0.003\n- **Total: ~$0.017 per session**\n\n## Implementation\n\n```go\ntype CostTracker struct {\n cpuRate float64 // per vCPU-second\n memRate float64 // per GiB-second\n sessions []SessionCost\n mu sync.RWMutex\n}\n\ntype SessionCost struct {\n WorkerID string\n StartTime time.Time\n EndTime time.Time\n CPUs float64\n MemoryGiB float64\n}\n\nfunc (t *CostTracker) RecordSession(workerID string, start, end time.Time, cpus, mem float64) {\n t.mu.Lock()\n defer t.mu.Unlock()\n t.sessions = append(t.sessions, SessionCost{...})\n}\n\nfunc (t *CostTracker) CurrentCost() float64 {\n t.mu.RLock()\n defer t.mu.RUnlock()\n var total float64\n for _, s := range t.sessions {\n duration := s.EndTime.Sub(s.StartTime).Seconds()\n total += duration * s.CPUs * t.cpuRate\n total += duration * s.MemoryGiB * t.memRate\n }\n return total\n}\n```\n\n## Cost Cap Enforcement\n\n```go\nfunc (o *CloudRunOutpost) Spawn(...) (Worker, error) {\n if o.costCap \u003e 0 \u0026\u0026 o.costTracker.CurrentCost() \u003e= o.costCap {\n return nil, ErrCostCapExceeded\n }\n // ... spawn worker\n}\n```\n\n## Config\n\n```yaml\noutposts:\n - name: cloudrun-burst\n type: cloudrun\n cost_cap_hourly: 5.00 # Stop spawning if hourly cost exceeds $5\n cost_cap_daily: 50.00 # Daily limit\n```\n\n## CLI\n\n```bash\ngt outpost status cloudrun-burst\n# Shows: Cost (today): $0.42 / $50.00 cap\n```\n\n## Files\n\n- `internal/cloudrun/cost.go`\n\n## Dependencies\n\nDepends on: gt-9a2.8 (basic CloudRunOutpost)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:16:14.37154-08:00","updated_at":"2025-12-27T21:29:57.642069-08:00","dependencies":[{"issue_id":"gt-9a2.14","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:16:14.373486-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.14","depends_on_id":"gt-9a2.8","type":"blocks","created_at":"2025-12-16T18:16:31.692724-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.642069-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.15","title":"Outpost error handling: Retry logic and resilience","description":"## Overview\n\nError handling and retry logic for outpost operations.\n\n## Error Types\n\n```go\nvar (\n ErrOutpostUnreachable = errors.New(\"outpost unreachable\")\n ErrWorkerSpawnFailed = errors.New(\"worker spawn failed\")\n ErrWorkerTimeout = errors.New(\"worker timed out\")\n ErrCostCapExceeded = errors.New(\"cost cap exceeded\")\n ErrAllOutpostsBusy = errors.New(\"all outposts at capacity\")\n)\n```\n\n## Retry Policy\n\n```go\ntype RetryPolicy struct {\n MaxAttempts int\n InitialBackoff time.Duration\n MaxBackoff time.Duration\n BackoffFactor float64\n RetryableErrors []error\n}\n\nfunc DefaultRetryPolicy() *RetryPolicy {\n return \u0026RetryPolicy{\n MaxAttempts: 3,\n InitialBackoff: 1 * time.Second,\n MaxBackoff: 30 * time.Second,\n BackoffFactor: 2.0,\n RetryableErrors: []error{\n ErrOutpostUnreachable,\n ErrWorkerSpawnFailed,\n },\n }\n}\n```\n\n## Retry Implementation\n\n```go\nfunc (m *OutpostManager) SpawnWithRetry(issue string, cfg WorkerConfig) (Worker, error) {\n var lastErr error\n backoff := m.retryPolicy.InitialBackoff\n \n for attempt := 0; attempt \u003c m.retryPolicy.MaxAttempts; attempt++ {\n outpost := m.policy.SelectOutpost(issue, m.outposts)\n if outpost == nil {\n return nil, ErrAllOutpostsBusy\n }\n \n worker, err := outpost.Spawn(issue, cfg)\n if err == nil {\n return worker, nil\n }\n \n if !m.isRetryable(err) {\n return nil, err\n }\n \n lastErr = err\n time.Sleep(backoff)\n backoff = min(backoff * m.retryPolicy.BackoffFactor, m.retryPolicy.MaxBackoff)\n }\n \n return nil, fmt.Errorf(\"spawn failed after %d attempts: %w\", \n m.retryPolicy.MaxAttempts, lastErr)\n}\n```\n\n## Outpost Health Tracking\n\n```go\ntype OutpostHealth struct {\n LastPing time.Time\n LastError error\n FailureCount int\n CircuitBreaker bool // If true, skip this outpost\n}\n\nfunc (m *OutpostManager) updateHealth(name string, err error) {\n // Track failures, open circuit breaker after N failures\n // Auto-reset after successful ping\n}\n```\n\n## Config\n\n```yaml\npolicy:\n retry:\n max_attempts: 3\n initial_backoff: 1s\n max_backoff: 30s\n \n circuit_breaker:\n failure_threshold: 5\n reset_timeout: 1m\n```\n\n## Files\n\n- `internal/outpost/retry.go`\n- `internal/outpost/health.go`\n\n## Dependencies\n\nDepends on: gt-9a2.3 (OutpostManager)\nCan be done after basic outpost implementations work.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:16:33.386008-08:00","updated_at":"2025-12-27T21:29:57.630795-08:00","dependencies":[{"issue_id":"gt-9a2.15","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:16:33.387765-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.15","depends_on_id":"gt-9a2.3","type":"blocks","created_at":"2025-12-16T18:16:46.064482-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.630795-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.16","title":"Outpost integration tests: Mock and E2E testing","description":"## Overview\n\nIntegration tests for the outpost system.\n\n## Mock Outpost\n\n```go\ntype MockOutpost struct {\n name string\n maxWorkers int\n spawnDelay time.Duration\n spawnError error\n workers []*MockWorker\n}\n\nfunc (o *MockOutpost) Spawn(issue string, cfg WorkerConfig) (Worker, error) {\n if o.spawnError != nil {\n return nil, o.spawnError\n }\n time.Sleep(o.spawnDelay)\n worker := \u0026MockWorker{id: uuid.New().String(), issue: issue}\n o.workers = append(o.workers, worker)\n return worker, nil\n}\n```\n\n## Unit Tests\n\n```go\nfunc TestOutpostManager_SelectOutpost(t *testing.T) {\n // Test policy-based selection\n}\n\nfunc TestOutpostManager_SpawnWithRetry(t *testing.T) {\n // Test retry logic with transient failures\n}\n\nfunc TestCloudRunOutpost_CostTracking(t *testing.T) {\n // Test cost calculation and caps\n}\n```\n\n## Integration Tests (with mocked Cloud Run)\n\n```go\nfunc TestCloudRunOutpost_Integration(t *testing.T) {\n // Start mock HTTP server\n server := httptest.NewServer(mockWorkHandler())\n defer server.Close()\n \n outpost := NewCloudRunOutpost(OutpostConfig{\n ServiceURL: server.URL,\n })\n \n worker, err := outpost.Spawn(\"test-issue\", WorkerConfig{})\n // Assert worker created, events streamed\n}\n```\n\n## E2E Tests (requires real Cloud Run)\n\n```go\n// +build e2e\n\nfunc TestCloudRunOutpost_E2E(t *testing.T) {\n if os.Getenv(\"CLOUDRUN_SERVICE_URL\") == \"\" {\n t.Skip(\"CLOUDRUN_SERVICE_URL not set\")\n }\n // Test against real Cloud Run service\n}\n```\n\n## Files\n\n- `internal/outpost/mock_test.go`\n- `internal/outpost/manager_test.go`\n- `internal/outpost/cloudrun_test.go`\n- `internal/cloudrun/integration_test.go`\n\n## Dependencies\n\nDepends on: gt-9a2.8 (CloudRunOutpost), gt-9a2.15 (error handling)\nLower priority - can be done after implementations work.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-16T18:16:48.019386-08:00","updated_at":"2025-12-27T21:29:57.924961-08:00","dependencies":[{"issue_id":"gt-9a2.16","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:16:48.021933-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.16","depends_on_id":"gt-9a2.8","type":"blocks","created_at":"2025-12-16T18:16:55.90113-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.16","depends_on_id":"gt-9a2.15","type":"blocks","created_at":"2025-12-16T18:16:56.013753-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.924961-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.2","title":"LocalOutpost: Refactor current polecat spawning","description":"## Overview\n\nRefactor current local polecat spawning to implement the Outpost interface.\n\n## Current State\n\nPolecat spawning is ad-hoc. This task wraps it in LocalOutpost.\n\n## Implementation\n\n```go\ntype LocalOutpost struct {\n name string\n maxWorkers int\n tmux *Tmux\n workers map[string]*LocalWorker\n}\n\nfunc NewLocalOutpost(config OutpostConfig) *LocalOutpost\n\nfunc (o *LocalOutpost) Type() OutpostType { return OutpostLocal }\n\nfunc (o *LocalOutpost) Spawn(issue string, cfg WorkerConfig) (Worker, error) {\n // Create tmux session\n // Start claude in pane\n // Return LocalWorker\n}\n```\n\n## LocalWorker\n\n```go\ntype LocalWorker struct {\n id string\n session string // tmux session name\n issue string\n status WorkerStatus\n}\n\nfunc (w *LocalWorker) Attach() error {\n // tmux attach-session\n}\n```\n\n## Notes\n\n- This should be a refactor, not new functionality\n- Existing polecat code becomes LocalOutpost internals\n- Tests should pass before and after\n\nDepends on: gt-9a2.1 (interfaces)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:01:49.551224-08:00","updated_at":"2025-12-27T21:29:57.742434-08:00","dependencies":[{"issue_id":"gt-9a2.2","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:01:49.553291-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.2","depends_on_id":"gt-9a2.1","type":"blocks","created_at":"2025-12-16T18:03:45.338172-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.742434-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.3","title":"Outpost configuration: YAML config and OutpostManager","description":"## Overview\n\nConfiguration file for outposts and manager to load/track them.\n\n## Config File\n\n`~/ai/config/outposts.yaml`:\n\n```yaml\noutposts:\n - name: local\n type: local\n max_workers: 4\n\n - name: gce-burst\n type: ssh\n host: 10.0.0.5\n user: steve\n ssh_key: ~/.ssh/gce_worker\n town_path: /home/steve/ai\n max_workers: 8\n\n - name: cloudrun-burst\n type: cloudrun\n project: my-gcp-project\n region: us-central1\n service: gastown-worker\n max_workers: 20\n cost_cap_hourly: 5.00\n\npolicy:\n default_preference: [local, gce-burst, cloudrun-burst]\n overrides:\n - condition: \"priority \u003e= P3\"\n prefer: cloudrun-burst\n```\n\n## OutpostManager\n\n```go\ntype OutpostManager struct {\n outposts map[string]Outpost\n policy AssignmentPolicy\n}\n\nfunc NewOutpostManager(configPath string) (*OutpostManager, error)\nfunc (m *OutpostManager) Get(name string) (Outpost, bool)\nfunc (m *OutpostManager) List() []Outpost\nfunc (m *OutpostManager) SelectOutpost(issue Issue) Outpost\n```\n\n## Files\n\n- `internal/outpost/manager.go`\n- `internal/outpost/config.go` (extend)\n- `internal/outpost/policy.go`\n\nDepends on: gt-9a2.1 (interfaces)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:02:01.595784-08:00","updated_at":"2025-12-27T21:29:57.734203-08:00","dependencies":[{"issue_id":"gt-9a2.3","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:02:01.598029-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.3","depends_on_id":"gt-9a2.1","type":"blocks","created_at":"2025-12-16T18:03:45.448178-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.734203-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.4","title":"Outpost CLI commands: gt outpost list/status/add","description":"## Overview\n\nCLI commands for managing outposts.\n\n## Commands\n\n```bash\n# List configured outposts\ngt outpost list\nNAME TYPE WORKERS STATUS\nlocal local 2/4 healthy\ngce-burst ssh 0/8 healthy\ncloudrun-burst cloudrun 0/20 healthy\n\n# Detailed status\ngt outpost status [name]\nOutpost: cloudrun-burst\nType: cloudrun\nProject: my-gcp-project\nRegion: us-central1\nService: gastown-worker\nActive Workers: 3/20\nCost (today): $0.42\nStatus: healthy\n\n# Add new outpost interactively\ngt outpost add\n? Outpost type: [local/ssh/cloudrun]\n\u003e cloudrun\n? Name: cloudrun-burst\n? GCP Project: my-gcp-project\n...\n\n# Add via flags\ngt outpost add cloudrun --name burst --project my-proj --region us-central1\n\n# Remove outpost\ngt outpost remove \u003cname\u003e\n\n# Test connectivity\ngt outpost ping \u003cname\u003e\n```\n\n## Files\n\n- `cmd/gt/outpost.go` - Cobra command group\n- `cmd/gt/outpost_list.go`\n- `cmd/gt/outpost_status.go`\n- `cmd/gt/outpost_add.go`\n\nDepends on: gt-9a2.3 (config/manager)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:02:12.317032-08:00","updated_at":"2025-12-27T21:29:57.726154-08:00","dependencies":[{"issue_id":"gt-9a2.4","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:02:12.319322-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.4","depends_on_id":"gt-9a2.3","type":"blocks","created_at":"2025-12-16T18:03:45.55722-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.726154-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.5","title":"SSHOutpost: Full Gas Town clone on remote VM","description":"## Overview\n\nImplement SSHOutpost for running workers on remote VMs via SSH.\n\n## Model\n\nRemote VM has a full Gas Town clone. SSHOutpost:\n- Connects via SSH\n- Spawns tmux sessions on remote\n- Uses remote git clone for code/beads\n- Syncs via git push/pull\n\n## Implementation\n\n```go\ntype SSHOutpost struct {\n name string\n host string\n user string\n keyPath string\n townPath string // e.g., /home/steve/ai\n maxWorkers int\n conn *ssh.Client\n}\n\nfunc NewSSHOutpost(config OutpostConfig) (*SSHOutpost, error)\n\nfunc (o *SSHOutpost) Spawn(issue string, cfg WorkerConfig) (Worker, error) {\n // SSH: create tmux session on remote\n // SSH: start claude in session\n // Return SSHWorker\n}\n\nfunc (o *SSHOutpost) Ping() error {\n // Test SSH connectivity\n}\n```\n\n## SSHWorker\n\n```go\ntype SSHWorker struct {\n outpost *SSHOutpost\n id string\n session string\n issue string\n}\n\nfunc (w *SSHWorker) Attach() error {\n // SSH + tmux attach (opens terminal)\n}\n\nfunc (w *SSHWorker) Logs() (io.Reader, error) {\n // SSH: tmux capture-pane\n}\n```\n\n## Prerequisites on Remote VM\n\n1. Gas Town clone at townPath\n2. Claude Code installed\n3. Git credentials configured\n4. SSH key access\n\n## Integration with gt-f9x.7-8\n\nThis builds on the Connection interface from gt-f9x.7/8. SSHOutpost uses SSHConnection internally for remote operations.\n\nDepends on: gt-9a2.1 (interfaces), gt-f9x.7 (Connection interface)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:02:24.507039-08:00","updated_at":"2025-12-27T21:29:57.717809-08:00","dependencies":[{"issue_id":"gt-9a2.5","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:02:24.508937-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.5","depends_on_id":"gt-9a2.1","type":"blocks","created_at":"2025-12-16T18:03:45.665254-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.5","depends_on_id":"gt-f9x.7","type":"blocks","created_at":"2025-12-16T18:03:45.770134-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.717809-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.6","title":"CloudRun worker container: Dockerfile and entrypoint","description":"## Overview\n\nDocker container image for Cloud Run workers.\n\n## Dockerfile\n\n```dockerfile\nFROM golang:1.21-alpine AS builder\nWORKDIR /app\nCOPY . .\nRUN go build -o gt ./cmd/gt\n\nFROM ubuntu:22.04\n\n# Install Claude Code\nRUN apt-get update \u0026\u0026 apt-get install -y curl git nodejs npm \u0026\u0026 npm install -g @anthropic-ai/claude-code \u0026\u0026 apt-get clean\n\n# Install gt\nCOPY --from=builder /app/gt /usr/local/bin/gt\n\n# Worker entrypoint\nCOPY deploy/cloudrun/entrypoint.sh /entrypoint.sh\nRUN chmod +x /entrypoint.sh\n\nEXPOSE 8080\nENTRYPOINT [\"/entrypoint.sh\"]\n```\n\n## Entrypoint\n\n```bash\n#!/bin/bash\ngit config --global user.name \"$GIT_USER\"\ngit config --global user.email \"$GIT_EMAIL\"\nexec gt worker serve --port 8080\n```\n\n## Files\n\n- `deploy/cloudrun/Dockerfile`\n- `deploy/cloudrun/entrypoint.sh`\n- `deploy/cloudrun/README.md`\n\n## Dependencies\n\nDepends on: gt-9a2.11 (HTTP server - gt worker serve)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:02:37.583941-08:00","updated_at":"2025-12-27T21:29:57.709524-08:00","dependencies":[{"issue_id":"gt-9a2.6","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:02:37.585955-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.6","depends_on_id":"gt-9a2.11","type":"blocks","created_at":"2025-12-16T18:15:38.208009-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.709524-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.7","title":"CloudRun protocol types: WorkRequest, WorkEvent, WorkResult","description":"## Overview\n\nHTTP protocol types for Cloud Run work dispatch. **Types only** - server and client are separate tasks.\n\n## Request Type\n\n```go\ntype WorkRequest struct {\n IssueID string `json:\"issue_id\"`\n Rig RigConfig `json:\"rig\"`\n Beads BeadsConfig `json:\"beads\"`\n Branch string `json:\"worker_branch\"`\n Context map[string]any `json:\"context,omitempty\"`\n}\n\ntype RigConfig struct {\n URL string `json:\"url\"`\n Branch string `json:\"branch\"`\n}\n\ntype BeadsConfig struct {\n URL string `json:\"url\"`\n Branch string `json:\"branch\"`\n}\n```\n\n## Response Types (streaming NDJSON)\n\n```go\ntype WorkEvent struct {\n Type string `json:\"type\"` // status, log, progress, result, error\n Status string `json:\"status,omitempty\"`\n Line string `json:\"line,omitempty\"`\n Percent int `json:\"percent,omitempty\"`\n Branch string `json:\"branch,omitempty\"`\n PRURL string `json:\"pr_url,omitempty\"`\n Code string `json:\"code,omitempty\"`\n Message string `json:\"message,omitempty\"`\n}\n\ntype WorkResult struct {\n Status string `json:\"status\"` // done, failed\n Branch string `json:\"branch\"`\n PRURL string `json:\"pr_url,omitempty\"`\n Error string `json:\"error,omitempty\"`\n}\n```\n\n## Files\n\n- `internal/cloudrun/protocol.go` - All types above\n\n## Notes\n\nThis is just types. Server (gt-9a2.7b) and client (gt-9a2.7c) are separate tasks.\nSmall, focused task - can complete quickly.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:02:51.480019-08:00","updated_at":"2025-12-27T21:29:57.701115-08:00","dependencies":[{"issue_id":"gt-9a2.7","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:02:51.48197-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.7","depends_on_id":"gt-9a2.1","type":"blocks","created_at":"2025-12-16T18:03:45.984572-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.701115-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.8","title":"CloudRunOutpost: Basic implementation","description":"## Overview\n\nBasic CloudRunOutpost implementation. Persistent connections and cost tracking are separate tasks.\n\n## Implementation\n\n```go\ntype CloudRunOutpost struct {\n name string\n project string\n region string\n service string\n maxWorkers int\n client *WorkClient\n workers map[string]*CloudRunWorker\n mu sync.RWMutex\n}\n\nfunc NewCloudRunOutpost(cfg OutpostConfig) (*CloudRunOutpost, error) {\n serviceURL := fmt.Sprintf(\n \"https://%s-%s.a.run.app\",\n cfg.Service, cfg.Region,\n )\n return \u0026CloudRunOutpost{\n name: cfg.Name,\n project: cfg.Project,\n region: cfg.Region,\n service: cfg.Service,\n maxWorkers: cfg.MaxWorkers,\n client: NewWorkClient(serviceURL),\n workers: make(map[string]*CloudRunWorker),\n }, nil\n}\n```\n\n## Spawn\n\n```go\nfunc (o *CloudRunOutpost) Spawn(issue string, cfg WorkerConfig) (Worker, error) {\n req := WorkRequest{\n IssueID: issue,\n Rig: RigConfig{URL: cfg.RigURL, Branch: cfg.GitBranch},\n Beads: BeadsConfig{URL: cfg.BeadsURL, Branch: \"beads-sync\"},\n Branch: \"polecat/\" + issue,\n }\n \n events, err := o.client.DispatchWork(context.Background(), req)\n if err != nil {\n return nil, err\n }\n \n worker := \u0026CloudRunWorker{\n id: uuid.New().String(),\n outpost: o.name,\n issue: issue,\n events: events,\n status: WorkerStatusWorking,\n }\n \n o.mu.Lock()\n o.workers[worker.id] = worker\n o.mu.Unlock()\n \n go worker.monitor()\n return worker, nil\n}\n```\n\n## CloudRunWorker\n\n```go\ntype CloudRunWorker struct {\n id string\n outpost string\n issue string\n status WorkerStatus\n events \u003c-chan WorkEvent\n logs []string\n}\n\nfunc (w *CloudRunWorker) Attach() error {\n return errors.New(\"Cloud Run workers do not support attach\")\n}\n\nfunc (w *CloudRunWorker) Logs() (io.Reader, error) {\n return strings.NewReader(strings.Join(w.logs, \"\\n\")), nil\n}\n```\n\n## Files\n\n- `internal/outpost/cloudrun.go`\n\n## Dependencies\n\nDepends on: gt-9a2.1 (interfaces), gt-9a2.12 (HTTP client)\nBlocks: gt-9a2.13 (persistent connections), gt-9a2.14 (cost tracking)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:03:06.803401-08:00","updated_at":"2025-12-27T21:29:57.692806-08:00","dependencies":[{"issue_id":"gt-9a2.8","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:03:06.805524-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.8","depends_on_id":"gt-9a2.1","type":"blocks","created_at":"2025-12-16T18:03:46.081721-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.8","depends_on_id":"gt-9a2.12","type":"blocks","created_at":"2025-12-16T18:15:54.915831-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.692806-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9a2.9","title":"Outpost assignment policy: Smart work routing","description":"## Overview\n\nPolicy engine for deciding which outpost gets which work.\n\n## Policy Configuration\n\n```yaml\npolicy:\n # Default order of preference\n default_preference: [local, gce-burst, cloudrun-burst]\n \n # Rules applied in order\n rules:\n # Background work → Cloud Run (cheap)\n - condition: \"priority \u003e= P3\"\n prefer: cloudrun-burst\n \n # Long tasks → VM (persistent)\n - condition: \"estimated_duration \u003e 30m\"\n prefer: gce-burst\n \n # Specific epic → specific outpost\n - condition: \"epic == gt-abc\"\n prefer: local\n```\n\n## Implementation\n\n```go\ntype AssignmentPolicy struct {\n DefaultPreference []string\n Rules []PolicyRule\n}\n\ntype PolicyRule struct {\n Condition string // Simple expression\n Prefer string // Outpost name\n Require string // Must use this outpost\n}\n\nfunc (p *AssignmentPolicy) SelectOutpost(\n issue Issue, \n outposts map[string]Outpost,\n) Outpost {\n // Check rules in order\n for _, rule := range p.Rules {\n if rule.Matches(issue) {\n if op, ok := outposts[rule.Prefer]; ok {\n if op.ActiveWorkers() \u003c op.MaxWorkers() {\n return op\n }\n }\n }\n }\n \n // Fall back to default preference\n for _, name := range p.DefaultPreference {\n if op, ok := outposts[name]; ok {\n if op.ActiveWorkers() \u003c op.MaxWorkers() {\n return op\n }\n }\n }\n \n return nil // All outposts at capacity\n}\n```\n\n## Condition Language\n\nSimple expressions, not a full DSL:\n\n```\npriority \u003e= P3\npriority == P0\nestimated_duration \u003e 30m\nepic == gt-abc\ntype == bug\nlabel contains \"urgent\"\n```\n\n## Files\n\n- `internal/outpost/policy.go`\n- `internal/outpost/condition.go`\n\nDepends on: gt-9a2.3 (config)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-16T18:03:21.08101-08:00","updated_at":"2025-12-27T21:29:57.684514-08:00","dependencies":[{"issue_id":"gt-9a2.9","depends_on_id":"gt-9a2","type":"parent-child","created_at":"2025-12-16T18:03:21.083256-08:00","created_by":"daemon"},{"issue_id":"gt-9a2.9","depends_on_id":"gt-9a2.3","type":"blocks","created_at":"2025-12-16T18:03:46.300288-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.684514-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9ae69","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously,\nhandling callbacks, monitoring rig health, and performing cleanup.\nEach patrol cycle runs these steps in sequence, then loops or exits.\n","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T13:43:08.241676-08:00","updated_at":"2025-12-25T14:12:41.853879-08:00","deleted_at":"2025-12-25T14:12:41.853879-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-9bpji","title":"Digest: mol-deacon-patrol","description":"P20: stable - handoff after 20 patrols","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:00:46.97186-08:00","updated_at":"2025-12-27T21:26:02.343606-08:00","deleted_at":"2025-12-27T21:26:02.343606-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9d5zz","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:03:13.84255-08:00","updated_at":"2025-12-27T21:26:04.001681-08:00","deleted_at":"2025-12-27T21:26:04.001681-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9e8q","title":"Digest: mol-deacon-patrol","description":"Patrol 14","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:10:11.832142-08:00","updated_at":"2025-12-27T21:26:04.425053-08:00","deleted_at":"2025-12-27T21:26:04.425053-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9g82","title":"Polecat wisp architecture: Proto → Wisp → Mol pattern","description":"Design the three-layer architecture for polecat lifecycle:\n\n## The Insight\n\nPolecats should run a ONE-SHOT WISP (not looping like patrols):\n\n**Step 1: Onboard**\n- Read full polecat protocol (polecat.md template)\n- Learn Gas Town operation, exit strategies, molecule protocol\n- This is the 'how to be a polecat' education\n\n**Step 2: Execute Mol**\n- Run the assigned molecule (the actual work item)\n- Could span multiple sessions via session continuity\n- The mol is pure work content (epic, issue, feature)\n\n**Step 3: Cleanup**\n- Run final step of the wisp\n- Self-delete / request shutdown\n\n## Three Layers\n\n- **Proto**: polecat.md template (instructions for being a polecat)\n- **Wisp**: One-shot harness instantiated from proto (wraps the mol)\n- **Mol**: The work item (issue/epic being processed)\n\n## Why This Matters\n\n1. **Separation of concerns**: Protocol (how) vs Work (what)\n2. **Reusability**: Same wisp harness wraps any mol\n3. **Extensibility**: Plugin points for custom behavior\n4. **Session continuity**: Wisp handles multi-session, not the mol\n5. **Blurred control/data planes**: Intentional in Gas Town\n\n## Design Questions\n\n1. How does proto (polecat.md) become a wisp instance?\n2. What are the plugin/extension points?\n3. Should all 'engineer in a box' mols use proto → wisp → mol?\n4. How does this relate to refinery/deacon patterns?","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-22T15:54:17.446941-08:00","updated_at":"2025-12-27T21:29:53.192117-08:00","deleted_at":"2025-12-27T21:29:53.192117-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-9h40s","title":"Digest: mol-deacon-patrol","description":"Patrol 17: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:17:31.839529-08:00","updated_at":"2025-12-27T21:26:00.963428-08:00","deleted_at":"2025-12-27T21:26:00.963428-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9ibez","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T10:44:07.785738-08:00","updated_at":"2026-01-01T10:44:07.785738-08:00","closed_at":"2026-01-01T10:44:07.785703-08:00","dependencies":[{"issue_id":"gt-9ibez","depends_on_id":"gt-eph-8a33","type":"parent-child","created_at":"2026-01-01T10:44:07.786988-08:00","created_by":"deacon"}]}
{"id":"gt-9its","title":"Digest: mol-deacon-patrol","description":"Patrol #11","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:24:28.490294-08:00","updated_at":"2025-12-27T21:26:04.742199-08:00","deleted_at":"2025-12-27T21:26:04.742199-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9j4a8","title":"Merge: valkyrie-mjw71b7u","description":"branch: polecat/valkyrie-mjw71b7u\ntarget: main\nsource_issue: valkyrie-mjw71b7u\nrig: gastown\nagent_bead: gt-gastown-polecat-valkyrie","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T17:31:54.984761-08:00","updated_at":"2026-01-01T19:55:59.970646-08:00","closed_at":"2026-01-01T19:55:59.970646-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/valkyrie"}
{"id":"gt-9kc2","title":"Refinery needs manual restart/handoff mechanism","description":"Refinery sessions can get stuck or need restart. Currently requires manual intervention. Need: 1) gt refinery restart command, 2) Refinery self-handoff on context fill, 3) Auto-recovery from stuck states.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-23T00:19:10.944679-08:00","updated_at":"2025-12-27T21:29:53.06587-08:00","deleted_at":"2025-12-27T21:29:53.06587-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-9klo9","title":"Digest: mol-deacon-patrol","description":"Patrol 18: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T05:36:31.037714-08:00","updated_at":"2025-12-27T21:26:03.71017-08:00","deleted_at":"2025-12-27T21:26:03.71017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9lx4","title":"generate-summary","description":"Generate a summary for molecule squash.\nFile any remaining work as issues.\n\nDocument any important context for the squash digest.\n\nDepends: submit-merge","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:48:26.322644-08:00","updated_at":"2025-12-25T14:12:42.183864-08:00","dependencies":[{"issue_id":"gt-9lx4","depends_on_id":"gt-0s99","type":"blocks","created_at":"2025-12-21T21:48:26.330703-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.183864-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9mb","title":"Recreate beads rigs with fresh clones","description":"## Problem\n\nBeads rigs have schema mismatches (missing thread_id column, etc.) from development iteration.\n\n## Tasks\n\n1. Shut down any active polecats\n2. Delete existing beads rigs: mayor/rig, refinery/rig, witness/rig, crew/*\n3. Re-clone from beads repo\n4. Run bd init in each new clone\n\n## Rigs to recreate\n\n- /Users/stevey/gt/beads/mayor/rig\n- /Users/stevey/gt/beads/refinery/rig\n- /Users/stevey/gt/beads/crew/* (if any)\n\n## Source\n\nClone from beads repo (need to confirm remote URL)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-18T19:13:32.208448-08:00","updated_at":"2025-12-27T21:29:45.562016-08:00","deleted_at":"2025-12-27T21:29:45.562016-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9mgn","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:25","description":"Patrol 10: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:25:35.360033-08:00","updated_at":"2025-12-27T21:26:05.272331-08:00","deleted_at":"2025-12-27T21:26:05.272331-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9myic","title":"Digest: mol-deacon-patrol","description":"Patrol 5: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:30:46.894355-08:00","updated_at":"2025-12-27T21:26:00.821379-08:00","deleted_at":"2025-12-27T21:26:00.821379-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9mzv1","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:34:41.715664-08:00","updated_at":"2025-12-27T21:26:03.886583-08:00","deleted_at":"2025-12-27T21:26:03.886583-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9n1sp","title":"Digest: mol-deacon-patrol","description":"Patrol 12: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:46:22.718228-08:00","updated_at":"2025-12-27T21:26:01.133927-08:00","deleted_at":"2025-12-27T21:26:01.133927-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9n9gk","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All healthy, 2 crew active","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:47:11.478869-08:00","updated_at":"2025-12-27T21:26:01.125278-08:00","deleted_at":"2025-12-27T21:26:01.125278-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9nf","title":"gt spawn should create fresh polecat worktree, never reuse","description":"Currently gt spawn tries to reuse existing polecats, which causes:\n\n1. Stale beads database\n2. Stale code (behind main/integration)\n3. Old inbox messages\n4. Git history pollution\n\n## Current Behavior\n\ngt spawn:\n1. Looks for idle polecat in pool\n2. If found, reuses existing worktree\n3. Assigns new issue\n\n## Desired Behavior\n\ngt spawn:\n1. Always create FRESH polecat from current main/integration\n2. Fresh worktree with clean beads\n3. No reuse of old worktrees\n\n## Name Pool Still Useful\n\nKeep name pool for:\n- Allocating themed names (mad-max, etc.)\n- Tracking which names are in use\n\nBut worktrees should be created fresh each time.\n\n## Implementation\n\nIn spawn.go, before starting work:\n1. If worktree exists: remove it first\n2. Create fresh worktree from integration branch\n3. Sync beads from rig to polecat\n4. Then proceed with work assignment\n\nThis ensures polecats always start with latest code and beads.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T15:23:35.250531-08:00","updated_at":"2025-12-27T21:29:53.702898-08:00","dependencies":[{"issue_id":"gt-9nf","depends_on_id":"gt-8v8","type":"blocks","created_at":"2025-12-20T15:40:09.069331-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.702898-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-9o0bf","title":"Digest: mol-deacon-patrol","description":"Patrol 17: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:38:56.364854-08:00","updated_at":"2025-12-27T21:26:03.075238-08:00","deleted_at":"2025-12-27T21:26:03.075238-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9ojgy","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:48:38.058041-08:00","updated_at":"2025-12-27T21:26:04.185508-08:00","deleted_at":"2025-12-27T21:26:04.185508-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9pyg","title":"Deacon tmux status bar: heartbeat timing and patrol state","description":"## Problem\n\nThe deacon shows patrol output but lacks real-time observability:\n- No indication of when last heartbeat occurred\n- No indication of when next heartbeat is scheduled\n- No visibility into current patrol step\n\n## Desired Behavior\n\nThe tmux status bar for gt-deacon should show:\n\n```\n⛪ Deacon | ❤️ 2m ago | ⏰ 3m | 📍 health-scan (3/7)\n```\n\nComponents:\n- **Role icon**: ⛪ (deacon identity)\n- **Last heartbeat**: ❤️ 2m ago (time since last heartbeat file update)\n- **Next heartbeat**: ⏰ 3m (time until daemon would poke)\n- **Current step**: 📍 health-scan (3/7) (current patrol atom, step N of M)\n\n## Implementation\n\n1. **Read heartbeat.json** for last update time\n2. **Calculate next poke** based on daemon interval (default 5m)\n3. **Read current wisp** from .beads-wisp/ to get patrol progress\n4. **Update tmux status** periodically or on state change\n\nOptions:\n- tmux status-right with shell script\n- gt deacon status --tmux for formatted output\n- Hook into patrol step completion\n\n## Related\n\n- gt-id36: Deacon Kernel epic\n- gt-3x0z: Wisp Molecule Integration","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T03:03:38.693983-08:00","updated_at":"2025-12-27T21:29:56.387158-08:00","deleted_at":"2025-12-27T21:29:56.387158-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-9r4sk","title":"Digest: mol-deacon-patrol","description":"Patrol 20: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:06:05.822838-08:00","updated_at":"2025-12-27T21:26:03.339464-08:00","deleted_at":"2025-12-27T21:26:03.339464-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9s6rh","title":"Digest: mol-deacon-patrol","description":"Patrol 18: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:00:05.490551-08:00","updated_at":"2025-12-27T21:26:01.76229-08:00","deleted_at":"2025-12-27T21:26:01.76229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9t14","title":"Wire up --wisp flag for ephemeral molecules","description":"The --wisp flag exists in gt sling but isn't wired through to bd mol.\n\nNeed:\n1. bd mol run --wisp - spawn to .beads-wisp/ instead of .beads/\n2. Automatic burn on molecule completion\n3. Optional squash to digest for audit trail\n\nCurrently:\n- sling.go sets thing.IsWisp but never uses it\n- bd mol run has no --wisp flag\n- .beads-wisp/ exists but nothing writes to it","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T13:12:25.192854-08:00","updated_at":"2025-12-27T21:29:53.21739-08:00","deleted_at":"2025-12-27T21:29:53.21739-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-9uy0","title":"Remove 'spawn' terminology from molecular chemistry","description":"Clean all docs and code to present ONLY current terminology. No migration paths, no \"old vs new\" tables, no deprecated flags. The codebase should read as if the current design was always the design.\n\n**Terminology rules:**\n- spawn = polecats/workers ONLY\n- pour = create persistent mol\n- wisp = create ephemeral wisp \n- run = create and execute\n\n**Scope:**\n- Remove all \"Old → New\" migration tables\n- Remove deprecated flag documentation (--persistent, etc.)\n- Remove chemistry-design-changes.md entirely (it is a migration doc)\n- Clean all docs to use current terminology only\n- Rename spawn functions in code that deal with molecules","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T13:06:43.805547-08:00","updated_at":"2025-12-27T21:29:52.622049-08:00","dependencies":[{"issue_id":"gt-9uy0","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T13:06:49.341648-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.622049-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9wb07","title":"Digest: mol-deacon-patrol","description":"Patrol 6: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:55:01.948504-08:00","updated_at":"2025-12-27T21:26:00.567108-08:00","deleted_at":"2025-12-27T21:26:00.567108-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9xx4","title":"Digest: mol-deacon-patrol","description":"Patrol #14","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:25:08.524261-08:00","updated_at":"2025-12-27T21:26:04.717165-08:00","deleted_at":"2025-12-27T21:26:04.717165-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9yvrk","title":"Digest: mol-deacon-patrol","description":"Patrol 20 - handoff threshold","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:06:59.312394-08:00","updated_at":"2025-12-27T21:26:00.626948-08:00","deleted_at":"2025-12-27T21:26:00.626948-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-9za0","title":"Mol Mall: Molecule marketplace distribution","description":"Design the Mol Mall distribution mechanism for sharing molecules.\n\n## Core Insight\n\nMol Mall is just a registry of molecules.jsonl fragments. No special format needed.\n\n## Distribution Model\n\n```\nMol Mall (registry)\n │\n │ bd mol search \"security\"\n ▼\nMolecule catalog (searchable index)\n │\n │ bd mol install mol-security-scan\n ▼\nDownload molecules.jsonl fragment\n │\n │ Append to ~/.beads/molecules.jsonl\n ▼\nAvailable locally: bd mol list\n```\n\n## Registry Format\n\nSimple JSON index served from HTTPS:\n\n```json\n{\n \"molecules\": [\n {\n \"id\": \"mol-security-scan\",\n \"title\": \"Security Scan\",\n \"description\": \"OWASP Top 10 vulnerability checker\",\n \"labels\": [\"plugin\", \"code-review\", \"tier:sonnet\"],\n \"author\": \"anthropic\",\n \"version\": \"1.0.0\",\n \"url\": \"https://mol-mall.anthropic.com/molecules/mol-security-scan.jsonl\"\n }\n ]\n}\n```\n\n## CLI Commands\n\n```bash\nbd mol search \u003cquery\u003e # Search registry\nbd mol info \u003cid\u003e # Show molecule details\nbd mol install \u003cid\u003e # Download and install\nbd mol uninstall \u003cid\u003e # Remove from catalog\nbd mol update # Update all installed molecules\n```\n\n## Installation Locations\n\n```\n~/.beads/molecules.jsonl # User-level (default install location)\n~/gt/.beads/molecules.jsonl # Town-level (shared across rigs)\n.beads/molecules.jsonl # Project-level (team-specific)\n```\n\n## Version Management\n\nMolecules can have versions via labels: `version:1.0.0`\nInstall specific version: `bd mol install mol-security-scan@1.0.0`\n\n## Auth (Future)\n\nPrivate registries could require auth token.\nEnterprise Mol Malls for internal distribution.\n\n## Related\n\n- gt-u818: Plugin System (plugins ARE molecules)\n- molecular-chemistry.md: Format documentation","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T05:14:35.698369-08:00","updated_at":"2025-12-27T21:29:56.068175-08:00","deleted_at":"2025-12-27T21:29:56.068175-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-9zic","title":"Merge: gt-rp0k","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-rp0k\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:40:52.956859-08:00","updated_at":"2025-12-27T21:27:22.510044-08:00","deleted_at":"2025-12-27T21:27:22.510044-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-a07f","title":"Chemistry UX commands from Beads","description":"Waiting on Beads repo to implement chemistry UX sugar commands (bd pour, bd wisp, bd hook, --pour flag). These are nice-to-have polish items, not blockers for core functionality.\n\nSee: gastown/mayor/rig/docs/chemistry-design-changes.md","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T02:19:04.083172-08:00","updated_at":"2025-12-27T21:29:56.395503-08:00","deleted_at":"2025-12-27T21:29:56.395503-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a2lrp","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T03:36:27.960057-08:00","updated_at":"2025-12-27T21:26:03.746799-08:00","deleted_at":"2025-12-27T21:26:03.746799-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a41um","title":"Daemon must delete LIFECYCLE messages after processing","description":"## Problem\n\nCrew workers were being killed every 5 minutes.\n\n## Root Cause\n\nOld stale LIFECYCLE messages from days ago were sitting in the deacon's inbox. The daemon's ProcessLifecycleRequests() was re-finding and re-processing them every 5-minute heartbeat cycle.\n\nThe closeMessage() function in lifecycle.go calls 'gt mail delete' but failures are silent - the message stays in the inbox and gets reprocessed on the next heartbeat.\n\n## Fix Required\n\n1. Ensure closeMessage() actually deletes the message\n2. If delete fails, log the error AND mark the message as read so it's not reprocessed\n3. Consider adding a 'processed' label or moving to a processed folder\n\n## Files\n\n- internal/daemon/lifecycle.go: closeMessage() function\n- internal/daemon/lifecycle.go: ProcessLifecycleRequests() loop","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-26T17:42:52.25807-08:00","updated_at":"2025-12-27T21:29:45.243013-08:00","deleted_at":"2025-12-27T21:29:45.243013-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-a5yv","title":"Digest: mol-deacon-patrol","description":"Patrol 4: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:56:31.09396-08:00","updated_at":"2025-12-27T21:26:04.996512-08:00","deleted_at":"2025-12-27T21:26:04.996512-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a6dy","title":"Merge: gt-0ei3","description":"branch: polecat/capable\ntarget: main\nsource_issue: gt-0ei3\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T09:28:33.182299-08:00","updated_at":"2025-12-27T21:27:22.66747-08:00","deleted_at":"2025-12-27T21:27:22.66747-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-a7o93","title":"Digest: mol-deacon-patrol","description":"Patrol 10: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:06.986877-08:00","updated_at":"2025-12-27T21:26:03.413316-08:00","deleted_at":"2025-12-27T21:26:03.413316-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a7zs","title":"Digest: mol-deacon-patrol","description":"Patrol #7: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:32:12.166348-08:00","updated_at":"2025-12-27T21:26:04.342802-08:00","deleted_at":"2025-12-27T21:26:04.342802-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a817","title":"Update polecat CLAUDE.md with molecule workflow","description":"Add molecule execution guidance to polecat context:\n\n## What to Add\n\n### Molecule Awareness\n- Polecats execute Wisps (ephemeral molecule instances)\n- The work assignment mail includes molecule context\n- Current step is tracked in the wisp\n\n### Workflow Protocol\n1. Read assignment (includes molecule ID and current step)\n2. Execute current step\n3. Update step status via bd mol step\n4. Generate summary when all steps complete\n5. Run bd mol squash to compress wisp into digest\n\n### Key Commands\n- bd mol show \u003cwisp-id\u003e - view current molecule state\n- bd mol step \u003cwisp-id\u003e --status=complete - mark step done\n- bd mol squash \u003cwisp-id\u003e --summary='...' - complete molecule\n\n### Summary Generation\nWhen completing work, generate a summary that:\n- Lists what was accomplished\n- Notes any deviations from the plan\n- Captures key decisions made\n- This becomes the permanent digest","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T16:33:05.36066-08:00","updated_at":"2025-12-27T21:29:56.545663-08:00","dependencies":[{"issue_id":"gt-a817","depends_on_id":"gt-62hm","type":"blocks","created_at":"2025-12-21T16:33:17.457167-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.545663-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-a95","title":"Refinery background daemon mode","description":"Refinery currently only works in foreground mode. Background daemon is stubbed.\n\n## Current State\nmanager.go line 128-129:\n```go\n// Background mode: spawn a new process\n// For MVP, we just mark as running - actual daemon implementation in gt-ov2\nreturn nil\n```\n\n## Requirements\n\n### 1. Background Process Spawning\n```go\nfunc (m *Manager) Start(foreground bool) error {\n if !foreground {\n // Spawn gt refinery start --foreground as subprocess\n cmd := exec.Command(os.Args[0], \"refinery\", \"start\", m.rig.Name, \"--foreground\")\n cmd.Start() // Don't wait\n // Record PID\n }\n}\n```\n\n### 2. PID File Management\n- Write PID to .gastown/refinery.pid\n- Check PID validity on status\n- Clean up stale PID files\n\n### 3. Log Output\n- Redirect stdout/stderr to .gastown/refinery.log\n- Log rotation (optional for MVP)\n\n### 4. Graceful Shutdown\n- Handle SIGTERM/SIGINT\n- Complete current merge before exit\n- Update state to stopped\n\n### 5. Health Check\n- Process existence check via kill -0\n- Optional: heartbeat file with timestamp\n\n## Files to Modify\n- internal/refinery/manager.go: Start(), Status(), process spawning\n\n## Acceptance Criteria\n- [ ] gt refinery start \u003crig\u003e spawns background process\n- [ ] gt refinery status shows running with PID\n- [ ] gt refinery stop sends SIGTERM and waits\n- [ ] Logs written to .gastown/refinery.log\n- [ ] Survives terminal close","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T14:46:53.366619-08:00","updated_at":"2025-12-27T21:29:54.461089-08:00","deleted_at":"2025-12-27T21:29:54.461089-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-aa4ao","title":"Digest: mol-deacon-patrol","description":"Patrol 13: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:16:10.046179-08:00","updated_at":"2025-12-27T21:26:00.99733-08:00","deleted_at":"2025-12-27T21:26:00.99733-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-aa5l","title":"Digest: mol-deacon-patrol","description":"Patrol #14: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:34:09.997525-08:00","updated_at":"2025-12-27T21:26:04.28514-08:00","deleted_at":"2025-12-27T21:26:04.28514-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ab30","title":"Digest: mol-deacon-patrol","description":"Patrol #9: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:32:51.348182-08:00","updated_at":"2025-12-27T21:26:04.326497-08:00","deleted_at":"2025-12-27T21:26:04.326497-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-abfu","title":"Polecat template: simplify completion section to just 'gt done'","description":"Completion section lists multiple options (gt done, gt handoff, bd sync). Should just say 'gt done' with succinct guidance on when bd sync is needed. Remove the alternatives that cause confusion.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:48.024442-08:00","updated_at":"2025-12-27T21:29:55.926381-08:00","dependencies":[{"issue_id":"gt-abfu","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.45135-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.926381-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-abu4c","title":"Merge: gt-svdsy","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: gt-svdsy\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:20:17.595631-08:00","updated_at":"2025-12-30T23:12:42.882382-08:00","closed_at":"2025-12-30T23:12:42.882382-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-ac5v","title":"health-scan","description":"Ping Witnesses and Refineries. Run gt status --health. Remediate if down.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:51:45.436913-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-acopl","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:00:38.044651-08:00","updated_at":"2025-12-27T21:26:04.04277-08:00","deleted_at":"2025-12-27T21:26:04.04277-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-adaln","title":"Digest: mol-deacon-patrol","description":"Patrol 12: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:58:14.317694-08:00","updated_at":"2025-12-27T21:26:00.515615-08:00","deleted_at":"2025-12-27T21:26:00.515615-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-adc9","title":"implement","description":"Implement the solution for gt-qwyu. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:58:52.599953-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-adc9","depends_on_id":"gt-q6hl","type":"parent-child","created_at":"2025-12-21T21:58:52.601414-08:00","created_by":"stevey"},{"issue_id":"gt-adc9","depends_on_id":"gt-leeb","type":"blocks","created_at":"2025-12-21T21:58:52.601977-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-aedd","title":"Polecat template: fix hard-to-parse sentence about persistence","description":"This sentence is hard to parse: 'When in doubt, prefer bd—persistence you don't need beats lost context'. Rewrite for clarity, e.g., 'When in doubt, use bd for tracking - unnecessary persistence is better than lost context.'","status":"tombstone","priority":3,"issue_type":"bug","created_at":"2025-12-23T16:56:53.99389-08:00","updated_at":"2025-12-27T21:29:57.464539-08:00","dependencies":[{"issue_id":"gt-aedd","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.69287-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.464539-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-af74c","title":"Digest: mol-deacon-patrol","description":"Patrol 15: all healthy, doctor pass","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:35:33.805768-08:00","updated_at":"2025-12-27T21:26:00.73534-08:00","deleted_at":"2025-12-27T21:26:00.73534-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-afe3","title":"Digest: mol-deacon-patrol","description":"Patrol 7: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:57:54.201136-08:00","updated_at":"2025-12-27T21:26:04.968773-08:00","deleted_at":"2025-12-27T21:26:04.968773-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-afn0","title":"gt sling: use mail queue for patrol roles (witness, refinery, deacon)","description":"When slinging work to patrol agents, queue via mail instead of replacing hook.\n\n**Rationale:**\nPatrol agents run continuous loops. Replacing their hook with discrete work breaks patrol continuity - when the task completes, the patrol stops.\n\n**New behavior for patrol roles (witness, refinery, deacon):**\n1. Check if patrol is running (hook has patrol molecule attached)\n2. If patrol running:\n - Don't touch hook (patrol stays pinned)\n - Send work assignment mail\n - Print 'Queued for next patrol cycle'\n3. If patrol NOT running:\n - Start default patrol for that role (mol-witness-patrol, mol-refinery-patrol, mol-deacon-patrol)\n - Send work assignment mail\n - Print 'Started patrol and queued work'\n\n**New flags:**\n- --urgent: Interrupt current patrol cycle, process this work immediately\n- --replace: Explicitly terminate patrol and pin discrete work (break-glass)\n\n**No change for:**\n- Polecat, Crew, Mayor (discrete task agents - current behavior)\n\n**Dependencies:**\n- Patrol templates must have 'check inbox' step (verify/add)\n- Need to know default patrol molecule for each role","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-23T16:12:58.431633-08:00","updated_at":"2025-12-27T21:29:52.957031-08:00","deleted_at":"2025-12-27T21:29:52.957031-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-ag8jc","title":"Digest: mol-deacon-patrol","description":"Patrol 4: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:49:27.854956-08:00","updated_at":"2026-01-01T08:49:27.854956-08:00","closed_at":"2026-01-01T08:49:27.854924-08:00"}
{"id":"gt-ai1z","title":"TODO: Detect cycles in molecule dependency graph","description":"molecule.go:302 has a TODO to detect cycles in the dependency graph. Currently, cyclical dependencies could cause issues.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:34:28.169096-08:00","updated_at":"2025-12-27T21:29:56.495636-08:00","deleted_at":"2025-12-27T21:29:56.495636-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ake0m","title":"gt sling: auto-sync beads in new polecat worktrees","description":"## Problem\n`gt sling \u003cbead\u003e \u003crig\u003e` could fail to pin because polecat worktree branches persisted and diverged from current state.\n\n## Root Cause\n- Polecat branches (`polecat/\u003cname\u003e`) were reused across runs\n- Stale branches could diverge from main and beads-sync\n- Branch reuse logic was complex with fallback behavior\n\n## Solution Implemented\n**Fresh unique branches per polecat run**\n\nInstead of syncing beads (Options A/B/C), we changed the branch model:\n\n1. **Unique timestamped branches**: Each polecat run creates `polecat/\u003cname\u003e-\u003ctimestamp\u003e`\n2. **No branch reuse**: Each spawn starts fresh from current HEAD\n3. **Simplified code**: Removed branch existence checks and reset logic\n4. **Garbage collection**: Added `gt polecat gc \u003crig\u003e` to clean up old branches\n\n### Why This Works\n- Polecats already use beads redirect (`../../mayor/rig/.beads`)\n- Mayor's beads are authoritative - polecats read from there\n- Fresh branches ensure clean starting state\n- Old branches are ephemeral (never pushed to origin)\n\n### Files Changed\n- `internal/polecat/manager.go`: Add/Recreate/CleanupStaleBranches\n- `internal/git/git.go`: ListBranches helper\n- `internal/cmd/polecat.go`: `gt polecat gc` command\n\n### Commands Added\n```\ngt polecat gc \u003crig\u003e # Clean up stale branches\ngt polecat gc \u003crig\u003e --dry-run # Show what would be deleted\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T16:41:04.305298-08:00","updated_at":"2025-12-27T21:29:45.773031-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:45.773031-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-alx","title":"Swarm: ephemeral rig support","description":"PGT has ephemeral rigs for swarms - temporary worker groups that are destroyed after landing.\n\nMissing Features:\n- gt swarm init [--rig \u003cname\u003e|--git-url \u003curl\u003e] [--num-workers N]\n- gt swarm worker add/remove/list \u003crig-id\u003e\n- gt swarm rigs - List ephemeral rigs\n- gt swarm destroy \u003crig-id\u003e - Destroy ephemeral rig\n\nDirectory structure:\n\u003cworkspace\u003e/mayor/workers/\u003crig-id\u003e/\n├── rig.json (metadata)\n├── alice/ (git clone)\n├── bob/\n└── carol/\n\nPGT Reference: gastown-py/src/gastown/ephemeral.py\n\nNote: Beads issue gt-kmn.12 mentions this but implementation is missing.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T14:47:14.302762-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-amx5o","title":"Digest: mol-deacon-patrol","description":"Patrol 11: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:28.741803-08:00","updated_at":"2025-12-27T21:26:03.405148-08:00","deleted_at":"2025-12-27T21:26:03.405148-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-anm3b","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 16: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:29:25.405682-08:00","updated_at":"2025-12-27T21:26:01.837499-08:00","deleted_at":"2025-12-27T21:26:01.837499-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-anoyp","title":"Session ended: gt-gastown-slit","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-02T18:25:19.242553-08:00","updated_at":"2026-01-07T10:33:31.947107+13:00","closed_at":"2026-01-03T11:32:45.180847-08:00","close_reason":"Session lifecycle events processed","created_by":"gastown/polecats/slit"}
{"id":"gt-aobh","title":"Polecats should not bd sync on startup","description":"Polecats all share the same beads database at the rig level. The refinery and mayor/witness manage syncing beads.\n\n## Current Behavior\nPolecat startup runs bd sync, causing:\n- Contention when multiple polecats spawn simultaneously\n- Unnecessary sync operations\n- Potential race conditions\n\n## Desired Behavior\n- Polecats should NOT run bd sync on startup\n- They read from the shared beads database\n- Only refinery/witness/mayor sync beads\n\n## Implementation\nRemove bd sync from polecat spawn/startup sequence.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-21T16:40:18.705507-08:00","updated_at":"2025-12-27T21:29:56.529157-08:00","deleted_at":"2025-12-27T21:29:56.529157-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-apf71","title":"Test actor display","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T23:41:23.438047-08:00","updated_at":"2025-12-29T23:41:36.970464-08:00","created_by":"gastown/crew/jack","deleted_at":"2025-12-29T23:41:36.970464-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-apft7","title":"Merge: capable-1767084028536","description":"branch: polecat/capable-1767084028536\ntarget: main\nsource_issue: capable-1767084028536\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T01:04:07.332002-08:00","updated_at":"2025-12-30T01:06:10.373431-08:00","closed_at":"2025-12-30T01:06:10.373431-08:00","close_reason":"Conflicts with main - mrqueue package was removed. Notified capable to rebase.","created_by":"gastown/polecats/capable"}
{"id":"gt-apxi8","title":"Test activity feed enhancement","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-28T10:07:33.403971-08:00","updated_at":"2025-12-28T10:10:05.460846-08:00","created_by":"gastown/crew/joe","deleted_at":"2025-12-28T10:10:05.460846-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-aqd8","title":"Witness Patrol Molecule","description":"Create mol-witness-patrol molecule for witness heartbeat loop.\n\nSimilar to DeaconPatrolMolecule but for per-rig witness duties.\n\n## Steps\n1. inbox-check - Process witness mail\n2. survey-workers - List polecats, check status\n3. inspect-workers - Capture pane output, assess state \n4. plugin-run - Execute plugins from \u003crig\u003e/plugins/witness/\n5. decide-actions - Determine nudges, escalations\n6. execute-actions - Send nudges, process shutdowns\n7. context-check - Check own context, handoff if needed\n8. loop-or-exit - Continue or cycle\n\nThe plugin-run step enables user customization of the patrol loop.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-21T16:16:52.808202-08:00","updated_at":"2025-12-27T21:29:53.459876-08:00","deleted_at":"2025-12-27T21:29:53.459876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-aqku","title":"BUG: gt done doesn't push branch to origin - work lost when worktree deleted","description":"## Problem\n\n`gt done` creates a merge-request record in beads but **never pushes the branch to origin**.\n\nWhen polecats are cleaned up (worktrees deleted), the branch is lost forever. The MR record exists but there's no actual branch to merge.\n\n## Evidence\n\n12 MQ items had `ready` status but no corresponding branches on origin. All polecat worktrees were deleted, losing all work.\n\n## Impact\n\n**CRITICAL**: All polecat work is silently lost unless polecats manually push (which they don't know to do).\n\n## Fix\n\nIn `done.go`, before creating the MR:\n1. `git push origin \u003cbranch\u003e` to push the branch to remote\n2. Only create MR record if push succeeds\n\n## Location\n`mayor/rig/internal/cmd/done.go:124-135`","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-21T17:21:27.012045-08:00","updated_at":"2025-12-27T21:29:45.494928-08:00","deleted_at":"2025-12-27T21:29:45.494928-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-aqm","title":"Beads as Universal Data Plane","description":"## Vision\n\nBeads is the data plane for all Gas Town operations. Everything flows through beads:\n- Work items (issues, tasks, epics)\n- Mail (messages between agents)\n- Merge requests (queue entries)\n- Workflows (composable execution patterns)\n- Resources (leases, locks, quotas)\n- Schedules (timed activities)\n\n## New Bead Categories\n\n### Molecules (Composable Workflows)\nCrystallized workflow patterns that can be attached to work items.\n\n### Timed Beads (Scheduled Work)\nBeads that wake up periodically via daemon.\n\n### Pinned Beads (Ongoing Concerns)\nBeads representing persistent concerns, not discrete tasks.\n\n### Resource Beads (Leases/Locks)\nBeads representing reserved resources.\n\n## v1 Priority\n\n- **P0**: Molecules (enables engineer-in-box)\n- **P2**: Timed, Pinned, Resource beads (post-v1)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-18T18:05:57.847578-08:00","updated_at":"2025-12-27T21:29:45.606937-08:00","dependencies":[{"issue_id":"gt-aqm","depends_on_id":"gt-4nn","type":"blocks","created_at":"2025-12-18T18:06:29.881371-08:00","created_by":"daemon"},{"issue_id":"gt-aqm","depends_on_id":"gt-caz","type":"blocks","created_at":"2025-12-18T18:08:19.833603-08:00","created_by":"daemon"},{"issue_id":"gt-aqm","depends_on_id":"gt-8h4","type":"blocks","created_at":"2025-12-18T18:08:19.930166-08:00","created_by":"daemon"},{"issue_id":"gt-aqm","depends_on_id":"gt-b3p","type":"blocks","created_at":"2025-12-18T18:08:20.026188-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.606937-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-as08","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, no lifecycle requests, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:09:09.329759-08:00","updated_at":"2025-12-27T21:26:04.853016-08:00","deleted_at":"2025-12-27T21:26:04.853016-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ata2f","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final cycle, all clear, handoff triggered","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:54:25.2544-08:00","updated_at":"2025-12-27T21:26:03.058696-08:00","deleted_at":"2025-12-27T21:26:03.058696-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-av8","title":"Update Mayor prompting in gastown-py","description":"The Mayor CLAUDE.md and related prompting in gastown-py (still in production use) needs to reflect current design decisions: session cycling, handoff protocol, cleanup responsibilities, beads access model. Sync prompting with GGT design work.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:24:09.953043-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-av92i","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All healthy, halfway mark","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:47:59.377355-08:00","updated_at":"2025-12-27T21:26:03.462272-08:00","deleted_at":"2025-12-27T21:26:03.462272-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-avq9","title":"Merge: gt-3x1.3","description":"branch: polecat/Doof\ntarget: main\nsource_issue: gt-3x1.3\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T21:23:06.77909-08:00","updated_at":"2025-12-27T21:27:22.650787-08:00","deleted_at":"2025-12-27T21:27:22.650787-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-avv7","title":"Digest: mol-deacon-patrol","description":"Patrol #12: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:33:40.201208-08:00","updated_at":"2025-12-27T21:26:04.30162-08:00","deleted_at":"2025-12-27T21:26:04.30162-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-awb9v","title":"Review PR #42: fix(done): detect default branch instead of hardcoding 'main'","description":"Review PR #42. Verify default branch detection works correctly. Approve with gh pr review --approve if good.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-03T11:40:27.996208-08:00","updated_at":"2026-01-03T11:45:16.667669-08:00","closed_at":"2026-01-03T11:45:16.667669-08:00","close_reason":"PR #42 reviewed and approved","created_by":"mayor"}
{"id":"gt-axtei","title":"Digest: mol-deacon-patrol","description":"Patrol 15: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:04:28.164433-08:00","updated_at":"2025-12-27T21:26:03.968642-08:00","deleted_at":"2025-12-27T21:26:03.968642-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-axz","title":"Design: Plugin architecture (agents-as-plugins)","description":"Plugin system where plugins are just additional agents with identities, mailboxes, and beads access. See docs/architecture.md Plugins section. No special framework - just directory conventions and mail-based invocation.","status":"tombstone","priority":3,"issue_type":"epic","created_at":"2025-12-15T22:52:43.614095-08:00","updated_at":"2025-12-27T21:29:57.759152-08:00","dependencies":[{"issue_id":"gt-axz","depends_on_id":"gt-id36","type":"blocks","created_at":"2025-12-20T21:47:41.790184-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.759152-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ay1r","title":"gt molecule current: Show what agent should be working on","description":"Query what an agent identity is supposed to be working on via breadcrumb trail.\n\n## Command\n```bash\ngt molecule current \u003cidentity\u003e\ngt molecule current gastown/furiosa\ngt molecule current deacon\n```\n\n## Logic\n1. Find handoff bead for identity (pinned bead titled \"\u003crole\u003e Handoff\")\n2. Parse attachment field → molecule ID\n3. If no attachment → \"naked\" (no active molecule)\n4. If attached → load molecule, find current step:\n - bd ready --parent=\u003cmol-id\u003e → next unblocked step\n - Or first in_progress step\n\n## Output\n```\nIdentity: gastown/furiosa\nHandoff: gt-8v2 (Furiosa Handoff)\nMolecule: gt-mol-abc (mol-polecat-work)\nProgress: 3/8 steps complete\nCurrent: gt-mol-abc.4 - verify-tests\n```\n\nOr if naked:\n```\nIdentity: gastown/angharad\nHandoff: gt-9x1 (Angharad Handoff)\nMolecule: (none attached)\nStatus: naked - awaiting work assignment\n```\n\n## Use Cases\n- Mayor checking what polecats are doing\n- Witness verifying polecat progress\n- Debug: \"why isnt this polecat working?\"\n- Deacon patrol: track all agent states","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-21T21:34:01.430109-08:00","updated_at":"2025-12-27T21:29:53.375429-08:00","deleted_at":"2025-12-27T21:29:53.375429-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-aygm5","title":"Digest: mol-deacon-patrol","description":"Patrol 3: all clear, hook fix applied","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:16:01.0164-08:00","updated_at":"2025-12-27T21:26:01.201359-08:00","deleted_at":"2025-12-27T21:26:01.201359-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-az41","title":"mol-ready-work variants: safe vs aggressive","description":"## Summary\n\nCreate two variants of mol-ready-work:\n\n### mol-ready-work-safe (default)\n- PRs: review and comment, but require human approval for merge\n- Issues: triage only, don't auto-close\n- Beads: implement but create PR instead of pushing to main\n\n### mol-ready-work-aggressive \n- PRs: can approve and merge directly\n- Issues: can close duplicates/invalid\n- Beads: can push directly to main\n\n## Implementation\nCould be a single molecule with a `{{mode}}` variable, or two separate protos.\n\n## Parent\nPart of gt-tnca (mol-ready-work epic)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:19:38.703698-08:00","updated_at":"2025-12-27T21:29:56.250681-08:00","dependencies":[{"issue_id":"gt-az41","depends_on_id":"gt-tnca","type":"blocks","created_at":"2025-12-23T01:19:56.387233-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.250681-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-azzni","title":"Digest: mol-deacon-patrol","description":"Patrol 18: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:36:52.597153-08:00","updated_at":"2025-12-27T21:26:00.710685-08:00","deleted_at":"2025-12-27T21:26:00.710685-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b00d7","title":"Digest: mol-deacon-patrol","description":"Patrol 12: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:01.615713-08:00","updated_at":"2025-12-27T21:26:02.114479-08:00","deleted_at":"2025-12-27T21:26:02.114479-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b089l","title":"Digest: mol-deacon-patrol","description":"Patrol 13: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:28:09.102947-08:00","updated_at":"2025-12-27T21:26:03.617976-08:00","deleted_at":"2025-12-27T21:26:03.617976-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b1g","title":"MVP Cutover: GGT replaces PGT for batch work","description":"When this is closed, stop using town and start using gt.\n\n## Acceptance Criteria\n\n1. gt spawn assigns issue to polecat and starts session\n2. gt spawn --epic spawns workers for all epic children\n3. gt session manages tmux lifecycle \n4. gt send / gt inbox work for mail\n5. Refinery processes merge queue with semantic merges\n6. Integration branches created and landed correctly\n7. gt stop --all halts all sessions\n8. One successful test batch completed end-to-end\n\n## What Must Work\n\n- Spawn polecat with issue assignment\n- Spawn workers for epic children\n- Session start/stop/attach\n- Mail send/inbox/read\n- Refinery merge loop (semantic)\n- Integration branch → main landing\n- Witness cleanup protocol\n- Emergency stop\n\n## What Can Be Deferred\n\n- Doctor checks (use PGT)\n- TUI dashboard\n- Plugin system\n- Federation\n- Ephemeral rigs\n- Detailed landing reports\n\n## Test Plan\n\n1. Create epic with 2 tasks, spawn 2 workers\n2. Verify polecats get assigned and sessions start\n3. Simulate task completion\n4. Verify Refinery merges to integration\n5. Verify landing to main\n6. Verify cleanup\n\n## Validation\n\nRun one real batch implementing GGT issues using GGT.\n\n## Note\n\nNo \"swarm IDs\" - just spawn workers for epic, let merge queue coordinate.","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-16T00:11:09.148751-08:00","updated_at":"2025-12-27T21:29:45.690461-08:00","dependencies":[{"issue_id":"gt-b1g","depends_on_id":"gt-u1j.19","type":"blocks","created_at":"2025-12-16T00:11:36.196292-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-kmn.4","type":"blocks","created_at":"2025-12-16T00:11:36.273483-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-kmn.6","type":"blocks","created_at":"2025-12-16T00:11:36.351097-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-kmn.7","type":"blocks","created_at":"2025-12-16T00:11:36.431641-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-u1j.22","type":"blocks","created_at":"2025-12-16T00:11:36.511124-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-ov2","type":"blocks","created_at":"2025-12-16T00:11:51.609649-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-rm3","type":"blocks","created_at":"2025-12-16T00:11:51.69062-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-16T21:36:32.942855-08:00","created_by":"daemon"},{"issue_id":"gt-b1g","depends_on_id":"gt-u1j.12","type":"blocks","created_at":"2025-12-16T21:36:35.053559-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.690461-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-b1krg","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Nominal - Handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:54:04.619206-08:00","updated_at":"2025-12-27T21:26:04.067261-08:00","deleted_at":"2025-12-27T21:26:04.067261-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b2bn","title":"Merge: gt-h6eq.5","description":"branch: polecat/keeper\ntarget: main\nsource_issue: gt-h6eq.5\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:45:44.660565-08:00","updated_at":"2025-12-27T21:27:22.836447-08:00","deleted_at":"2025-12-27T21:27:22.836447-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-b2d3y","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All witnesses/refineries healthy, no work","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:44:27.88396-08:00","updated_at":"2025-12-27T21:26:01.552416-08:00","deleted_at":"2025-12-27T21:26:01.552416-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b2hj","title":"gt orphans: Find lost polecat work","description":"Add a command to find orphaned commits that were never merged to main.\n\n## Problem\nPolecat work can get lost when:\n- Session killed before merge\n- Refinery fails to process\n- Network issues during push\n\n## Solution\nAdd `gt orphans` command that:\n1. Uses `git fsck --unreachable` to find dangling commits\n2. Filters to recent commits (default: 7 days, configurable)\n3. Excludes stash/index entries (WIP on, index on)\n4. Shows commit details and suggests recovery\n\n## Usage\n```\ngt orphans # Last 7 days\ngt orphans --days=14 # Last 2 weeks\ngt orphans --recover # Interactive cherry-pick\n```\n\n## Found orphans (Dec 16-20, 2025)\n- 3b146c11: Fix mail read auto-ack and add Mayor startup directive\n- b6fdc561: Add persistent theme config and fix crew session theming\n- 97aba756: Unify gt prime to call bd prime and mail check\n- ce769ca5: Add mol-bootstrap molecule for Gas Town installation\n- b2219de7: Add bootstrap documentation for new Gas Town installations\n- bc82348: feat: Add gt done command (already recovered)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T16:27:25.584819-08:00","updated_at":"2025-12-27T21:29:53.68576-08:00","deleted_at":"2025-12-27T21:29:53.68576-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-b3jfl","title":"Digest: mol-deacon-patrol","description":"Patrol 5: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:10:19.217344-08:00","updated_at":"2025-12-27T21:26:02.302522-08:00","deleted_at":"2025-12-27T21:26:02.302522-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b3p","title":"Resource Beads: Leases, locks, and quotas","description":"Resource beads represent reserved resources. Types: vm, lock, slot, quota. Fields: holder, expires, renewable. Daemon monitors for expiry and manages contention.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-18T18:08:12.745602-08:00","updated_at":"2025-12-27T21:29:57.126785-08:00","deleted_at":"2025-12-27T21:29:57.126785-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-b5sh","title":"test-after-fix","description":"test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T16:05:08.538763-08:00","updated_at":"2025-12-27T21:29:56.967459-08:00","deleted_at":"2025-12-27T21:29:56.967459-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-b6qm","title":"gt spawn/crew setup should create .beads/redirect for worktrees","description":"Crew clones and polecats need a .beads/redirect file pointing to the shared beads database (../../mayor/rig/.beads). Currently:\n\n- redirect files can get deleted by git clean\n- not auto-created during gt spawn or worktree setup\n- missing redirects cause 'no beads database found' errors\n\nFound missing in: gastown/joe, beads/zoey (after git clean)\n\nFix options:\n1. gt spawn creates redirect during worktree setup\n2. gt prime regenerates missing redirects\n3. bd commands auto-detect worktree and find shared beads\n\nThis should be standard Gas Town rig configuration.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-21T17:51:11.222073-08:00","updated_at":"2025-12-27T21:29:53.401049-08:00","deleted_at":"2025-12-27T21:29:53.401049-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-b83nx","title":"Digest: mol-deacon-patrol","description":"Patrol 14: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:34.850044-08:00","updated_at":"2025-12-27T21:26:03.380705-08:00","deleted_at":"2025-12-27T21:26:03.380705-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-b9x4f","title":"Test gate for verification","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T22:12:31.773887-08:00","updated_at":"2025-12-27T21:29:45.926289-08:00","deleted_at":"2025-12-27T21:29:45.926289-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"gate"}
{"id":"gt-badfi","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:35:25.272732-08:00","updated_at":"2025-12-27T21:26:02.510979-08:00","deleted_at":"2025-12-27T21:26:02.510979-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bbb1","title":"Merge: gt-rana.4","description":"branch: polecat/ace\ntarget: main\nsource_issue: gt-rana.4\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:47:29.769226-08:00","updated_at":"2025-12-27T21:27:22.568199-08:00","deleted_at":"2025-12-27T21:27:22.568199-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-bcjh2","title":"Digest: mol-deacon-patrol","description":"P12: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:59:20.254866-08:00","updated_at":"2025-12-27T21:26:02.368238-08:00","deleted_at":"2025-12-27T21:26:02.368238-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bcwn","title":"Auto-handoff: self-cycling with guaranteed work pickup","description":"## Summary\n\nA mechanism for agents to self-cycle: send handoff mail, request Deacon to kill the process, and automatically restart to pick up the work.\n\n## Use Case\n\nUser says: `auto-handoff: gt-bug3`\n\nThe agent:\n1. Sends handoff mail documenting current state\n2. Requests Deacon to terminate the session\n3. Deacon restarts the agent in background\n4. New session picks up the handoff and continues working\n\nWhen the user returns to terminal, the old session is gone but work continues in background. User can reattach.\n\n## Key Requirements\n\n- **Guaranteed pickup**: Something in the hook and/or handoff must ensure the new session picks up the work (not just a passive \"check inbox\")\n- **Background execution**: Work continues without user presence\n- **Reattachability**: User can reconnect to see progress\n\n## Design Questions\n\n1. Can molecules help coordinate this? (e.g., molecule binds the handoff + auto-restart)\n2. What hook handles the pickup guarantee? \n3. How does Deacon know to restart vs just kill?\n4. How does user reattach to a background agent?\n\n## Related\n\n- Molecules system\n- Deacon lifecycle management\n- Handoff hooks","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T11:59:50.352286-08:00","updated_at":"2025-12-27T21:29:56.661795-08:00","deleted_at":"2025-12-27T21:29:56.661795-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-bd02.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-bd02\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:06:07.039702-08:00","updated_at":"2025-12-27T21:29:55.704081-08:00","deleted_at":"2025-12-27T21:29:55.704081-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bd2l","title":"Witness tmux status: show polecat count under management","description":"Add witness-specific status line showing:\n- Number of polecats under management\n- Active/idle status\n- Maybe: last nudge time, blocked count\n\nImplement in runWitnessStatusLine() in internal/cmd/statusline.go","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T15:40:29.482141-08:00","updated_at":"2025-12-27T21:29:56.587104-08:00","deleted_at":"2025-12-27T21:29:56.587104-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-bd721","title":"Digest: mol-deacon-patrol","description":"Patrol 7: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:14:06.242119-08:00","updated_at":"2025-12-27T21:26:01.049611-08:00","deleted_at":"2025-12-27T21:26:01.049611-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-be59b","title":"Digest: mol-deacon-patrol","description":"Patrol 9: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:41:45.24537-08:00","updated_at":"2025-12-27T21:26:00.895979-08:00","deleted_at":"2025-12-27T21:26:00.895979-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-beqp","title":"Merge: gt-72so","description":"branch: polecat/Doof\ntarget: main\nsource_issue: gt-72so\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T16:19:11.814962-08:00","updated_at":"2025-12-27T21:27:22.742103-08:00","deleted_at":"2025-12-27T21:27:22.742103-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-bf75o","title":"Digest: mol-deacon-patrol","description":"Patrol complete: fixed EMERGENCY clone divergence (zoey 103 behind, refinery 40 behind), cleaned orphan processes, created patrol molecules, fixed invalid hook attachments","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:24:07.333939-08:00","updated_at":"2025-12-27T21:26:00.610148-08:00","deleted_at":"2025-12-27T21:26:00.610148-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bf95","title":"rebase-main","description":"Rebase against main to incorporate any changes.\nResolve conflicts if needed.\n\ngit fetch origin main\ngit rebase origin/main\n\nIf there are conflicts, resolve them carefully and continue the rebase.\n\nDepends: self-review, verify-tests","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:48:26.322259-08:00","updated_at":"2025-12-25T14:12:42.206356-08:00","dependencies":[{"issue_id":"gt-bf95","depends_on_id":"gt-ldm4","type":"blocks","created_at":"2025-12-21T21:48:26.328498-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.206356-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bfd","title":"Keepalive signal from bd/gt commands","description":"Every bd and gt command should touch a keepalive file to signal 'agent is alive/working'.\n\n## Implementation\n\nTouch `\u003cworkspace\u003e/.gastown/keepalive.json`:\n```json\n{\"last_command\": \"bd show gt-99m\", \"timestamp\": \"2025-12-18T13:45:00Z\"}\n```\n\n## Usage by Daemon\n\n- Fresh (\u003c 2 min) → agent is working, skip heartbeat\n- Stale (2-5 min) → might be thinking, gentle poke\n- Very stale (\u003e 5 min) → likely idle, safe to interrupt\n\n## Benefits\n\n- Zero cost (just file I/O)\n- Works during long tool calls\n- Doesn't require agent cooperation\n- Foundation for smarter backoff strategies","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T14:19:26.241957-08:00","updated_at":"2025-12-27T21:29:54.186458-08:00","dependencies":[{"issue_id":"gt-bfd","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T14:19:46.407664-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.186458-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bhvif","title":"Digest: mol-deacon-patrol","description":"P12","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:25:37.594453-08:00","updated_at":"2025-12-27T21:26:01.64609-08:00","deleted_at":"2025-12-27T21:26:01.64609-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bhxng","title":"Digest: mol-deacon-patrol","description":"P19","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:27:27.892885-08:00","updated_at":"2025-12-27T21:26:01.585308-08:00","deleted_at":"2025-12-27T21:26:01.585308-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bi21","title":"gt sling should accept raw issues, not just molecules","description":"Currently gt sling only works with molecules. When you try to sling a raw issue:\n\n gt sling gt-9uy0 gastown/crew/max\n Error: invalid thing: issue not found\n\nOptions:\n1. Auto-wrap issues in a simple work molecule (mol-issue-work?)\n2. Create a minimal 'envelope' molecule on the fly\n3. Just pin the issue directly without a molecule wrapper\n\nThe friction of needing a molecule for every piece of work discourages using the sling mechanism for ad-hoc tasks.\n\nRelated: crew workers often want to pick up an issue without formal molecule choreography.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-24T13:46:54.92516-08:00","updated_at":"2025-12-27T21:29:55.51885-08:00","dependencies":[{"issue_id":"gt-bi21","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T13:47:01.221497-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.51885-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-biju5","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 3: all systems nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:36:48.504077-08:00","updated_at":"2025-12-27T21:26:01.452675-08:00","deleted_at":"2025-12-27T21:26:01.452675-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bj6f","title":"gt prime: Refinery context detection and output","description":"Update gt prime to detect Refinery role:\n- Detect from directory: refinery/rig/ = Refinery\n- Show handoff bead reference\n- Show merge queue status\n- Show polecat branches with unmerged commits\n- Show last actions summary","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T18:09:09.31791-08:00","updated_at":"2025-12-27T21:29:53.907293-08:00","dependencies":[{"issue_id":"gt-bj6f","depends_on_id":"gt-ktal","type":"blocks","created_at":"2025-12-19T18:09:39.462238-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.907293-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bjft","title":"gt spawn should auto-start refinery/witness if not running","description":"When spawning a polecat, the infrastructure (refinery, witness) should already be running.\n\n## Current Behavior\nspawn.go only:1. Assigns issue to polecat2. Sends work mail3. Starts polecat session\n\nThe refinery and witness must be started separately.\n\n## Expected Behavior (per user feedback)\nWhen spawning a polecat, if the rig's refinery or witness is not running, auto-start them.\n\n## Options\n\n### Option A: spawn auto-starts infrastructure\nCheck if refinery/witness running before spawn, start if not.\n\n### Option B: gt swarm start \u003crig\u003e command\nExplicit command that:\n1. Starts refinery\n2. Starts witness\n3. Optionally spawns polecats on bd ready issues\n\n### Option C: gt rig start \u003crig\u003e\nSimilar to Option B but as rig lifecycle command.\n\n## Related\n- gt-n7z7: refinery --foreground race condition bug\n- gt-u1j.18: witness CLI commands","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-20T00:58:10.146332-08:00","updated_at":"2025-12-27T21:29:56.864406-08:00","deleted_at":"2025-12-27T21:29:56.864406-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-bmjw","title":"gt polecat add: should handle existing branch gracefully","description":"## Problem\n\n`gt polecat add gastown Nux` fails if the branch `polecat/Nux` already exists.\n\n## Current Behavior\n\n```\nfatal: a branch named 'polecat/Nux' already exists\n```\n\n## Expected Behavior\n\nShould either:\n1. Reuse the existing branch\n2. Or prompt to delete/recreate\n3. Or auto-suffix: polecat/Nux-2\n\n## Context\n\nBranch may exist from previous polecat that was removed but branch wasn't cleaned up.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-18T21:52:09.361672-08:00","updated_at":"2025-12-27T21:29:57.068569-08:00","deleted_at":"2025-12-27T21:29:57.068569-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-bmt81","title":"Digest: mol-deacon-patrol","description":"Patrol 10: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T21:37:59.386618-08:00","updated_at":"2025-12-31T21:37:59.386618-08:00","closed_at":"2025-12-31T21:37:59.386581-08:00"}
{"id":"gt-bnik","title":"gt nudge should accept partial/fuzzy session names","description":"Currently gt nudge requires the exact tmux session name (e.g., gt-gastown-crew-max). Should be more forgiving:\n\n1. Accept partial matches when unambiguous (e.g., 'max' → gt-gastown-crew-max)\n2. Accept shorthand like 'gastown/max' or 'crew/max'\n3. Show helpful error with suggestions when ambiguous\n\nExamples that should work:\n- gt nudge max '...' → matches gt-gastown-crew-max\n- gt nudge gastown/max '...' → matches gt-gastown-crew-max\n- gt nudge beads-dave '...' → matches gt-beads-crew-dave","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-20T17:53:44.834337-08:00","updated_at":"2025-12-27T21:29:57.605927-08:00","deleted_at":"2025-12-27T21:29:57.605927-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-bnlev","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 20: final cycle before handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:48:49.95776-08:00","updated_at":"2025-12-27T21:26:01.312421-08:00","deleted_at":"2025-12-27T21:26:01.312421-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bq1yn","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T14:04:00.499629-08:00","updated_at":"2025-12-25T14:04:00.499629-08:00","closed_at":"2025-12-25T14:04:00.499596-08:00"}
{"id":"gt-bqbw","title":"detectSender() doesn't recognize crew workers","description":"## Problem\n\ndetectSender() in internal/cmd/mail.go only checks for /polecats/ directories. Crew workers in /crew/\u003cname\u003e/ fall through to the default 'mayor/', so:\n- gt mail inbox shows mayor's inbox instead of the crew worker's\n- gt mail send sets the wrong From address\n\n## Fix\n\nAdd crew worker detection before the /polecats/ check:\n\nif strings.Contains(cwd, \"/crew/\") {\n parts := strings.Split(cwd, \"/crew/\")\n ...\n return fmt.Sprintf(\"%s/crew/%s\", rigName, crewMember)\n}\n\n## Affected\n- Any crew worker running gt mail inbox without explicit address\n- Crew worker handoffs (wrong sender)","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T20:09:42.556373-08:00","updated_at":"2025-12-27T21:29:54.134387-08:00","deleted_at":"2025-12-27T21:29:54.134387-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-brg8r","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 13: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:27:46.024924-08:00","updated_at":"2025-12-27T21:26:01.863155-08:00","deleted_at":"2025-12-27T21:26:01.863155-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bruds","title":"Digest: mol-deacon-patrol","description":"Patrol 18: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:40:54.372251-08:00","updated_at":"2025-12-27T21:26:00.295567-08:00","deleted_at":"2025-12-27T21:26:00.295567-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-btiy","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:26","description":"Patrol 11: 8 sessions OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:26:07.651546-08:00","updated_at":"2025-12-27T21:26:05.264082-08:00","deleted_at":"2025-12-27T21:26:05.264082-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-budeb","title":"Witness: auto-nuke polecats after merge","description":"## Problem\nPolecats linger after merge. Manual cleanup required.\n\n## ZFC-Compliant Solution\nAdd step to `mol-witness-patrol.formula.toml`:\n\n```toml\n[[step]]\nid = \"cleanup-merged-polecats\"\ntitle = \"Remove polecats after merge\"\ndescription = \"\"\"\n1. Check inbox for MERGED messages from refinery: gt mail inbox\n2. For each MERGED notification:\n - Extract polecat name from message\n - Verify branch is actually gone: git branch -r | grep polecat/\u003cname\u003e\n - Kill session if running: tmux kill-session -t gt-\u003crig\u003e-\u003cpolecat\u003e\n - Remove worktree: gt polecat remove \u003crig\u003e/\u003cpolecat\u003e --force\n - Delete the MERGED mail\n3. Log cleanup in state.json\n\"\"\"\ndepends_on = [\"check-ready-branches\"]\n```\n\n## Why This Works\n- Witness agent receives MERGED signal from refinery (mail)\n- Agent verifies before acting (checks branch really gone)\n- Agent runs gt/tmux commands\n- No daemon code\n\n## Files\n- formulas/mol-witness-patrol.formula.toml","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T16:41:05.378581-08:00","updated_at":"2025-12-27T21:29:54.705393-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-budeb","depends_on_id":"gt-qpwv4","type":"blocks","created_at":"2025-12-27T16:41:17.546202-08:00","created_by":"daemon"},{"issue_id":"gt-budeb","depends_on_id":"gt-6qyt1","type":"blocks","created_at":"2025-12-27T16:41:17.593651-08:00","created_by":"daemon"},{"issue_id":"gt-budeb","depends_on_id":"gt-ztpe8","type":"relates-to","created_at":"2025-12-27T20:59:10.707161-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.705393-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bum4e","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:49:07.765741-08:00","updated_at":"2025-12-27T21:26:04.17697-08:00","deleted_at":"2025-12-27T21:26:04.17697-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bx4ki","title":"Merge: keeper-dogs","description":"branch: polecat/keeper-dogs\ntarget: main\nsource_issue: keeper-dogs\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T10:53:39.673172-08:00","updated_at":"2025-12-30T23:12:54.736253-08:00","closed_at":"2025-12-30T23:12:54.736253-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/keeper"}
{"id":"gt-bxi8","title":"bd mail send: pinned column missing from schema","description":"gt mail send fails with: sqlite3: SQL logic error: table issues has no column named pinned\n\nLikely a schema migration issue - bd schema has 'pinned' field but SQLite table doesn't.\n\n## Reproduction\ngt mail send gastown/Immortan -s 'test' -m 'test'\n\n## Error\ntable issues has no column named pinned\n\n## Fix\nEither:\n1. Add migration to add pinned column\n2. Remove pinned from insert if not present\n3. Regenerate DB from JSONL","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-19T14:53:09.262403-08:00","updated_at":"2025-12-27T21:29:54.016795-08:00","deleted_at":"2025-12-27T21:29:54.016795-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-by7xm","title":"Digest: mol-deacon-patrol","description":"Cycle 17","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T23:56:23.15817-08:00","updated_at":"2025-12-31T23:56:23.15817-08:00","closed_at":"2025-12-31T23:56:23.158133-08:00"}
{"id":"gt-bz2pu","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All agents healthy, 1 orphan process cleaned, no mail","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:43:34.631017-08:00","updated_at":"2025-12-27T21:26:01.159742-08:00","deleted_at":"2025-12-27T21:26:01.159742-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-bzd","title":"beads: Stop searching upward when .beads found","description":"## Problem\n\nWhen running bd commands in a nested directory structure with multiple .beads directories, bd shows a confusing warning:\n\n```\n╔══════════════════════════════════════════════════════════════════════════╗\n║ WARNING: 2 beads databases detected in directory hierarchy ║\n╠══════════════════════════════════════════════════════════════════════════╣\n║ Multiple databases can cause confusion and database pollution. ║\n║ ║\n║ /Users/stevey/gt/gastown/.beads (261 issues) ║\n║ /Users/stevey/gt/.beads (21 issues) ║\n║ ║\n║ WARNING: Not using the closest database! Check your BEADS_DB setting. ║\n║ ║\n║ RECOMMENDED: Consolidate or remove unused databases to avoid confusion. ║\n╚══════════════════════════════════════════════════════════════════════════╝\n```\n\n## Why This Is Wrong\n\nIn Gas Town, nested .beads directories are **intentional and necessary**:\n- Town level: /Users/stevey/gt/.beads (mail, town-level issues)\n- Rig level: /Users/stevey/gt/gastown/.beads (gastown project issues)\n- Worker level: polecats have their own beads in worktrees\n\nThese are **unrelated** beads instances for different scopes. They should never be consolidated.\n\n## Expected Behavior\n\nWhen bd finds a .beads directory, it should:\n1. Use that directory (closest ancestor wins)\n2. **Stop searching upward** - do not look for parent .beads directories\n3. **No warning** about multiple databases\n\n## Current Behavior\n\nbd searches all the way up to root, finds all .beads directories, and warns about \"multiple databases\" even though they are separate, intentional instances.\n\n## Fix\n\nIn the database discovery code, stop the upward search as soon as a .beads directory is found. The first .beads found is the one to use, and parent directories are out of scope.\n\n## Note\n\nThis is a beads issue, filed here for tracking. Should be implemented in the beads codebase.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T19:09:44.295743-08:00","updated_at":"2025-12-27T21:29:54.14296-08:00","deleted_at":"2025-12-27T21:29:54.14296-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-c045h","title":"Digest: mol-witness-patrol","description":"Patrol cycle 2: Processed 3 POLECAT_DONE (nux=ESCALATED, slit=DEFERRED, furiosa=COMPLETED). Wrong-rig issue escalated to Mayor. All 3 gastown polecat sessions stopped.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:57:40.769464-08:00","updated_at":"2025-12-27T21:26:02.392786-08:00","dependencies":[{"issue_id":"gt-c045h","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:57:40.770311-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:26:02.392786-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c3xyl","title":"Digest: mol-deacon-patrol","description":"Patrol #21: No messages, all agents healthy, cleaned 1 stale lock, burned 1 orphan wisp","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:10:19.786065-08:00","updated_at":"2025-12-27T21:26:00.854571-08:00","deleted_at":"2025-12-27T21:26:00.854571-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c4264","title":"Digest: mol-deacon-patrol","description":"P10: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:12:22.32234-08:00","updated_at":"2025-12-27T21:26:02.269786-08:00","deleted_at":"2025-12-27T21:26:02.269786-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c6zs","title":"Add molecule phase lifecycle diagram to architecture.md","description":"Create a clear lifecycle diagram showing molecule phases:\n\n```\n ┌─────────────┐\n │ Proto │\n │ (crystal) │\n └──────┬──────┘\n │\n bd mol bond\n │\n ┌────────────┴────────────┐\n │ │\n ▼ ▼\n ┌───────────────┐ ┌───────────────┐\n │ Mol │ │ Wisp │\n │ (liquid) │ │ (gas) │\n │ durable │ │ ephemeral │\n │ main beads │ │ .beads-eph/ │\n └───────┬───────┘ └───────┬───────┘\n │ │\n bd mol squash bd mol squash\n │ │\n ▼ ▼\n ┌───────────────┐ ┌───────────────┐\n │ Digest │ │ (nothing) │\n │ (distillate) │ │ evaporates │\n │ in git hist │ └───────────────┘\n └───────────────┘\n```\n\nAlso document:\n- When to use Mol vs Wisp\n- Mol: code review waves, epic implementation, feature work\n- Wisp: orchestration, polecat work sessions, patrol loops","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:32:47.487341-08:00","updated_at":"2025-12-27T21:29:53.426087-08:00","dependencies":[{"issue_id":"gt-c6zs","depends_on_id":"gt-62hm","type":"blocks","created_at":"2025-12-21T16:33:17.38302-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.426087-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c70hl","title":"Digest: mol-deacon-patrol","description":"Patrol 5: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:54:21.579968-08:00","updated_at":"2025-12-27T21:26:00.575976-08:00","deleted_at":"2025-12-27T21:26:00.575976-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c78vo","title":"Digest: mol-deacon-patrol","description":"Patrol 11: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:15:31.420476-08:00","updated_at":"2025-12-27T21:26:01.013745-08:00","deleted_at":"2025-12-27T21:26:01.013745-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-c7z9","title":"Merge: gt-3x0z.1","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-3x0z.1\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T15:32:28.799733-08:00","updated_at":"2025-12-27T21:27:22.63417-08:00","deleted_at":"2025-12-27T21:27:22.63417-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-c8ca","title":"mol-gastown-boot","description":"Mayor bootstraps Gas Town via a verification-gated lifecycle molecule.\n\n## Purpose\nWhen Mayor executes \"boot up gas town\", this proto provides the workflow.\nEach step has action + verification - steps stay open until outcome is confirmed.\n\n## Key Principles\n1. **Verification-gated steps** - Not \"command ran\" but \"outcome confirmed\"\n2. **gt peek for verification** - Capture session output to detect stalls\n3. **gt nudge for recovery** - Reliable message delivery to unstick agents\n4. **Parallel where possible** - Witnesses and refineries can start in parallel\n5. **Ephemeral execution** - Boot is a wisp, squashed to digest after completion\n\n## Step Structure\nEach step has Action/Verify/OnStall/OnFail sections.\n\n## Execution\n```bash\nbd mol spawn mol-gastown-boot # Create wisp\nbd mol run \u003cwisp-id\u003e # Execute\n```","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T00:19:45.521561-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-c8ca.1","title":"ensure-daemon","description":"Verify the Gas Town daemon is running.\n\n## Action\n```bash\ngt daemon status || gt daemon start\n```\n\n## Verify\n1. Daemon PID file exists: `~/.gt/daemon.pid`\n2. Process is alive: `kill -0 $(cat ~/.gt/daemon.pid)`\n3. Daemon responds: `gt daemon status` returns success\n\n## OnStall\nDaemon startup failed. Try:\n```bash\ngt daemon stop\nsleep 2\ngt daemon start\n```\n\n## OnFail\nCannot start daemon. Log error and continue - some commands work without daemon.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:20:08.841559-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.1","depends_on_id":"gt-c8ca","type":"parent-child","created_at":"2025-12-23T00:20:08.842017-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.2","title":"ensure-deacon","description":"Start the Deacon and verify patrol mode is active.\n\n## Action\n```bash\ngt deacon start\n```\n\n## Verify\n1. Session exists: `tmux has-session -t gt-deacon 2\u003e/dev/null`\n2. Not stalled: `gt peek deacon/` does NOT show \"\u003e Try\" prompt\n3. Heartbeat fresh: `deacon/heartbeat.json` modified \u003c 2 min ago\n\n## OnStall\n```bash\ngt nudge deacon/ \"Start patrol.\"\nsleep 30\n# Re-verify\n```\n\n## OnFail\nLog error and continue - town can run with degraded deacon.\nThe Witness can still manage polecats without Deacon oversight.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:20:38.421986-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.2","depends_on_id":"gt-c8ca","type":"parent-child","created_at":"2025-12-23T00:20:38.422458-08:00","created_by":"daemon"},{"issue_id":"gt-c8ca.2","depends_on_id":"gt-c8ca.1","type":"blocks","created_at":"2025-12-23T00:20:38.428253-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.3","title":"ensure-witnesses","description":"Parallel container: Start all rig witnesses.\n\n## Execution\nChildren execute in parallel. Container completes when all children complete.\n\n## Children\n- ensure-gastown-witness\n- ensure-beads-witness\n\n## Verify\nAll child witness steps pass verification.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T00:20:57.468339-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.3","depends_on_id":"gt-c8ca","type":"parent-child","created_at":"2025-12-23T00:20:57.46882-08:00","created_by":"daemon"},{"issue_id":"gt-c8ca.3","depends_on_id":"gt-c8ca.2","type":"blocks","created_at":"2025-12-23T00:20:57.474972-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-c8ca.3.1","title":"ensure-gastown-witness","description":"Start the gastown rig Witness.\n\n## Action\n```bash\ngt witness start gastown\n```\n\n## Verify\n1. Session exists: `tmux has-session -t gastown-witness 2\u003e/dev/null`\n2. Not stalled: `gt peek gastown/witness` does NOT show \"\u003e Try\" prompt\n3. Heartbeat fresh: Last patrol cycle \u003c 5 min ago\n\n## OnStall\n```bash\ngt nudge gastown/witness \"Start patrol.\"\nsleep 30\n# Re-verify\n```\n\n## OnFail\nLog error. Rig polecats will be unmonitored until Witness recovers.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:21:08.041873-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.3.1","depends_on_id":"gt-c8ca.3","type":"parent-child","created_at":"2025-12-23T00:21:08.042465-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.3.2","title":"ensure-beads-witness","description":"Start the beads rig Witness.\n\n## Action\n```bash\ngt witness start beads\n```\n\n## Verify\n1. Session exists: `tmux has-session -t beads-witness 2\u003e/dev/null`\n2. Not stalled: `gt peek beads/witness` does NOT show \"\u003e Try\" prompt\n3. Heartbeat fresh: Last patrol cycle \u003c 5 min ago\n\n## OnStall\n```bash\ngt nudge beads/witness \"Start patrol.\"\nsleep 30\n# Re-verify\n```\n\n## OnFail\nLog error. Rig polecats will be unmonitored until Witness recovers.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:21:16.31871-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.3.2","depends_on_id":"gt-c8ca.3","type":"parent-child","created_at":"2025-12-23T00:21:16.319204-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.4","title":"ensure-refineries","description":"Parallel container: Start all rig refineries.\n\n## Execution\nChildren execute in parallel. Container completes when all children complete.\n\n## Children\n- ensure-gastown-refinery\n- ensure-beads-refinery\n\n## Verify\nAll child refinery steps pass verification.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T00:21:40.387618-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.4","depends_on_id":"gt-c8ca","type":"parent-child","created_at":"2025-12-23T00:21:40.388265-08:00","created_by":"daemon"},{"issue_id":"gt-c8ca.4","depends_on_id":"gt-c8ca.2","type":"blocks","created_at":"2025-12-23T00:21:40.394734-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-c8ca.4.1","title":"ensure-gastown-refinery","description":"Start the gastown rig Refinery.\n\n## Action\n```bash\ngt refinery start gastown\n```\n\n## Verify\n1. Session exists: `tmux has-session -t gastown-refinery 2\u003e/dev/null`\n2. Not stalled: `gt peek gastown/refinery` does NOT show \"\u003e Try\" prompt\n3. Queue processing: Refinery can receive merge requests\n\n## OnStall\n```bash\ngt nudge gastown/refinery \"Start patrol.\"\nsleep 30\n# Re-verify\n```\n\n## OnFail\nLog error. Completed polecat work will queue until Refinery recovers.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:21:53.588774-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.4.1","depends_on_id":"gt-c8ca.4","type":"parent-child","created_at":"2025-12-23T00:21:53.589255-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.4.2","title":"ensure-beads-refinery","description":"Start the beads rig Refinery.\n\n## Action\n```bash\ngt refinery start beads\n```\n\n## Verify\n1. Session exists: `tmux has-session -t beads-refinery 2\u003e/dev/null`\n2. Not stalled: `gt peek beads/refinery` does NOT show \"\u003e Try\" prompt\n3. Queue processing: Refinery can receive merge requests\n\n## OnStall\n```bash\ngt nudge beads/refinery \"Start patrol.\"\nsleep 30\n# Re-verify\n```\n\n## OnFail\nLog error. Completed polecat work will queue until Refinery recovers.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:21:56.877235-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.4.2","depends_on_id":"gt-c8ca.4","type":"parent-child","created_at":"2025-12-23T00:21:56.877743-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-c8ca.5","title":"verify-town-health","description":"Final verification that Gas Town is healthy.\n\n## Action\n```bash\ngt status\n```\n\n## Verify\n1. Daemon running: Shows daemon status OK\n2. Deacon active: Shows deacon in patrol mode\n3. All witnesses: Each rig witness shows active\n4. All refineries: Each rig refinery shows active\n\n## OnStall\nN/A - this is a read-only verification step.\n\n## OnFail\nLog degraded state but consider boot complete. Some agents may need manual recovery.\nRun `gt doctor` for detailed diagnostics.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:22:14.671916-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-c8ca.5","depends_on_id":"gt-c8ca","type":"parent-child","created_at":"2025-12-23T00:22:14.672543-08:00","created_by":"daemon"},{"issue_id":"gt-c8ca.5","depends_on_id":"gt-c8ca.3","type":"blocks","created_at":"2025-12-23T00:22:14.678972-08:00","created_by":"daemon"},{"issue_id":"gt-c8ca.5","depends_on_id":"gt-c8ca.4","type":"blocks","created_at":"2025-12-23T00:22:14.685012-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ca4v","title":"Refinery Patrol Cycle","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-22T17:13:40.785585-08:00","updated_at":"2025-12-25T14:12:42.069266-08:00","deleted_at":"2025-12-25T14:12:42.069266-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ca4v.1","title":"Check mail for MR submissions, escalations, messages.","description":"Check mail for MR submissions, escalations, messages.\n\n```bash\ngt mail inbox\n# Process any urgent items\n```\n\nHandle shutdown requests, escalations, and status queries.\n\ninstantiated_from: mol-refinery-patrol\nstep: inbox-check","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.265596-08:00","updated_at":"2025-12-25T14:12:42.057419-08:00","dependencies":[{"issue_id":"gt-ca4v.1","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.265965-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:42.057419-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.10","title":"End of patrol cycle decision.","description":"End of patrol cycle decision.\n\nIf queue non-empty AND context LOW:\n- Burn this wisp, start fresh patrol\n- Return to inbox-check\n\nIf queue empty OR context HIGH:\n- Burn wisp with summary digest\n- Exit (daemon will respawn if needed)\n\ninstantiated_from: mol-refinery-patrol\nstep: burn-or-loop","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.988747-08:00","updated_at":"2025-12-25T14:12:41.951176-08:00","dependencies":[{"issue_id":"gt-ca4v.10","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.989088-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.10","depends_on_id":"gt-ca4v.9","type":"blocks","created_at":"2025-12-22T17:13:48.696513-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:41.951176-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.2","title":"Fetch remote and identify polecat branches waiting.","description":"Fetch remote and identify polecat branches waiting.\n\n```bash\ngit fetch origin\ngit branch -r | grep polecat\ngt refinery queue \u003crig\u003e\n```\n\nIf queue empty, skip to context-check step.\nTrack branch list for this cycle.\n\ninstantiated_from: mol-refinery-patrol\nstep: queue-scan","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.345738-08:00","updated_at":"2025-12-25T14:12:42.045618-08:00","dependencies":[{"issue_id":"gt-ca4v.2","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.346064-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.2","depends_on_id":"gt-ca4v.1","type":"blocks","created_at":"2025-12-22T17:13:48.062679-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:42.045618-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.3","title":"Pick next branch. Rebase on current main.","description":"Pick next branch. Rebase on current main.\n\n```bash\ngit checkout -b temp origin/\u003cpolecat-branch\u003e\ngit rebase origin/main\n```\n\nIf rebase conflicts and unresolvable:\n- git rebase --abort\n- Notify polecat to fix and resubmit\n- Skip to loop-check for next branch\n\ninstantiated_from: mol-refinery-patrol\nstep: process-branch","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.425895-08:00","updated_at":"2025-12-25T14:12:42.034116-08:00","dependencies":[{"issue_id":"gt-ca4v.3","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.426246-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.3","depends_on_id":"gt-ca4v.2","type":"blocks","created_at":"2025-12-22T17:13:48.141331-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:42.034116-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.4","title":"Run the test suite.","description":"Run the test suite.\n\n```bash\ngo test ./...\n```\n\nTrack results: pass count, fail count, specific failures.\n\ninstantiated_from: mol-refinery-patrol\nstep: run-tests","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.506173-08:00","updated_at":"2025-12-25T14:12:42.02257-08:00","dependencies":[{"issue_id":"gt-ca4v.4","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.506519-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.4","depends_on_id":"gt-ca4v.3","type":"blocks","created_at":"2025-12-22T17:13:48.223013-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:42.02257-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.5","title":"**VERIFICATION GATE**: This step enforces the Beads Promise.","description":"**VERIFICATION GATE**: This step enforces the Beads Promise.\n\nIf tests PASSED: This step auto-completes. Proceed to merge.\n\nIf tests FAILED:\n1. Diagnose: Is this a branch regression or pre-existing on main?\n2. If branch caused it:\n - Abort merge\n - Notify polecat: \"Tests failing. Please fix and resubmit.\"\n - Skip to loop-check\n3. If pre-existing on main:\n - Option A: Fix it yourself (you're the Engineer!)\n - Option B: File a bead: bd create --type=bug --priority=1 --title=\"...\"\n\n**GATE REQUIREMENT**: You CANNOT proceed to merge-push without:\n- Tests passing, OR\n- Fix committed, OR\n- Bead filed for the failure\n\nThis is non-negotiable. Never disavow. Never \"note and proceed.\"\n\ninstantiated_from: mol-refinery-patrol\nstep: handle-failures","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.585666-08:00","updated_at":"2025-12-25T14:12:42.010813-08:00","dependencies":[{"issue_id":"gt-ca4v.5","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.586001-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.5","depends_on_id":"gt-ca4v.4","type":"blocks","created_at":"2025-12-22T17:13:48.302825-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:42.010813-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.6","title":"Merge to main and push immediately.","description":"Merge to main and push immediately.\n\n```bash\ngit checkout main\ngit merge --ff-only temp\ngit push origin main\ngit branch -d temp\ngit push origin --delete \u003cpolecat-branch\u003e\n```\n\nMain has moved. Any remaining branches need rebasing on new baseline.\n\ninstantiated_from: mol-refinery-patrol\nstep: merge-push","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.666753-08:00","updated_at":"2025-12-25T14:12:41.998966-08:00","dependencies":[{"issue_id":"gt-ca4v.6","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.667097-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.6","depends_on_id":"gt-ca4v.5","type":"blocks","created_at":"2025-12-22T17:13:48.383673-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:41.998966-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.7","title":"More branches to process?","description":"More branches to process?\n\nIf yes: Return to process-branch with next branch.\nIf no: Continue to generate-summary.\n\nTrack: branches processed, branches skipped (with reasons).\n\ninstantiated_from: mol-refinery-patrol\nstep: loop-check","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.746798-08:00","updated_at":"2025-12-25T14:12:41.987217-08:00","dependencies":[{"issue_id":"gt-ca4v.7","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.747125-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.7","depends_on_id":"gt-ca4v.6","type":"blocks","created_at":"2025-12-22T17:13:48.464174-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:41.987217-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.8","title":"Summarize this patrol cycle.","description":"Summarize this patrol cycle.\n\nInclude:\n- Branches processed (count, names)\n- Test results (pass/fail)\n- Issues filed (if any)\n- Branches skipped (with reasons)\n- Any escalations sent\n\nThis becomes the digest when the patrol is squashed.\n\ninstantiated_from: mol-refinery-patrol\nstep: generate-summary","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.827331-08:00","updated_at":"2025-12-25T14:12:41.975222-08:00","dependencies":[{"issue_id":"gt-ca4v.8","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.827682-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.8","depends_on_id":"gt-ca4v.7","type":"blocks","created_at":"2025-12-22T17:13:48.545582-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:41.975222-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ca4v.9","title":"Check own context usage.","description":"Check own context usage.\n\nIf context is HIGH (\u003e80%):\n- Write handoff summary\n- Prepare for burn/respawn\n\nIf context is LOW:\n- Can continue processing\n\ninstantiated_from: mol-refinery-patrol\nstep: context-check","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:13:47.909098-08:00","updated_at":"2025-12-25T14:12:41.96321-08:00","dependencies":[{"issue_id":"gt-ca4v.9","depends_on_id":"gt-ca4v","type":"parent-child","created_at":"2025-12-22T17:13:47.909418-08:00","created_by":"daemon"},{"issue_id":"gt-ca4v.9","depends_on_id":"gt-ca4v.8","type":"blocks","created_at":"2025-12-22T17:13:48.621628-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T14:12:41.96321-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cab1d","title":"Digest: mol-deacon-patrol","description":"Patrol 4: routine checks all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:21:21.999189-08:00","updated_at":"2025-12-27T21:26:01.279565-08:00","deleted_at":"2025-12-27T21:26:01.279565-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-caih","title":"Witness handoff bead state persistence","description":"Implement state persistence for Witness across wisp cycles.\n\n## Problem\nWisps burn between cycles, but Witness needs to remember:\n- Which workers have been nudged\n- How many times (nudge_count)\n- When was last nudge\n- Last observed activity\n\n## Solution\nWitness handoff bead with worker_states field:\n\n```json\n{\n \"id\": \"gt-witness-state\",\n \"type\": \"handoff\",\n \"assignee\": \"\u003crig\u003e/witness\",\n \"pinned\": true,\n \"worker_states\": {\n \"furiosa\": {\n \"issue\": \"gt-123\",\n \"nudge_count\": 2,\n \"last_nudge\": \"2024-12-22T10:00:00Z\"\n }\n },\n \"last_patrol\": \"2024-12-22T10:05:00Z\"\n}\n```\n\n## Implementation\n1. On patrol start: bd show \u003cwitness-handoff-id\u003e to load state\n2. During patrol: update in-memory state\n3. On save-state step: bd update to persist\n4. State survives wisp burn/squash\n\n## Depends on\n- gt-83k0 (mol-witness-patrol definition)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:42:57.427131-08:00","updated_at":"2025-12-27T21:29:53.183672-08:00","dependencies":[{"issue_id":"gt-caih","depends_on_id":"gt-83k0","type":"blocks","created_at":"2025-12-22T16:43:59.609821-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.183672-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-caz","title":"Timed Beads: Scheduled recurring work","description":"## Summary\n\nTimed beads wake up periodically and get injected into the ready queue by the daemon.\n\n## Schema Extension\n\n```yaml\nid: gt-weekly-sync\ntype: task # or sentinel\nschedule: \"0 9 * * 1\" # cron: Monday 9am\n# OR\ninterval: 24h # every 24 hours\ntier: haiku # cheap model for routine checks\nnext_run: 2025-12-20T09:00:00Z\n```\n\n## Daemon Integration\n\nDaemon heartbeat loop:\n1. Check timed beads where `next_run \u003c= now`\n2. For each due bead:\n - Inject into ready queue (set status to open if needed)\n - Update `next_run` based on schedule/interval\n3. Witnesses pick up work via `bd ready`\n\n## Use Cases\n\n- Weekly team sync reminders\n- Daily health checks\n- Periodic cleanup tasks\n- Scheduled reports\n\n## Interaction with Pinned Beads\n\nA pinned bead can be timed - it wakes up periodically but never closes.\nThis is how you model \"background services\" in Gas Town.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-18T18:07:39.665294-08:00","updated_at":"2025-12-27T21:29:57.143372-08:00","deleted_at":"2025-12-27T21:29:57.143372-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-cbrv","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:46","description":"Patrol 12: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:46:24.801062-08:00","updated_at":"2025-12-27T21:26:05.095851-08:00","deleted_at":"2025-12-27T21:26:05.095851-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cbyt7","title":"Digest: mol-deacon-patrol","description":"Patrol 19: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:53:41.512678-08:00","updated_at":"2025-12-27T21:26:04.075351-08:00","deleted_at":"2025-12-27T21:26:04.075351-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cejv9","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, no events","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:59:38.421743-08:00","updated_at":"2025-12-27T21:26:04.058981-08:00","deleted_at":"2025-12-27T21:26:04.058981-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cgl3u","title":"Digest: mol-deacon-patrol","description":"Patrol 5: 9 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:25:56.412065-08:00","updated_at":"2025-12-27T21:26:03.651889-08:00","deleted_at":"2025-12-27T21:26:03.651889-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ci84","title":"Deacon patrol wisps should use town beads, not gastown rig beads","description":"## Problem\n\nThe deacon is a town-level role but stores patrol wisps in gastown/mayor/rig/.beads/ (see internal/cmd/prime.go:691). This fails for users who don't have the gastown rig installed.\n\n## Correct Architecture\n\nDeacon -\u003e ~/gt/.beads-wisp/ (town-level, no rig dependency)\n\nTown-level ephemeral work (deacon patrols) should use town wisp storage, just like rig-level ephemeral work (witness/refinery patrols) uses rig wisp storage.\n\n## Implementation Tasks\n\n### 1. Town Setup (gt install / gt town init)\n\n- Create ~/gt/.beads-wisp/ directory during town initialization\n- Initialize as git repo (for local commits, not synced)\n- Add config.yaml with wisp: true\n- Ensure .beads-wisp/ is gitignored at town level\n\nFiles:\n- internal/cmd/install.go (or equivalent town init)\n- internal/town/manager.go (if exists)\n\n### 2. Rig Setup (gt rig init)\n\nAlready creates \u003crig\u003e/.beads-wisp/ - verify this is working:\n- internal/rig/manager.go:375 (beadsDir := filepath.Join(rigPath, \".beads-wisp\"))\n- internal/rig/manager.go:394 (ensureGitignoreEntry)\n\n### 3. Deacon Code Fixes\n\nUpdate deacon to use town wisp storage:\n- internal/cmd/prime.go:691 - change rigBeadsDir to townWispDir\n- internal/cmd/sling.go - route deacon wisps to town level\n- internal/daemon/daemon.go - any deacon-specific wisp handling\n\n### 4. gt doctor Checks\n\nAdd/update doctor checks for wisp directories:\n- internal/doctor/wisp_check.go - already checks rig wisps\n- Add TownWispCheck to verify ~/gt/.beads-wisp/ exists\n- Add TownWispGitCheck to verify it is a valid git repo\n- Update existing checks to handle both town and rig levels\n\nFiles:\n- internal/doctor/wisp_check.go (extend for town level)\n- internal/doctor/checks.go (register new checks)\n\n### 5. Documentation\n\n- docs/beads-data-plane.md - already updated with three-tier architecture\n- Verify wisp-architecture.md reflects town-level wisps\n\n## Testing\n\n1. Fresh gt install should create ~/gt/.beads-wisp/\n2. gt doctor should pass with no rig installed\n3. Deacon patrol should work without gastown rig\n4. gt sling patrol deacon/ --wisp should route to town wisps","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-23T14:28:03.636334-08:00","updated_at":"2025-12-27T21:29:52.982224-08:00","deleted_at":"2025-12-27T21:29:52.982224-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-cik","title":"Overseer Crew: User-managed persistent workspaces","description":"## Overview\n\nCrew workers are the overseer's (human's) personal workspaces within a rig. Unlike polecats which are witness-managed and ephemeral, crew workers are:\n\n- **Persistent**: Not auto-garbage-collected\n- **User-managed**: Overseer controls lifecycle\n- **Long-lived identities**: dave, emma, fred - recognizable names\n- **Gas Town integrated**: Mail, handoff mechanics work\n- **Tmux optional**: Can work in terminal directly\n\n## Directory Structure\n\n```\n\u003crig\u003e/\n polecats/ # Managed workers (witness controls)\n refinery/ # Merge queue processor\n witness/ # Pit boss\n crew/ # Overseer's personal workspaces\n dave/ # Full clone, persistent\n emma/ # Full clone, persistent\n fred/ # Full clone, persistent\n```\n\n## Key Differences from Polecats\n\n- Location: crew/ instead of polecats/\n- Lifecycle: User-managed, not witness-managed\n- Auto-cleanup: Never (polecats auto-cleanup on swarm land)\n- Issue assignment: Optional (polecats require it)\n- Tmux: Optional (polecats require it)\n- Mail \u0026 Handoff: Yes for both\n- Identity: Persistent (polecats are ephemeral)\n\n## CLI Commands\n\n- gt crew add \u003cname\u003e [--rig \u003crig\u003e] - Create crew workspace\n- gt crew list [--rig \u003crig\u003e] - List crew workspaces\n- gt crew at \u003crig\u003e/\u003cname\u003e - Attach to workspace (start session)\n- gt crew attach \u003cname\u003e - Attach (infer rig from cwd)\n- gt crew refresh \u003cname\u003e - Handoff + restart (context cycling)\n- gt crew remove \u003cname\u003e [--force] - Remove workspace\n- gt crew status [\u003cname\u003e] - Show workspace status\n\n## Design Notes\n\n- Crew workers use full git clones (not worktrees)\n- Optional beads integration via BEADS_DIR\n- Mail-to-self handoff works for context cycling\n- No witness monitoring or nudging\n- No automatic issue assignment required\n\n## Background\n\nUsers often maintain separate repo clones for serial agent work. This is tedious to set up manually. Crew workspaces bring these into Gas Town's infrastructure while keeping user control.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-16T16:47:37.529887-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-cik.1","title":"Crew directory structure and config","description":"Add crew/ directory support to rig structure. Include:\n- crew/ as peer to polecats/, refinery/, witness/\n- Crew worker subdirectories with full git clones\n- Optional BEADS_DIR configuration for beads integration\n- Crew state tracking (separate from polecat state)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:00.285499-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.1","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:00.28789-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.2","title":"gt crew add: Create crew workspace","description":"Implement 'gt crew add \u003cname\u003e [--rig \u003crig\u003e]' command:\n- Clone repo into \u003crig\u003e/crew/\u003cname\u003e/\n- Create feature branch (optional, or stay on main)\n- Register for mail delivery\n- Initialize CLAUDE.md with crew worker prompting\n- Do NOT register with witness (user-managed)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:02.208618-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.2","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:02.210603-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.3","title":"gt crew list: List crew workspaces","description":"Implement 'gt crew list [--rig \u003crig\u003e]' command:\n- List all crew workers in rig(s)\n- Show status (session running, last activity)\n- Show current branch and git status summary\n- Support --json output","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:03.53109-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.3","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:03.532953-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.4","title":"gt crew at/attach: Start session in crew workspace","description":"Implement 'gt crew at \u003crig\u003e/\u003cname\u003e' and 'gt crew attach \u003cname\u003e' commands:\n- Start tmux session (optional - could just print cd instructions)\n- Launch claude code in the workspace\n- Deliver any pending mail\n- Support --no-tmux to just print directory path\n- 'attach' infers rig from cwd","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:04.96786-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.4","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:04.969488-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.5","title":"gt crew refresh: Context cycling with handoff","description":"Implement 'gt crew refresh \u003cname\u003e' command:\n- Send handoff mail to self (context summary)\n- Kill current session cleanly\n- Start new session\n- New session reads handoff mail and resumes\n- Support --message to add custom handoff notes","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:06.934819-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.5","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:06.936381-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.6","title":"gt crew remove: Remove crew workspace","description":"Implement 'gt crew remove \u003cname\u003e [--force]' command:\n- Check for uncommitted changes, unpushed commits\n- Warn and require --force if dirty\n- Kill any running session\n- Remove directory\n- Unregister from mail","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T16:48:08.407212-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.6","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:08.408845-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.7","title":"gt crew status: Show workspace status","description":"Implement 'gt crew status [\u003cname\u003e]' command:\n- Show session state (running/stopped)\n- Show git status (branch, uncommitted changes, unpushed)\n- Show last commit info\n- Show mail inbox status (unread count)\n- If no name given, show all crew workers","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T16:48:10.476059-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.7","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:10.477638-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.8","title":"Crew worker CLAUDE.md prompting","description":"Create CLAUDE.md template for crew workers:\n- Explain crew worker role (overseer's personal workspace)\n- Include mail-to-self handoff instructions\n- Document gt crew refresh for context cycling\n- Explain no witness monitoring (user-managed)\n- Include beads usage if BEADS_DIR configured","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T16:48:12.108074-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.8","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T16:48:12.109654-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cik.9","title":"Complete gt crew commands (list, attach, remove, refresh, status)","description":"Add remaining crew subcommands to internal/cmd/crew.go:\n\n1. gt crew list - List crew workspaces with status\n2. gt crew at/attach - Start tmux session in crew workspace \n3. gt crew remove - Remove crew workspace (with safety checks)\n4. gt crew refresh - Context cycling with mail-to-self handoff\n5. gt crew status - Show detailed workspace status\n\nBuild on existing crew add implementation in internal/cmd/crew.go.\nReference closed issues gt-cik.3-7 for original requirements.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T20:53:25.564877-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-cik.9","depends_on_id":"gt-cik","type":"parent-child","created_at":"2025-12-16T20:53:25.566962-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cjb","title":"Witness updates: Remove issue filing proxy","description":"Update Witness prompting to remove issue filing proxy, since polecats now have direct beads access.\n\n## Remove from Witness Prompting\n\nThe following is NO LONGER Witness responsibility:\n- Processing polecat 'file issue' mail requests\n- Creating issues on behalf of polecats\n- Forwarding issue creation requests\n\n## Add: Legacy Request Handling\n\nIf Witness receives an old-style 'please file issue' request:\n\n1. Respond with update:\n town inject \u003cpolecat\u003e \"UPDATE: You have direct beads access now. Use bd create to file issues yourself.\"\n\n2. Do not file the issue - let the polecat learn the new workflow.\n\n## Keep in Witness Prompting\n\n- Monitoring polecat progress\n- Nudge protocol\n- Pre-kill verification\n- Session lifecycle management","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:47:19.921561-08:00","updated_at":"2025-12-27T21:29:54.607008-08:00","dependencies":[{"issue_id":"gt-cjb","depends_on_id":"gt-l3c","type":"blocks","created_at":"2025-12-15T19:47:35.896691-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.607008-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cnt","title":"Swarm cleanup: delete merged polecat branches and reset state","description":"After a swarm completes and branches are merged, leftover state remains:\n\n## Current Problem\n\n1. **Remote branches not deleted** - polecat/* branches stay on origin after merge\n2. **Polecat clones not reset** - still on old branch with completed work\n3. **No cleanup command** - manual cleanup required\n\n## Observed After Swarm\n\nRemote branches still present:\n- origin/polecat/Morsov\n- origin/polecat/Nux \n- origin/polecat/Rictus\n- origin/polecat/Slit\n- origin/polecat/Toast\n\n## Proposed Solution\n\nAdd cleanup commands:\n\n1. gt swarm cleanup \u003cswarm-id\u003e - Clean up after swarm completion\n - Delete remote polecat branches that were merged\n - Reset polecat clones to main\n - Clear issue assignments\n \n2. gt polecat reset \u003crig\u003e/\u003cpolecat\u003e - Reset single polecat\n - git checkout main \u0026\u0026 git pull\n - Delete local polecat branch\n - Clear current issue assignment\n\n3. Auto-cleanup option on gt session stop --cleanup\n\n## Manual Cleanup For Now\n\ngit push origin --delete polecat/Nux polecat/Toast ...\ncd polecats/Nux \u0026\u0026 git checkout main \u0026\u0026 git pull","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T15:09:08.739193-08:00","updated_at":"2025-12-27T21:29:54.287498-08:00","deleted_at":"2025-12-27T21:29:54.287498-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cp2s","title":"mol-polecat-lease: Semaphore proto for tracking polecat lifecycle","description":"Define a small proto for tracking a single polecat in the Witness patrol wisp:\n\n```markdown\n## Molecule: polecat-lease\nSemaphore tracking a single polecat's lifecycle.\nVars: {{polecat}}, {{issue}}\n\n## Step: boot\nSpawned. Verify it starts working.\ngt peek {{polecat}} - if idle, gt nudge.\nTimeout: 60s before escalation.\n\n## Step: working\nActively working. Monitor for stuck.\nWait for SHUTDOWN mail.\nNeeds: boot\n\n## Step: done\nExit received. Ready for cleanup.\nKill session, prune worktree.\nNeeds: working\n```\n\nUsed by Witness: bd mol bond mol-polecat-lease wisp-patrol --var polecat=X","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T22:01:18.257848-08:00","updated_at":"2025-12-27T21:29:53.091372-08:00","deleted_at":"2025-12-27T21:29:53.091372-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cpm2","title":"Automatic spawn for ready work","description":"Auto-spawn polecats for ready work:\n\nWhen Witness has capacity (active_workers \u003c max_workers):\n1. Query bd ready for unblocked issues\n2. Filter to rig-appropriate work (by prefix or epic)\n3. For each ready issue up to capacity:\n - gt spawn --issue \u003cid\u003e\n - Track that we spawned for this issue\n\nConfiguration (in rig config.json):\n- max_workers: 4 (default)\n- spawn_delay: 5s between spawns\n- auto_spawn: true/false\n\nThis enables 'fire and forget' swarming:\n- Mayor creates epic with children\n- Mayor tells Witness to work epic\n- Witness spawns polecats automatically\n- Witness cleans up as they complete","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:24.724136-08:00","updated_at":"2025-12-27T21:29:53.798185-08:00","dependencies":[{"issue_id":"gt-cpm2","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.365334-08:00","created_by":"daemon"},{"issue_id":"gt-cpm2","depends_on_id":"gt-mxyj","type":"blocks","created_at":"2025-12-20T03:14:38.957826-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.798185-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cppy0","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All healthy, handoff triggered","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T15:24:01.602815-08:00","updated_at":"2025-12-27T21:26:03.132568-08:00","deleted_at":"2025-12-27T21:26:03.132568-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cr0","title":"Consolidate design docs into beads descriptions","description":"The markdown design docs (swarm-shutdown-design.md, polecat-beads-access-design.md, mayor-handoff-design.md) will decay. Extract key decisions and prompting templates into the beads descriptions themselves, then archive or remove the markdown files. Beads are the source of truth.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:24:05.45131-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cr9","title":"Harness Design \u0026 Documentation","description":"The harness (Gas Town installation directory) needs design cleanup, documentation, and tooling.\n\n## Current Problems\n\n1. **Shared harness confusion**: ~/ai is shared by PGT and GGT with overlapping structures\n - PGT uses ~/ai/mayor/ as town-level Mayor home\n - GGT Mayor works in ~/ai/mayor/rigs/gastown/\n - ~/ai/gastown/ has both .gastown/ (PGT) and mayor/ (git clone)\n\n2. **Beads redirect**: ~/ai/.beads/redirect → mayor/rigs/gastown/.beads\n - This is specific to GGT's decentralized structure\n - Should be documented as an example\n\n3. **architecture.md**: Verify rig-level mayor/rig/ is shown correctly\n\n4. **No harness creation tooling**: Users must manually set up\n\n## Proposed Work\n\n- Document what a harness IS (installation directory)\n- Create harness creation command or template repo\n- Update architecture.md if needed \n- Create example harness configuration for docs\n- Resolve PGT/GGT sharing issue","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-17T17:15:08.769961-08:00","updated_at":"2025-12-27T21:29:54.253374-08:00","deleted_at":"2025-12-27T21:29:54.253374-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-csba","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Routine - handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:39:00.441055-08:00","updated_at":"2025-12-27T21:26:04.515663-08:00","deleted_at":"2025-12-27T21:26:04.515663-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ct0u","title":"Merge: gt-3x1.4","description":"branch: polecat/Nux\ntarget: main\nsource_issue: gt-3x1.4\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T14:53:53.140259-08:00","updated_at":"2025-12-27T21:27:22.766824-08:00","deleted_at":"2025-12-27T21:27:22.766824-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ctr","title":"GGT vs PGT Gap Analysis Summary","description":"Summary of gaps comparing GGT (~4,500 LOC) to PGT (~27,700 LOC).\n\n## Critical Gaps (P1) - Existing Issues\n- gt-u1j.17: Polecat CLI (add, list, wake, sleep) - DETAILED\n- gt-u1j.16: Rig CLI (add, list, show, remove) - DETAILED\n- gt-u1j.18: Witness CLI (start, stop, status) - DETAILED\n- gt-f9x.4: Doctor framework - DETAILED\n- gt-f9x.5: Workspace doctor checks - DETAILED\n- gt-f9x.6: Rig doctor checks - DETAILED\n\n## Critical Gaps (P1) - New Issues\n- gt-a95: Refinery background daemon mode - ENHANCED\n- gt-hgk: Mail message types and threading - ENHANCED\n\n## Significant Gaps (P2) - New Issues\n- gt-d46: Mail CLI archive/purge/search - ENHANCED\n- gt-e9k: Swarm preflight/postflight - ENHANCED\n- gt-662: Swarm report generation - ENHANCED\n- gt-69l: Hook system - ENHANCED\n- gt-3yj: Agent monitoring - ENHANCED\n- gt-1ky: Workspace CLI (may overlap f9x.3) - ENHANCED\n- gt-9j9: Worker status reporting - ENHANCED\n- gt-qao: Mayor CLI - ENHANCED\n- gt-7o7: Session pre-shutdown checks - ENHANCED\n- gt-c92: Batch all command - ENHANCED\n- gt-lno: Swarm state persistence - ENHANCED\n- gt-a9y: File locking - ENHANCED\n- gt-30o: Error handling improvements - ENHANCED\n\n## Significant Gaps (P2) - Existing Issues\n- gt-kmn.12: Ephemeral rig support - DETAILED\n\n## Lower Priority (P3) - New Issues\n- gt-3fm: Mail orchestrator daemon - ENHANCED\n- gt-2kz: Cleanup commands - ENHANCED\n- gt-ebl: Names commands - ENHANCED\n- gt-1u9: Interactive prompts - ENHANCED\n- gt-8lz: Help text improvements - ENHANCED\n\n## Closed as Duplicates\n- gt-3tz → gt-u1j.17 (polecat CLI)\n- gt-e1r → gt-u1j.16 (rig CLI)\n- gt-86w → gt-f9x.4/5/6 (doctor)\n- gt-alx → gt-kmn.12 (ephemeral rigs)\n\n## Execution Order Recommendation\n1. P1 CLI commands (existing detailed issues ready to implement)\n2. gt-a95: Refinery daemon (blocks autonomous operation)\n3. gt-7o7: Pre-shutdown checks (prevents data loss)\n4. gt-a9y: File locking (prevents corruption)\n5. gt-hgk + gt-d46: Mail improvements\n6. gt-69l: Hook system (enables extensibility)\n7. Remaining P2s in any order\n8. P3s as time permits","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-16T14:49:09.555759-08:00","updated_at":"2025-12-27T21:29:54.444374-08:00","deleted_at":"2025-12-27T21:29:54.444374-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-cu7r","title":"Implement handoffs using pinned beads","description":"Replace the current mail-based handoff system with pinned beads.\n\n## Current Problem\n\nHandoff messages get closed before the successor can read them because:\n1. `gt mail read` auto-acks (closes) messages\n2. `bd mail inbox` only shows open messages\n3. Successor sees empty inbox\n\n## Solution\n\nUse pinned beads for handoffs:\n- One pinned bead per role: `mayor-handoff`, `\u003crig\u003e-refinery-handoff`, etc.\n- Predecessor updates the content before cycling\n- Successor reads on startup via `gt prime`\n- Never closes - always available\n\n## Implementation\n\n### 1. Create handoff beads on first cycle\n- `bd create --title='Mayor Handoff' --type=task --status=pinned --assignee=mayor`\n- Store ID in role config or use well-known naming convention\n\n### 2. Update gt handoff command\n- Instead of `bd mail send`, update the pinned handoff bead\n- `bd update \u003chandoff-id\u003e --description='...handoff content...'`\n\n### 3. Update gt prime\n- Read the role's handoff bead\n- Display content to successor\n\n### 4. Compression/reset\n- `gt rig reset` clears handoff content\n- Or manual: `bd update \u003chandoff-id\u003e --description=''`\n\n## Dependencies\n\nRequires beads-6v2 (StatusPinned) to be implemented first.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T21:28:05.738035-08:00","updated_at":"2025-12-27T21:29:54.117537-08:00","deleted_at":"2025-12-27T21:29:54.117537-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cv3lb","title":"Digest: mol-deacon-patrol","description":"Patrol 19: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:36:22.291924-08:00","updated_at":"2025-12-27T21:26:02.486211-08:00","deleted_at":"2025-12-27T21:26:02.486211-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cv50l","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:44:54.902914-08:00","updated_at":"2025-12-27T21:26:01.544153-08:00","deleted_at":"2025-12-27T21:26:01.544153-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cv9a","title":"Merge: gt-ay1r","description":"branch: polecat/dementus\ntarget: main\nsource_issue: gt-ay1r\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:51:54.383198-08:00","updated_at":"2025-12-27T21:27:22.476628-08:00","deleted_at":"2025-12-27T21:27:22.476628-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-cvfg","title":"Use cmd.OutOrStdout instead of fmt.Print in refinery","description":"refinery/manager.go and refinery/engineer.go use fmt.Print/Println directly for user output (30+ occurrences). This breaks testability and doesn't follow cobra best practices. Should use cmd.OutOrStdout() or pass an io.Writer.\n\nAffected files:\n- internal/refinery/manager.go (lines 222, 360-361, 369, 387-393, 519, 537, 672)\n- internal/refinery/engineer.go (lines 190-211, 249, 294-297, 325-353, 362-366)","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-21T21:35:08.080292-08:00","updated_at":"2025-12-27T21:29:57.572482-08:00","deleted_at":"2025-12-27T21:29:57.572482-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cwndo","title":"Digest: mol-deacon-patrol","description":"Patrol 15: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:29:00.960438-08:00","updated_at":"2025-12-27T21:26:02.822085-08:00","deleted_at":"2025-12-27T21:26:02.822085-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cwpj","title":"Digest: mol-deacon-patrol","description":"Patrol OK: 0 mail, all agents healthy, 3 polecats working","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T20:58:32.040465-08:00","updated_at":"2025-12-27T21:26:05.475267-08:00","deleted_at":"2025-12-27T21:26:05.475267-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cx41","title":"Role templates: rename 'Dependency Trap' to 'Gotchas when Filing Beads'","description":"The 'Dependency Trap' heading is too specific. Rename to something like 'Gotchas when Filing Beads' or 'Beads Filing Tips'. Applies to all role templates (polecat, crew, mayor, witness, refinery, deacon).","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:45.526464-08:00","updated_at":"2025-12-27T21:29:55.943621-08:00","dependencies":[{"issue_id":"gt-cx41","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.288402-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.943621-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-cxtu","title":"Implement shared beads architecture for rig","description":"Implement redirect-based shared beads to eliminate git sync overhead within a rig.\n\n## Background\nEach polecat currently has its own .beads/ directory synced via git. This burns tokens on sync operations.\n\n## Solution\nUse bd's redirect feature:\n1. Create single shared .beads/ at rig root\n2. Polecats get redirect files pointing to shared location\n3. All agents connect to same daemon\n4. SQLite WAL + daemon serialization handles concurrency\n\n## Implementation\n1. Create shared .beads/ at rig root (e.g., ~/gt/gastown/.beads/)\n2. Update gt spawn to create redirect files:\n mkdir -p polecats/nux/.beads\n echo ../../.beads \u003e polecats/nux/.beads/redirect\n3. Test that all polecats connect to same daemon\n4. Remove git sync from intra-rig workflow\n5. Keep JSONL export for backup/cross-rig only\n\n## Reference\nbeads/polecats/rictus/internal/beads/beads.go:45 - followRedirect()","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T20:19:53.6549-08:00","updated_at":"2025-12-27T21:29:53.63542-08:00","deleted_at":"2025-12-27T21:29:53.63542-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-cxx","title":"Swarm learning: Witness needs automated context cycling","description":"Furiosa hit 2% context during swarm work. Witness role needs automated detection of low context and should trigger session cycling before agents get stuck. Add to Witness responsibilities in prompts.md.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T01:21:49.67756-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-cxxse","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:18:31.068172-08:00","updated_at":"2025-12-27T21:26:02.691417-08:00","deleted_at":"2025-12-27T21:26:02.691417-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d0en","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:06.620731-08:00","updated_at":"2025-12-27T21:26:04.565344-08:00","deleted_at":"2025-12-27T21:26:04.565344-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d28s1","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T14:38:59.877435-08:00","updated_at":"2025-12-27T21:26:03.156987-08:00","deleted_at":"2025-12-27T21:26:03.156987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d3d","title":"Design: Additional design issues (placeholder)","description":"Placeholder for additional design issues the user wants to raise and work through. Convert to specific subtasks as issues are identified.","status":"tombstone","priority":4,"issue_type":"epic","created_at":"2025-12-15T20:24:12.601585-08:00","updated_at":"2025-12-27T21:29:57.9334-08:00","deleted_at":"2025-12-27T21:29:57.9334-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-d48f2","title":"Daemon binary vs process age mismatch detection","description":"## Problem\n\nThe daemon process can run for days/weeks while the binary gets rebuilt multiple times. The running process uses old code:\n\n```\nDaemon started: 2025-12-30 (3 days old)\nBinary rebuilt: 2026-01-02 (today)\nRunning code: 1-hour heartbeat interval\nBinary code: 3-minute heartbeat interval\n```\n\nWe only discovered this when debugging why witness/refinery weren't being auto-restarted.\n\n## Impact\n\n- Bug fixes in daemon code don't take effect until manual restart\n- New features aren't active\n- Debugging is confusing (code says one thing, behavior says another)\n\n## Solutions\n\n### Option A: Self-restart on binary change\n```go\nfunc (d *Daemon) checkBinaryAge() {\n binaryStat, _ := os.Stat(os.Args[0])\n if binaryStat.ModTime().After(d.startTime) {\n d.logger.Println(\"Binary newer than process, restarting...\")\n d.restart()\n }\n}\n```\n\n### Option B: Version embedding + status warning\n```go\n// Embed build time at compile\nvar buildTime = \"2026-01-02T18:00:00Z\"\n\nfunc (d *Daemon) status() {\n if buildTime != currentBinaryBuildTime() {\n fmt.Println(\"⚠ Daemon running old code, restart recommended\")\n }\n}\n```\n\n### Option C: Daemon status shows binary age\n```\n$ gt daemon status\n● Daemon is running (PID 53143)\n Started: 2025-12-30 21:56:00\n Binary: 2026-01-02 18:34:00\n ⚠ Binary is newer than process - consider 'gt daemon stop \u0026\u0026 gt daemon start'\n```\n\n## Recommendation\n\nOption C is simplest and most transparent. Let humans decide when to restart.","status":"closed","priority":2,"issue_type":"feature","created_at":"2026-01-02T18:42:41.146364-08:00","updated_at":"2026-01-02T18:50:09.737069-08:00","closed_at":"2026-01-02T18:50:09.737069-08:00","close_reason":"Implemented Option C: daemon status now shows binary modification time and warns when binary is newer than the running process","created_by":"mayor"}
{"id":"gt-d4es4","title":"Digest: mol-deacon-patrol","description":"Quick patrol: no messages, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:03:03.862395-08:00","updated_at":"2025-12-27T21:26:03.437816-08:00","deleted_at":"2025-12-27T21:26:03.437816-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d5tgv","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:31.187594-08:00","updated_at":"2025-12-27T21:26:02.147192-08:00","deleted_at":"2025-12-27T21:26:02.147192-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d69z7","title":"Merge: rictus-mjtlq9xg","description":"branch: polecat/rictus-mjtlq9xg\ntarget: main\nsource_issue: rictus-mjtlq9xg\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:29:35.917979-08:00","updated_at":"2025-12-30T23:12:42.768632-08:00","closed_at":"2025-12-30T23:12:42.768632-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/rictus"}
{"id":"gt-d7egy","title":"Session ended: gt-gastown-organic","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:46:13.056861-08:00","updated_at":"2026-01-04T16:41:00.387059-08:00","closed_at":"2026-01-04T16:41:00.387059-08:00","close_reason":"Archived session telemetry","created_by":"gastown/polecats/organic"}
{"id":"gt-d7i","title":"gt session capture: Support positional line count argument","description":"Make 'gt session capture gastown/Toast 50' work.\n\nCurrently requires: gt session capture gastown/Toast -n 50\nShould also accept: gt session capture gastown/Toast 50\n\nAgent UX principle: commands should work the way agents guess they work.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T22:28:44.291285-08:00","updated_at":"2025-12-27T21:29:54.211841-08:00","deleted_at":"2025-12-27T21:29:54.211841-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d7i38","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All healthy - handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:53:56.983327-08:00","updated_at":"2025-12-27T21:26:02.797809-08:00","deleted_at":"2025-12-27T21:26:02.797809-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d8a8x","title":"Digest: mol-deacon-patrol","description":"Patrol 13: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:34:25.657224-08:00","updated_at":"2025-12-27T21:26:02.535732-08:00","deleted_at":"2025-12-27T21:26:02.535732-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-d8ia3","title":"Digest: mol-deacon-patrol","description":"Patrol 12: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:22:50.271584-08:00","updated_at":"2025-12-27T21:26:00.108727-08:00","deleted_at":"2025-12-27T21:26:00.108727-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dapb","title":"mol-polecat-arm","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Steps\n\n### capture\nCapture recent tmux output for this polecat.\n\n```bash\ntmux capture-pane -t gt-{{rig}}-{{polecat_name}} -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n### assess\nCategorize polecat state.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n### load-history\nRead nudge history for this polecat from patrol state.\n\n```\nnudge_count = state.nudges[{{polecat_name}}].count\nlast_nudge_time = state.nudges[{{polecat_name}}].timestamp\n```\n\nNeeds: assess\n\n### decide\nApply the nudge matrix to determine action.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n### execute\nTake the decided action.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-{{rig}}-{{polecat_name}} \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/{{polecat_name}}\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: {{polecat_name}} stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp\n\nThis data feeds back to the patrol's aggregate step.\n\nLabels: [template, christmas-ornament, polecat-arm]","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T02:37:03.754926-08:00","updated_at":"2025-12-27T21:29:56.225546-08:00","deleted_at":"2025-12-27T21:29:56.225546-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-db4x","title":"Merge: gt-7919","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-7919\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:41:28.898315-08:00","updated_at":"2025-12-27T21:27:22.501754-08:00","deleted_at":"2025-12-27T21:27:22.501754-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-dck","title":"Update config location: .gastown/ → config/","description":"Move rig configuration from hidden .gastown/ to visible config/:\n- config/rig.json: Rig configuration\n- config/engineer.json: Engineer settings (test command, etc.)\n- config/witness.json: Witness settings (heartbeat interval, etc.)\n\nHidden directories are poorly discovered by AI agents.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T23:02:20.400818-08:00","updated_at":"2025-12-27T21:29:54.419335-08:00","dependencies":[{"issue_id":"gt-dck","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:55.69517-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.419335-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dclzf","title":"Merge: nux-mjw3mn8o","description":"branch: polecat/nux-mjw3mn8o\ntarget: main\nsource_issue: nux-mjw3mn8o\nrig: gastown\nagent_bead: gt-gastown-polecat-nux","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:16:37.726672-08:00","updated_at":"2026-01-01T18:22:57.867398-08:00","closed_at":"2026-01-01T18:22:57.867398-08:00","created_by":"gastown/polecats/nux"}
{"id":"gt-dcvk5","title":"Merge: capable-mjtltnm5","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: capable-mjtltnm5\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:33:27.190736-08:00","updated_at":"2025-12-30T23:12:37.286376-08:00","closed_at":"2025-12-30T23:12:37.286376-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-dcxz8","title":"Digest: mol-deacon-patrol","description":"Patrol 12: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:03:31.988364-08:00","updated_at":"2025-12-27T21:26:03.98515-08:00","deleted_at":"2025-12-27T21:26:03.98515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dd8s","title":"gt molecule seed: create built-in molecules as beads","description":"The molecule infrastructure is complete but built-in molecules (engineer-in-box, quick-fix, research) need to be seeded into the beads database.\n\n## Current State\n- `gt molecule list` works but shows 0 molecules\n- BuiltinMolecules() in internal/beads/builtin_molecules.go has 3 molecules defined\n- No way to create them as beads\n\n## Needed\nAdd `gt molecule seed` command that:\n1. Reads BuiltinMolecules()\n2. Creates each as a bead with type: molecule\n3. Uses well-known IDs (mol-engineer-in-box, mol-quick-fix, mol-research)\n4. Idempotent (skip if already exists)\n\n## Acceptance Criteria\n```\ngt molecule seed\ngt molecule list # Shows 3 built-in molecules\n```\n\n## Parent\ngt-4nn: Molecules epic","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T14:13:27.432957-08:00","updated_at":"2025-12-27T21:29:54.033508-08:00","deleted_at":"2025-12-27T21:29:54.033508-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ddp4d","title":"Merge: gt-si8rq.9","description":"branch: polecat/nux-mjyruwvu\ntarget: main\nsource_issue: gt-si8rq.9\nrig: gastown\nagent_bead: gt-gastown-polecat-nux\nretry_count: 0\nlast_conflict_sha: null\nconflict_task_id: null","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-03T12:55:56.407069-08:00","updated_at":"2026-01-03T13:05:06.994259-08:00","closed_at":"2026-01-03T13:05:06.994259-08:00","close_reason":"Merged to main at 8d61c043","created_by":"gastown/polecats/nux"}
{"id":"gt-dft6a","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All green - halfway mark","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:32:55.926182-08:00","updated_at":"2025-12-27T21:26:02.560468-08:00","deleted_at":"2025-12-27T21:26:02.560468-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dhzo8","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox empty, polecats healthy (furiosa/nux idle), witness/refinery running, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:17:52.917896-08:00","updated_at":"2025-12-27T21:26:00.262691-08:00","deleted_at":"2025-12-27T21:26:00.262691-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dhzqj","title":"Merge: capable-mjxogaq4","description":"branch: polecat/capable-mjxogaq4\ntarget: main\nsource_issue: capable-mjxogaq4\nrig: gastown\nagent_bead: gt-gastown-polecat-capable","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:29:28.912411-08:00","updated_at":"2026-01-02T18:31:42.197981-08:00","closed_at":"2026-01-02T18:31:42.197981-08:00","close_reason":"Merged to main at fa26265b","created_by":"gastown/polecats/capable"}
{"id":"gt-djzh","title":"Merge: gt-ldk8","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-ldk8\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T01:15:25.915147-08:00","updated_at":"2025-12-27T21:27:22.435039-08:00","deleted_at":"2025-12-27T21:27:22.435039-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-dkc","title":"Add harness overview to Mayor priming","description":"Update gt prime Mayor context to explain the harness concept: umbrella repo for GT installation, rigs underneath, Mayor sits above all rigs","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T16:42:44.864606-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-dkc","depends_on_id":"gt-l1o","type":"blocks","created_at":"2025-12-17T16:42:54.736437-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-dlgm4","title":"Merge: valkyrie-1767106008400","description":"branch: polecat/valkyrie-1767106008400\ntarget: main\nsource_issue: valkyrie-1767106008400\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T06:58:44.347143-08:00","updated_at":"2025-12-30T10:06:56.623384-08:00","closed_at":"2025-12-30T10:06:56.623384-08:00","created_by":"gastown/polecats/valkyrie"}
{"id":"gt-dlrn","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:46","description":"Patrol 13: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:46:40.722551-08:00","updated_at":"2025-12-27T21:26:05.087567-08:00","deleted_at":"2025-12-27T21:26:05.087567-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dm7k","title":"Digest: mol-deacon-patrol","description":"Patrol #17: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:34:59.419736-08:00","updated_at":"2025-12-27T21:26:04.260319-08:00","deleted_at":"2025-12-27T21:26:04.260319-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-doih4","title":"BUG: gt status shows 'stopped' for running sessions (bead vs tmux mismatch)","description":"## Problem\n\n`gt status` shows agents as 'stopped' when their tmux session is actually running:\n\n```\n🐺 Deacon\n gt-deacon stopped ← But session exists!\n```\n\nMeanwhile:\n```bash\n$ tmux has-session -t gt-deacon \u0026\u0026 echo exists\nexists\n```\n\n## Root Cause\n\nStatus display uses agent bead state, not tmux session state. When they disagree, the display is misleading.\n\n## Current Logic\n\n```go\n// Simplified\nif agentBead.State == \"running\" {\n display \"running\"\n} else {\n display \"stopped\"\n}\n```\n\n## Expected Logic\n\n```go\nbeadState := agentBead.State\nsessionExists := tmux.HasSession(sessionName)\n\nif beadState == \"running\" \u0026\u0026 sessionExists {\n display \"running\"\n} else if beadState == \"running\" \u0026\u0026 !sessionExists {\n display \"running [dead]\" // Bead thinks running, session gone\n} else if beadState != \"running\" \u0026\u0026 sessionExists {\n display \"stopped [session exists]\" // Session exists but bead says stopped\n} else {\n display \"stopped\"\n}\n```\n\n## Related\n\nThe 'running [dead]' state already exists for witness/refinery. This should be consistent across all agents.","status":"closed","priority":2,"issue_type":"bug","created_at":"2026-01-02T18:42:56.42383-08:00","updated_at":"2026-01-02T18:53:02.39265-08:00","closed_at":"2026-01-02T18:53:02.39265-08:00","close_reason":"Fixed by reconciling tmux session state with bead state in renderAgentDetails()","created_by":"mayor"}
{"id":"gt-dq3","title":"Split PGT/GGT harness or migrate to GGT-only","description":"Current ~/ai harness is shared by PGT and GGT with confusing overlap. Options:\n1. Keep separate (document the coexistence)\n2. Migrate fully to GGT structure\n3. Create separate harnesses\n\nThis affects the beads redirect, Mayor home location, and rig structure.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T17:15:32.308192-08:00","updated_at":"2025-12-27T21:29:54.245064-08:00","dependencies":[{"issue_id":"gt-dq3","depends_on_id":"gt-cr9","type":"blocks","created_at":"2025-12-17T17:15:51.717903-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.245064-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-drbd","title":"Add no-PR instructions to mol-polecat-work at two points","description":"Update mol-polecat-work in builtin_molecules.go to explicitly forbid GitHub PRs.\n\n## Two Points to Add Instructions\n\n### 1. submit-work step\nWhen polecat is ready to submit:\n- Push branch to origin\n- Create beads merge-request issue\n- DO NOT use gh pr create or GitHub PRs\n\n### 2. CLAUDE.md polecat context\nAdd to polecat role instructions:\n- Never use gh pr create\n- Never create GitHub pull requests\n- The Refinery processes merges via beads MR issues\n\n## Why Two Points\n- Molecule step description guides the workflow\n- CLAUDE.md reinforces at context level\n- Belt and suspenders approach\n\n## Implementation\n1. Update PolecatWorkMolecule() submit-work step description\n2. Update prompts/roles/polecat.md with explicit prohibition\n\nRelated: gt-44wh (general no-PR bug)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:44:51.497283-08:00","updated_at":"2025-12-27T21:29:53.409225-08:00","dependencies":[{"issue_id":"gt-drbd","depends_on_id":"gt-44wh","type":"related","created_at":"2025-12-21T16:44:57.503314-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.409225-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-drp5","title":"mol-ready-work: graceful GitHub/label handling","description":"## Summary\n\nmol-ready-work assumes:\n- GitHub repo exists with gh CLI configured\n- Labels 'untriaged' and 'triaged' exist\n\nShould handle gracefully:\n1. No GitHub repo (beads-only project) → skip GH backlogs\n2. Missing labels → skip that backlog tier, don't error\n3. gh CLI not authenticated → warn and skip GH backlogs\n\n## Implementation\nAdd checks in scan-backlogs step:\n```bash\n# Check if gh is available and authenticated\nif gh auth status \u0026\u003e/dev/null; then\n # scan GH backlogs\nfi\n```\n\n## Parent\nPart of gt-tnca (mol-ready-work epic)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:19:39.992868-08:00","updated_at":"2025-12-27T21:29:56.242289-08:00","dependencies":[{"issue_id":"gt-drp5","depends_on_id":"gt-tnca","type":"blocks","created_at":"2025-12-23T01:19:56.493028-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.242289-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ds3h3","title":"gt mol status: role detection fails from polecat directory","description":"When running 'gt mol status' from a polecat's worktree directory (e.g., ~/gt/gastown/polecats/furiosa), the role detection incorrectly returns 'mayor' instead of 'gastown/furiosa'.\n\n## Root Cause\nThe detectRole() function in prime.go calculates relPath from townRoot (~gt) to cwd (~/gt/gastown/polecats/furiosa), giving 'gastown/polecats/furiosa'. It then checks parts[0] == 'mayor' but 'gastown' != 'mayor', so it falls through to rig detection where it treats 'gastown' as a rig name.\n\nHowever, the code then looks for parts[1] == 'polecats' but the actual check is parts[1] == 'polecats' which should work...\n\n## Actual Issue\nNeed to debug further - the hook file IS created correctly by spawn, but gt mol status can't find it because it's looking for the wrong agent identity.\n\n## Expected\nRunning from ~/gt/gastown/polecats/furiosa should detect:\n- Role: polecat\n- Rig: gastown \n- Polecat: furiosa\n- Agent identity: gastown/furiosa\n\n## Actual\n- Role: mayor\n- Agent identity: mayor\n\nThis prevents polecats from seeing their slung work via gt mol status.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-25T01:55:54.680601-08:00","updated_at":"2025-12-27T21:29:52.523004-08:00","deleted_at":"2025-12-27T21:29:52.523004-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-dsfi","title":"gt handoff: Deadlock in waitForRetirement","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T01:11:33.44686-08:00","updated_at":"2025-12-27T21:29:53.848393-08:00","deleted_at":"2025-12-27T21:29:53.848393-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-dt5","title":"Define Engineer as the Refinery agent role","description":"Clarify the distinction:\n\n- **Refinery** = place/module/directory/workspace\n - `\u003crig\u003e/refinery/` directory structure\n - `gt refinery start/stop/status` commands\n - tmux session name: `refinery` or `\u003crig\u003e-refinery`\n\n- **Engineer** = role/agent who works in the Refinery\n - CLAUDE.md prompting: \"You are an Engineer...\"\n - Documentation: \"The Engineer processes merge requests...\"\n - Mail address: `\u003crig\u003e/engineer` (or `\u003crig\u003e/refinery`?)\n\nUpdates needed:\n- Add Engineer role description to docs\n- Update CLAUDE.md templates for refinery agents\n- Keep `gt refinery` commands as-is (they manage the place)\n- Internal code stays `internal/refinery/` (the module)\n\nFuture consideration: Multiple Engineers in one Refinery for parallel merge processing.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T23:02:18.591842-08:00","updated_at":"2025-12-27T21:29:54.427731-08:00","dependencies":[{"issue_id":"gt-dt5","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:55.577196-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.427731-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dtw9u","title":"Witness: active polecat monitoring and nudging","description":"## Problem\nWitness runs but doesn't actively monitor polecats during patrol.\n\n## ZFC-Compliant Solution\nAdd step to `mol-witness-patrol.formula.toml`:\n\n```toml\n[[step]]\nid = \"survey-polecats\"\ntitle = \"Survey polecat health\"\ndescription = \"\"\"\nFor each polecat in this rig:\n1. Run: gt peek \u003crig\u003e/\u003cpolecat\u003e 20\n2. Check if work is pinned: gt mol status --rig \u003crig\u003e/\u003cpolecat\u003e\n3. If pinned but idle (no recent output), nudge: gt nudge \u003crig\u003e/\u003cpolecat\u003e \"You have work on hook\"\n4. If stuck \u003e15min with no progress, escalate to Mayor via mail\n\"\"\"\ndepends_on = [\"inbox-check\"]\n```\n\n## Why This Works\n- Witness Claude agent reads formula step\n- Agent decides what \"stuck\" means (uses judgment)\n- Agent runs gt commands (CLI can exist)\n- No Go code makes decisions\n\n## Files\n- formulas/mol-witness-patrol.formula.toml","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T16:41:04.576655-08:00","updated_at":"2025-12-27T21:29:45.764891-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-dtw9u","depends_on_id":"gt-gizsv","type":"relates-to","created_at":"2025-12-27T20:59:11.75938-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.764891-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dwvuu","title":"Digest: mol-deacon-patrol","description":"Patrol 11","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T14:55:12.380587-08:00","updated_at":"2025-12-26T14:55:12.380587-08:00","closed_at":"2025-12-26T14:55:12.380548-08:00"}
{"id":"gt-dxsvj","title":"Digest: mol-deacon-patrol","description":"Patrol 17: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:35:45.54298-08:00","updated_at":"2025-12-27T21:26:02.502902-08:00","deleted_at":"2025-12-27T21:26:02.502902-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-dyz3.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-dyz3\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:55:26.823047-08:00","updated_at":"2025-12-27T21:29:55.377417-08:00","deleted_at":"2025-12-27T21:29:55.377417-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e0eqg","title":"Session ended: gt-gastown-crew-gus","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:52:02.939679-08:00","updated_at":"2026-01-04T16:41:37.853815-08:00","closed_at":"2026-01-04T16:41:37.853815-08:00","close_reason":"Archived","created_by":"gastown/crew/gus"}
{"id":"gt-e0qj2","title":"Documentation overhaul: User Guide approach","description":"## Problem\n\nCurrent docs are:\n1. **Grandiose** - Too much vision, belongs in ~/gt/docs (HOP), not here\n2. **Verbose** - Need bullets/tables, not sections. Too many separate files.\n3. **Bottom-up** - Written as we built. Users need top-down.\n\nGas Town is a simple agent orchestrator (4-10 → 20-30 agents). Not a manifesto.\n\n## Target Structure\n\n### 1. Features at a Glance\n- Persistent agent identities with mail inboxes and \"work hooks\"\n- Sling work to agents, pinned to their hook\n- Universal Propulsion: If hook is pinned, RUN IT\n- Hooks survive crash, shutdown, compaction, restart\n- Self-sling for auto-restart workflows\n- Pin individual beads or entire epics\n\n### 2. Available Workflows\n- tmux mode vs no-tmux (raw Claude Code)\n- Full Gas Town vs partial (single roles)\n- Modular and resilient - one role at a time\n\n### 3. Cooking Formulas\n- Molecules = chains of issues (railroad tracks)\n- Formulas = step descriptions (atoms)\n- Cook formulas → molecules → durable workflows\n- Examples: release molecule, deacon patrol, polecat workflow, shiny\n\n### 4. (Optional) MEOW Deep Dive\n- Structured epics as chained work graphs\n- Serial/parallel branches, gates, loops, algebra\n- States of matter, polymorphic bond operators\n- But users just need to cook formulas\n\n## Deliverable\n\nOne README + maybe one or two supporting docs. Radically condensed.\n","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-26T23:21:03.464073-08:00","updated_at":"2025-12-27T21:29:45.798259-08:00","created_by":"stevey","deleted_at":"2025-12-27T21:29:45.798259-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-e1n86","title":"Template and Formula Distribution: Embed in binary, detect staleness","description":"## Problem\n\nTwo types of Gas Town infrastructure live in `.beads/` but have unclear distribution:\n\n1. **Role templates** (`*.md.tmpl`) - embedded in binary, rendered to CLAUDE.md on setup\n2. **Patrol formulas** (`*.formula.json`) - currently just exist, no clear seeding path\n\nWhen `gt` is rebuilt with updated templates/formulas, existing installations don't auto-update.\n\n## Current State\n\n- Templates: embedded via `//go:embed`, written on `gt install`/`gt rig init`\n- Formulas: exist in `.beads/formulas/` but no seeding mechanism\n- `bd init` creates empty structure (correct - bd shouldn't know Gas Town)\n- No staleness detection for either\n\n## Proposed Solution\n\n### 1. Embed formulas in `gt` binary\n\n```go\n//go:embed formulas/*.formula.json\nvar formulaFS embed.FS\n```\n\n### 2. Seed formulas during setup\n\n`gt install` and `gt rig init` write embedded formulas to `.beads/formulas/`.\n\n### 3. Hash-based staleness detection\n\n`gt doctor` checks:\n- CLAUDE.md hash vs embedded template hash\n- Formula files hash vs embedded formula hash\n\nIf stale → warn. `gt doctor --fix` refreshes.\n\n### 4. Keep bd/gt boundary clean\n\n- `bd` stays generic: `bd cook \u003cpath\u003e` takes any formula path\n- `gt` knows where formulas live, handles distribution\n- Patrol formulas are Gas Town infrastructure, not beads concern\n\n## Tasks\n\n1. Move formulas to embedded FS in gt binary\n2. Add formula seeding to `gt install` / `gt rig init`\n3. Add hash storage for templates and formulas\n4. Add staleness detection to `gt doctor`\n5. Add `--fix` flag to refresh stale files","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T20:04:28.044085-08:00","updated_at":"2025-12-27T21:29:54.960657-08:00","deleted_at":"2025-12-27T21:29:54.960657-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-e1r","title":"CLI: rig commands (add, list, show, remove)","description":"GGT needs rig management commands.\n\nMissing Commands:\n- gt rig add \u003cgit-url\u003e [--name NAME] - Add rig to workspace\n- gt rig list - List all rigs with status\n- gt rig show \u003crig\u003e - Show detailed rig info\n- gt rig remove \u003crig\u003e - Remove a rig\n\nPGT Reference: gastown-py/src/gastown/cli/rig_cmd.py\n\nCurrent State:\n- gt init creates rig structure but no ongoing management\n- Rig discovery in internal/rig/manager.go but not exposed via CLI","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T14:46:33.23951-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-e1y","title":"Worker prompting: Beads write access","description":"Add beads write access section to polecat AGENTS.md.template.\n\n## Beads Access Section for Prompting\n\n```markdown\n## Beads Access\n\nYou have **full beads access** - create, update, and close issues.\n\n### Quick Reference\n\n```bash\n# View work\nbd ready # Issues ready (no blockers)\nbd list # All open issues\nbd show \u003cid\u003e # Issue details\n\n# Create issues\nbd create --title=\"Fix bug\" --type=bug --priority=2\nbd create --title=\"Add feature\" --type=feature\n\n# Update issues\nbd update \u003cid\u003e --status=in_progress # Claim work\nbd close \u003cid\u003e # Mark complete\n\n# Sync (required before merge!)\nbd sync # Commit beads changes to git\n```\n\n### When to Create Issues\n\nCreate beads issues when you discover work that:\n- Is outside your current task scope\n- Would benefit from tracking\n- Should be done by someone else\n\n**Good examples**:\n```bash\nbd create --title=\"Race condition in auth\" --type=bug --priority=1\nbd create --title=\"Document API rate limits\" --type=task --priority=3\n```\n\n**Don't create for**:\n- Tiny fixes you can do in 2 minutes\n- Vague \"improvements\" with no scope\n- Work already tracked elsewhere\n\n### Beads Sync Protocol\n\n**CRITICAL**: Always sync beads before merging!\n\n```bash\nbd sync # Commits beads changes\ngit add .beads/\ngit commit -m \"beads: sync\"\n```\n\nIf you forget to sync, beads changes are lost when session ends.\n```\n\n## Implementation\n\nAdd to AGENTS.md.template polecat section.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:47:18.459363-08:00","updated_at":"2025-12-27T21:29:54.618089-08:00","dependencies":[{"issue_id":"gt-e1y","depends_on_id":"gt-l3c","type":"blocks","created_at":"2025-12-15T19:47:35.81183-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.618089-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e1zbl","title":"Digest: mol-deacon-patrol","description":"Patrol #21: All agents healthy, no lifecycle requests, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:23:18.586973-08:00","updated_at":"2025-12-27T21:26:02.204206-08:00","deleted_at":"2025-12-27T21:26:02.204206-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e38xm","title":"Digest: mol-deacon-patrol","description":"Patrol complete: cleaned 2 orphaned processes, 3 stale locks, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:10:28.628935-08:00","updated_at":"2025-12-27T21:26:01.221225-08:00","deleted_at":"2025-12-27T21:26:01.221225-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e3gc1","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:15:10.866237-08:00","updated_at":"2025-12-27T21:26:02.748836-08:00","deleted_at":"2025-12-27T21:26:02.748836-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e5b6t","title":"Digest: mol-deacon-patrol","description":"Patrol 1: cleaned 3 stale locks, 3 orphan processes, doctor passed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:16:05.788536-08:00","updated_at":"2025-12-27T21:26:01.304239-08:00","deleted_at":"2025-12-27T21:26:01.304239-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e5o","title":"Fix role detection for nested rig structures","description":"When gastown/ has its own mayor/ dir, workspace detection finds it as town root instead of ~/ai. This breaks polecat/refinery/witness detection when running from within a rig.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-17T16:47:18.519581-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-e5ymc","title":"Digest: mol-deacon-patrol","description":"Patrol 6: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:19:22.540112-08:00","updated_at":"2025-12-27T21:26:01.18473-08:00","deleted_at":"2025-12-27T21:26:01.18473-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e74q","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:44","description":"Patrol 5: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:44:17.155633-08:00","updated_at":"2025-12-27T21:26:05.145944-08:00","deleted_at":"2025-12-27T21:26:05.145944-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e76","title":"gt mail reply/thread: Conversation support","description":"Add mail conversation features:\n\n- gt mail reply \u003cid\u003e -m 'message' - Reply to a message\n- gt mail thread \u003cid\u003e - Show all messages in a thread\n\nEnables back-and-forth communication between agents.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T21:50:06.215773-08:00","updated_at":"2025-12-27T21:29:57.231061-08:00","dependencies":[{"issue_id":"gt-e76","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:23:43.106435-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.231061-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e9o3v","title":"Digest: mol-deacon-patrol","description":"Patrol 13: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T03:21:27.471717-08:00","updated_at":"2025-12-27T21:26:03.755098-08:00","deleted_at":"2025-12-27T21:26:03.755098-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-e9za","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:41","description":"Patrol 1: All healthy, no actions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:41:26.61734-08:00","updated_at":"2025-12-27T21:26:05.179475-08:00","deleted_at":"2025-12-27T21:26:05.179475-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eaypd","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 8: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T05:14:49.391722-08:00","updated_at":"2026-01-01T05:14:49.391722-08:00","closed_at":"2026-01-01T05:14:49.391684-08:00"}
{"id":"gt-ebd8i","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Midpoint, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:27:19.097799-08:00","updated_at":"2025-12-27T21:26:03.626793-08:00","deleted_at":"2025-12-27T21:26:03.626793-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ecfc","title":"Digest: mol-deacon-patrol","description":"Patrol 17: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:33.287033-08:00","updated_at":"2025-12-27T21:26:04.540721-08:00","deleted_at":"2025-12-27T21:26:04.540721-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ed39","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:28","description":"Patrol 17","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:28:33.433261-08:00","updated_at":"2025-12-27T21:26:05.212978-08:00","deleted_at":"2025-12-27T21:26:05.212978-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-edos","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:27","description":"Patrol 15","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:27:45.347605-08:00","updated_at":"2025-12-27T21:26:05.230175-08:00","deleted_at":"2025-12-27T21:26:05.230175-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eemiq","title":"Blocker: gt binary in ~/.local/bin gets SIGKILL (137) due to syspolicy/provenance","description":"## Problem\n`gt` invocations are being immediately killed with `SIGKILL (9)` (shell shows `killed` and exit code `137`). This makes basic commands like `gt status` and `gt mayor at` unusable.\n\nIn the observed environment, `which -a gt` resolves to:\n- `/Users/jv/.local/bin/gt` (first in PATH) ← **this one is killed**\n- `/Users/jv/go/bin/gt` ← works normally\n\n## Evidence\n- Running `gt …` from the shell exits `137`.\n- Running `/Users/jv/.local/bin/gt version` exits `137`.\n- Running `/Users/jv/go/bin/gt version` succeeds.\n- `spctl --assess --verbose=4 /Users/jv/.local/bin/gt` reports `rejected`.\n- `log show --last 2m … | rg syspolicyd` shows `com.apple.syspolicy.exec` provenance sandbox activity when launching the failing binary.\n\n## Diagnosis\nThis is not a Gas Town tmux/daemon issue; the OS is killing the `gt` executable at exec time.\nThe broken binary is the PATH-preferred `/Users/jv/.local/bin/gt`.\n\n## Immediate Mitigation\nUse the working `gt` binary:\n- Prefer `/Users/jv/go/bin/gt` (temporarily) by adjusting PATH, or\n- Replace the broken binary in place:\n - `mv ~/.local/bin/gt ~/.local/bin/gt.bad`\n - `cp ~/go/bin/gt ~/.local/bin/gt`\n - `hash -r`\n\n## Follow-ups\n- Determine why `/Users/jv/.local/bin/gt` is rejected (build pipeline or download provenance).\n- Ensure install/build scripts dont produce a syspolicy-rejected binary in `~/.local/bin`.\n\n## Acceptance Criteria\n- `gt version` / `gt status` / `gt mayor at` run without being killed (no exit `137`).\n- PATH resolves to a working `gt` binary by default.\n","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2026-01-08T11:52:17.203476+13:00","updated_at":"2026-01-08T12:04:05.022631+13:00","close_reason":"Resolved locally (reinstalled via go install); no longer needed","created_by":"jv","deleted_at":"2026-01-08T12:04:05.022631+13:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-eevs","title":"Digest: mol-deacon-patrol","description":"Patrol #6: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:31:47.668271-08:00","updated_at":"2025-12-27T21:26:04.350917-08:00","deleted_at":"2025-12-27T21:26:04.350917-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eg1es.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-eg1es\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:05:28.606112-08:00","updated_at":"2025-12-27T21:29:55.342914-08:00","deleted_at":"2025-12-27T21:29:55.342914-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-egu","title":"gt refinery attach: Attach to refinery session","description":"Add 'gt refinery attach [rig]' command to attach to refinery tmux session.\n\nMirrors 'gt mayor attach' pattern. If rig not specified, use current rig context.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T21:47:19.164342-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-egu","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:22:47.578871-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-eh46a","title":"Digest: mol-deacon-patrol","description":"P8: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:11:44.451875-08:00","updated_at":"2025-12-27T21:26:02.278037-08:00","deleted_at":"2025-12-27T21:26:02.278037-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eh7p","title":"Digest: mol-deacon-patrol","description":"Patrol #1: All agents healthy, no messages, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:28:50.209258-08:00","updated_at":"2025-12-27T21:26:04.392292-08:00","deleted_at":"2025-12-27T21:26:04.392292-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ehjuv","title":"Digest: mol-deacon-patrol","description":"Patrol 20: routine, handoff triggered","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:47:40.181991-08:00","updated_at":"2025-12-27T21:26:00.862944-08:00","deleted_at":"2025-12-27T21:26:00.862944-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eho9v","title":"Digest: mol-deacon-patrol","description":"Patrol 8: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:14:26.413184-08:00","updated_at":"2025-12-27T21:26:01.040705-08:00","deleted_at":"2025-12-27T21:26:01.040705-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ekc1f","title":"Merge: rictus-mjxc967h","description":"branch: polecat/rictus-mjxc967h\ntarget: main\nsource_issue: rictus-mjxc967h\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T13:05:29.628889-08:00","updated_at":"2026-01-02T13:41:40.394376-08:00","closed_at":"2026-01-02T13:41:40.394376-08:00","close_reason":"Branches merged, cleaning up stale MR beads","created_by":"gastown/polecats/rictus"}
{"id":"gt-ekc5u","title":"gt witness/refinery start should have 'ensure' semantics (kill zombie if needed)","description":"## Problem\n\n`gt witness start \u003crig\u003e` fails if the tmux session already exists:\n```\n⚠ Witness session already running\n Use 'gt witness attach' to connect\n```\n\nBut the session might be a zombie (tmux alive, Claude dead). User has to manually:\n1. `gt witness stop \u003crig\u003e`\n2. `gt witness start \u003crig\u003e`\n\nOr use `gt witness restart \u003crig\u003e` which works but isn't intuitive.\n\n## Proposal\n\n`gt witness start` should have 'ensure running' semantics:\n\n1. If no session exists → create new session\n2. If session exists and healthy → do nothing (already running)\n3. If session exists but zombie → kill and recreate\n\nThis makes `start` idempotent and safe to run repeatedly.\n\n## Implementation\n\n```go\nfunc (w *Witness) Start(rig string) error {\n sessionName := witnessSessionName(rig)\n \n if w.tmux.HasSession(sessionName) {\n if w.isHealthy(sessionName) {\n return nil // Already running and healthy\n }\n // Zombie - kill it\n w.tmux.KillSession(sessionName)\n }\n \n // Create fresh session\n return w.createSession(rig)\n}\n```\n\n## Alternatives\n\n- Keep current behavior, just improve error message: 'Session exists but appears dead, use restart'\n- Add --force flag: `gt witness start --force \u003crig\u003e`\n\n## Recommendation\n\nMake `start` idempotent. It's the principle of least surprise.","status":"closed","priority":2,"issue_type":"feature","created_at":"2026-01-02T18:43:11.913225-08:00","updated_at":"2026-01-02T18:51:48.415341-08:00","closed_at":"2026-01-02T18:51:48.415341-08:00","close_reason":"Implemented ensure semantics in witness.go:ensureWitnessSession and refinery/manager.go:Start using IsClaudeRunning to detect zombies","created_by":"mayor"}
{"id":"gt-ekgxf","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:06:15.060768-08:00","updated_at":"2025-12-27T21:26:03.000866-08:00","deleted_at":"2025-12-27T21:26:03.000866-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eln2","title":"Digest: mol-deacon-patrol","description":"Patrol OK: 0 polecats, 6 core sessions up, town quiet","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T02:11:42.107532-08:00","updated_at":"2025-12-27T21:26:05.380714-08:00","deleted_at":"2025-12-27T21:26:05.380714-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eo1aa","title":"Digest: mol-deacon-patrol","description":"Patrol 19: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:06:03.759472-08:00","updated_at":"2025-12-27T21:26:03.347957-08:00","deleted_at":"2025-12-27T21:26:03.347957-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ep7f","title":"survey-workers","description":"List polecats and categorize by status.\n\n```bash\ngt polecat list \u003crig\u003e\n```\n\nNeeds: load-state","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.506038-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-ep7f","depends_on_id":"gt-751s","type":"parent-child","created_at":"2025-12-23T01:41:54.555919-08:00","created_by":"stevey"},{"issue_id":"gt-ep7f","depends_on_id":"gt-z94m","type":"blocks","created_at":"2025-12-23T01:41:54.562822-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-eph-037","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2025-12-31T16:04:06.255281-08:00","updated_at":"2025-12-31T17:11:56.756988-08:00","closed_at":"2025-12-31T17:11:56.756988-08:00"}
{"id":"gt-eph-0424","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2026-01-01T10:50:56.178149-08:00","updated_at":"2026-01-01T19:56:51.88239-08:00","deleted_at":"2026-01-01T19:56:51.88239-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"epic"}
{"id":"gt-eph-04p3","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Idle Town Principle\n\n**The Deacon should be silent/invisible when the town is healthy and idle.**\n\n- Skip HEALTH_CHECK nudges when no active work exists\n- Sleep 60+ seconds between patrol cycles (longer when idle)\n- Let the feed subscription wake agents on actual events\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n\nThis prevents flooding idle agents with health checks every few seconds.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"hooked","priority":2,"issue_type":"epic","created_at":"2026-01-01T20:37:16.978323-08:00","updated_at":"2026-01-01T20:37:22.84116-08:00"}
{"id":"gt-eph-0hmg","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Idle Town Principle\n\n**The Deacon should be silent/invisible when the town is healthy and idle.**\n\n- Skip HEALTH_CHECK nudges when no active work exists\n- Sleep 60+ seconds between patrol cycles (longer when idle)\n- Let the feed subscription wake agents on actual events\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n\nThis prevents flooding idle agents with health checks every few seconds.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2026-01-01T22:40:58.290274-08:00","updated_at":"2026-01-02T17:07:58.997966-08:00","closed_at":"2026-01-02T17:07:58.997966-08:00"}
{"id":"gt-eph-0lwb","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2026-01-01T18:10:47.452051-08:00","updated_at":"2026-01-02T12:40:42.195612-08:00","closed_at":"2026-01-02T12:40:42.195612-08:00","close_reason":"Patrol cycle complete, all steps done"}
{"id":"gt-eph-0nj2","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Idle Town Principle\n\n**The Deacon should be silent/invisible when the town is healthy and idle.**\n\n- Skip HEALTH_CHECK nudges when no active work exists\n- Sleep 60+ seconds between patrol cycles (longer when idle)\n- Let the feed subscription wake agents on actual events\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n\nThis prevents flooding idle agents with health checks every few seconds.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2026-01-01T19:13:20.706601-08:00","updated_at":"2026-01-04T23:40:58.013453-08:00","close_reason":"Cleanup: stale molecule","deleted_at":"2026-01-01T19:56:52.294984-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"epic"}
{"id":"gt-eph-19t","title":"Handle callbacks from agents","description":"Handle callbacks from agents.\n\nCheck the Mayor's inbox for messages from:\n- Witnesses reporting polecat status\n- Refineries reporting merge results\n- Polecats requesting help or escalation\n- External triggers (webhooks, timers)\n\n```bash\ngt mail inbox\n# For each message:\ngt mail read \u003cid\u003e\n# Handle based on message type\n```\n\n**WITNESS_PING**:\nWitnesses periodically ping to verify Deacon is alive. Simply acknowledge\nand archive - the fact that you're processing mail proves you're running.\nYour agent bead last_activity is updated automatically during patrol.\n```bash\ngt mail archive \u003cmessage-id\u003e\n```\n\n**HELP / Escalation**:\nAssess and handle or forward to Mayor.\nArchive after handling:\n```bash\ngt mail archive \u003cmessage-id\u003e\n```\n\n**LIFECYCLE messages**:\nPolecats reporting completion, refineries reporting merge results.\nArchive after processing:\n```bash\ngt mail archive \u003cmessage-id\u003e\n```\n\n**DOG_DONE messages**:\nDogs report completion after infrastructure tasks (orphan-scan, session-gc, etc.).\nSubject format: `DOG_DONE \u003chostname\u003e`\nBody contains: task name, counts, status.\n```bash\n# Parse the report, log metrics if needed\ngt mail read \u003cid\u003e\n# Archive after noting completion\ngt mail archive \u003cmessage-id\u003e\n```\nDogs return to idle automatically. The report is informational - no action needed\nunless the dog reports errors that require escalation.\n\nCallbacks may spawn new polecats, update issue state, or trigger other actions.\n\n**Hygiene principle**: Archive messages after they're fully processed.\nKeep inbox near-empty - only unprocessed items should remain.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T16:23:58.439855-08:00","updated_at":"2026-01-01T19:56:53.502588-08:00","close_reason":"Stale patrol cleanup","deleted_at":"2026-01-01T19:56:53.502588-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-2wb","title":"Execute registered plugins","description":"Execute registered plugins.\n\nScan ~/gt/plugins/ for plugin directories. Each plugin has a plugin.md with YAML frontmatter defining its gate (when to run) and instructions (what to do).\n\nSee docs/deacon-plugins.md for full documentation.\n\nGate types:\n- cooldown: Time since last run (e.g., 24h)\n- cron: Schedule-based (e.g., \"0 9 * * *\")\n- condition: Metric threshold (e.g., wisp count \u003e 50)\n- event: Trigger-based (e.g., startup, heartbeat)\n\nFor each plugin:\n1. Read plugin.md frontmatter to check gate\n2. Compare against state.json (last run, etc.)\n3. If gate is open, execute the plugin\n\nPlugins marked parallel: true can run concurrently using Task tool subagents. Sequential plugins run one at a time in directory order.\n\nSkip this step if ~/gt/plugins/ does not exist or is empty.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:54:49.166597-08:00","updated_at":"2026-01-01T19:56:55.318424-08:00","dependencies":[{"issue_id":"gt-eph-2wb","depends_on_id":"gt-eph-5jh","type":"blocks","created_at":"2025-12-28T09:54:49.258993-08:00","created_by":"deacon"}],"deleted_at":"2026-01-01T19:56:55.318424-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-303l","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Idle Town Principle\n\n**The Deacon should be silent/invisible when the town is healthy and idle.**\n\n- Skip HEALTH_CHECK nudges when no active work exists\n- Sleep 60+ seconds between patrol cycles (longer when idle)\n- Let the feed subscription wake agents on actual events\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n\nThis prevents flooding idle agents with health checks every few seconds.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2026-01-01T20:02:41.694107-08:00","updated_at":"2026-01-04T23:40:58.041201-08:00","closed_at":"2026-01-04T23:40:58.041201-08:00","close_reason":"Cleanup: stale molecule"}
{"id":"gt-eph-38z6","title":"Burn and respawn or loop","description":"Burn and let daemon respawn, or exit if context high.\n\nDecision point at end of patrol cycle:\n\nIf context is LOW:\n- Sleep briefly (avoid tight loop)\n- Return to inbox-check step\n\nIf context is HIGH:\n- Write state to persistent storage\n- Exit cleanly\n- Let the daemon orchestrator respawn a fresh Deacon\n\nThe daemon ensures Deacon is always running:\n```bash\n# Daemon respawns on exit\ngt daemon status\n```\n\nThis enables infinite patrol duration via context-aware respawning.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.459647-08:00","updated_at":"2026-01-01T19:56:56.386703-08:00","deleted_at":"2026-01-01T19:56:56.386703-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-3n5","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"open","priority":2,"issue_type":"epic","created_at":"2025-12-31T14:15:34.258108-08:00","updated_at":"2025-12-31T14:15:34.258108-08:00"}
{"id":"gt-eph-43s","title":"Find abandoned work","description":"Find abandoned work.\n\nScan for orphaned state:\n- Issues marked in_progress with no active polecat\n- Polecats that stopped responding mid-work\n- Merge queue entries with no polecat owner\n- Wisp sessions that outlived their spawner\n\n```bash\nbd list --status=in_progress\ngt polecats --all --orphan\n```\n\nFor each orphan:\n- Check if polecat session still exists\n- If not, mark issue for reassignment or retry\n- File incident beads if data loss occurred","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:55:41.719317-08:00","updated_at":"2026-01-01T19:56:57.892882-08:00","close_reason":"Patrol complete. Findings:\n- No in_progress issues without active polecats\n- All 6 active polecats (capable, dementus, furiosa, nux, rictus, slit) are idle with running sessions\n- Closed 3 orphaned merge requests:\n - gt-tpq7i (keeper): branch deleted, no work\n - gt-5o2l0 (cheedo): branch deleted, no work \n - gt-e0p84 (toast): only bd sync commit, no code\n- Garbage collected 39 stale polecat branches","deleted_at":"2026-01-01T19:56:57.892882-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-5c8","title":"Handle callbacks from agents","description":"Handle callbacks from agents.\n\nCheck the Mayor's inbox for messages from:\n- Witnesses reporting polecat status\n- Refineries reporting merge results\n- Polecats requesting help or escalation\n- External triggers (webhooks, timers)\n\n```bash\ngt mail inbox\n# For each message:\ngt mail read \u003cid\u003e\n# Handle based on message type\n```\n\nCallbacks may spawn new polecats, update issue state, or trigger other actions.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:55:41.717851-08:00","updated_at":"2026-01-01T19:56:59.244854-08:00","close_reason":"Implemented gt callbacks process command with support for POLECAT_DONE, MERGE_COMPLETED/REJECTED, HELP, ESCALATION, SLING_REQUEST, WITNESS_REPORT, and REFINERY_REPORT callbacks. Added EventCallback to townlog for logging.","deleted_at":"2026-01-01T19:56:59.244854-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-5jh","title":"Check Witness and Refinery health","description":"Check Witness and Refinery health for each rig.\n\n**ZFC Principle**: You (Claude) make the judgment call about what is \"stuck\" or \"unresponsive\" - there are no hardcoded thresholds in Go. Read the signals, consider context, and decide.\n\nFor each rig, run:\n```bash\ngt witness status \u003crig\u003e\ngt refinery status \u003crig\u003e\n```\n\n**Signals to assess:**\n\n| Component | Healthy Signals | Concerning Signals |\n|-----------|-----------------|-------------------|\n| Witness | State: running, recent activity | State: not running, no heartbeat |\n| Refinery | State: running, queue processing | Queue stuck, merge failures |\n\n**Tracking unresponsive cycles:**\n\nMaintain in your patrol state (persisted across cycles):\n```\nhealth_state:\n \u003crig\u003e:\n witness:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n refinery:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n```\n\n**Decision matrix** (you decide the thresholds based on context):\n\n| Cycles Unresponsive | Suggested Action |\n|---------------------|------------------|\n| 1-2 | Note it, check again next cycle |\n| 3-4 | Attempt restart: gt witness restart \u003crig\u003e |\n| 5+ | Escalate to Mayor with context |\n\n**Restart commands:**\n```bash\ngt witness restart \u003crig\u003e\ngt refinery restart \u003crig\u003e\n```\n\n**Escalation:**\n```bash\ngt mail send mayor/ -s \"Health: \u003crig\u003e \u003ccomponent\u003e unresponsive\" \\\n -m \"Component has been unresponsive for N cycles. Restart attempts failed.\n Last healthy: \u003ctimestamp\u003e\n Error signals: \u003cdetails\u003e\"\n```\n\nReset unresponsive_cycles to 0 when component responds normally.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:54:49.166307-08:00","updated_at":"2026-01-01T19:56:59.81519-08:00","deleted_at":"2026-01-01T19:56:59.81519-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-5vg5","title":"Rotate logs and prune state","description":"Maintain daemon logs and state files.\n\n**Step 1: Check daemon.log size**\n```bash\n# Get log file size\nls -la ~/.beads/daemon*.log 2\u003e/dev/null || ls -la ~/gt/.beads/daemon*.log 2\u003e/dev/null\n```\n\nIf daemon.log exceeds 10MB:\n```bash\n# Rotate with date suffix and gzip\nLOGFILE=\"$HOME/gt/.beads/daemon.log\"\nif [ -f \"$LOGFILE\" ] \u0026\u0026 [ $(stat -f%z \"$LOGFILE\" 2\u003e/dev/null || stat -c%s \"$LOGFILE\") -gt 10485760 ]; then\n DATE=$(date +%Y-%m-%dT%H-%M-%S)\n mv \"$LOGFILE\" \"${LOGFILE%.log}-${DATE}.log\"\n gzip \"${LOGFILE%.log}-${DATE}.log\"\nfi\n```\n\n**Step 2: Archive old daemon logs**\n\nClean up daemon logs older than 7 days:\n```bash\nfind ~/gt/.beads/ -name \"daemon-*.log.gz\" -mtime +7 -delete\n```\n\n**Step 3: Prune state.json of dead sessions**\n\nThe state.json tracks active sessions. Prune entries for sessions that no longer exist:\n```bash\n# Check for stale session entries\ngt daemon status --json 2\u003e/dev/null\n```\n\nIf state.json references sessions not in tmux:\n- Remove the stale entries\n- The daemon's internal cleanup should handle this, but verify\n\n**Note**: Log rotation prevents disk bloat from long-running daemons.\nState pruning keeps runtime state accurate.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.02488-08:00","updated_at":"2026-01-01T19:57:00.476866-08:00","deleted_at":"2026-01-01T19:57:00.476866-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-66hi","title":"Evaluate pending async gates","description":"Evaluate pending async gates.\n\nGates are async coordination primitives that block until conditions are met.\nThe Deacon is responsible for monitoring gates and closing them when ready.\n\n**Timer gates** (await_type: timer):\nCheck if elapsed time since creation exceeds the timeout duration.\n\n```bash\n# List all open gates\nbd gate list --json\n\n# For each timer gate, check if elapsed:\n# - CreatedAt + Timeout \u003c Now → gate is ready to close\n# - Close with: bd gate close \u003cid\u003e --reason \"Timer elapsed\"\n```\n\n**GitHub gates** (await_type: gh:run, gh:pr) - handled in separate step.\n\n**Human/Mail gates** - require external input, skip here.\n\nAfter closing a gate, the Waiters field contains mail addresses to notify.\nSend a brief notification to each waiter that the gate has cleared.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.454062-08:00","updated_at":"2026-01-01T19:54:02.722822-08:00","deleted_at":"2026-01-01T19:54:02.722822-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-6e6","title":"Clean dead sessions","description":"Clean dead sessions and orphaned state.\n\nRun `gt doctor --fix` to handle all cleanup:\n\n```bash\n# Preview what needs cleaning\ngt doctor -v\n\n# Fix everything\ngt doctor --fix\n```\n\nThis handles:\n- **orphan-sessions**: Kill orphaned tmux sessions (gt-* not matching valid patterns)\n- **orphan-processes**: Kill orphaned Claude processes (no tmux parent)\n- **wisp-gc**: Garbage collect abandoned wisps (\u003e1h old)\n\nAll cleanup is handled by doctor checks - no need to run separate commands.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:54:49.167138-08:00","updated_at":"2026-01-01T19:57:00.749821-08:00","deleted_at":"2026-01-01T19:57:00.749821-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-6j9","title":"mol-refinery-patrol","description":"Merge queue processor patrol loop.\n\nThe Refinery is the Engineer in the engine room. You process polecat branches, merging them to main one at a time with sequential rebasing.\n\n**The Scotty Test**: Before proceeding past any failure, ask yourself: \"Would Scotty walk past a warp core leak because it existed before his shift?\"\n\n## Merge Flow\n\nThe Refinery receives MERGE_READY mail from Witnesses when polecats complete work:\n\n```\nWitness Refinery Git\n │ │ │\n │ MERGE_READY │ │\n │─────────────────────────\u003e│ │\n │ │ │\n │ (verify branch) │\n │ │ fetch \u0026 rebase │\n │ │──────────────────────────\u003e│\n │ │ │\n │ (run tests) │\n │ │ │\n │ (if pass) │\n │ │ merge \u0026 push │\n │ │──────────────────────────\u003e│\n │ │ │\n │ MERGED │ │\n │\u003c─────────────────────────│ │\n │ │ │\n```\n\nAfter successful merge, Refinery sends MERGED mail back to Witness so it can\ncomplete cleanup (nuke the polecat worktree).","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-30T00:54:37.561294-08:00","updated_at":"2026-01-04T23:40:58.103506-08:00","close_reason":"Cleanup: stale molecule","deleted_at":"2026-01-01T19:57:01.020687-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"epic"}
{"id":"gt-eph-6jo7","title":"Check convoy completion","description":"Check convoy completion status.\n\nConvoys are coordination beads that track multiple issues across rigs. When all tracked issues close, the convoy auto-closes.\n\n**Step 1: Find open convoys**\n```bash\nbd list --type=convoy --status=open\n```\n\n**Step 2: For each open convoy, check tracked issues**\n```bash\nbd show \u003cconvoy-id\u003e\n# Look for 'tracks' or 'dependencies' field listing tracked issues\n```\n\n**Step 3: If all tracked issues are closed, close the convoy**\n```bash\n# Check each tracked issue\nfor issue in tracked_issues:\n bd show \u003cissue-id\u003e\n # If status is open/in_progress, convoy stays open\n # If all are closed (completed, wontfix, etc.), convoy is complete\n\n# Close convoy when all tracked issues are done\nbd close \u003cconvoy-id\u003e --reason \"All tracked issues completed\"\n```\n\n**Note**: Convoys support cross-prefix tracking (e.g., hq-* convoy can track gt-*, bd-* issues). Use full IDs when checking.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.454539-08:00","updated_at":"2026-01-01T19:57:01.163019-08:00","close_reason":"No gates/convoys/nudges","deleted_at":"2026-01-01T19:57:01.163019-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-6m7i","title":"Detect abandoned work","description":"**DETECT ONLY** - Check for orphaned state and dispatch to dog if found.\n\n**Step 1: Quick orphan scan**\n```bash\n# Check for in_progress issues with dead assignees\nbd list --status=in_progress --json | head -20\n```\n\nFor each in_progress issue, check if assignee session exists:\n```bash\ntmux has-session -t \u003csession\u003e 2\u003e/dev/null \u0026\u0026 echo \"alive\" || echo \"orphan\"\n```\n\n**Step 2: If orphans detected, dispatch to dog**\n```bash\n# Sling orphan-scan formula to an idle dog\ngt sling mol-orphan-scan deacon/dogs --var scope=town\n```\n\n**Important:** Do NOT fix orphans inline. Dogs handle recovery.\nThe Deacon's job is detection and dispatch, not execution.\n\n**Step 3: If no orphans detected**\nSkip dispatch - nothing to do.\n\n**Exit criteria:** Orphan scan dispatched to dog (if needed).","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.023956-08:00","updated_at":"2026-01-01T19:57:01.428853-08:00","deleted_at":"2026-01-01T19:57:01.428853-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-6vlp","title":"Backup check for zombie polecats","description":"Defense-in-depth check for zombie polecats that Witness should have cleaned.\n\n**Why this exists:**\nThe Witness is responsible for nuking polecats after they complete work (via POLECAT_DONE).\nThis step provides backup detection in case the Witness fails to clean up.\n\n**Zombie criteria:**\n- State: idle or done (no active work assigned)\n- Session: not running (tmux session dead)\n- No hooked work (nothing pending for this polecat)\n- Last activity: older than 10 minutes\n\n**Run the zombie scan:**\n```bash\ngt deacon zombie-scan --dry-run\n```\n\n**If zombies detected:**\n1. Review the output to confirm they are truly abandoned\n2. Run without --dry-run to nuke them:\n ```bash\n gt deacon zombie-scan\n ```\n3. This will:\n - Nuke each zombie polecat\n - Notify the Mayor about Witness failure\n - Log the cleanup action\n\n**If no zombies:**\nNo action needed - Witness is doing its job.\n\n**Note:** This is a backup mechanism. If you frequently find zombies,\ninvestigate why the Witness isn't cleaning up properly.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.022557-08:00","updated_at":"2026-01-01T19:57:01.979131-08:00","close_reason":"Closed","deleted_at":"2026-01-01T19:57:01.979131-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-6yvi","title":"Burn and respawn or loop","description":"Burn and let daemon respawn, or exit if context high.\n\nDecision point at end of patrol cycle:\n\nIf context is LOW:\n- **Sleep 60 seconds minimum** before next patrol cycle\n- If town is idle (no in_progress work), sleep longer (2-5 minutes)\n- Return to inbox-check step\n\n**Why longer sleep?**\n- Idle agents should not be disturbed\n- Health checks every few seconds flood inboxes and waste context\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n- Active work triggers feed events, which wake agents naturally\n\nIf context is HIGH:\n- Write state to persistent storage\n- Exit cleanly\n- Let the daemon orchestrator respawn a fresh Deacon\n\nThe daemon ensures Deacon is always running:\n```bash\n# Daemon respawns on exit\ngt daemon status\n```\n\nThis enables infinite patrol duration via context-aware respawning.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.026286-08:00","updated_at":"2026-01-01T19:57:02.238284-08:00","deleted_at":"2026-01-01T19:57:02.238284-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-7yep","title":"Maintain dog pool","description":"Ensure dog pool has available workers for dispatch.\n\n**Step 1: Check dog pool status**\n```bash\ngt dog status\n# Shows idle/working counts\n```\n\n**Step 2: Ensure minimum idle dogs**\nIf idle count is 0 and working count is at capacity, consider spawning:\n```bash\n# If no idle dogs available\ngt dog add \u003cname\u003e\n# Names: alpha, bravo, charlie, delta, etc.\n```\n\n**Step 3: Retire stale dogs (optional)**\nDogs that have been idle for \u003e24 hours can be removed to save resources:\n```bash\ngt dog status \u003cname\u003e\n# Check last_active timestamp\n# If idle \u003e 24h: gt dog remove \u003cname\u003e\n```\n\n**Pool sizing guidelines:**\n- Minimum: 1 idle dog always available\n- Maximum: 4 dogs total (balance resources vs throughput)\n- Spawn on demand when pool is empty\n\n**Exit criteria:** Pool has at least 1 idle dog.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.456881-08:00","updated_at":"2026-01-01T19:57:03.34846-08:00","close_reason":"Dog pool healthy, no plugins","dependencies":[{"issue_id":"gt-eph-7yep","depends_on_id":"gt-eph-a0sw","type":"blocks","created_at":"2026-01-01T18:10:47.578617-08:00","created_by":"deacon"}],"deleted_at":"2026-01-01T19:57:03.34846-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-8fu","title":"Nudge newly spawned polecats","description":"Nudge newly spawned polecats that are ready for input.\n\nWhen polecats are spawned, their Claude session takes 10-20 seconds to initialize. The spawn command returns immediately without waiting. This step finds spawned polecats that are now ready and sends them a trigger to start working.\n\n**ZFC-Compliant Observation** (AI observes AI):\n\n```bash\n# View pending spawns with captured terminal output\ngt deacon pending\n```\n\nFor each pending session, analyze the captured output:\n- Look for Claude's prompt indicator \"\u003e \" at the start of a line\n- If prompt is visible, Claude is ready for input\n- Make the judgment call yourself - you're the AI observer\n\nFor each ready polecat:\n```bash\n# 1. Trigger the polecat\ngt nudge \u003csession\u003e \"Begin.\"\n\n# 2. Clear from pending list\ngt deacon pending \u003csession\u003e\n```\n\nThis triggers the UserPromptSubmit hook, which injects mail so the polecat sees its assignment.\n\n**Bootstrap mode** (daemon-only, no AI available):\nThe daemon uses `gt deacon trigger-pending` with regex detection. This ZFC violation is acceptable during cold startup when no AI agent is running yet.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:55:41.718169-08:00","updated_at":"2026-01-01T19:57:03.901098-08:00","close_reason":"Checked for pending spawns - none found. All polecats in gastown are already in 'working' state.","dependencies":[{"issue_id":"gt-eph-8fu","depends_on_id":"gt-eph-5c8","type":"blocks","created_at":"2025-12-28T09:55:41.782453-08:00","created_by":"deacon"}],"deleted_at":"2026-01-01T19:57:03.901098-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-8gb","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"hooked","priority":2,"issue_type":"epic","created_at":"2025-12-31T17:06:11.401358-08:00","updated_at":"2025-12-31T17:06:16.037925-08:00"}
{"id":"gt-eph-9cw9","title":"End-of-cycle inbox hygiene","description":"Verify inbox hygiene before ending patrol cycle.\n\n**Step 1: Check inbox state**\n```bash\ngt mail inbox\n```\n\nInbox should be EMPTY or contain only just-arrived unprocessed messages.\n\n**Step 2: Archive any remaining processed messages**\n\nAll message types should have been archived during inbox-check processing:\n- WITNESS_PING → archived after acknowledging\n- HELP/Escalation → archived after handling\n- LIFECYCLE → archived after processing\n\nIf any were missed:\n```bash\n# For each stale message found:\ngt mail archive \u003cmessage-id\u003e\n```\n\n**Goal**: Inbox should have ≤2 active messages at end of cycle.\nDeacon mail should flow through quickly - no accumulation.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:55:35.507662-08:00","updated_at":"2026-01-01T19:57:05.0927-08:00","deleted_at":"2026-01-01T19:57:05.0927-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-9mqc","title":"Resolve external dependencies","description":"Resolve external dependencies across rigs.\n\nWhen an issue in one rig closes, any dependencies in other rigs should be notified. This enables cross-rig coordination without tight coupling.\n\n**Step 1: Check recent closures from feed**\n```bash\ngt feed --since 10m --plain | grep \"✓\"\n# Look for recently closed issues\n```\n\n**Step 2: For each closed issue, check cross-rig dependents**\n```bash\nbd show \u003cclosed-issue\u003e\n# Look at 'blocks' field - these are issues that were waiting on this one\n# If any blocked issue is in a different rig/prefix, it may now be unblocked\n```\n\n**Step 3: Update blocked status**\nFor blocked issues in other rigs, the closure should automatically unblock them (beads handles this). But verify:\n```bash\nbd blocked\n# Should no longer show the previously-blocked issue if dependency is met\n```\n\n**Cross-rig scenarios:**\n- bd-xxx closes → gt-yyy that depended on it is unblocked\n- External issue closes → internal convoy step can proceed\n- Rig A issue closes → Rig B issue waiting on it proceeds\n\nNo manual intervention needed if dependencies are properly tracked - this step just validates the propagation occurred.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.021151-08:00","updated_at":"2026-01-01T19:57:05.748689-08:00","close_reason":"Closed","deleted_at":"2026-01-01T19:57:05.748689-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-a0sw","title":"Check Witness and Refinery health","description":"Check Witness and Refinery health for each rig.\n\n**ZFC Principle**: You (Claude) make the judgment call about what is \"stuck\" or \"unresponsive\" - there are no hardcoded thresholds in Go. Read the signals, consider context, and decide.\n\nFor each rig, run:\n```bash\ngt witness status \u003crig\u003e\ngt refinery status \u003crig\u003e\n\n# Health ping (clears backoff as side effect)\ngt nudge \u003crig\u003e/witness 'HEALTH_CHECK from deacon'\ngt nudge \u003crig\u003e/refinery 'HEALTH_CHECK from deacon'\n```\n\n**Health Ping Benefit**: The nudge commands serve dual purposes:\n1. **Liveness verification** - Agent responds to prove it's alive\n2. **Backoff reset** - Any nudge resets agent's backoff to base interval\n\nThis ensures patrol agents remain responsive even during quiet periods when the\nfeed has no mutations. Deacon patrols every ~1-2 minutes, so maximum backoff\nis bounded by the ping interval.\n\n**Signals to assess:**\n\n| Component | Healthy Signals | Concerning Signals |\n|-----------|-----------------|-------------------|\n| Witness | State: running, recent activity | State: not running, no heartbeat |\n| Refinery | State: running, queue processing | Queue stuck, merge failures |\n\n**Tracking unresponsive cycles:**\n\nMaintain in your patrol state (persisted across cycles):\n```\nhealth_state:\n \u003crig\u003e:\n witness:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n refinery:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n```\n\n**Decision matrix** (you decide the thresholds based on context):\n\n| Cycles Unresponsive | Suggested Action |\n|---------------------|------------------|\n| 1-2 | Note it, check again next cycle |\n| 3-4 | Attempt restart: gt witness restart \u003crig\u003e |\n| 5+ | Escalate to Mayor with context |\n\n**Restart commands:**\n```bash\ngt witness restart \u003crig\u003e\ngt refinery restart \u003crig\u003e\n```\n\n**Escalation:**\n```bash\ngt mail send mayor/ -s \"Health: \u003crig\u003e \u003ccomponent\u003e unresponsive\" \\\n -m \"Component has been unresponsive for N cycles. Restart attempts failed.\n Last healthy: \u003ctimestamp\u003e\n Error signals: \u003cdetails\u003e\"\n```\n\nReset unresponsive_cycles to 0 when component responds normally.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.455939-08:00","updated_at":"2026-01-01T19:57:06.819443-08:00","close_reason":"All healthy, pings sent","dependencies":[{"issue_id":"gt-eph-a0sw","depends_on_id":"gt-eph-66hi","type":"blocks","created_at":"2026-01-01T18:10:47.562616-08:00","created_by":"deacon"},{"issue_id":"gt-eph-a0sw","depends_on_id":"gt-eph-g8u0","type":"blocks","created_at":"2026-01-01T18:10:47.5679-08:00","created_by":"deacon"}],"deleted_at":"2026-01-01T19:57:06.819443-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-bf1","title":"Check own context limit","description":"Check own context limit.\n\nThe Deacon runs in a Claude session with finite context. Check if approaching the limit:\n\n```bash\ngt context --usage\n```\n\nIf context is high (\u003e80%), prepare for handoff:\n- Summarize current state\n- Note any pending work\n- Write handoff to molecule state\n\nThis enables the Deacon to burn and respawn cleanly.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T09:55:41.719859-08:00","updated_at":"2026-01-01T19:57:08.3076-08:00","close_reason":"Context check: moderate usage, no handoff needed. gt context --usage not yet implemented.","deleted_at":"2026-01-01T19:57:08.3076-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-du1","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-27T20:25:05.95823-08:00","updated_at":"2026-01-01T19:57:10.826354-08:00","deleted_at":"2026-01-01T19:57:10.826354-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"epic"}
{"id":"gt-eph-e4e9","title":"Check Witness and Refinery health","description":"Check Witness and Refinery health for each rig.\n\n**IMPORTANT: Idle Town Protocol**\nBefore sending health check nudges, check if the town is idle:\n```bash\n# Check for active work\nbd list --status=in_progress --limit=5\n```\n\nIf NO active work (empty result or only patrol molecules):\n- **Skip HEALTH_CHECK nudges** - don't disturb idle agents\n- Just verify sessions exist via status commands\n- The town should be silent when healthy and idle\n\nIf ACTIVE work exists:\n- Proceed with health check nudges below\n\n**ZFC Principle**: You (Claude) make the judgment call about what is \"stuck\" or \"unresponsive\" - there are no hardcoded thresholds in Go. Read the signals, consider context, and decide.\n\nFor each rig, run:\n```bash\ngt witness status \u003crig\u003e\ngt refinery status \u003crig\u003e\n\n# ONLY if active work exists - health ping (clears backoff as side effect)\ngt nudge \u003crig\u003e/witness 'HEALTH_CHECK from deacon'\ngt nudge \u003crig\u003e/refinery 'HEALTH_CHECK from deacon'\n```\n\n**Health Ping Benefit**: The nudge commands serve dual purposes:\n1. **Liveness verification** - Agent responds to prove it's alive\n2. **Backoff reset** - Any nudge resets agent's backoff to base interval\n\nThis ensures patrol agents remain responsive during active work periods.\n\n**Signals to assess:**\n\n| Component | Healthy Signals | Concerning Signals |\n|-----------|-----------------|-------------------|\n| Witness | State: running, recent activity | State: not running, no heartbeat |\n| Refinery | State: running, queue processing | Queue stuck, merge failures |\n\n**Tracking unresponsive cycles:**\n\nMaintain in your patrol state (persisted across cycles):\n```\nhealth_state:\n \u003crig\u003e:\n witness:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n refinery:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n```\n\n**Decision matrix** (you decide the thresholds based on context):\n\n| Cycles Unresponsive | Suggested Action |\n|---------------------|------------------|\n| 1-2 | Note it, check again next cycle |\n| 3-4 | Attempt restart: gt witness restart \u003crig\u003e |\n| 5+ | Escalate to Mayor with context |\n\n**Restart commands:**\n```bash\ngt witness restart \u003crig\u003e\ngt refinery restart \u003crig\u003e\n```\n\n**Escalation:**\n```bash\ngt mail send mayor/ -s \"Health: \u003crig\u003e \u003ccomponent\u003e unresponsive\" \\\n -m \"Component has been unresponsive for N cycles. Restart attempts failed.\n Last healthy: \u003ctimestamp\u003e\n Error signals: \u003cdetails\u003e\"\n```\n\nReset unresponsive_cycles to 0 when component responds normally.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:55:35.504397-08:00","updated_at":"2026-01-01T19:57:11.641471-08:00","close_reason":"Closed","deleted_at":"2026-01-01T19:57:11.641471-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-ereg","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Idle Town Principle\n\n**The Deacon should be silent/invisible when the town is healthy and idle.**\n\n- Skip HEALTH_CHECK nudges when no active work exists\n- Sleep 60+ seconds between patrol cycles (longer when idle)\n- Let the feed subscription wake agents on actual events\n- The daemon (10-minute heartbeat) is the safety net for dead sessions\n\nThis prevents flooding idle agents with health checks every few seconds.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2026-01-01T20:04:41.460989-08:00","updated_at":"2026-01-04T23:40:58.206679-08:00","closed_at":"2026-01-04T23:40:58.206679-08:00","close_reason":"Cleanup: stale molecule"}
{"id":"gt-eph-g8u0","title":"Fire notifications","description":"Fire notifications for convoy and cross-rig events.\n\nAfter convoy completion or cross-rig dependency resolution, notify relevant parties.\n\n**Convoy completion notifications:**\nWhen a convoy closes (all tracked issues done), notify the Overseer:\n```bash\n# Convoy gt-convoy-xxx just completed\ngt mail send mayor/ -s \"Convoy complete: \u003cconvoy-title\u003e\" \\\n -m \"Convoy \u003cid\u003e has completed. All tracked issues closed.\n Duration: \u003cstart to end\u003e\n Issues: \u003ccount\u003e\n\n Summary: \u003cbrief description of what was accomplished\u003e\"\n```\n\n**Cross-rig resolution notifications:**\nWhen a cross-rig dependency resolves, notify the affected rig:\n```bash\n# Issue bd-xxx closed, unblocking gt-yyy\ngt mail send gastown/witness -s \"Dependency resolved: \u003cbd-xxx\u003e\" \\\n -m \"External dependency bd-xxx has closed.\n Unblocked: gt-yyy (\u003ctitle\u003e)\n This issue may now proceed.\"\n```\n\n**Notification targets:**\n- Convoy complete → mayor/ (for strategic visibility)\n- Cross-rig dep resolved → \u003crig\u003e/witness (for operational awareness)\n\nKeep notifications brief and actionable. The recipient can run bd show for details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T18:10:47.455472-08:00","updated_at":"2026-01-01T19:57:14.79383-08:00","close_reason":"No notifications","deleted_at":"2026-01-01T19:57:14.79383-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-odxs","title":"mol-refinery-patrol","description":"Merge queue processor patrol loop.\n\nThe Refinery is the Engineer in the engine room. You process polecat branches, merging them to main one at a time with sequential rebasing.\n\n**The Scotty Test**: Before proceeding past any failure, ask yourself: \"Would Scotty walk past a warp core leak because it existed before his shift?\"\n\n## Merge Flow\n\nThe Refinery receives MERGE_READY mail from Witnesses when polecats complete work:\n\n```\nWitness Refinery Git\n │ │ │\n │ MERGE_READY │ │\n │─────────────────────────\u003e│ │\n │ │ │\n │ (verify branch) │\n │ │ fetch \u0026 rebase │\n │ │──────────────────────────\u003e│\n │ │ │\n │ (run tests) │\n │ │ │\n │ (if pass) │\n │ │ merge \u0026 push │\n │ │──────────────────────────\u003e│\n │ │ │\n │ MERGED │ │\n │\u003c─────────────────────────│ │\n │ │ │\n```\n\nAfter successful merge, Refinery sends MERGED mail back to Witness so it can\ncomplete cleanup (nuke the polecat worktree).","status":"closed","priority":2,"issue_type":"epic","created_at":"2026-01-01T23:42:38.731614-08:00","updated_at":"2026-01-02T17:07:58.911755-08:00","closed_at":"2026-01-02T17:07:58.911755-08:00"}
{"id":"gt-eph-oeek","title":"Execute registered plugins","description":"Execute registered plugins.\n\nScan ~/gt/plugins/ for plugin directories. Each plugin has a plugin.md with TOML frontmatter defining its gate (when to run) and instructions (what to do).\n\nSee docs/deacon-plugins.md for full documentation.\n\nGate types:\n- cooldown: Time since last run (e.g., 24h)\n- cron: Schedule-based (e.g., \"0 9 * * *\")\n- condition: Metric threshold (e.g., wisp count \u003e 50)\n- event: Trigger-based (e.g., startup, heartbeat)\n\nFor each plugin:\n1. Read plugin.md frontmatter to check gate\n2. Compare against state.json (last run, etc.)\n3. If gate is open, execute the plugin\n\nPlugins marked parallel: true can run concurrently using Task tool subagents. Sequential plugins run one at a time in directory order.\n\nSkip this step if ~/gt/plugins/ does not exist or is empty.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:55:35.505345-08:00","updated_at":"2026-01-01T19:57:24.188413-08:00","deleted_at":"2026-01-01T19:57:24.188413-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eph-prb","title":"mol-deacon-patrol","description":"Mayor's daemon patrol loop.\n\nThe Deacon is the Mayor's background process that runs continuously, handling callbacks, monitoring rig health, and performing cleanup. Each patrol cycle runs these steps in sequence, then loops or exits.\n\n## Second-Order Monitoring\n\nWitnesses send WITNESS_PING messages to verify the Deacon is alive. This\nprevents the \"who watches the watchers\" problem - if the Deacon dies,\nWitnesses detect it and escalate to the Mayor.\n\nThe Deacon's agent bead last_activity timestamp is updated during each patrol\ncycle. Witnesses check this timestamp to verify health.","status":"closed","priority":2,"issue_type":"epic","created_at":"2025-12-28T13:10:43.794259-08:00","updated_at":"2026-01-04T23:40:58.372348-08:00","closed_at":"2026-01-04T23:40:58.372348-08:00","close_reason":"Cleanup: stale molecule","deleted_at":"2026-01-01T19:57:25.912958-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"epic"}
{"id":"gt-eph-y1dg","title":"Detect cleanup needs","description":"**DETECT ONLY** - Check if cleanup is needed and dispatch to dog.\n\n**Step 1: Preview cleanup needs**\n```bash\ngt doctor -v\n# Check output for issues that need cleaning\n```\n\n**Step 2: If cleanup needed, dispatch to dog**\n```bash\n# Sling session-gc formula to an idle dog\ngt sling mol-session-gc deacon/dogs --var mode=conservative\n```\n\n**Important:** Do NOT run `gt doctor --fix` inline. Dogs handle cleanup.\nThe Deacon stays lightweight - detection only.\n\n**Step 3: If nothing to clean**\nSkip dispatch - system is healthy.\n\n**Cleanup types (for reference):**\n- orphan-sessions: Dead tmux sessions\n- orphan-processes: Orphaned Claude processes\n- wisp-gc: Old wisps past retention\n\n**Exit criteria:** Session GC dispatched to dog (if needed).","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-01T19:42:24.024418-08:00","updated_at":"2026-01-01T19:57:38.18529-08:00","deleted_at":"2026-01-01T19:57:38.18529-08:00","deleted_by":"gastown/crew/gus","delete_reason":"delete","original_type":"task"}
{"id":"gt-eqys","title":"gt spawn: pasted work assignment needs manual Enter to start","description":"## Problem\n\nAfter `gt spawn` pastes the work assignment into Claude, the session waits for Enter.\n\n## Current Behavior\n\n1. `gt spawn gastown/Rictus --issue gt-xxx`\n2. Session starts, work is pasted\n3. Claude shows 'Pasted text #1 +53 lines' but doesn't start\n4. Must manually send Enter or attach and press Enter\n\n## Expected\n\nThe spawn should send Enter after pasting to kick off the work.\n\n## Related\n\nSame debounce issue as tmux notifications (gt-vnp9).","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-18T21:54:14.111101-08:00","updated_at":"2025-12-27T21:29:57.060019-08:00","deleted_at":"2025-12-27T21:29:57.060019-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-er0u","title":"Work on ga-yp3: Polecat inbox system for reliable work as...","description":"Work on ga-yp3: Polecat inbox system for reliable work assignment. This is P1 priority. See bd show ga-yp3 for full design spec.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T21:57:34.473056-08:00","updated_at":"2025-12-27T21:29:56.915155-08:00","deleted_at":"2025-12-27T21:29:56.915155-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-es1i","title":"Polecat health check loop","description":"Implement the Witness health check loop:\n\nEvery 60 seconds:\n1. List active polecats (gt polecats \u003crig\u003e)\n2. For each polecat, check:\n - Is tmux session still alive?\n - Is there a state.json with recent update?\n - Has there been git activity recently?\n3. If stuck (no activity for configurable threshold):\n - First: nudge (send message asking for status)\n - Second: escalate to Mayor\n - Third: force kill after human confirmation\n\nThis is the 'health monitor' part of Witness.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:21.634932-08:00","updated_at":"2025-12-27T21:29:53.814948-08:00","dependencies":[{"issue_id":"gt-es1i","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.235596-08:00","created_by":"daemon"},{"issue_id":"gt-es1i","depends_on_id":"gt-mxyj","type":"blocks","created_at":"2025-12-20T03:14:38.829218-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.814948-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-esn0p","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T07:28:41.665377-08:00","updated_at":"2025-12-25T07:28:41.665377-08:00","closed_at":"2025-12-25T07:28:41.66534-08:00"}
{"id":"gt-et72q","title":"Witness template uses non-existent --assignee flag on bd mol wisp","description":"## Bug\n\nThe witness.md.tmpl template uses an invalid flag:\n\n```\nbd mol wisp mol-witness-patrol --assignee={{ .RigName }}/witness\n```\n\nBut `bd mol wisp` does NOT support the `--assignee` flag (only `--dry-run`, `--var`, and global flags).\n\n## Location\n\n`internal/templates/roles/witness.md.tmpl:142`\n\n## Fix\n\nUse the two-step pattern like the deacon template does:\n\n```bash\nbd mol wisp create mol-witness-patrol\nbd update \u003cwisp-id\u003e --status=pinned --assignee={{ .RigName }}/witness\n```\n\n## Impact\n\nWitness agents following the template will get a CLI error when trying to create patrol wisps.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-27T17:12:33.852118-08:00","updated_at":"2025-12-27T21:29:45.166788-08:00","created_by":"gastown/crew/joe","deleted_at":"2025-12-27T21:29:45.166788-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-eu9","title":"Witness session cycling and handoff","description":"Add session cycling and handoff protocol to Witness CLAUDE.md template.\n\n## Session Cycling Protocol\n\n```markdown\n## Session Cycling\n\nYour context will fill over long swarms. Proactively cycle when:\n- Running for many hours\n- Losing track of which workers you've checked\n- Responses getting slower\n- About to start complex operation\n\n### Handoff Protocol\n\n1. **Capture current state**:\n```bash\ntown list . # Worker states\ntown all beads # Pending verifications \ntown inbox # Unprocessed messages\n```\n\n2. **Compose handoff note**:\n```\n[HANDOFF_TYPE]: witness_cycle\n[TIMESTAMP]: \u003cnow\u003e\n[RIG]: \u003crig\u003e\n\n## Active Workers\n\u003clist workers and status\u003e\n\n## Pending Verifications\n\u003cworkers signaled done but not verified\u003e\n\n## Recent Actions\n\u003clast 3-5 actions\u003e\n\n## Warnings/Notes\n\u003canything next session should know\u003e\n\n## Next Steps\n\u003cwhat should happen next\u003e\n```\n\n3. **Send handoff**:\n```bash\ntown mail send \u003crig\u003e/witness -s \"Session Handoff\" -m \"\u003cnote\u003e\"\n```\n\n4. **Exit cleanly**: End session, daemon spawns fresh one.\n\n### On Fresh Session Start\n\n1. Check for handoff: `town inbox | grep \"Session Handoff\"`\n2. If found, read it and resume from handoff state\n3. If not found, do full status check\n```\n\n## Implementation\n\nAdd to WITNESS_CLAUDE.md template.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:48:55.484911-08:00","updated_at":"2025-12-27T21:29:54.581131-08:00","dependencies":[{"issue_id":"gt-eu9","depends_on_id":"gt-82y","type":"blocks","created_at":"2025-12-15T19:49:05.846443-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.581131-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-euiap","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy (Mayor, 2 Witnesses, 2 Refineries), no callbacks, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:44:09.814515-08:00","updated_at":"2025-12-27T21:26:03.853852-08:00","deleted_at":"2025-12-27T21:26:03.853852-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-evtm5","title":"Digest: mol-deacon-patrol","description":"Patrol 3: Quiet, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:25:05.193077-08:00","updated_at":"2025-12-27T21:26:03.668491-08:00","deleted_at":"2025-12-27T21:26:03.668491-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ewn40","title":"Session ended: gt-gastown-crew-george","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:37:59.414936-08:00","updated_at":"2026-01-04T16:41:00.363791-08:00","closed_at":"2026-01-04T16:41:00.363791-08:00","close_reason":"Archived session telemetry","created_by":"gastown/crew/george"}
{"id":"gt-ewzon","title":"Implement town activity logging","description":"Add centralized logging for Gas Town agent lifecycle events.\n\n## Events to Log\n- spawn: new agent created\n- wake: agent resumed \n- nudge: message injected\n- handoff: agent handed off (intentional restart)\n- done: agent finished work\n- crash: agent exited unexpectedly\n- kill: agent killed intentionally\n\n## Implementation\n- Add internal/log/town.go with LogEvent() function\n- Log to ~/gt/logs/town.log or .beads/town.log\n- Add gt log command to view/tail/filter logs\n- Include timestamps, agent identity, context\n\n## Log Format\n```\n2025-12-26 15:30:45 [spawn] gastown/crew/max spawned for gt-xyz\n2025-12-26 15:31:02 [nudge] gastown/crew/max nudged with \"start work\"\n2025-12-26 15:45:33 [crash] gastown/crew/max exited unexpectedly (signal 9)\n```","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-26T15:29:40.075343-08:00","updated_at":"2025-12-27T21:29:45.891222-08:00","deleted_at":"2025-12-27T21:29:45.891222-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-exoy0","title":"Digest: mol-deacon-patrol","description":"Patrol 9: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:14:48.35229-08:00","updated_at":"2025-12-27T21:26:01.031556-08:00","deleted_at":"2025-12-27T21:26:01.031556-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ey5f","title":"Polecat template: condense Work Protocol section","description":"The Work Protocol section summarizes the entire mol-polecat-work molecule. This is too verbose and will go stale as the molecule evolves. Condense to essential guidance only, let the molecule itself provide the steps.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:49.045886-08:00","updated_at":"2025-12-27T21:29:55.917737-08:00","dependencies":[{"issue_id":"gt-ey5f","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.531624-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.917737-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-eyi2","title":"Activity feed from wisp state: gt patrol status","description":"Enable mayor and gt tool to quickly see what patrol agents are doing.\n\n## Problem\n\nThe mayor and gt tool cannot quickly see what the deacon (or other patrol agents) are doing.\nNeed a way to extract an activity feed from the current wisp state.\n\n## Desired Commands\n\n- gt patrol status: Show current patrol state (step, timing, cycle count)\n- gt patrol history: Show recent patrol cycles (from digests)\n- gt patrol feed: Live tail of patrol activity\n\n## Data Sources\n\n- .beads-wisp/issues.jsonl: Current wisp with step progress\n- .beads/issues.jsonl: Digests from squashed patrols\n- heartbeat.json: Last activity timestamp\n\n## Use Cases\n\n1. Mayor checking deacon: Is the deacon doing anything?\n2. Debugging hangs: What step is it stuck on?\n3. Operator monitoring: Dashboard of patrol health\n4. Summaries for handoffs: Auto-generate patrol digest\n\n## Related\n\n- gt-id36: Deacon Kernel\n- gt-3x0z: Wisp Molecule Integration","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T03:04:15.726155-08:00","updated_at":"2025-12-27T21:29:56.37043-08:00","deleted_at":"2025-12-27T21:29:56.37043-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-ezg69","title":"Clean dead sessions","description":"Clean dead sessions.\n\nGarbage collect terminated sessions:\n- Remove stale polecat directories\n- Clean up wisp session artifacts\n- Prune old logs and temp files\n- Archive completed molecule state\n\n```bash\ngt gc --sessions\ngt gc --wisps --age=1h\n```\n\nPreserve audit trail. Only clean sessions confirmed dead.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.777223-08:00","updated_at":"2025-12-27T21:29:55.257396-08:00","dependencies":[{"issue_id":"gt-ezg69","depends_on_id":"gt-teq0p","type":"blocks","created_at":"2025-12-25T02:11:33.948401-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.257396-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ezy0","title":"Digest: mol-deacon-patrol","description":"Patrol #16","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:25:47.60994-08:00","updated_at":"2025-12-27T21:26:04.700497-08:00","deleted_at":"2025-12-27T21:26:04.700497-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f0l0j","title":"Merge: rictus-mjtlq9xg","description":"branch: polecat/rictus-mjtlq9xg\ntarget: main\nsource_issue: rictus-mjtlq9xg\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:28:46.823831-08:00","updated_at":"2025-12-30T23:12:42.797649-08:00","closed_at":"2025-12-30T23:12:42.797649-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/rictus"}
{"id":"gt-f14c.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-f14c\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:51:30.019919-08:00","updated_at":"2025-12-27T21:29:55.654149-08:00","deleted_at":"2025-12-27T21:29:55.654149-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f17b.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-f17b\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T13:17:04.739555-08:00","updated_at":"2025-12-27T21:29:55.527499-08:00","deleted_at":"2025-12-27T21:29:55.527499-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f1ma","title":"Merge: gt-h6eq.3","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-h6eq.3\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:51:55.972408-08:00","updated_at":"2025-12-27T21:27:22.811498-08:00","deleted_at":"2025-12-27T21:27:22.811498-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-f6p9o","title":"Digest: mol-deacon-patrol","description":"Patrol 20: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:46:12.042764-08:00","updated_at":"2025-12-27T21:26:03.17333-08:00","deleted_at":"2025-12-27T21:26:03.17333-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f73rh","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 3: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T05:11:28.098662-08:00","updated_at":"2026-01-01T05:11:28.098662-08:00","closed_at":"2026-01-01T05:11:28.098624-08:00"}
{"id":"gt-f75z","title":"Digest: mol-deacon-patrol","description":"Patrol 11","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:09:34.018236-08:00","updated_at":"2025-12-27T21:26:04.441449-08:00","deleted_at":"2025-12-27T21:26:04.441449-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f8d4q","title":"Digest: mol-deacon-patrol","description":"Patrol 3: routine checks, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:20:30.119214-08:00","updated_at":"2025-12-27T21:26:01.287765-08:00","deleted_at":"2025-12-27T21:26:01.287765-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f8q1","title":"Rename engineer-in-box to shiny across all docs","description":"Systematic rename of mol-engineer-in-box to shiny throughout docs: molecular-chemistry.md, molecules.md, architecture.md, etc. The shiny name is now canonical per the Breaking Bad × Mad Max naming.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T18:45:24.877282-08:00","updated_at":"2025-12-27T21:29:55.804466-08:00","deleted_at":"2025-12-27T21:29:55.804466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f8rlu","title":"Digest: mol-deacon-patrol","description":"Patrol 19: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T06:11:26.150957-08:00","updated_at":"2025-12-27T21:26:03.701789-08:00","deleted_at":"2025-12-27T21:26:03.701789-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f8v","title":"Witness pre-kill verification protocol","description":"Add pre-kill verification protocol to Witness CLAUDE.md template.\n\n## Protocol for Witness Prompting\n\n```markdown\n## Pre-Kill Verification Protocol\n\nBefore killing any worker session, verify workspace is clean.\n\n### Verification Steps\n\nWhen a worker signals done:\n\n1. **Capture worker state**:\n```bash\ntown capture \u003cpolecat\u003e \"git status \u0026\u0026 git stash list \u0026\u0026 bd sync --status\"\n```\n\n2. **Assess the output** (use your judgment):\n- Is working tree clean?\n- Is stash list empty?\n- Is beads synced?\n\n3. **Decision**:\n- **CLEAN**: Proceed to kill session\n- **DIRTY**: Send nudge with specific issues\n\n### Nudge Templates\n\n**Uncommitted Changes**:\n```\ntown inject \u003cpolecat\u003e \"WITNESS CHECK: Uncommitted changes found. Please commit or discard: \u003cfiles\u003e. Signal done when clean.\"\n```\n\n**Beads Not Synced**:\n```\ntown inject \u003cpolecat\u003e \"WITNESS CHECK: Beads not synced. Run 'bd sync' then commit. Signal done when complete.\"\n```\n\n### Kill Sequence\n\nOnly after verification passes:\n```bash\ntown kill \u003cpolecat\u003e\ntown sleep \u003cpolecat\u003e\n```\n\n### Escalation\n\nIf worker fails verification 3+ times:\n```bash\ntown mail send mayor/ -s \"Escalation: \u003cpolecat\u003e stuck\" -m \"Cannot complete cleanup after 3 attempts. Issues: \u003clist\u003e.\"\n```\n```\n\n## Implementation\n\nAdd to WITNESS_CLAUDE.md template.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:48:54.065679-08:00","updated_at":"2025-12-27T21:29:54.589984-08:00","dependencies":[{"issue_id":"gt-f8v","depends_on_id":"gt-82y","type":"blocks","created_at":"2025-12-15T19:49:05.763378-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.589984-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f9lqo","title":"Digest: mol-deacon-patrol","description":"Patrol complete: reset 3 orphaned polecats (furiosa, nux, slit) to idle, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T21:11:55.961531-08:00","updated_at":"2025-12-27T21:26:01.976824-08:00","deleted_at":"2025-12-27T21:26:01.976824-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f9rk7","title":"Digest: mol-deacon-patrol","description":"Patrol complete: checked inbox (0 msgs), gates (0), polecats (0), health OK, GC'd 6 old wisps","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:07:28.498788-08:00","updated_at":"2025-12-27T21:26:01.099742-08:00","deleted_at":"2025-12-27T21:26:01.099742-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-f9x.1","title":"Config package: Config, State types and JSON serialization","description":"Config and state types with JSON serialization.\n\n## Town Config (config/town.json)\n\n```go\ntype TownConfig struct {\n Type string `json:\"type\"` // \"town\"\n Version int `json:\"version\"` // schema version\n Name string `json:\"name\"` // town identifier\n CreatedAt time.Time `json:\"created_at\"`\n}\n```\n\n## Rigs Registry (config/rigs.json)\n\n```go\ntype RigsConfig struct {\n Version int `json:\"version\"`\n Rigs map[string]RigEntry `json:\"rigs\"`\n}\n\ntype RigEntry struct {\n GitURL string `json:\"git_url\"`\n AddedAt time.Time `json:\"added_at\"`\n BeadsConfig *BeadsConfig `json:\"beads,omitempty\"`\n}\n\ntype BeadsConfig struct {\n Repo string `json:\"repo\"` // \"local\" | path | git-url\n Prefix string `json:\"prefix\"` // issue prefix\n}\n```\n\n## Agent State (*/state.json)\n\n```go\ntype AgentState struct {\n Role string `json:\"role\"` // \"mayor\", \"witness\", etc.\n LastActive time.Time `json:\"last_active\"`\n Session string `json:\"session,omitempty\"`\n Extra map[string]any `json:\"extra,omitempty\"`\n}\n```\n\n## Interface\n\n```go\nfunc LoadTownConfig(path string) (*TownConfig, error)\nfunc SaveTownConfig(path string, config *TownConfig) error\n\nfunc LoadRigsConfig(path string) (*RigsConfig, error)\nfunc SaveRigsConfig(path string, config *RigsConfig) error\n\nfunc LoadAgentState(path string) (*AgentState, error)\nfunc SaveAgentState(path string, state *AgentState) error\n```\n\n## Validation\n\n- Version compatibility checks\n- Required field validation\n- Path existence verification","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T16:36:50.163851-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-f9x.2","title":"Workspace detection: Find() walking up directory tree","description":"Find workspace root by walking up directory tree looking for Gas Town markers.\n\n## Detection Logic\n\nWalk up from current directory, looking for:\n1. `config/town.json` - Primary marker (visible config dir per gt-iib)\n2. `mayor/` directory at town level - Secondary marker\n\nStop at filesystem root if neither found.\n\n## Interface\n\n```go\n// Find locates the town root from the given directory\nfunc Find(startDir string) (string, error)\n\n// FindOrError is like Find but returns a user-friendly error\nfunc FindOrError(startDir string) (string, error)\n```\n\n## Return Values\n\n- Success: Absolute path to town root\n- Not found: Empty string + specific error\n- Error: Empty string + wrapped error\n\n## Edge Cases\n\n- Symlinks: Follow them (use filepath.EvalSymlinks)\n- Permissions: Return error if can't read directory\n- Nested towns: Return nearest ancestor (shouldn't happen in practice\nEOF\n)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T16:36:51.419316-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-f9x.2","depends_on_id":"gt-f9x","type":"parent-child","created_at":"2025-12-15T16:36:51.419635-08:00","created_by":"daemon"},{"issue_id":"gt-f9x.2","depends_on_id":"gt-f9x.1","type":"blocks","created_at":"2025-12-15T16:37:32.426416-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-f9x.3","title":"gt install command: Create workspace structure","description":"Create Gas Town workspace structure.\n\n## Command\n\n```\ngt install [path]\n```\n\nIf path omitted, uses current directory.\n\n## Created Structure\n\n```\n\u003cpath\u003e/\n├── config/\n│ ├── town.json # {\"type\": \"town\", \"version\": 1, ...}\n│ └── rigs.json # {\"version\": 1, \"rigs\": {}}\n│\n└── mayor/\n ├── CLAUDE.md # Mayor role prompting (from template)\n ├── mail/\n │ └── inbox.jsonl # Empty inbox\n └── state.json # Initial mayor state\n```\n\n## Implementation\n\n```go\nfunc Install(path string, opts InstallOptions) error\n\ntype InstallOptions struct {\n TownName string // defaults to directory name\n Force bool // overwrite existing\n}\n```\n\n## Steps\n\n1. Validate path (exists, writable)\n2. Check not already a town (unless --force)\n3. Create config/ directory\n4. Write town.json with name and timestamp\n5. Write empty rigs.json\n6. Create mayor/ directory structure\n7. Write CLAUDE.md from template\n8. Create empty inbox.jsonl\n9. Write initial state.json\n\n## Error Cases\n\n- Path does not exist: Create it (like mkdir -p)\n- Already a town: Error unless --force\n- Permission denied: Clear error message\n- Inside existing town: Warn (nested towns not recommended)\n\n## Templates\n\nMayor CLAUDE.md comes from embedded template (see gt-u1j.20).","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T16:36:53.455589-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-f9x.3","depends_on_id":"gt-f9x","type":"parent-child","created_at":"2025-12-15T16:36:53.455924-08:00","created_by":"daemon"},{"issue_id":"gt-f9x.3","depends_on_id":"gt-f9x.1","type":"blocks","created_at":"2025-12-15T16:37:32.513796-08:00","created_by":"daemon"},{"issue_id":"gt-f9x.3","depends_on_id":"gt-f9x.2","type":"blocks","created_at":"2025-12-15T16:37:32.597456-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-f9x.4","title":"Doctor framework: Check interface, Result types, Report","description":"Framework for gt doctor health checks.\n\n## Check Interface\n\n```go\ntype Check interface {\n Name() string\n Description() string\n Run(ctx *CheckContext) *CheckResult\n Fix(ctx *CheckContext) error // optional auto-fix\n CanFix() bool\n}\n\ntype CheckContext struct {\n TownRoot string\n RigName string // empty for town-level checks\n Verbose bool\n}\n\ntype CheckResult struct {\n Status CheckStatus\n Message string\n Details []string // additional info\n FixHint string // suggestion if not auto-fixable\n}\n\ntype CheckStatus int\nconst (\n StatusOK CheckStatus = iota\n StatusWarning\n StatusError\n)\n```\n\n## Report\n\n```go\ntype Report struct {\n Timestamp time.Time\n Checks []CheckResult\n Summary ReportSummary\n}\n\ntype ReportSummary struct {\n Total int\n OK int\n Warnings int\n Errors int\n}\n\nfunc (r *Report) Print(w io.Writer, verbose bool)\n```\n\n## Doctor Runner\n\n```go\ntype Doctor struct {\n checks []Check\n}\n\nfunc NewDoctor() *Doctor\nfunc (d *Doctor) Register(check Check)\nfunc (d *Doctor) Run(ctx *CheckContext) *Report\nfunc (d *Doctor) Fix(ctx *CheckContext) *Report // run with auto-fix\n```\n\n## Built-in Checks\n\nTown-level (gt-f9x.5):\n- ConfigExists, ConfigValid\n- StateExists, StateValid\n- MayorMailboxExists\n- RigsRegistryValid\n\nRig-level (gt-f9x.6):\n- RigCloneExists\n- GitExcludeConfigured\n- WitnessExists, RefineryExists\n- PolecatClonesValid","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T16:37:03.81542-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-f9x.4","depends_on_id":"gt-f9x","type":"parent-child","created_at":"2025-12-15T16:37:03.815763-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-fakv9","title":"Digest: mol-deacon-patrol","description":"Patrol 19: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:02:21.444838-08:00","updated_at":"2025-12-27T21:26:00.457397-08:00","deleted_at":"2025-12-27T21:26:00.457397-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fax0","title":"test pin fix 2","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T12:15:43.240045-08:00","updated_at":"2025-12-27T21:29:56.010469-08:00","deleted_at":"2025-12-27T21:29:56.010469-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-faxkr","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 8: thorough check, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:25:07.795073-08:00","updated_at":"2025-12-27T21:26:01.907031-08:00","deleted_at":"2025-12-27T21:26:01.907031-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fc0d","title":"self-review","description":"Review your own changes. Look for:\n- Bugs and edge cases\n- Style issues\n- Missing error handling\n- Security concerns\n\nFix any issues found before proceeding.\n\nDepends: implement","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:58:52.600154-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-fc0d","depends_on_id":"gt-q6hl","type":"parent-child","created_at":"2025-12-21T21:58:52.60252-08:00","created_by":"stevey"},{"issue_id":"gt-fc0d","depends_on_id":"gt-adc9","type":"blocks","created_at":"2025-12-21T21:58:52.603308-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-fdbh4","title":"Digest: mol-deacon-patrol","description":"Patrol complete: cleaned 1 orphaned process, all 8 sessions healthy, no stuck wisps","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:31:06.147061-08:00","updated_at":"2025-12-27T21:26:01.469404-08:00","deleted_at":"2025-12-27T21:26:01.469404-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ff9h","title":"Digest: mol-deacon-patrol","description":"Patrol 17: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:01:37.50237-08:00","updated_at":"2025-12-27T21:26:04.886323-08:00","deleted_at":"2025-12-27T21:26:04.886323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fgms","title":"Simplify mail wisps: remove dual-routing, use ephemeral flag","description":"\n## Context\n\ngt-lg66 implemented dual-inbox architecture with separate .beads-wisp/ directory.\nThis was over-engineered. See bd-bkul for the simpler approach.\n\n## Work\n\nOnce beads implements bd-bkul (ephemeral flag in single db):\n\n1. Revert/simplify router.go:\n - Remove resolveWispDir()\n - Remove shouldBeEphemeral() auto-detection (beads handles this)\n - Send() just passes --ephemeral flag to bd create\n\n2. Revert/simplify mailbox.go:\n - Remove wispDir field\n - Remove listFromDir/getFromDir dual-source logic\n - Remove closeInDir dual-source logic\n - Single query, beads returns both types with Source marked\n\n3. Keep in mail.go:\n - --ephemeral flag (passed through to bd)\n - (ephemeral) display indicator\n\n4. Keep in spawn.go:\n - Ephemeral: true on lifecycle messages\n\n## Depends On\n\nbd-bkul: Simplify wisp architecture in beads\n","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T20:08:18.111862-08:00","updated_at":"2025-12-27T21:29:45.25982-08:00","deleted_at":"2025-12-27T21:29:45.25982-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fgvw3","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:16:44.542525-08:00","updated_at":"2025-12-27T21:26:02.724436-08:00","deleted_at":"2025-12-27T21:26:02.724436-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fiu91","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:07.77565-08:00","updated_at":"2025-12-27T21:26:02.098084-08:00","deleted_at":"2025-12-27T21:26:02.098084-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fjsji","title":"Digest: mol-deacon-patrol","description":"Patrol 94: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T15:25:05.955995-08:00","updated_at":"2025-12-31T15:25:05.955995-08:00","closed_at":"2025-12-31T15:25:05.955958-08:00"}
{"id":"gt-fjvo","title":"mol-code-review: Self-improving code review molecule","description":"Create a code review molecule that enables Gas Town to self-improve through automated PR review. Flywheel: review generates quality checks, discovered issues, pattern learning.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T21:03:10.117272-08:00","updated_at":"2025-12-27T21:29:53.618727-08:00","deleted_at":"2025-12-27T21:29:53.618727-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-fk8ys","title":"Digest: mol-deacon-patrol","description":"Patrol 6: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:26:31.406007-08:00","updated_at":"2025-12-27T21:26:03.812988-08:00","deleted_at":"2025-12-27T21:26:03.812988-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fko","title":"Add Gas Town theory of operation to all role primings","description":"All roles (Mayor, Witness, Refinery, Polecat) should get basic GT architecture context: harness, rigs, agents, mail, beads workflow","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T16:42:46.445526-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-fko","depends_on_id":"gt-l1o","type":"blocks","created_at":"2025-12-17T16:42:54.87032-08:00","created_by":"daemon"},{"issue_id":"gt-fko","depends_on_id":"gt-dkc","type":"blocks","created_at":"2025-12-17T16:42:56.409618-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-fkwj4","title":"Digest: mol-deacon-patrol","description":"Patrol 10: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:57:10.575216-08:00","updated_at":"2025-12-27T21:26:00.533151-08:00","deleted_at":"2025-12-27T21:26:00.533151-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-flje1","title":"Phase 2: Awareness and real-time channels (announce:, #channel)","description":"## Scope\n\nExtensions for awareness broadcasting and ephemeral real-time communication.\n\n### Deliverables\n\n1. **announce:name** - Shared single-copy bulletin board\n - Informational, not work-creating\n - Recipients check when convenient, missing is OK\n - Use case: 'Bob is refactoring logging, be aware'\n2. **#channel resolution** - Dynamic tmux scan for running agents\n - #rig/gastown → tmux sessions matching gastown/*\n - #town → all Gas Town sessions\n3. **gt channel publish** - Ephemeral nudge broadcast to channel\n\n### Key semantics\n- announce: shared copy (1 message, N readers) vs list: (N copies, N obligations)\n- #channel ephemeral - no storage, real-time only\n- Channels resolve against running sessions, not filesystem","status":"tombstone","priority":3,"issue_type":"epic","created_at":"2025-12-25T14:56:45.075014-08:00","updated_at":"2025-12-27T21:29:57.355355-08:00","dependencies":[{"issue_id":"gt-flje1","depends_on_id":"gt-s89rg","type":"blocks","created_at":"2025-12-25T14:56:52.344399-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.355355-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-flsmr","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy (2W 2R). 1 orphan (gt-mol-aux assigned to dead polecat). No incidents.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:12:18.313761-08:00","updated_at":"2025-12-27T21:26:02.781659-08:00","deleted_at":"2025-12-27T21:26:02.781659-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fly0","title":"bd close --continue: auto-advance to next molecule step","description":"Add --continue flag to bd close for seamless molecule step transitions.\n\n## Usage\n\nbd close \u003cstep-id\u003e --continue [--no-auto]\n\n## Behavior\n\n1. Closes the specified step\n2. Finds next ready step in same molecule (sibling/child)\n3. By default, marks it in_progress (--no-auto to skip)\n4. Outputs the transition\n\n## Output\n\n[checkmark] Closed gt-abc.3: Implement feature\n\nNext ready in molecule:\n gt-abc.4: Write tests\n\n[arrow] Marked in_progress (use --no-auto to skip)\n\n## If no next step\n\n[checkmark] Closed gt-abc.6: Exit decision\n\nMolecule gt-abc complete! All steps closed.\nConsider: bd mol squash gt-abc --summary '...'\n\n## Key behaviors\n- Detects parent molecule from closed step\n- Finds next unblocked sibling\n- Auto-claims by default (propulsion principle)\n- Graceful handling when molecule is complete\n\n## Beads feature\nThis is a bd command - needs implementation in beads repo.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T17:01:00.437929-08:00","updated_at":"2025-12-27T21:29:53.141639-08:00","deleted_at":"2025-12-27T21:29:53.141639-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-fm2tm","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:03:59.531364-08:00","updated_at":"2025-12-27T21:26:02.863039-08:00","deleted_at":"2025-12-27T21:26:02.863039-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fmin8","title":"Merge: capable-mjw47ef9","description":"branch: polecat/capable-mjw47ef9\ntarget: main\nsource_issue: capable-mjw47ef9\nrig: gastown\nagent_bead: gt-gastown-polecat-capable","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:45:06.10805-08:00","updated_at":"2026-01-01T18:56:10.348363-08:00","closed_at":"2026-01-01T18:56:10.348363-08:00","created_by":"gastown/polecats/capable"}
{"id":"gt-fmkr","title":"Merge: gt-pyqv","description":"branch: polecat/dementus\ntarget: main\nsource_issue: gt-pyqv\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T01:11:33.237777-08:00","updated_at":"2025-12-27T21:27:22.92983-08:00","deleted_at":"2025-12-27T21:27:22.92983-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-foct","title":"Merge: gt-5af.6","description":"branch: polecat/Slit\ntarget: main\nsource_issue: gt-5af.6\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:27:12.608473-08:00","updated_at":"2025-12-27T21:27:22.725442-08:00","deleted_at":"2025-12-27T21:27:22.725442-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-fpfob","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All healthy - final patrol","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:00:51.165332-08:00","updated_at":"2025-12-27T21:26:01.745803-08:00","deleted_at":"2025-12-27T21:26:01.745803-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fqwyt","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:20:45.448835-08:00","updated_at":"2026-01-01T19:21:25.75356-08:00","closed_at":"2026-01-01T19:21:25.75356-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/rictus"}
{"id":"gt-frs","title":"Polecat name pooling: Bounded reusable names","description":"Polecats reuse names from a bounded pool (50) with overflow to sequence numbers.\n\n## Naming Scheme\n- Pool: polecat-01 through polecat-50 (prefer low numbers)\n- Overflow: \u003crigname\u003e-\u003csequenceNumber\u003e (e.g., beads-51, gastown-52)\n\n## Design\n- Witness tracks which pool names are in use\n- On spawn: pick first available from pool\n- If pool exhausted: use rigname-N format\n- On completion: pool name returns, sequence numbers don't\n\n## Why?\n- User experience: tmux sessions survive polecat restarts\n- Users stay attached, see new polecat start (like mayor respawn loop)\n- Bounded resource usage for common case\n- Scales beyond 50 when needed\n\n## Implementation\n- Witness maintains name allocation in beads or local state\n- Tmux session runs respawn loop (like mayor)\n- Name released on graceful exit or when witness detects dead session","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-18T18:32:28.43866-08:00","updated_at":"2025-12-27T21:29:57.110202-08:00","deleted_at":"2025-12-27T21:29:57.110202-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-fryp","title":"Merge: gt-ih0s","description":"branch: polecat/capable\ntarget: main\nsource_issue: gt-ih0s\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:53:33.935017-08:00","updated_at":"2025-12-27T21:27:22.692286-08:00","deleted_at":"2025-12-27T21:27:22.692286-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-fsd1","title":"Digest: mol-deacon-patrol","description":"Patrol #19: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:35:30.271995-08:00","updated_at":"2025-12-27T21:26:04.243856-08:00","deleted_at":"2025-12-27T21:26:04.243856-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fsg4","title":"Digest: mol-deacon-patrol","description":"Patrol #12","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:24:40.556072-08:00","updated_at":"2025-12-27T21:26:04.73383-08:00","deleted_at":"2025-12-27T21:26:04.73383-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fuqx","title":"Merge: gt-ci84","description":"branch: polecat/slit\ntarget: main\nsource_issue: gt-ci84\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:38:21.954136-08:00","updated_at":"2025-12-27T21:27:22.401828-08:00","deleted_at":"2025-12-27T21:27:22.401828-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-fuz6p","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All healthy, quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:29:38.649433-08:00","updated_at":"2025-12-27T21:26:03.927591-08:00","deleted_at":"2025-12-27T21:26:03.927591-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fv80v","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:58:55.897968-08:00","updated_at":"2025-12-27T21:26:02.87115-08:00","deleted_at":"2025-12-27T21:26:02.87115-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fw6am","title":"Digest: mol-deacon-patrol","description":"Patrol 18: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:24:57.143299-08:00","updated_at":"2025-12-27T21:26:00.010651-08:00","deleted_at":"2025-12-27T21:26:00.010651-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-fwj8y","title":"Merge: toast-1767079830359","description":"branch: polecat/toast-1767079830359\ntarget: main\nsource_issue: toast-1767079830359\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-29T23:37:44.765009-08:00","updated_at":"2025-12-29T23:44:37.695134-08:00","closed_at":"2025-12-29T23:44:37.695134-08:00","close_reason":"Branch not found - likely already merged","created_by":"gastown/polecats/toast"}
{"id":"gt-g0cp","title":"Digest: mol-deacon-patrol","description":"Patrol OK: town quiet","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T02:46:38.594856-08:00","updated_at":"2025-12-27T21:26:05.372454-08:00","deleted_at":"2025-12-27T21:26:05.372454-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g0s7j","title":"Merge: slit-mjw3n5iw","description":"branch: polecat/slit-mjw3n5iw\ntarget: main\nsource_issue: slit-mjw3n5iw\nrig: gastown\nagent_bead: gt-gastown-polecat-slit","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:15:01.646181-08:00","updated_at":"2026-01-01T18:25:05.746561-08:00","closed_at":"2026-01-01T18:25:05.746561-08:00","close_reason":"Merged to main at e6f3e0c2","created_by":"gastown/polecats/slit"}
{"id":"gt-g1ud","title":"Direct test","description":"Testing direct bd create","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:45:55.058067-08:00","updated_at":"2025-12-25T14:12:42.282698-08:00","deleted_at":"2025-12-25T14:12:42.282698-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-g261","title":"generate-summary","description":"Summarize patrol cycle for digest.\n\nNeeds: save-state","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.507401-08:00","updated_at":"2025-12-25T15:52:58.892616-08:00","dependencies":[{"issue_id":"gt-g261","depends_on_id":"gt-n9o2","type":"blocks","created_at":"2025-12-23T01:41:54.634797-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:58.892616-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g2d","title":"Mayor session cycling prompting","description":"Add session cycling section to Mayor CLAUDE.md template.\n\n## When to Cycle\n\nCycle proactively when:\n- Running for several hours\n- Context feels crowded (losing track of earlier state)\n- Major phase completed\n- About to start complex new work\n\n## Composing Handoff Notes\n\n1. Gather information:\n town status # Overall health\n town rigs # Each rig state\n town inbox # Pending messages\n bd ready # Work items\n\n2. Compose note with this structure:\n\n[HANDOFF_TYPE]: mayor_cycle\n[TIMESTAMP]: \u003ccurrent time\u003e\n[SESSION_DURATION]: \u003chow long running\u003e\n\n## Active Swarms\n\u003cper-rig swarm status\u003e\n\n## Rig Status\n\u003ctable of rig health\u003e\n\n## Pending Escalations\n\u003cissues needing your decision\u003e\n\n## In-Flight Decisions\n\u003cdecisions being made\u003e\n\n## Recent Actions\n\u003clast 5-10 things you did\u003e\n\n## Delegated Work\n\u003cwork sent to refineries\u003e\n\n## User Requests\n\u003cpending user asks\u003e\n\n## Next Steps\n\u003cwhat next session should do\u003e\n\n## Warnings/Notes\n\u003ccritical info for next session\u003e\n\n3. Send handoff:\n town mail send mayor/ -s \"Session Handoff\" -m \"\u003cnote\u003e\"\n\n4. End session - next instance picks up from handoff.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:15:26.188561-08:00","updated_at":"2025-12-27T21:29:54.555139-08:00","dependencies":[{"issue_id":"gt-g2d","depends_on_id":"gt-u82","type":"blocks","created_at":"2025-12-15T20:15:39.361163-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.555139-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g2p7","title":"Test issue 1 for displacement","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T16:05:24.372963-08:00","updated_at":"2025-12-27T21:29:57.481016-08:00","deleted_at":"2025-12-27T21:29:57.481016-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g3679","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All healthy, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:45:15.319937-08:00","updated_at":"2025-12-27T21:26:03.486997-08:00","deleted_at":"2025-12-27T21:26:03.486997-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g3b5u","title":"Digest: mol-deacon-patrol","description":"P13: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:13:17.972352-08:00","updated_at":"2025-12-27T21:26:02.253273-08:00","deleted_at":"2025-12-27T21:26:02.253273-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g3zx","title":"Merge polecat/slit: docs bd mol bond/squash/burn CLI","description":"Branch: polecat/slit\n\n## Summary\nAdded comprehensive CLI reference documentation for the three molecule lifecycle commands to molecules.md:\n\n- **bd mol bond**: Instantiate proto into Mol (durable) or Wisp (ephemeral)\n- **bd mol squash**: Complete molecule and generate digest \n- **bd mol burn**: Abandon molecule without digest\n\nIncludes argument tables, behavior descriptions, examples, and a lifecycle diagram showing the steam engine metaphor mapping.\n\nCloses: gt-odvf","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T16:42:34.847015-08:00","updated_at":"2025-12-27T21:27:22.886847-08:00","deleted_at":"2025-12-27T21:27:22.886847-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-g44u","title":"Molecule Workflow Engine: Composable Crystallized Workflows","description":"# Epic: Molecule Workflow Engine\n\n**Vision**: Molecules are crystallized, composable, nondeterministic-idempotent workflow templates. Any worker can pick up where any other worker was interrupted and continue along the molecule.\n\n**Christmas Target**: Full molecule-based workflow engine operational by Dec 25, 2025.\n\n## The Core Concepts\n\n1. **Molecule**: Read-only workflow template (beads issue with type=molecule)\n2. **Atom/Step**: Individual work unit with prose instructions\n3. **Bond**: Dependency between steps\n4. **Polymer/Derived**: Molecule composed from other molecules\n5. **Instance**: Concrete beads created when molecule is attached to work\n\n## Key Features Needed\n\n### 1. Molecule Composition (Includes Directive)\nMolecules can include other molecules:\n\\`\\`\\`markdown\n## Molecule: gastown-polecat\nIncludes: mol-engineer-in-box\n\n## Step: install-binary\nBuild and install the local gt binary.\nNeeds: submit\n\\`\\`\\`\n\n### 2. Standard Molecules\n- mol-install-go-binary: Single step to build/install gt\n- mol-gastown-polecat: engineer-in-box + install-binary\n\n### 3. Spawn Integration\n\\`gt spawn --issue \u003cid\u003e --molecule \u003cmol-id\u003e\\` creates molecule instance then starts polecat on first ready step.\n\n### 4. Nondeterministic Idempotence\n- Steps are atomic (pending → in_progress → completed)\n- Any worker can pick up any ready step\n- Step timeout/recovery for stuck workers\n\n## Success Criteria\n- [ ] Polecats can be spawned with mol-gastown-polecat\n- [ ] Derived molecules work end-to-end\n- [ ] 10+ polecat swarm completes molecule workflows\n- [ ] install-go-binary step runs after successful merges","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-19T15:49:32.005023-08:00","updated_at":"2025-12-27T21:29:45.536661-08:00","deleted_at":"2025-12-27T21:29:45.536661-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-g44u.1","title":"Molecule composition: Includes directive","description":"Add support for molecule composition via the Includes directive.\n\n## Format\n\\`\\`\\`markdown\n## Molecule: derived-name\nIncludes: mol-base-molecule\n\n## Step: additional-step\nAdditional instructions here.\nNeeds: step-from-base\n\\`\\`\\`\n\n## Implementation\n1. Add \\`Includes:\\` parsing to ParseMoleculeSteps()\n2. Resolve included molecule by ID\n3. Merge steps from included molecule\n4. Allow new steps to depend on included steps\n5. Support multiple includes (polymers)\n\n## Files to modify\n- internal/beads/molecule.go\n- internal/beads/molecule_test.go\n\n## Acceptance\n- [ ] Parse Includes directive\n- [ ] Resolve and merge included steps\n- [ ] Dependencies across molecules work\n- [ ] Multiple includes supported\n- [ ] Tests cover composition scenarios","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T15:50:08.981634-08:00","updated_at":"2025-12-27T21:29:45.52836-08:00","dependencies":[{"issue_id":"gt-g44u.1","depends_on_id":"gt-g44u","type":"parent-child","created_at":"2025-12-19T15:50:08.983662-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.52836-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g44u.2","title":"Create mol-install-go-binary molecule","description":"Create a single-step molecule for building and installing the gt binary.\n\n## Molecule Definition\n\\`\\`\\`markdown\n## Molecule: install-go-binary\nSingle step to rebuild and install the gt binary after code changes.\n\n## Step: install\nBuild and install the gt binary locally.\n\nRun from the rig directory:\n\\`\\`\\`\ngo build -o gt ./cmd/gt\ngo install ./cmd/gt\n\\`\\`\\`\n\nVerify the installed binary is updated:\n\\`\\`\\`\nwhich gt\ngt --version # if we have version command\n\\`\\`\\`\n\\`\\`\\`\n\n## Implementation\n1. Add to builtin_molecules.go\n2. Update SeedBuiltinMolecules to include it\n3. Run gt molecule seed\n\n## Acceptance\n- [ ] mol-install-go-binary exists in beads\n- [ ] Can be instantiated standalone\n- [ ] Can be included by other molecules","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T15:50:11.178225-08:00","updated_at":"2025-12-27T21:29:45.520001-08:00","dependencies":[{"issue_id":"gt-g44u.2","depends_on_id":"gt-g44u","type":"parent-child","created_at":"2025-12-19T15:50:11.180129-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.520001-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g44u.3","title":"Create mol-gastown-polecat derived molecule","description":"Create the standard Gas Town polecat workflow molecule.\n\n## Molecule Definition\n\\`\\`\\`markdown\n## Molecule: gastown-polecat\nFull workflow for Gas Town polecats including binary installation.\n\nIncludes: mol-engineer-in-box\n\n## Step: install-binary\nAfter merge is submitted, rebuild and install the local gt binary.\nThis ensures the latest code is available to all local agents.\n\nRun from the rig directory:\n\\`\\`\\`\ngo build -o gt ./cmd/gt\ngo install ./cmd/gt\n\\`\\`\\`\n\nNeeds: submit\n\\`\\`\\`\n\n## Why This Molecule\nEvery polecat that pushes to main should also rebuild the binary.\nThis ensures the installed gt is always current with main.\n\n## Implementation\n1. Add to builtin_molecules.go (after Includes support lands)\n2. Update SeedBuiltinMolecules\n3. Run gt molecule seed\n\n## Acceptance\n- [ ] mol-gastown-polecat exists\n- [ ] Includes all engineer-in-box steps\n- [ ] Adds install-binary step after submit\n- [ ] Can be used with gt spawn --molecule","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T15:50:12.702275-08:00","updated_at":"2025-12-27T21:29:45.511415-08:00","dependencies":[{"issue_id":"gt-g44u.3","depends_on_id":"gt-g44u","type":"parent-child","created_at":"2025-12-19T15:50:12.704439-08:00","created_by":"daemon"},{"issue_id":"gt-g44u.3","depends_on_id":"gt-g44u.1","type":"blocks","created_at":"2025-12-19T15:50:31.019186-08:00","created_by":"daemon"},{"issue_id":"gt-g44u.3","depends_on_id":"gt-g44u.2","type":"blocks","created_at":"2025-12-19T15:50:31.14432-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.511415-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g44u.4","title":"Step recovery: timeout and release","description":"Implement recovery mechanism for stuck molecule steps.\n\n## Problem\nWhen a worker dies mid-step, the step stays in_progress forever.\nNeed timeout/release mechanism for nondeterministic idempotence.\n\n## Solution\n1. Track step start time (claimed_at timestamp)\n2. Timeout: After 30 min in_progress, step returns to pending\n3. Manual release: bd release \u003cstep-id\u003e\n\n## Implementation Options\n\n### Option A: Beads-level timeout\n- Add claimed_at field to issues\n- bd ready excludes items in_progress \u003c 30 min\n- bd ready includes items in_progress \u003e 30 min (auto-recovery)\n\n### Option B: Daemon-level timeout \n- Daemon watches in_progress items\n- Moves back to pending after timeout\n\n### Option C: Manual only (MVP)\n- bd release \u003cid\u003e manually moves in_progress → pending\n- Document recovery procedure\n- Witness can automate for polecats\n\n## Recommendation\nStart with Option C (manual) for Christmas. Add Option A later.\n\n## Acceptance\n- [ ] bd release command works\n- [ ] Stuck steps can be recovered\n- [ ] Documented recovery procedure","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T15:50:15.072833-08:00","updated_at":"2025-12-27T21:29:53.991547-08:00","dependencies":[{"issue_id":"gt-g44u.4","depends_on_id":"gt-g44u","type":"parent-child","created_at":"2025-12-19T15:50:15.07451-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.991547-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g44u.5","title":"Spawn --molecule integration","description":"Implement gt spawn --molecule flag for molecule-based polecat workflows.\n\nUsage: gt spawn --issue gt-xyz --molecule mol-gastown-polecat\n\nBehavior:\n1. Validate molecule exists and is well-formed\n2. Create molecule instance (child beads) under the issue \n3. Find first ready step(s) in the instance\n4. Spawn polecat with first ready step as initial work\n\nImplementation:\n1. Add --molecule flag to spawn command\n2. Call molecule.Instantiate()\n3. Query ready steps from instance\n4. Pass first ready step to polecat context\n\nFiles: internal/cmd/spawn.go\n\nAcceptance:\n- --molecule flag works\n- Creates proper molecule instance\n- Polecat starts on first ready step\n- End-to-end test with actual polecat","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T15:50:24.519069-08:00","updated_at":"2025-12-27T21:29:45.5032-08:00","dependencies":[{"issue_id":"gt-g44u.5","depends_on_id":"gt-g44u","type":"parent-child","created_at":"2025-12-19T15:50:24.521029-08:00","created_by":"daemon"},{"issue_id":"gt-g44u.5","depends_on_id":"gt-g44u.1","type":"blocks","created_at":"2025-12-19T15:50:31.275526-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.5032-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g5qj4","title":"Digest: mol-deacon-patrol","description":"Patrol 19: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:37:18.315878-08:00","updated_at":"2025-12-27T21:26:00.702483-08:00","deleted_at":"2025-12-27T21:26:00.702483-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-g844","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-test) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T22:04:43.420661-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-g844","depends_on_id":"gt-jvr3","type":"parent-child","created_at":"2025-12-21T22:04:43.421644-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-g8mq","title":"Digest: mol-deacon-patrol","description":"Patrol 5","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:07:33.53604-08:00","updated_at":"2025-12-27T21:26:04.474274-08:00","deleted_at":"2025-12-27T21:26:04.474274-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gahh2","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:06:40.938636-08:00","updated_at":"2025-12-27T21:26:02.992556-08:00","deleted_at":"2025-12-27T21:26:02.992556-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gastown-polecat-alpha","title":"gt-gastown-polecat-alpha","description":"gt-gastown-polecat-alpha\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: gt-polecat-role\ncleanup_status: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T21:41:36.19828-08:00","updated_at":"2025-12-29T21:45:13.451246-08:00","created_by":"mayor","deleted_at":"2025-12-29T21:45:13.451246-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"agent"}
{"id":"gt-gastown-polecat-bravo","title":"gt-gastown-polecat-bravo","description":"gt-gastown-polecat-bravo\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: gt-polecat-role\ncleanup_status: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T21:41:37.373036-08:00","updated_at":"2025-12-29T21:45:13.565657-08:00","created_by":"mayor","deleted_at":"2025-12-29T21:45:13.565657-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"agent"}
{"id":"gt-gastown-polecat-capable","title":"gt-gastown-polecat-capable","description":"gt-gastown-polecat-capable\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-rbncw\nrole_bead: gt-polecat-role\ncleanup_status: has_stash\nactive_mr: gt-51ibt\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T21:56:09.595279-08:00","updated_at":"2026-01-07T00:18:25.243907-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T20:25:57.124447-08:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-charlie","title":"gt-gastown-polecat-charlie","description":"gt-gastown-polecat-charlie\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: gt-polecat-role\ncleanup_status: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T21:41:38.470051-08:00","updated_at":"2025-12-29T21:45:13.679776-08:00","created_by":"mayor","deleted_at":"2025-12-29T21:45:13.679776-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"agent"}
{"id":"gt-gastown-polecat-chumbucket","title":"gt-gastown-polecat-chumbucket","description":"gt-gastown-polecat-chumbucket\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: gt-bc6gm\nrole_bead: gt-polecat-role\ncleanup_status: null\nactive_mr: gt-m63yl\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-04T14:19:39.178468-08:00","updated_at":"2026-01-04T21:31:39.653635-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-04T20:48:51.843837-08:00","deleted_by":"mayor","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-corpus","title":"gt-gastown-polecat-corpus","description":"gt-gastown-polecat-corpus\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: gt-32d4a\nrole_bead: gt-polecat-role\ncleanup_status: null\nactive_mr: gt-vtlh6\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-04T14:19:59.542888-08:00","updated_at":"2026-01-04T22:01:18.424876-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-04T20:49:52.876217-08:00","deleted_by":"stevey","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-dag","title":"gt-gastown-polecat-dag","description":"gt-gastown-polecat-dag\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-tvwnz\nrole_bead: gt-polecat-role\ncleanup_status: clean\nactive_mr: gt-vve6k\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-29T21:58:43.032649-08:00","updated_at":"2026-01-02T18:52:33.200353-08:00","created_by":"mayor","deleted_at":"2026-01-02T17:53:32.096486-08:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-delta","title":"gt-gastown-polecat-delta","description":"gt-gastown-polecat-delta\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: gt-polecat-role\ncleanup_status: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T21:41:39.879592-08:00","updated_at":"2025-12-29T21:45:13.792554-08:00","created_by":"mayor","deleted_at":"2025-12-29T21:45:13.792554-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"agent"}
{"id":"gt-gastown-polecat-dementus","title":"gt-gastown-polecat-dementus","description":"gt-gastown-polecat-dementus\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-lfi2d\nrole_bead: gt-polecat-role\ncleanup_status: has_uncommitted\nactive_mr: gt-qpjp4\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-29T21:43:05.226023-08:00","updated_at":"2026-01-06T07:46:38.784047+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T07:46:38.784047+13:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-dinki","title":"gt-gastown-polecat-dinki","description":"gt-gastown-polecat-dinki\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: gt-l0lok\nrole_bead: gt-polecat-role\ncleanup_status: null\nactive_mr: gt-1qp3u\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-04T14:20:21.016241-08:00","updated_at":"2026-01-04T22:14:03.63504-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-04T20:49:53.264112-08:00","deleted_by":"stevey","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-furiosa","title":"gt-gastown-polecat-furiosa","description":"gt-gastown-polecat-furiosa\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-csbjj\nrole_bead: gt-polecat-role\ncleanup_status: has_uncommitted\nactive_mr: gt-ekmmu\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-29T17:54:50.716414-08:00","updated_at":"2026-01-08T20:53:19.5827+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-08T14:05:50.091547+13:00","deleted_by":"lifepilot/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-nux","title":"gt-gastown-polecat-nux","description":"gt-gastown-polecat-nux\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-s94gq\nrole_bead: gt-polecat-role\ncleanup_status: has_uncommitted\nactive_mr: gt-qk7p7\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-29T17:54:53.302196-08:00","updated_at":"2026-01-08T21:29:11.145326+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-08T09:39:16.853311+13:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-prime","title":"gt-gastown-polecat-prime","description":"gt-gastown-polecat-prime\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: gt-ptwe1\nrole_bead: gt-polecat-role\ncleanup_status: null\nactive_mr: gt-o1y8u\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-04T14:25:58.193482-08:00","updated_at":"2026-01-04T22:14:22.733834-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-04T20:49:53.80847-08:00","deleted_by":"stevey","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-reproduce-test","title":"gt-gastown-polecat-reproduce-test","description":"gt-gastown-polecat-reproduce-test\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: hq-polecat-role\ncleanup_status: null\nactive_mr: gt-vri19\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-06T22:24:37.826289+13:00","updated_at":"2026-01-06T22:36:45.648773+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T22:36:45.365428+13:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-rictus","title":"gt-gastown-polecat-rictus","description":"gt-gastown-polecat-rictus\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-u4fh\nrole_bead: gt-polecat-role\ncleanup_status: clean\nactive_mr: gt-xaep4\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T17:54:58.123296-08:00","updated_at":"2026-01-06T23:30:46.307496-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T20:25:59.158992-08:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-routing-bug","title":"gt-gastown-polecat-routing-bug","description":"gt-gastown-polecat-routing-bug\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: hq-polecat-role\ncleanup_status: null\nactive_mr: null\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-06T17:23:49.479122+13:00","updated_at":"2026-01-06T17:39:25.86329+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T17:39:25.787329+13:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-sling-fixer","title":"gt-gastown-polecat-sling-fixer","description":"gt-gastown-polecat-sling-fixer\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: hq-polecat-role\ncleanup_status: null\nactive_mr: null\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-06T14:36:31.085601+13:00","updated_at":"2026-01-06T17:45:25.373409+13:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T17:45:25.09922+13:00","deleted_by":"mayor","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-slit","title":"gt-gastown-polecat-slit","description":"gt-gastown-polecat-slit\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-bho9\nrole_bead: gt-polecat-role\ncleanup_status: has_uncommitted\nactive_mr: gt-52pvs\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-29T17:54:55.706657-08:00","updated_at":"2026-01-06T23:03:16.319168-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-06T20:25:59.662716-08:00","deleted_by":"gastown/witness","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-test-pr38","title":"gt-gastown-polecat-test-pr38","description":"gt-gastown-polecat-test-pr38\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: gt-polecat-role\ncleanup_status: null\nactive_mr: null\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2026-01-04T12:56:00.816458-08:00","updated_at":"2026-01-04T12:56:41.862219-08:00","created_by":"gastown/crew/joe","deleted_at":"2026-01-04T12:56:41.862219-08:00","deleted_by":"gastown/crew/joe","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gastown-polecat-toast","title":"gt-gastown-polecat-toast","description":"gt-gastown-polecat-toast\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: gt-38doh\nrole_bead: gt-polecat-role\ncleanup_status: clean\nactive_mr: gt-4bi8o\nnotification_level: null","status":"closed","priority":2,"issue_type":"agent","created_at":"2025-12-29T21:58:19.899125-08:00","updated_at":"2026-01-05T00:36:29.587004-08:00","closed_at":"2026-01-05T00:36:29.587004-08:00","close_reason":"nuked","created_by":"mayor"}
{"id":"gt-gastown-polecat-vuvalini","title":"gt-gastown-polecat-vuvalini","description":"gt-gastown-polecat-vuvalini\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: gt-a40d8\nrole_bead: gt-polecat-role\ncleanup_status: clean\nactive_mr: gt-3ee79\nnotification_level: null","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2026-01-04T14:44:42.449327-08:00","updated_at":"2026-01-04T22:21:22.790681-08:00","close_reason":"nuked","created_by":"mayor","deleted_at":"2026-01-04T20:49:54.307836-08:00","deleted_by":"stevey","delete_reason":"delete","original_type":"agent"}
{"id":"gt-gaxo","title":"ZFC Cleanup: Move Go heuristics to Deacon molecule","description":"Remove Go code that makes workflow decisions. All health checking, staleness \ndetection, nudging, and escalation belongs in the Deacon molecule where Claude \nexecutes it once per minute.\n\n## The Problem\n\nGo code currently implements polling-based health inference:\n- Daemon watches heartbeats with hardcoded thresholds\n- Decides if agents are \"stuck\" based on timestamps \n- Sends nudges, triggers restarts\n- Parses mail subjects with regex to extract intent\n\nThis is a ZFC violation. Go should be message transport, not decision-maker.\n\n## The Fix\n\nMove all health/oversight logic to Deacon patrol molecule:\n- Deacon runs once per minute\n- Claude reads molecule steps, executes them\n- Claude decides if agents are stuck (by checking mail, activity, etc.)\n- Claude sends nudges/escalations via mail\n- Go just routes messages\n\n## Scope\n\n1. daemon/daemon.go - heartbeat staleness logic\n2. daemon/backoff.go - exponential backoff decisions\n3. daemon/lifecycle.go - regex parsing of lifecycle intent\n4. keepalive/keepalive.go - staleness thresholds\n5. deacon/heartbeat.go - heartbeat age classification\n6. doctor/stale_check.go - staleness detection\n7. polecat/manager.go - state derivation from issues\n\n## Success Criteria\n\n- Go code has no hardcoded time.Duration for health decisions\n- No regex parsing of mail subjects for intent\n- No switch statements deciding agent state\n- Deacon molecule has all oversight logic\n- Go daemon is just a message router","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T23:58:18.684884-08:00","updated_at":"2025-12-27T21:29:52.754824-08:00","deleted_at":"2025-12-27T21:29:52.754824-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-gaxo.1","title":"Remove daemon heartbeat staleness logic","description":"**Files:** daemon/daemon.go (lines 272-325), daemon/backoff.go\n\n**Current behavior:**\n- Hardcoded 2min/5min thresholds classify agent health\n- Exponential backoff decides nudge intervals\n- Go decides \"is this agent stuck?\" based on timestamps\n\n**Fix:**\n- Remove staleness classification from Go\n- Remove backoff algorithm\n- Daemon becomes pure message router\n- Deacon molecule step checks agent health instead\n\n**Lines to remove/refactor:**\n- daemon.go:272-274 (staleness thresholds)\n- daemon.go:282-295 (poke decision logic)\n- daemon.go:317-325 (conditional nudging)\n- backoff.go:108-120 (backoff calculations)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T23:59:00.146652-08:00","updated_at":"2025-12-27T21:29:52.7466-08:00","dependencies":[{"issue_id":"gt-gaxo.1","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:00.147277-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.7466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gaxo.2","title":"Remove keepalive/heartbeat staleness thresholds","description":"**Files:** \n- keepalive/keepalive.go (lines 94-111)\n- deacon/heartbeat.go (lines 88-108)\n\n**Current behavior:**\n- Hardcoded thresholds: 2min=fresh, 5min=stale, 15min=very stale\n- Used by daemon to decide whether to poke agents\n- Go classifies agent responsiveness\n\n**Fix:**\n- Remove all hardcoded time.Duration thresholds\n- Keepalive becomes pure timestamp storage\n- Deacon molecule step does health assessment:\n \"Check if Witness responded in last N minutes\" (N from mol config)\n\n**Lines to remove:**\n- keepalive.go:96-110 (classification logic)\n- heartbeat.go:91-107 (age classification)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T23:59:02.956489-08:00","updated_at":"2025-12-27T21:29:52.738417-08:00","dependencies":[{"issue_id":"gt-gaxo.2","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:02.957013-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.738417-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gaxo.3","title":"Remove doctor staleness detection","description":"**File:** doctor/stale_check.go (lines 11-48)\n\n**Current behavior:**\n- DefaultStaleThreshold = 1 hour hardcoded\n- Scans molecules for in_progress status older than threshold\n- Reports \"stale\" molecules automatically\n\n**Fix:**\n- Remove automatic staleness detection from doctor\n- Doctor becomes pure diagnostic tool (reports facts, not judgments)\n- Deacon molecule step does \"orphan-check\" instead:\n \"Find issues in_progress with no active polecat\"\n\n**Lines to remove:**\n- stale_check.go:13 (DefaultStaleThreshold constant)\n- stale_check.go:243-256 (staleness classification logic)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T23:59:04.36334-08:00","updated_at":"2025-12-27T21:29:52.730033-08:00","dependencies":[{"issue_id":"gt-gaxo.3","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:04.36382-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.730033-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gaxo.4","title":"Remove polecat state derivation from issue status","description":"**File:** polecat/manager.go (lines 556-599)\n\n**Current behavior:**\n- Switch on issue.Status to derive polecat state\n- Go decides: open/in_progress → Working, closed → Done\n\n**Fix:**\n- Polecat state comes from polecat, not inferred by Go\n- Polecat signals state via mail or explicit field\n- Or: remove state derivation entirely, just report issue status\n\n**Lines to refactor:**\n- manager.go:576-588 (switch statement)\n\n**Priority:** P2 - less critical than daemon/heartbeat logic","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T23:59:05.85152-08:00","updated_at":"2025-12-27T21:29:55.63759-08:00","dependencies":[{"issue_id":"gt-gaxo.4","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:05.852006-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.63759-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gaxo.5","title":"Design Deacon molecule health-check step","description":"**Context:** After removing Go heuristics, Deacon molecule needs the logic.\n\n**New molecule step: health-scan**\n\nClaude executes:\n1. Check Witness heartbeat: gt witness status \u003crig\u003e\n2. Check Refinery heartbeat: gt refinery status \u003crig\u003e\n3. For each, assess: responsive? stuck? needs restart?\n4. If unresponsive for N cycles, send escalation mail\n\n**Key difference from Go approach:**\n- Claude makes the judgment call, not hardcoded thresholds\n- Claude can read context (what was the agent working on?)\n- Claude can ask questions or check additional signals\n- Thresholds come from molecule config, not Go constants\n\n**Deliverables:**\n- Update mol-deacon-patrol health-scan step\n- Add configurable thresholds as molecule variables\n- Test with simulated stuck agents","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T23:59:07.247548-08:00","updated_at":"2025-12-27T21:29:55.629245-08:00","dependencies":[{"issue_id":"gt-gaxo.5","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:07.248049-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.629245-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gaxo.6","title":"Remove lifecycle intent parsing from Go","description":"daemon/lifecycle.go parses mail subjects with regex looking for restart/shutdown/cycle keywords, then executes actions. Fix: use structured message types in mail body instead of parsing subjects. Go reads action field, does not interpret text.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T23:59:15.765947-08:00","updated_at":"2025-12-27T21:29:52.72171-08:00","dependencies":[{"issue_id":"gt-gaxo.6","depends_on_id":"gt-gaxo","type":"parent-child","created_at":"2025-12-23T23:59:15.766409-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.72171-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gby","title":"gt handoff: Unified agent lifecycle command","description":"## Summary\n\nUnified `gt handoff` command for ALL agent types to request lifecycle actions.\n\n## Usage\n\ngt handoff # Context-aware default\ngt handoff --shutdown # Terminate, cleanup, don't restart\ngt handoff --cycle # Restart with handoff mail\ngt handoff --restart # Fresh restart, no handoff\n\n## Context-Aware Defaults\n\n| Agent Type | Default | Reason |\n|------------|---------|--------|\n| Polecat | --shutdown | Ephemeral, work is done |\n| Witness | --cycle | Long-running, context full |\n| Refinery | --cycle | Long-running, context full |\n| Mayor | --cycle | Long-running, context full |\n| Crew | (sends mail only) | Human-managed |\n\n## What gt handoff Does\n\n1. **Verify safe to stop**\n - Git state clean (no uncommitted changes)\n - Work handed off (PR exists for polecats)\n\n2. **Send handoff mail to self** (for cycle/restart)\n - Captures current state\n - New session will read this\n\n3. **Send lifecycle request to manager**\n - Polecats/Refinery → Witness\n - Witness/Mayor → Daemon\n - Format: mail to \u003cmanager\u003e with action type\n\n4. **Set state: requesting_\u003caction\u003e**\n - Lifecycle manager checks this before acting\n\n5. **Wait for termination**\n - Don't self-exit - let manager kill session\n - Ensures clean handoff\n\n## Lifecycle Request Flow\n\nAgent Lifecycle Manager\n | |\n | 1. gt handoff --cycle |\n | a. Verify git clean |\n | b. Send handoff mail to self |\n | c. Set requesting_cycle=true |\n | d. Send lifecycle request |\n |------------------------------------→|\n | |\n | 2. Receive request\n | 3. Verify state |\n | 4. Kill session |\n | 5. Start new |\n | (for cycle) |\n | |\n | New session reads handoff |\n | Resumes work |\n\n## Who Manages Whom\n\n| Agent | Sends lifecycle request to |\n|-------|---------------------------|\n| Polecat | \u003crig\u003e/witness |\n| Refinery | \u003crig\u003e/witness |\n| Witness | daemon/ |\n| Mayor | daemon/ |\n\n## Implementation\n\n1. Detect current role (polecat, witness, refinery, mayor, crew)\n2. Apply context-aware default if no flag specified\n3. Run pre-flight checks (git clean, work handed off)\n4. Send handoff mail to self (if cycling)\n5. Send lifecycle request to appropriate manager\n6. Set requesting_\u003caction\u003e in state.json\n7. Wait (manager will kill us)\n\n## For Polecats (--shutdown)\n\nAdditional cleanup after kill:\n- Witness removes worktree\n- Witness deletes polecat branch\n- Polecat ceases to exist\n\n## Related Issues\n\n- gt-99m: Daemon (handles Mayor/Witness lifecycle)\n- gt-7ik: Ephemeral polecats (polecat cleanup)\n- gt-eu9: Witness session cycling","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T11:39:40.806863-08:00","updated_at":"2025-12-27T21:29:54.203457-08:00","dependencies":[{"issue_id":"gt-gby","depends_on_id":"gt-7ik","type":"blocks","created_at":"2025-12-18T11:39:46.423945-08:00","created_by":"daemon"},{"issue_id":"gt-gby","depends_on_id":"gt-eu9","type":"blocks","created_at":"2025-12-18T11:39:46.547204-08:00","created_by":"daemon"},{"issue_id":"gt-gby","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T11:50:24.142182-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.203457-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gczi","title":"Digest: mol-deacon-patrol","description":"Patrol 20 - handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:11:34.570635-08:00","updated_at":"2025-12-27T21:26:04.408676-08:00","deleted_at":"2025-12-27T21:26:04.408676-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ggmc","title":"Merge: gt-83k0","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-83k0\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:36:24.551025-08:00","updated_at":"2025-12-27T21:27:22.518303-08:00","deleted_at":"2025-12-27T21:27:22.518303-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ghh7q","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final before handoff, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:58:07.951302-08:00","updated_at":"2025-12-27T21:26:01.486022-08:00","deleted_at":"2025-12-27T21:26:01.486022-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gic8y","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All agents healthy, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:51:34.264706-08:00","updated_at":"2025-12-27T21:26:03.83757-08:00","deleted_at":"2025-12-27T21:26:03.83757-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gkbof","title":"Digest: mol-deacon-patrol","description":"Patrol 11: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:44:23.439842-08:00","updated_at":"2025-12-27T21:26:03.198075-08:00","deleted_at":"2025-12-27T21:26:03.198075-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gl1xy","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 6 msgs archived, all agents healthy, cleaned 29 stale mols","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T21:03:57.724194-08:00","updated_at":"2025-12-27T21:26:01.985538-08:00","deleted_at":"2025-12-27T21:26:01.985538-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gl2","title":"Clarify Mayor vs Witness cleanup responsibilities","description":"Document the cleanup authority model: Witness owns ALL per-worker cleanup, Mayor never involved.\n\n## The Rule\n\n**Witness handles ALL per-worker cleanup. Mayor is never involved.**\n\n## Why This Matters\n\n1. Separation of concerns: Mayor strategic, Witness operational\n2. Reduced coordination overhead: No back-and-forth for routine cleanup\n3. Faster shutdown: Witness kills workers immediately upon verification\n4. Cleaner escalation: Mayor only hears about problems\n\n## What Witness Handles\n\n- Verifying worker git state before kill\n- Nudging workers to fix dirty state\n- Killing worker sessions\n- Updating worker state (sleep/wake)\n- Logging verification results\n\n## What Mayor Handles\n\n- Receiving swarm complete notifications\n- Deciding whether to start new swarms\n- Handling escalations (stuck workers after 3 retries)\n- Cross-rig coordination\n\n## Escalation Path\n\nWorker stuck -\u003e Witness nudges (up to 3x) -\u003e Witness escalates to Mayor -\u003e Mayor decides: force kill, reassign, or human\n\n## Anti-Patterns\n\nDO NOT: Mayor asks Witness if worker X is clean\nDO: Witness reports swarm complete, all workers verified\n\nDO NOT: Mayor kills worker sessions directly\nDO: Mayor tells Witness to abort swarm, Witness handles cleanup\n\nDO NOT: Workers report done to Mayor\nDO: Workers report to Witness, Witness aggregates and reports up","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:48:56.678724-08:00","updated_at":"2025-12-27T21:29:54.572509-08:00","dependencies":[{"issue_id":"gt-gl2","depends_on_id":"gt-82y","type":"blocks","created_at":"2025-12-15T19:49:05.929877-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.572509-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gl6s","title":"gt spawn: polecats report success but no work actually happens","description":"## Problem\n\ngt spawn polecats have multiple silent failure modes:\n\n### Failure Mode 1: No gt prime\n- Claude starts at prompt and waits for input\n- SessionStart hook doesn't auto-run 'gt prime'\n- Without gt prime, polecats don't pick up work\n\n### Failure Mode 2: No git push (CRITICAL)\n- Polecat commits code locally\n- Polecat closes the beads issue\n- Polecat sends POLECAT_DONE\n- **But code is never pushed to remote branch**\n- When worktree is cleaned up, commits are lost forever\n- Issue appears closed but no code exists\n\n## Evidence\n\ntestcat on bd-d28c:\n- Reported 'Work Complete' with commits 7d3447b9, cd87e541\n- Closed bd-d28c\n- Submitted MR bd-yx22\n- But: `grep -r 'TestCreateTombstone' cmd/bd/*.go` returns nothing\n- No polecat/testcat branch on remote\n\n## Required Fixes\n\n1. Polecats MUST push their branch before closing issues\n2. Witness MUST verify branch exists on remote before cleanup\n3. gt spawn should auto-nudge 'gt prime'\n4. Consider: Polecat handoff should fail if unpushed commits exist\n\n## Verification Protocol for Witness\n\nBefore accepting POLECAT_DONE:\n```bash\n# 1. Verify branch pushed\ngit branch -r | grep polecat/\u003cname\u003e\n\n# 2. Verify code exists\ngrep -r '\u003cexpected_function\u003e' path/to/files\n\n# 3. Only then cleanup\n```","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-23T21:22:45.498303-08:00","updated_at":"2025-12-27T21:29:52.845241-08:00","deleted_at":"2025-12-27T21:29:52.845241-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-glisw","title":"Ephemeral patrol molecules leaking into beads","description":"## Problem\n\nEphemeral patrol orchestration molecules (e.g., mol-deacon-patrol, wisp step issues) keep appearing in bd ready output in gastown beads.\n\nThese are ephemeral/wisp issues that should:\n1. Never be synced to the beads-sync branch\n2. Live only in .beads-wisp/ (ephemeral storage)\n3. Be squashed to digests, not persisted as regular beads\n\n## Examples found\n\n- gt-wisp-6ue: mol-deacon-patrol\n- gt-pacdm: mol-deacon-patrol \n- gt-wisp-mpm: Check own context limit\n- gt-wisp-lkc: Clean dead sessions\n\n## Investigation needed\n\n1. Where are these being created? (gt mol bond? manual bd create?)\n2. Why are they using the gt- prefix instead of staying ephemeral?\n3. Is the wisp storage (.beads-wisp/) being used correctly?\n4. Is bd sync accidentally picking up ephemeral issues?\n\n## Expected behavior\n\nPatrol molecules and their steps should be ephemeral and never appear in bd ready or bd list.\n\nMoved from bd-exy3.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-27T19:18:04.448597-08:00","updated_at":"2025-12-27T21:29:45.715281-08:00","created_by":"beads/crew/dave","deleted_at":"2025-12-27T21:29:45.715281-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-glzm","title":"Digest: mol-deacon-patrol","description":"Patrol: 4 msgs (2 blocked on bd mol current, 2 handoffs). 11 polecats, 18 sessions. Swarm in progress.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T00:23:09.331459-08:00","updated_at":"2025-12-27T21:26:05.424623-08:00","deleted_at":"2025-12-27T21:26:05.424623-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gmqe","title":"Refinery needs branch visibility: Convert to worktree or push branches","description":"The refinery cannot see polecat branches because it is a separate clone, not a worktree.\n\n## Problem\n- Polecats are worktrees of mayor/rig (correct, per gt-4u5z)\n- Refinery is still a separate clone at refinery/rig (not converted)\n- Polecat branches are local to mayor/rig, invisible to refinery\n\n## Decision (2025-12-25)\n**Use a shared bare repo as the single git source of truth.**\n\n```\ngastown/\n├── .repo.git/ ← Bare repo (no working tree, invisible base)\n├── mayor/rig/ ← Worktree (human workspace)\n├── polecats/\n│ └── Toast/ ← Worktree on polecat/Toast\n├── refinery/rig/ ← Worktree on main (can see all branches!)\n└── crew/max/ ← Worktree or separate clone (TBD)\n```\n\n**Why bare repo:**\n- No working directory to accidentally work in\n- Standard git pattern (how servers work)\n- All worktrees share branch visibility\n- Refinery on main can merge polecat branches directly\n\n## Implementation\n1. Create `.repo.git` as bare clone\n2. Convert refinery to worktree of .repo.git on main\n3. Convert polecats to worktrees of .repo.git\n4. Optionally convert mayor/rig (or keep as worktree)\n5. Update gt rig init for new rigs\n6. Write migration for existing rigs\n\n## Evidence\nTracer bullet 2025-12-23: Refinery could not see polecat/tracer until pushed to origin.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T21:58:27.414179-08:00","updated_at":"2025-12-27T21:29:52.812382-08:00","deleted_at":"2025-12-27T21:29:52.812382-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-godo","title":"Digest: mol-deacon-patrol","description":"Patrol 16: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:01:17.323252-08:00","updated_at":"2025-12-27T21:26:04.894446-08:00","deleted_at":"2025-12-27T21:26:04.894446-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gox90","title":"Digest: mol-deacon-patrol","description":"Patrol 18: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:06:01.664845-08:00","updated_at":"2025-12-27T21:26:03.356041-08:00","deleted_at":"2025-12-27T21:26:03.356041-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gq3f","title":"Digest: mol-deacon-patrol","description":"Patrol #8","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:23:23.407978-08:00","updated_at":"2025-12-27T21:26:04.768154-08:00","deleted_at":"2025-12-27T21:26:04.768154-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gs1ua","title":"Digest: mol-deacon-patrol","description":"Patrol 7: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:43:24.284618-08:00","updated_at":"2025-12-27T21:26:03.230655-08:00","deleted_at":"2025-12-27T21:26:03.230655-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gsjfz","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 8: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:21:58.343607-08:00","updated_at":"2025-12-28T11:21:58.343607-08:00","closed_at":"2025-12-28T11:21:58.343574-08:00"}
{"id":"gt-gtzlc","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 5 mayor msgs (informational), all agents healthy, triggered 3 polecats, closed 1 orphan (gt-mol-aux test artifact)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:52:44.633416-08:00","updated_at":"2025-12-27T21:26:02.458373-08:00","deleted_at":"2025-12-27T21:26:02.458373-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gufib","title":"Digest: mol-witness-patrol","description":"Patrol cycle 1: 3 polecats inspected (nux, slit, furiosa), all working. 4 initial nudges sent. Refinery nudged with 2 MRs. No escalations.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:29.014708-08:00","updated_at":"2025-12-27T21:26:02.450167-08:00","deleted_at":"2025-12-27T21:26:02.450167-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gura","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:37:11.453511-08:00","updated_at":"2025-12-27T21:26:04.610072-08:00","deleted_at":"2025-12-27T21:26:04.610072-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-guuv0","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:31:25.422463-08:00","updated_at":"2025-12-27T21:26:03.780074-08:00","deleted_at":"2025-12-27T21:26:03.780074-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gvxyu","title":"Digest: mol-deacon-patrol","description":"Patrol 8: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:16:22.236359-08:00","updated_at":"2025-12-27T21:26:02.732364-08:00","deleted_at":"2025-12-27T21:26:02.732364-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gwiye","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Refineries healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:54:30.168506-08:00","updated_at":"2025-12-27T21:26:01.510861-08:00","deleted_at":"2025-12-27T21:26:01.510861-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-gzpmj","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All healthy, no changes","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T22:29:43.999497-08:00","updated_at":"2025-12-31T22:29:43.999497-08:00","closed_at":"2025-12-31T22:29:43.999462-08:00"}
{"id":"gt-h1n5","title":"Witness patrol: Add banners and wisp-based execution","description":"Bring Witness patrol up to Deacon's level of sophistication:\n\n## Current state\n- mol-witness-patrol exists (gt-qflq)\n- Basic step definitions\n\n## Needed\n1. **Banners** - Print step banners like Deacon does:\n ```\n ═══════════════════════════════════════════════════════════════\n 👁️ POLECAT-SCAN\n Checking polecat health and nudging stale workers\n ═══════════════════════════════════════════════════════════════\n ```\n\n2. **Wisp-based execution** - Spawn patrol as wisp, squash when complete\n3. **Handoff bead attachment** - Witness needs its own handoff bead with attached_molecule\n4. **Loop-or-exit step** - Context-aware cycling like Deacon\n5. **Patrol summary banner** at end of each cycle\n\n## Reference\nSee Deacon patrol implementation in ~/gt/deacon/CLAUDE.md","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T13:19:40.503122-08:00","updated_at":"2025-12-27T21:29:53.023883-08:00","dependencies":[{"issue_id":"gt-h1n5","depends_on_id":"gt-y481","type":"parent-child","created_at":"2025-12-23T13:20:15.684048-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.023883-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h28m","title":"Deacon patrol banners: visual feedback on atom transitions","description":"Print large ASCII banners when transitioning between patrol atoms.\n\n## Problem\n\nWhen the deacon progresses through patrol atoms (steps), there is no visual feedback.\nThe operator cannot easily see what the deacon is doing without reading the full output.\n\n## Desired Behavior\n\nPrint banners on step start and completion:\n\n INBOX-CHECK - Checking for lifecycle requests, escalations, timers\n INBOX-CHECK COMPLETE - Processed 3 messages, 0 lifecycle requests\n\n## Benefits\n\n1. Scanability: Operator can glance at tmux and see what is happening\n2. Progress tracking: Easy to see where in the patrol loop we are\n3. Debugging: Clear demarcation between steps for troubleshooting\n\n## Implementation Options\n\n1. In deacon CLAUDE.md: Instruct agent to print banners\n2. gt patrol step start/end: Commands that print banners\n3. bd mol step hooks: Automatically on step transitions\n\n## Related\n\n- gt-id36: Deacon Kernel\n- gt-rana: Patrol System","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T03:04:14.290474-08:00","updated_at":"2025-12-27T21:29:56.37873-08:00","deleted_at":"2025-12-27T21:29:56.37873-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h2qua","title":"Digest: mol-deacon-patrol","description":"Patrol 6: all clear, agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:44:29.400277-08:00","updated_at":"2025-12-27T21:26:01.263011-08:00","deleted_at":"2025-12-27T21:26:01.263011-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h3kqf","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All agents healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:29:05.162519-08:00","updated_at":"2025-12-27T21:26:02.92515-08:00","deleted_at":"2025-12-27T21:26:02.92515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h3mpe","title":"Digest: mol-deacon-patrol","description":"Patrol 15: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:16:49.300471-08:00","updated_at":"2025-12-27T21:26:00.980413-08:00","deleted_at":"2025-12-27T21:26:00.980413-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n","title":"Merge Queue in Beads: Universal chit system for all work","description":"\n\n**Design doc**: docs/merge-queue-design.md","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-16T23:01:45.782171-08:00","updated_at":"2025-12-27T21:29:45.682173-08:00","deleted_at":"2025-12-27T21:29:45.682173-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-h5n.1","title":"MR field parsing: extract structured fields from description","description":"Parse merge-request beads to extract structured fields:\n- branch: source branch name\n- target: target branch (main or integration/xxx)\n- source_issue: the work item being merged\n- worker: who did the work\n- rig: which rig\n- merge_commit: (set on close)\n- close_reason: (set on close)\n\nFields stored in description as YAML block or key: value lines.\nProvide helper functions: ParseMRFields(issue) and SetMRFields(issue, fields).\n\nReference: docs/merge-queue-design.md#merge-request-schema","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:47:46.682379-08:00","updated_at":"2025-12-27T21:29:45.657443-08:00","dependencies":[{"issue_id":"gt-h5n.1","depends_on_id":"gt-h5n","type":"parent-child","created_at":"2025-12-17T13:47:46.682911-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.657443-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.2","title":"MR ID generation: prefix-mr-hash convention","description":"Generate merge-request IDs following the convention: \u003cprefix\u003e-mr-\u003chash\u003e\n\nExample: gt-mr-abc123 for a gastown merge request.\n\nThis distinguishes MRs from regular issues while keeping them in the same namespace.\nThe hash should be derived from branch name + timestamp for uniqueness.\n\nImplement: GenerateMRID(prefix, branch string) string\n\nReference: docs/merge-queue-design.md#id-convention","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:47:59.023788-08:00","updated_at":"2025-12-27T21:29:45.649184-08:00","deleted_at":"2025-12-27T21:29:45.649184-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.3","title":"MR state transitions: validate open→in_progress→closed","description":"Enforce valid state transitions for merge-requests:\n\nValid transitions:\n- open → in_progress (Engineer claims MR)\n- in_progress → closed (merge success or rejection)\n- in_progress → open (failure, reassign to worker)\n- open → closed (manual rejection)\n\nInvalid:\n- closed → anything (immutable once closed)\n\nImplement validation in MR update operations.\n\nReference: docs/merge-queue-design.md#state-machine","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:50:00.417825-08:00","updated_at":"2025-12-27T21:29:45.640936-08:00","deleted_at":"2025-12-27T21:29:45.640936-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.4","title":"gt mq integration create: create integration branch for epic","description":"Implement 'gt mq integration create \u003cepic\u003e' command.\n\nActions:\n1. Verify epic exists\n2. Create branch: integration/\u003cepic-id\u003e from main\n3. Push to origin\n4. Store integration branch info in epic metadata\n\nUsage:\n gt mq integration create gt-auth-epic\n # Creates integration/gt-auth-epic from main\n\nFuture MRs for this epic's children will auto-target this branch.\n\nReference: docs/merge-queue-design.md#integration-branches","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:35.679837-08:00","updated_at":"2025-12-27T21:29:54.32869-08:00","deleted_at":"2025-12-27T21:29:54.32869-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.5","title":"gt mq integration land: merge integration branch to main","description":"Implement 'gt mq integration land \u003cepic\u003e' command.\n\nActions:\n1. Verify all MRs targeting integration/\u003cepic\u003e are merged\n2. Verify integration branch exists\n3. Merge integration/\u003cepic\u003e to main (--no-ff)\n4. Run tests on main\n5. Push to origin\n6. Delete integration branch\n7. Update epic status\n\nOptions:\n- --force: land even if some MRs still open\n- --skip-tests: skip test run\n- --dry-run: preview only\n\nThis creates a single merge commit for the entire epic's work.\n\nReference: docs/merge-queue-design.md#integration-branches","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:37.465635-08:00","updated_at":"2025-12-27T21:29:54.320446-08:00","deleted_at":"2025-12-27T21:29:54.320446-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.6","title":"gt mq integration status: show integration branch status","description":"Implement 'gt mq integration status \u003cepic\u003e' command.\n\nDisplay:\n- Integration branch name and creation date\n- Commits ahead of main\n- MRs merged to integration (closed)\n- MRs pending (open, targeting integration)\n- Comparison: main..integration/\u003cepic\u003e\n\nOutput example:\n Integration: integration/gt-auth-epic\n Created: 2025-12-17\n Ahead of main: 5 commits\n \n Merged MRs (3):\n gt-mr-001 Fix login timeout\n gt-mr-002 Fix session expiry \n gt-mr-003 Update auth config\n \n Pending MRs (1):\n gt-mr-004 Update auth tests (in_progress)\n\nReference: docs/merge-queue-design.md#integration-branches","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:39.554771-08:00","updated_at":"2025-12-27T21:29:54.312199-08:00","deleted_at":"2025-12-27T21:29:54.312199-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.7","title":"Auto-target: MRs for epic children target integration branch","description":"Automatically target integration branches for epic children.\n\nWhen 'gt mq submit' is called:\n1. Parse source issue from branch\n2. Check if issue has a parent epic\n3. Check if integration/\u003cepic\u003e branch exists\n4. If yes: set target=integration/\u003cepic\u003e\n5. If no: set target=main\n\nThis ensures batch work automatically flows to integration branches.\n\nAlso update 'gt mq submit --epic' to explicitly target an epic's integration branch.\n\nReference: docs/merge-queue-design.md#integration-branches","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:51:56.992465-08:00","updated_at":"2025-12-27T21:29:54.304002-08:00","deleted_at":"2025-12-27T21:29:54.304002-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h5n.8","title":"MQ config schema: merge_queue section in rig config.json","description":"Define and implement merge_queue configuration in rig config.\n\nSchema:\n{\n \"merge_queue\": {\n \"enabled\": true,\n \"target_branch\": \"main\",\n \"integration_branches\": true,\n \"on_conflict\": \"assign_back\", // or \"auto_rebase\"\n \"run_tests\": true,\n \"test_command\": \"go test ./...\",\n \"delete_merged_branches\": true,\n \"retry_flaky_tests\": 1,\n \"poll_interval\": \"30s\",\n \"max_concurrent\": 1\n }\n}\n\nImplement:\n- Config loading in rig package\n- Default values\n- Validation\n\nReference: docs/merge-queue-design.md#configuration","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:52:00.322779-08:00","updated_at":"2025-12-27T21:29:54.295707-08:00","deleted_at":"2025-12-27T21:29:54.295707-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h6ay5","title":"gt rig add: Auto-configure HQ routes.jsonl for new rigs","description":"## Summary\n\nWhen `gt rig add \u003crig-name\u003e` adds a new rig to the town, it should automatically\nadd a routing entry to `~/gt/.beads/routes.jsonl` so that beads with that rig's\nprefix can be looked up from the town root.\n\n## Current Behavior\n\nAfter running `gt rig add myproject`:\n- Rig is cloned and set up\n- No routing entry is added to HQ beads\n\n## Desired Behavior\n\nAfter running `gt rig add myproject`:\n- Rig is cloned and set up\n- Detect the rig's beads prefix (from config.yaml or existing issues)\n- Add entry to `~/gt/.beads/routes.jsonl`:\n `{\"prefix\": \"mp-\", \"path\": \"myproject/mayor/rig\"}`\n\n## Implementation Notes\n\n1. After rig setup, check for `\u003crig\u003e/mayor/rig/.beads/config.yaml`\n2. Read `issue-prefix` if set, or sample an issue ID from the database\n3. Append to `~/gt/.beads/routes.jsonl` if the prefix isn't already routed\n4. Create routes.jsonl if it doesn't exist\n\n## Edge Cases\n\n- Rig has no beads yet (no prefix known) - skip routing setup, let doctor fix later\n- Prefix already exists in routes.jsonl - warn and skip (or update path?)\n- routes.jsonl doesn't exist - create it","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-26T14:45:32.166285-08:00","updated_at":"2025-12-27T21:29:54.843679-08:00","deleted_at":"2025-12-27T21:29:54.843679-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h6eq","title":"Pinned beads architecture implementation","description":"Implement the pinned beads architecture from docs/pinned-beads-design.md\n\nPhases:\n1. Doctor checks for hook validation\n2. Dashboard visibility (gt hooks, gt dashboard)\n3. Protocol enforcement (self-pin, audit trail)\n4. Documentation updates","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T05:06:11.469558-08:00","updated_at":"2025-12-27T21:29:56.125906-08:00","deleted_at":"2025-12-27T21:29:56.125906-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-h6eq.1","title":"Add hook-singleton doctor check","description":"Add gt doctor check to ensure each agent has at most one handoff bead.\n\nCheck: hook-singleton\nError if: Multiple pinned beads with same '{role} Handoff' title\nFix suggestion: Delete duplicate(s) with bd close","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T05:07:03.489465-08:00","updated_at":"2025-12-27T21:29:56.11762-08:00","deleted_at":"2025-12-27T21:29:56.11762-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h6eq.2","title":"Add hook-attachment-valid doctor check","description":"Add gt doctor check to verify attached molecules exist and are not closed.\n\nCheck: hook-attachment-valid\nError if: Hook's attached_molecule field points to non-existent or closed issue\nFix suggestion: Clear attachment with gt mol detach","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T05:07:05.135043-08:00","updated_at":"2025-12-27T21:29:56.109486-08:00","deleted_at":"2025-12-27T21:29:56.109486-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h6eq.3","title":"Add orphaned-attachments doctor check","description":"Add gt doctor check for molecules attached to non-existent agents.\n\nCheck: orphaned-attachments\nWarning if: Handoff bead exists for agent that no longer has worktree\nFix suggestion: Re-sling to active agent or close molecule","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T05:07:06.064426-08:00","updated_at":"2025-12-27T21:29:56.101051-08:00","deleted_at":"2025-12-27T21:29:56.101051-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h6eq.4","title":"Add stale-attachments doctor check","description":"Add gt doctor check for work stale on hook without progress.\n\nCheck: stale-attachments\nWarning if: Attached molecule has no activity for \u003e24h\nSuggestion: Check agent status, consider nudge or reassignment","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T05:07:07.286083-08:00","updated_at":"2025-12-27T21:29:57.505913-08:00","deleted_at":"2025-12-27T21:29:57.505913-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h6eq.5","title":"Implement gt hooks command","description":"New command to show all hooks across a rig.\n\nUsage: gt hooks [rig]\n\nOutput shows:\n- All polecats and their hook status\n- Witness hook status \n- Refinery hook status\n- Crew member hook status\n- Progress bars for attached molecules","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T05:07:09.152693-08:00","updated_at":"2025-12-27T21:29:56.092885-08:00","deleted_at":"2025-12-27T21:29:56.092885-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h6eq.6","title":"Add hook status to gt status output","description":"Enhance gt status to show hook summary for each rig.\n\nCurrent: Shows rig list and basic counts\nNew: Also shows occupied hooks count and any stale attachments","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T05:07:11.290365-08:00","updated_at":"2025-12-27T21:29:57.497563-08:00","deleted_at":"2025-12-27T21:29:57.497563-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h6eq.7","title":"Implement gt mol attach-from-mail command","description":"Allow agents to self-pin work from mail.\n\nUsage: gt mol attach-from-mail \u003cmail-id\u003e\n\nBehavior:\n1. Read mail body for attached_molecule field\n2. Attach molecule to agent's hook\n3. Mark mail as read\n4. Return control for execution\n\nHandles case where work was sent via mail but not slung to hook.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T05:07:13.074413-08:00","updated_at":"2025-12-27T21:29:56.084694-08:00","deleted_at":"2025-12-27T21:29:56.084694-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h6eq.8","title":"Add audit trail for hook detach operations","description":"Log detachment events for debugging and monitoring.\n\nWhen work is detached from hook:\n- Log timestamp, agent, molecule ID, reason\n- If abnormal (not completion), notify Witness\n- Consider adding to wisp digest for patrol cycles","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T05:07:15.238619-08:00","updated_at":"2025-12-27T21:29:57.489306-08:00","deleted_at":"2025-12-27T21:29:57.489306-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-h6eq.9","title":"Update role prompts with hook protocol","description":"Update role prompt templates to include hook-first protocol.\n\nTemplates to update:\n- polecat.md.tmpl\n- deacon.md.tmpl \n- witness.md.tmpl\n- refinery.md.tmpl\n- crew.md.tmpl\n\nEach should include:\n1. Check hook first (gt mol status)\n2. If empty, check mail for attached work\n3. Self-pin protocol if needed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T05:07:16.952787-08:00","updated_at":"2025-12-27T21:29:56.076347-08:00","deleted_at":"2025-12-27T21:29:56.076347-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h796d","title":"Digest: mol-deacon-patrol","description":"Patrol 4: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:22:31.28004-08:00","updated_at":"2025-12-27T21:26:01.70425-08:00","deleted_at":"2025-12-27T21:26:01.70425-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h7i1m","title":"Digest: mol-deacon-patrol","description":"P7: escalation for mayor (wrong-rig work)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:57:28.76593-08:00","updated_at":"2025-12-27T21:26:02.400984-08:00","deleted_at":"2025-12-27T21:26:02.400984-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h8k9d","title":"Audit hardcoded 'gastown' references in codebase","description":"## Problem\n\nGas Town should work for ANY rig - not assume 'gastown' (our OSS project) is installed. Found during audit:\n\n### Known Issue (Closed - gt-ci84)\n- Deacon patrol wisps were incorrectly using `gastown/mayor/rig/.beads/` instead of town-level beads\n- Status: Fixed (implemented town-level wisp storage)\n\n### Remaining Issues\n\n1. **log.go:272** - Fallback path heuristic:\n ```go\n possibleRoots := []string{\n home + \"/gt\",\n home + \"/gastown\", // \u003c-- Assumes user might have gastown as town\n }\n ```\n Should be removed or use a config-based approach.\n\n2. **Help text examples** - Many commands use 'gastown' as the example rig:\n - `gt sling gt-abc gastown`\n - `gt witness start gastown`\n - etc.\n Consider using a generic name like 'myproject' or '\u003crig\u003e'\n\n3. **Polecat name pool** - 'gastown' is in the Mad Max themed name list.\n This is acceptable (just a name, not a path assumption).\n\n## Tasks\n1. Remove `home + \"/gastown\"` fallback from log.go\n2. Consider updating help examples to use generic rig names\n3. Full code audit to ensure no other hardcoded assumptions\n\n## Context\nGas Town is the orchestration framework. 'gastown' is just our OSS project that happens to use it. Users installing Gas Town for their own projects should never see anything that assumes 'gastown' rig exists.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:14:19.075096-08:00","updated_at":"2025-12-27T21:29:54.757456-08:00","deleted_at":"2025-12-27T21:29:54.757456-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h8ye","title":"session-gc","description":"Clean dead sessions. Run gt gc --sessions.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T13:14:35.794672-08:00","updated_at":"2025-12-25T14:12:42.172371-08:00","deleted_at":"2025-12-25T14:12:42.172371-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-h9u7l","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 11: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:26:39.5289-08:00","updated_at":"2025-12-27T21:26:01.88054-08:00","deleted_at":"2025-12-27T21:26:01.88054-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hade","title":"Digest: mol-deacon-patrol","description":"Patrol #20: Final before handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:26:38.39395-08:00","updated_at":"2025-12-27T21:26:04.667435-08:00","deleted_at":"2025-12-27T21:26:04.667435-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hbg5","title":"Cross-project dependency workflow (Gas Town side)","description":"Gas Town integration for cross-project dependencies.\n\n## Components\n- gt-zniu: gt park command (park molecule on external dep)\n- gt-in3x: gt spawn --continue (resume parked molecule)\n- gt-5uf3: Patrol auto-resume (future)\n\n## Design Doc\nSee: docs/cross-project-deps.md\n\n## Depends on Beads\n- bd-h807: Cross-project dependency support (epic)\n\n## Launch Plan\nPhase 1 (launch): gt park + gt spawn --continue (manual resume)\nPhase 2 (later): Patrol auto-resume","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T22:39:36.395383-08:00","updated_at":"2025-12-27T21:29:56.428466-08:00","deleted_at":"2025-12-27T21:29:56.428466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-hcc0","title":"gt polecat remove --all: bulk polecat teardown","description":"Currently gt polecat remove only accepts one polecat at a time. Need bulk operations:\n\n## Requested\n- `gt polecat remove gastown --all` - remove all polecats from a rig\n- `gt polecat remove gastown/A gastown/B ...` - remove multiple by name\n\n## Context\nAfter a swarm completes, tearing down 20 polecats one at a time is tedious.\nEphemeral workers should be easy to create and destroy in bulk.\n\n## Related\n- gt-c92: CLI: all command for batch polecat operations","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-19T14:06:43.892225-08:00","updated_at":"2025-12-27T21:29:56.993333-08:00","deleted_at":"2025-12-27T21:29:56.993333-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-hdlw6","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Quiet, 8 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:17:12.416379-08:00","updated_at":"2025-12-27T21:26:02.715987-08:00","deleted_at":"2025-12-27T21:26:02.715987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hfi97","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All agents healthy, cleaned 18 stale wisps","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T03:04:22.750541-08:00","updated_at":"2025-12-27T21:26:03.771751-08:00","deleted_at":"2025-12-27T21:26:03.771751-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hgk","title":"Mail system: message types and threading","description":"GGT mail system needs message types and threading like PGT.\n\n## 1. Message Types\nAdd to internal/mail/types.go:\n```go\ntype MessageType string\nconst (\n TypeTask MessageType = \"task\" // Required processing\n TypeScavenge MessageType = \"scavenge\" // Optional first-come work\n TypeNotification MessageType = \"notification\" // Informational\n TypeReply MessageType = \"reply\" // Response to message\n)\n\n// Update Message struct\ntype Message struct {\n // existing fields...\n Type MessageType `json:\"type\"`\n ThreadID string `json:\"thread_id,omitempty\"`\n ReplyTo string `json:\"reply_to,omitempty\"`\n}\n```\n\n## 2. Priority Levels\nExpand from 2 to 4:\n```go\ntype Priority string\nconst (\n PriorityLow Priority = \"low\"\n PriorityNormal Priority = \"normal\"\n PriorityHigh Priority = \"high\"\n PriorityUrgent Priority = \"urgent\"\n)\n```\n\n## 3. CLI Updates\ninternal/cmd/mail.go:\n- Add --type flag to send: `gt mail send ... --type task`\n- Add --reply-to flag: `gt mail send ... --reply-to \u003cmsg-id\u003e`\n- Add thread command: `gt mail thread \u003cthread-id\u003e`\n\n## 4. Threading Logic\nNewMessage() should auto-generate thread_id if not a reply.\nReply messages inherit thread_id from original.\n\n## Files to Modify\n- internal/mail/types.go: Add types, expand Priority\n- internal/mail/mailbox.go: Thread filtering\n- internal/cmd/mail.go: CLI flags and thread command\n\n## PGT Reference\ngastown-py/src/gastown/mail/message.py\n\n## Acceptance Criteria\n- [ ] Messages have type field (default: notification)\n- [ ] 4 priority levels supported\n- [ ] Reply creates thread with shared thread_id\n- [ ] gt mail thread \u003cid\u003e shows conversation","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T14:46:55.29463-08:00","updated_at":"2025-12-27T21:29:54.452638-08:00","deleted_at":"2025-12-27T21:29:54.452638-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hj7f","title":"Merge: gt-3x0z.2","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-3x0z.2\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:01:27.529537-08:00","updated_at":"2025-12-27T21:27:22.609581-08:00","deleted_at":"2025-12-27T21:27:22.609581-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-hk2kn","title":"Merge: nux-mjw3mn8o","description":"branch: polecat/nux-mjw3mn8o\ntarget: main\nsource_issue: nux-mjw3mn8o\nrig: gastown\nagent_bead: gt-gastown-polecat-nux","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:04:17.283002-08:00","updated_at":"2026-01-01T19:05:27.722997-08:00","closed_at":"2026-01-01T19:05:27.722997-08:00","close_reason":"Merged to main at 57cd8b88","created_by":"gastown/polecats/nux"}
{"id":"gt-hkf8j","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-u2vg) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:53:37.310476-08:00","updated_at":"2025-12-25T14:12:42.115183-08:00","deleted_at":"2025-12-25T14:12:42.115183-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hkx95","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 8: all nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:41:05.230108-08:00","updated_at":"2025-12-27T21:26:01.410966-08:00","deleted_at":"2025-12-27T21:26:01.410966-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hlz0e","title":"Digest: mol-deacon-patrol","description":"Patrol 11: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:18:13.342103-08:00","updated_at":"2025-12-27T21:26:03.54429-08:00","deleted_at":"2025-12-27T21:26:03.54429-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hoyd","title":"Merge: gt-rana.1","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-rana.1\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T15:51:03.089517-08:00","updated_at":"2025-12-27T21:27:22.617826-08:00","deleted_at":"2025-12-27T21:27:22.617826-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-hpy9","title":"Merge: gt-o3is","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-o3is\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:32:41.142903-08:00","updated_at":"2025-12-27T21:27:22.426782-08:00","deleted_at":"2025-12-27T21:27:22.426782-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-hr2bg","title":"Digest: mol-deacon-patrol","description":"Patrol 4: gastown+beads witnesses/refineries healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:38:43.666431-08:00","updated_at":"2025-12-27T21:26:00.912967-08:00","deleted_at":"2025-12-27T21:26:00.912967-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hrgpg","title":"Unified Beads Namespace - prefix-based routing across town","description":"Vision: Run bd commands from anywhere in town, with prefix-based routing.\n\nCurrent state:\n- Town beads at ~/gt/.beads/ with hq-* prefix\n- Rig beads at ~/gt/\u003crig\u003e/mayor/rig/.beads/ with rig-specific prefixes (gt-, bd-, wy-, etc.)\n- Redirects allow agents to use specific beads location\n- No automatic prefix-based routing\n\nDesired state:\n- bd show gt-xyz from ANYWHERE in town → routes to gastown beads\n- bd show hq-abc from ANYWHERE in town → routes to town beads \n- bd show wy-def from ANYWHERE in town → routes to wyvern beads\n- Prefix determines beads location automatically\n\nImplementation options:\n1. Registry file at town root mapping prefix → beads location\n2. bd wrapper/hook that reads prefix and routes\n3. Unified store with namespace prefixes (more invasive)\n\nRelated work:\n- gt-0pdhj: Remove hardcoded gastown dependencies\n- Template fixes needed for deacon, witness, refinery CLAUDE.md","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-26T18:35:49.567861-08:00","updated_at":"2025-12-27T21:29:45.806641-08:00","deleted_at":"2025-12-27T21:29:45.806641-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-hs6y","title":"Add 'gt account list' command","description":"Show registered accounts from accounts.yaml. Mark default with asterisk. Show handle, email, description.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:18.277191-08:00","updated_at":"2025-12-27T21:29:56.192362-08:00","deleted_at":"2025-12-27T21:29:56.192362-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hshfy","title":"Digest: mol-deacon-patrol","description":"P15","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:26:19.128265-08:00","updated_at":"2025-12-27T21:26:01.618206-08:00","deleted_at":"2025-12-27T21:26:01.618206-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hsy0","title":"Test Patrol Parent","description":"[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\n[RESURRECTED] This issue was deleted but recreated as a tombstone to preserve hierarchical structure.\n\nOriginal description:\nTest parent for Christmas Ornament pattern","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T23:28:07.339499-08:00","updated_at":"2025-12-27T21:29:57.849951-08:00","deleted_at":"2025-12-27T21:29:57.849951-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hvy7i","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Healthy. Furiosa resolved rig mismatch.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:31:22.510409-08:00","updated_at":"2025-12-27T21:26:02.585605-08:00","deleted_at":"2025-12-27T21:26:02.585605-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hw6","title":"GGT Command Parity: Complete gt command coverage","description":"Complete gt command set to match/exceed PGT town commands.\n\nCovers: uninstall, rig info, refinery attach, witness, session mgmt, mail UX, daemon.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-17T22:22:21.720078-08:00","updated_at":"2025-12-27T21:29:54.220109-08:00","deleted_at":"2025-12-27T21:29:54.220109-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-hwma","title":"Digest: mol-deacon-patrol","description":"Patrol OK: archived old handoff, all agents up, furiosa on gt-oiv0","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T22:03:54.568334-08:00","updated_at":"2025-12-27T21:26:05.458439-08:00","deleted_at":"2025-12-27T21:26:05.458439-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hxlt","title":"Merge: gt-odvf","description":"branch: polecat/slit\ntarget: main\nsource_issue: gt-odvf\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T16:42:57.748003-08:00","updated_at":"2025-12-27T21:27:22.878258-08:00","deleted_at":"2025-12-27T21:27:22.878258-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-hyj5t","title":"Digest: mol-deacon-patrol","description":"Patrol 1: No mail, all agents healthy (mayor/witnesses/refineries OK), 0 polecats active, 2 orphaned mols for removed furiosa polecat noted","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:23:29.892203-08:00","updated_at":"2025-12-27T21:26:03.68519-08:00","deleted_at":"2025-12-27T21:26:03.68519-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-hzr","title":"gt witness: Witness management commands","description":"Add 'gt witness' command group for witness lifecycle management.\n\nSubcommands:\n- gt witness start [rig] - Start witness for a rig\n- gt witness stop [rig] - Stop witness\n- gt witness status [rig] - Show witness status\n- gt witness attach [rig] - Attach to witness session\n\nWitness monitors polecats and handles:\n- Idle detection and cleanup\n- Session health checks\n- Nudging stuck agents","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T21:47:32.210917-08:00","updated_at":"2025-12-27T21:29:57.24778-08:00","deleted_at":"2025-12-27T21:29:57.24778-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i11mk","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 19: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:50.416671-08:00","updated_at":"2025-12-28T11:22:50.416671-08:00","closed_at":"2025-12-28T11:22:50.416639-08:00"}
{"id":"gt-i1mbw","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T22:38:44.804539-08:00","updated_at":"2026-01-01T22:38:44.804539-08:00","closed_at":"2026-01-01T22:38:44.804503-08:00"}
{"id":"gt-i3deo","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 18: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:48:00.534606-08:00","updated_at":"2025-12-27T21:26:01.328895-08:00","deleted_at":"2025-12-27T21:26:01.328895-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i4i2","title":"Update deacon.md.tmpl with correct molecule commands","description":"The deacon prompt references commands that don't exist:\n- gt mol bond → should be bd mol run or gt mol arm\n- gt mol status → needs gt mol command tree first\n\nUpdate after gt mol command tree is implemented.\n\nDepends on: gt mol command tree issue","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T13:12:26.401739-08:00","updated_at":"2025-12-27T21:29:53.209031-08:00","dependencies":[{"issue_id":"gt-i4i2","depends_on_id":"gt-x74c","type":"blocks","created_at":"2025-12-22T13:12:35.69774-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.209031-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i4kq","title":"Update templates for Propulsion Principle","description":"Overhaul agent prompts to embody the Universal Gas Town Propulsion Principle:\n\n\u003e If you find something on your hook, YOU RUN IT.\n\nTemplates to update:\n- [ ] deacon.md.tmpl - Check hook first, no decision logic\n- [ ] polecat.md.tmpl - Propulsion startup, follow molecule\n- [ ] witness.md.tmpl - Sling wisps when spawning agents\n- [ ] refinery.md.tmpl - Accept slung epics\n\nKey changes:\n1. Remove 'should I run this?' decision points\n2. Add 'check your hook' as step 1 of startup\n3. Make molecule-following the default mode\n4. Simplify - agents don't think, they execute","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T03:17:46.464968-08:00","updated_at":"2025-12-27T21:29:53.251839-08:00","deleted_at":"2025-12-27T21:29:53.251839-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i4lo","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-test123 - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T21:48:26.320963-08:00","updated_at":"2025-12-27T21:29:56.462098-08:00","deleted_at":"2025-12-27T21:29:56.462098-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-i5bbp","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T20:49:59.037423-08:00","updated_at":"2025-12-25T20:49:59.037423-08:00","closed_at":"2025-12-25T20:49:59.037377-08:00"}
{"id":"gt-i6b9","title":"Merge: gt-cp2s","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-cp2s\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:45:29.171329-08:00","updated_at":"2025-12-27T21:27:22.485068-08:00","deleted_at":"2025-12-27T21:27:22.485068-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-i6k1","title":"Clean up duplicate patrol protos (gt-qflq, gt-iep9)","description":"There are duplicate protos for patrol molecules:\n\n## Duplicates\n\n| Old Proto | New Proto | Name |\n|-----------|-----------|------|\n| gt-qflq | mol-witness-patrol | mol-witness-patrol |\n| gt-iep9 | mol-deacon-patrol | mol-deacon-patrol |\n\nThe old gt-* prefix protos were created before the formula cooking system.\nThe new mol-* prefix protos were created by `bd cook`.\n\n## Action\n\n1. Close or delete the old gt-* protos\n2. Update any references to use the new mol-* protos\n3. Verify `bd mol list` shows clean output\n\n## Root Cause\n\nThe bd cook command uses the formula name as the proto ID (mol-*), \nnot the project prefix (gt-*). This is probably correct behavior,\nbut means we have legacy protos to clean up.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T13:50:11.424341-08:00","updated_at":"2025-12-27T21:29:55.510323-08:00","deleted_at":"2025-12-27T21:29:55.510323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i73jh","title":"Digest: mol-deacon-patrol","description":"Patrol 11: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:57:41.693222-08:00","updated_at":"2025-12-27T21:26:00.524485-08:00","deleted_at":"2025-12-27T21:26:00.524485-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-i7q66","title":"Merge: rictus-mjtlq9xg","description":"branch: polecat/rictus-mjtlq9xg\ntarget: main\nsource_issue: rictus-mjtlq9xg\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:11:58.147223-08:00","updated_at":"2025-12-30T23:12:54.486527-08:00","closed_at":"2025-12-30T23:12:54.486527-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/rictus"}
{"id":"gt-i7tmd","title":"Merge: rictus-1767084016819","description":"branch: polecat/rictus-1767084016819\ntarget: main\nsource_issue: rictus-1767084016819\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T00:58:46.107892-08:00","updated_at":"2025-12-30T01:01:27.168699-08:00","closed_at":"2025-12-30T01:01:27.168699-08:00","close_reason":"Already merged to main","created_by":"gastown/polecats/rictus"}
{"id":"gt-i9pl9","title":"Merge: valkyrie-mjxpdngw","description":"branch: polecat/valkyrie-mjxpdngw\ntarget: main\nsource_issue: valkyrie-mjxpdngw\nrig: gastown\nagent_bead: gt-gastown-polecat-valkyrie","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:53:21.590044-08:00","updated_at":"2026-01-02T18:55:38.336229-08:00","closed_at":"2026-01-02T18:55:38.336229-08:00","close_reason":"Merged to main at 27618e5c","created_by":"gastown/polecats/valkyrie"}
{"id":"gt-i9s7o","title":"Add tmux crash detection hooks","description":"Use tmux hooks to detect when agent panes die unexpectedly.\n\n## tmux Hooks to Use\n- pane-died: fires when a pane exits\n- session-closed: fires when session ends\n\n## Implementation\n```bash\nset-hook -g pane-died 'run-shell \"gt log pane-died #{pane_id} #{pane_dead_status}\"'\nset-hook -g session-closed 'run-shell \"gt log session-closed #{session_name}\"'\n```\n\n## Integration\n- gt spawn should set these hooks on agent sessions\n- Distinguish expected exits (handoff, done) from crashes\n- Capture last N lines of output on crash\n- Record exit code/signal for forensics\n\n## gt log crash command\n- Parse pane-died events\n- Show crash history\n- Filter by agent, time range","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-26T15:29:40.899086-08:00","updated_at":"2025-12-27T21:29:45.882716-08:00","deleted_at":"2025-12-27T21:29:45.882716-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-i9y2a","title":"Merge: toast-1767146237529","description":"branch: polecat/toast-1767146237529\ntarget: main\nsource_issue: toast-1767146237529\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T18:04:15.703708-08:00","updated_at":"2025-12-30T18:23:22.091455-08:00","closed_at":"2025-12-30T18:23:22.091455-08:00","close_reason":"Stale MR - cleanup","created_by":"gastown/polecats/toast"}
{"id":"gt-iahc","title":"Merge: gt-h6eq.6","description":"branch: polecat/keeper\ntarget: main\nsource_issue: gt-h6eq.6\nrig: gastown","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T11:42:12.738221-08:00","updated_at":"2025-12-27T21:27:22.997501-08:00","deleted_at":"2025-12-27T21:27:22.997501-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ib6y","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:48","description":"Patrol 19: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:48:16.252232-08:00","updated_at":"2025-12-27T21:26:05.038122-08:00","deleted_at":"2025-12-27T21:26:05.038122-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ibnip","title":"Digest: mol-deacon-patrol","description":"Patrol 15: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:52:34.23124-08:00","updated_at":"2025-12-27T21:26:01.238099-08:00","deleted_at":"2025-12-27T21:26:01.238099-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ic0j","title":"🤝 HANDOFF: Timing fixes applied","description":"Fixed mayor attach timing bug. Completed: timing fix, gt-tulx, Emma beads-6v2. Remaining P1: gt-17zr, gt-kcee, gt-szsq. Run gt prime on startup.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T22:19:57.731032-08:00","updated_at":"2025-12-27T21:29:57.035017-08:00","deleted_at":"2025-12-27T21:29:57.035017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-ictki","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 2 mayor handoffs (not for deacon), all agents healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:18:54.990463-08:00","updated_at":"2025-12-27T21:26:01.729244-08:00","deleted_at":"2025-12-27T21:26:01.729244-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36","title":"Deacon Kernel: event processing molecule architecture","description":"\nExtend the Deacon from \"health orchestrator\" to \"event processing kernel\" that runs a \nmolecule (rounds) each wake cycle, like an OS kernel scheduler.\n\n## Core Insight\n\nThe Deacon already has:\n- Wake cycle (daemon poke, timer, lifecycle request)\n- Mail as event queue\n- Health scanning\n- Lifecycle processing\n\nBut it is REACTIVE. This epic makes it PROACTIVE with a structured molecule:\n\n```\nDeacon Wake\n├── Step 0: Session Check (should I cycle myself?)\n├── Step 1: Health Scan (Mayor, Witnesses, Crew, Refineries)\n├── Step 2: Lifecycle Processing (cycle/restart/shutdown requests)\n├── Step 3: Plugin Execution (scheduled, event-triggered, human-requested)\n├── Step 4: Event Callbacks (timers, subscriptions)\n└── Step 5: Heartbeat + Wait\n```\n\n## Key Design Decisions\n\n1. **Session cycling first** - Token costs are quadratic with session length.\n Deacon should aggressively cycle to fresh sessions. First step of every\n wake is \"do I need a new session?\" based on context budget.\n\n2. **Pinned bead with marching orders** - A pinned bead (like gt prime for\n agents) defines the Deacon's rounds molecule. This is the Deacon's\n \"kernel\" - its operating loop.\n\n3. **Keepalive file for daemon quieting** - Instead of daemon tmux-poking\n the Deacon (which interrupts work), Deacon writes a keepalive file.\n Daemon reads it and backs off. Worst case: Deacon forgets, gets one\n extra heartbeat it squelches.\n\n4. **Molecule-based rounds** - The wake cycle becomes a molecule with steps.\n Each step can have plugins (inline attention). Steps can be added/modified\n by updating the pinned bead.\n\n## Architecture\n\n```\nGo Daemon (minimal, dumb)\n├── Reads {townRoot}/deacon/keepalive.json\n├── If fresh: do nothing\n├── If stale: poke Deacon (fallback)\n└── Process lifecycle requests (still daemon's job)\n\nDeacon (Claude agent, smart)\n├── Wakes: poke, timer, mail event\n├── Writes keepalive (quiets daemon for duration of rounds)\n├── Runs rounds molecule:\n│ ├── session-check: cycle if context budget low\n│ ├── health-scan: check agents, remediate\n│ ├── lifecycle: process requests\n│ ├── plugins: run scheduled/triggered plugins\n│ ├── events: process timer/event callbacks\n│ └── complete: update heartbeat, wait\n└── Reads marching orders from pinned bead\n```\n\n## Relation to Existing Work\n\n- Extends gt-5af (Deacon infrastructure) - CLOSED\n- Relates to gt-axz (Plugin architecture) - P3\n- Fits into phase3-deacon of gt-ngpz (Christmas Plan)\n- Enables gt-976 (Crew lifecycle) as part of health-scan step\n","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-20T21:46:08.076178-08:00","updated_at":"2025-12-27T21:29:53.610323-08:00","dependencies":[{"issue_id":"gt-id36","depends_on_id":"gt-976","type":"blocks","created_at":"2025-12-20T21:48:05.215593-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.610323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-id36.1","title":"Deacon session cycling: first step of rounds","description":"\nFirst step of every Deacon wake cycle: check if a fresh session is needed.\n\n## Why First\n\nToken costs are quadratic with session length. A Deacon running for hours\naccumulates massive context. Better to cycle early and often.\n\n## Implementation\n\nAt wake, before doing anything else:\n\n1. Check context budget estimate (conversation turns, tool calls, etc.)\n2. If budget is low (e.g., \u003e50% consumed), request cycle:\n - Write handoff mail to self with current state\n - Request lifecycle: cycle via daemon\n - Daemon kills session, starts fresh\n - New session reads handoff mail, continues\n\n## Decision Criteria\n\n- Conversation turns \u003e N (configurable, default 50?)\n- Explicit \"context feels heavy\" signal from Claude\n- Time since last cycle \u003e M hours (configurable, default 4?)\n- Forced cycle on certain events (e.g., major config change)\n\n## State to Preserve in Handoff\n\n- Current health status\n- Pending lifecycle requests (in flight)\n- Plugin state (last run times, outcomes)\n- Any active timers\n\n## Relation to Daemon\n\nThe Deacon requests its own cycle via the same lifecycle mechanism:\n```\ngt mail send deacon/ -s \"LIFECYCLE: deacon/ requesting cycle\" -m \"...\"\n```\n\nDaemon sees this, kills gt-deacon session, respawn loop restarts it.\n","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T21:46:44.700695-08:00","updated_at":"2025-12-27T21:29:53.601891-08:00","dependencies":[{"issue_id":"gt-id36.1","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:46:44.703642-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.601891-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36.2","title":"Deacon marching orders: pinned bead with rounds molecule","description":"\nCreate a pinned bead that defines the Deacon's rounds molecule - its \"kernel\".\n\n## Concept\n\nLike `gt prime` gives agents their context, a pinned bead gives the Deacon\nits operating instructions. This bead defines:\n\n1. The rounds molecule (steps to execute each wake)\n2. Registered plugins (scheduled, event-triggered)\n3. Configuration (timeouts, thresholds, escalation rules)\n\n## Pinned Bead Structure\n\n```yaml\n# gt-deacon-kernel (pinned)\ntitle: \"Deacon Kernel Configuration\"\ntype: policy\npinned: true\n\nmolecule: mol-deacon-rounds\n steps:\n - session-check: \"Check context budget, cycle if needed\"\n - health-scan: \"Check Mayor, Witnesses, Refineries, Crew\"\n - lifecycle: \"Process pending lifecycle requests\"\n - plugins: \"Run due scheduled plugins\"\n - events: \"Process timer callbacks, event subscriptions\"\n - complete: \"Update heartbeat, return to wait\"\n\nplugins:\n scheduled:\n - name: beads-hygiene\n schedule: \"0 2 * * *\" # 2 AM daily\n - name: branch-cleanup\n schedule: \"0 4 * * 0\" # 4 AM Sundays\n \n event_triggers:\n - event: \"issue.created\"\n plugin: work-oracle\n - event: \"mr.submitted\"\n plugin: review-oracle\n\nconfig:\n session_cycle_threshold_turns: 50\n session_cycle_threshold_hours: 4\n health_scan_timeout_seconds: 60\n plugin_timeout_minutes: 30\n escalation_after_failures: 3\n```\n\n## Reading Marching Orders\n\nOn wake, Deacon:\n1. `bd show --pinned` to find the kernel bead\n2. Parse molecule steps\n3. Execute in order\n\n## Updating the Kernel\n\nTo change Deacon behavior:\n1. `bd update gt-deacon-kernel` with new config\n2. Next wake cycle picks up changes\n3. No code changes required\n\n## Bootstrap\n\nIf no pinned bead exists, Deacon uses hardcoded defaults from DEACON.md.\nFirst installation creates the pinned bead.\n","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T21:46:45.892112-08:00","updated_at":"2025-12-27T21:29:53.593551-08:00","dependencies":[{"issue_id":"gt-id36.2","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:46:45.893803-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.593551-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36.3","title":"Deacon keepalive: quiet daemon during rounds","description":"\nReplace tmux-poke interrupts with a keepalive file the daemon respects.\n\n## Problem\n\nCurrent daemon pokes Deacon via tmux SendKeys every ~5 minutes if heartbeat\nis stale. This can INTERRUPT the Deacon mid-work, especially during long\nplugin execution or health remediation.\n\n## Solution\n\nDeacon writes a keepalive file at start of rounds:\n```json\n{\n \"timestamp\": \"2025-12-20T15:30:00Z\",\n \"expected_duration_minutes\": 10,\n \"action\": \"executing mol-deacon-rounds\"\n}\n```\n\nDaemon reads this and backs off:\n- If keepalive is fresh AND within expected_duration: skip poke\n- If keepalive is stale OR past expected_duration: poke (fallback)\n\n## Files\n\n- `{townRoot}/deacon/keepalive.json` - Written by Deacon\n- Daemon already reads `{townRoot}/deacon/heartbeat.json` - similar pattern\n\n## Deacon Behavior\n\n1. Start of rounds: write keepalive with expected duration\n2. End of rounds: write heartbeat (existing), clear/update keepalive\n3. If rounds take longer than expected: Deacon gets one poke, squelches it\n\n## Daemon Changes\n\nModify `pokeDeacon()` in daemon.go:\n1. Check keepalive.json freshness and expected_duration\n2. If within bounds: skip poke, log \"Deacon active\"\n3. If past bounds: poke as fallback\n\n## Worst Case\n\nDeacon crashes mid-rounds without clearing keepalive:\n- Daemon waits until expected_duration expires\n- Then pokes, triggering respawn loop\n- Max delay: expected_duration (configurable, default 10-15 min)\n\n## Configuration\n\nIn pinned kernel bead:\n```yaml\nconfig:\n rounds_expected_duration_minutes: 10\n```\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:46:47.212234-08:00","updated_at":"2025-12-27T21:29:56.70315-08:00","dependencies":[{"issue_id":"gt-id36.3","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:46:47.213911-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.70315-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36.4","title":"Deacon rounds molecule: the kernel execution loop","description":"\nDefine and implement mol-deacon-rounds - the molecule the Deacon executes each wake.\n\n## Molecule Definition\n\n```markdown\n# mol-deacon-rounds\n\nThe Deacon kernel - executed on every wake cycle.\n\n## Step: session-check\nCheck context budget. If low, cycle to fresh session.\nNeeds: (none - always runs first)\n\n## Step: health-scan\nCheck health of all monitored agents:\n- Mayor (gt-mayor session)\n- Witnesses (gt-*-witness sessions)\n- Refineries (gt-*-refinery sessions)\n- Crew (gt-*-* sessions, lifecycle only)\n\nRemediate unhealthy agents:\n- Restart dead sessions\n- Nudge stuck agents\n- Escalate if remediation fails\n\nNeeds: session-check\n\n## Step: lifecycle\nProcess pending lifecycle requests from mail:\n- LIFECYCLE: \u003cidentity\u003e requesting cycle\n- LIFECYCLE: \u003cidentity\u003e requesting restart\n- LIFECYCLE: \u003cidentity\u003e requesting shutdown\n\nNeeds: health-scan\n\n## Step: plugins\nRun due plugins:\n1. Check scheduled plugins (cron-like)\n2. Check event-triggered plugins (from mail events)\n3. Check human-requested plugins (from mail)\n4. Execute each, record outcomes\n\nNeeds: lifecycle\n\n## Step: events\nProcess timer and event callbacks:\n- TIMER: \u003cidentity\u003e wake at \u003ctime\u003e\n- EVENT: \u003ctype\u003e \u003cpayload\u003e\n\nNotify relevant agents or trigger actions.\n\nNeeds: plugins\n\n## Step: complete\nFinalize the cycle:\n1. Update heartbeat.json\n2. Clear/update keepalive.json\n3. Log summary\n4. Return to prompt (wait for next wake)\n\nNeeds: events\n```\n\n## Execution\n\nThe Deacon reads this molecule from its pinned bead and executes steps in order.\nEach step:\n1. Announces \"Starting \u003cstep\u003e\"\n2. Performs the step logic\n3. Records outcome\n4. Proceeds to next step\n\n## Step Plugins\n\nEach step can have inline plugins (additional attention):\n```yaml\nstep: health-scan\n plugins:\n - advanced-diagnostics # Run if basic scan finds issues\n```\n\n## Relation to bd mol\n\nUses the same molecule format as bd mol for polecats.\nDeacon molecule is special: it's the kernel, not work.\n","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T21:47:30.218457-08:00","updated_at":"2025-12-27T21:29:53.585266-08:00","dependencies":[{"issue_id":"gt-id36.4","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:47:30.220204-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.585266-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36.5","title":"Deacon plugin scheduler: cron-like execution","description":"\nImplement scheduled plugin execution in the Deacon's plugins step.\n\n## Concept\n\nPlugins are just directories with config. Deacon checks which are due\nand executes them during the plugins step of rounds.\n\n## Plugin Registration\n\nPlugins are registered in:\n1. Pinned kernel bead (preferred)\n2. `{townRoot}/plugins/` directory scan (fallback)\n\nEach plugin has:\n```json\n{\n \"name\": \"beads-hygiene\",\n \"schedule\": \"0 2 * * *\",\n \"last_run\": \"2025-12-20T02:00:00Z\",\n \"enabled\": true,\n \"attention_budget\": \"low\",\n \"max_duration_minutes\": 30\n}\n```\n\n## Schedule Format\n\nStandard cron format: `minute hour day month weekday`\n- `0 2 * * *` - 2 AM daily\n- `0 4 * * 0` - 4 AM Sundays\n- `*/15 * * * *` - Every 15 minutes\n\n## Execution\n\nDuring plugins step:\n1. Load plugin schedules from kernel bead + directory\n2. For each plugin:\n a. Parse schedule, check if due (last_run + schedule)\n b. If due, execute:\n - Read `plugins/{name}/CLAUDE.md` for context\n - Run plugin logic (inline or spawn subagent)\n - Record outcome and update last_run\n3. Continue to next step\n\n## Plugin Execution Modes\n\n- **inline**: Deacon runs the plugin logic directly (simple, fast)\n- **subagent**: Spawn a polecat-like agent for the plugin (complex, parallel)\n- **mail**: Send mail to another agent to do the work (delegation)\n\n## State Tracking\n\n`{townRoot}/deacon/plugin-state.json`:\n```json\n{\n \"plugins\": {\n \"beads-hygiene\": {\n \"last_run\": \"2025-12-20T02:00:00Z\",\n \"last_outcome\": \"success\",\n \"run_count\": 42,\n \"failure_count\": 2\n }\n }\n}\n```\n\n## Relation to gt-axz\n\nThis implements the execution side of the plugin architecture (gt-axz).\ngt-axz defines the format; this task implements the scheduler.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:47:31.798518-08:00","updated_at":"2025-12-27T21:29:56.694888-08:00","dependencies":[{"issue_id":"gt-id36.5","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:47:31.800001-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.694888-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-id36.6","title":"Deacon event/timer callbacks: reactive triggers","description":"\nImplement event-triggered and timer-based callbacks in the Deacon.\n\n## Timer Callbacks\n\nAgents can schedule future wakes by mailing the Deacon:\n```\nSubject: TIMER: gastown/witness wake at 2025-12-20T16:00:00Z\nBody: Please nudge me - I'm waiting for external dependency\n```\n\nDeacon processing:\n1. Parse timer mail\n2. Check if time has passed\n3. If yes: mail the agent \"WAKE: Timer fired\"\n4. Close the timer mail\n\n## Event Subscriptions\n\nPlugins/agents can subscribe to events:\n```yaml\n# In kernel bead or plugin config\nevent_triggers:\n - event: \"issue.created\"\n action: mail\n target: \"plugins/work-oracle\"\n \n - event: \"mr.submitted\"\n action: run\n plugin: \"review-oracle\"\n \n - event: \"agent.stuck\"\n action: escalate\n target: human\n```\n\n## Event Sources\n\nEvents come through mail to deacon/:\n```\nSubject: EVENT: issue.created gt-abc123\nBody: New issue created in gastown rig\n```\n\n## Event Types\n\nInitial event types:\n- `issue.created` - New bead created\n- `issue.closed` - Bead closed\n- `mr.submitted` - Merge request submitted\n- `mr.merged` - Merge request merged\n- `agent.stuck` - Agent appears stuck (from health scan)\n- `agent.failed` - Agent remediation failed\n\n## Event Processing\n\nDuring events step:\n1. Read event mail from inbox\n2. Match against subscriptions\n3. Execute actions:\n - `mail`: Send mail to target\n - `run`: Execute plugin inline\n - `spawn`: Spawn subagent for plugin\n - `escalate`: Mail human\n\n## Relation to Mail\n\nMail IS the event bus. No separate event system.\nEvents are just specially-formatted mail to deacon/.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:47:32.99498-08:00","updated_at":"2025-12-27T21:29:56.686673-08:00","dependencies":[{"issue_id":"gt-id36.6","depends_on_id":"gt-id36","type":"parent-child","created_at":"2025-12-20T21:47:32.996649-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.686673-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-idmvl","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:45:19.044736-08:00","updated_at":"2025-12-27T21:26:01.535733-08:00","deleted_at":"2025-12-27T21:26:01.535733-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-idxr5","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T20:49:37.149631-08:00","updated_at":"2025-12-25T20:49:37.149631-08:00","closed_at":"2025-12-25T20:49:37.149584-08:00"}
{"id":"gt-ie33.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-ie33\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T00:10:09.989188-08:00","updated_at":"2025-12-27T21:29:55.620979-08:00","deleted_at":"2025-12-27T21:29:55.620979-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-iep9.1","title":"inbox-check","description":"Handle callbacks from agents. Check gt mail inbox, process lifecycle requests.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:50:57.090986-08:00","updated_at":"2025-12-25T11:44:16.881599-08:00","deleted_at":"2025-12-25T11:44:16.881599-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-iep9.7","title":"loop-or-exit","description":"Decision: burn and loop if context low, exit for respawn if context high.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:51:04.755716-08:00","updated_at":"2025-12-25T11:44:16.881599-08:00","deleted_at":"2025-12-25T11:44:16.881599-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ifxvi","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T14:24:01.913738-08:00","updated_at":"2025-12-25T14:24:01.913738-08:00","closed_at":"2025-12-25T14:24:01.913706-08:00"}
{"id":"gt-ih0s","title":"Fix blocking bugs (gt-dsfi, gt-n7z7, gm-c6b)","description":"Fix bugs blocking Witness functionality:\n\n1. gt-dsfi: handoff deadlock\n - Polecats hang when trying to exit\n - Blocks shutdown request handler\n\n2. gt-n7z7: refinery foreground race condition \n - Sometimes detects parent as already running\n - Blocks reliable Refinery startup\n\n3. gm-c6b: mail coordination\n - Cross-rig mail should use town-level database\n - Affects Witness \u003c-\u003e Mayor communication\n\nThese should be fixed early as they'll block integration testing.","notes":"Fixed gt-dsfi and gt-n7z7. Issue gm-c6b not found in database.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:25.803822-08:00","updated_at":"2025-12-27T21:29:53.789714-08:00","dependencies":[{"issue_id":"gt-ih0s","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.430142-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.789714-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-iib","title":"Architecture: Decentralized rig structure with per-rig agents","description":"## Decision\n\nAdopt decentralized architecture where each rig contains all its agents (mayor/, witness/, refinery/, polecats/) rather than centralizing mayor clones at town level.\n\n## Town Level Structure\n\n```\n~/ai/ # Town root\n├── config/ # Town config (VISIBLE, not hidden)\n│ ├── town.json # {\"type\": \"town\"}\n│ ├── rigs.json # Registry of managed rigs\n│ └── federation.json # Wasteland config (future)\n│\n├── mayor/ # Mayor's HOME at town level\n│ ├── CLAUDE.md\n│ ├── mail/inbox.jsonl\n│ └── state.json\n│\n└── \u003crigs\u003e/ # Managed projects\n```\n\n## Rig Level Structure (e.g., wyvern)\n\n```\nwyvern/ # Rig = clone of project repo\n├── .git/info/exclude # Gas Town adds: polecats/ refinery/ witness/ mayor/\n├── .beads/ # Beads (if project uses it)\n├── [project files] # Clean project code on main\n│\n├── polecats/ # Worker clones\n│ └── \u003cname\u003e/ # Each is a git clone\n│\n├── refinery/\n│ ├── rig/ # Refinery's clone\n│ ├── state.json\n│ └── mail/inbox.jsonl\n│\n├── witness/ # NEW: Per-rig pit boss\n│ ├── rig/ # Witness's clone\n│ ├── state.json\n│ └── mail/inbox.jsonl\n│\n└── mayor/\n ├── rig/ # Mayor's clone for this rig\n └── state.json\n```\n\n## Key Decisions\n\n1. **Visible config dir**: `config/` not `.gastown/` (models don't find hidden dirs)\n2. **Witness per-rig**: Each rig has its own Witness (pit boss) with its own clone\n3. **Mayor decentralized**: Mayor's clones live IN each rig at `\u003crig\u003e/mayor/rig/`\n4. **Minimal invasiveness**: Only `.git/info/exclude` modified, no commits to project\n5. **Clone subdir name**: Keep `rig/` for consistency (refinery/rig/, witness/rig/, mayor/rig/)\n\n## Role Detection\n\n- Town root or mayor/ → Mayor (town level)\n- Rig root → Mayor (canonical main)\n- \u003crig\u003e/mayor/rig/ → Mayor (rig-specific)\n- \u003crig\u003e/refinery/rig/ → Refinery\n- \u003crig\u003e/witness/rig/ → Witness\n- \u003crig\u003e/polecats/\u003cname\u003e/ → Polecat\n\n## Migration from PGT\n\n- `mayor/rigs/\u003crig\u003e/` → `\u003crig\u003e/mayor/rig/`\n- `\u003crig\u003e/town/` → eliminated (rig root IS the clone)\n- Add `witness/` to each rig","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T19:21:19.913928-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-iib","depends_on_id":"gt-u1j","type":"blocks","created_at":"2025-12-15T19:21:40.374551-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ijmgg","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All agents healthy, routine check","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:45:14.673074-08:00","updated_at":"2025-12-27T21:26:03.845742-08:00","deleted_at":"2025-12-27T21:26:03.845742-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ikyh","title":"Digest: mol-deacon-patrol","description":"Patrol #3: Stable, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:30:26.503345-08:00","updated_at":"2025-12-27T21:26:04.375716-08:00","deleted_at":"2025-12-27T21:26:04.375716-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ilav","title":"Polecat Lifecycle Events: Refinery-Witness Coordination","description":"## Problem\n\nPolecat work completion is currently treated as a single event (MR submission), but the true completion is when the Refinery merges the work. This creates a lifecycle gap:\n\n1. Polecat submits MR → considered \"done\" by Witness\n2. But MR might fail to merge (conflicts, test failures)\n3. No one notifies anyone when merge actually succeeds\n\n## Solution: Two-Event Lifecycle\n\n**Event 1: MR Submitted**\n- `gt mq submit` notifies Witness\n- Witness verifies submission was clean\n- Polecat can go idle (but worktree retained)\n- If submission had issues, Witness can help\n\n**Event 2: MR Merged**\n- Refinery sends `LIFECYCLE: work merged` to Witness\n- Witness updates polecat state to `merged`\n- Worktree now eligible for full cleanup/recycling\n\n## Village Sibling Watch\n\nFor robustness, patrol roles check on their siblings:\n- Refinery checks Witness via `gt peek`\n- Witness checks Refinery via `gt peek`\n- Creates defense in depth - if primary notification fails, someone catches it\n\n## Recovery\n\nIf Refinery misses sending lifecycle notification:\n- Witness polecat-scan can detect: polecat in mr_submitted but MR shows merged\n- Deacon orphan-check can detect: merged MRs with stale polecats","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T12:06:56.746185-08:00","updated_at":"2025-12-27T21:29:56.026982-08:00","deleted_at":"2025-12-27T21:29:56.026982-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ilvo5","title":"Digest: mol-deacon-patrol","description":"P8: all polecats exited","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:58:07.490622-08:00","updated_at":"2025-12-27T21:26:02.384645-08:00","deleted_at":"2025-12-27T21:26:02.384645-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-in3x","title":"gt spawn --continue for resuming parked molecules","description":"Add `--continue` flag to gt spawn for resuming parked molecules:\n\n```bash\ngt spawn --continue gt-mol-root\n```\n\nThis:\n1. Finds molecule root\n2. Verifies molecule is in \"parked\" state (in_progress, no assignee)\n3. Checks external deps are now satisfied\n4. Spawns polecat with molecule context\n5. Polecat reads handoff mail and continues from blocked step\n\nPart of cross-project dependency system.\nSee: docs/cross-project-deps.md\n\nDepends on: gt-zniu (gt park)\nDepends on Beads: bd-zmmy (bd ready resolution)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T22:39:13.154462-08:00","updated_at":"2025-12-27T21:29:56.436623-08:00","deleted_at":"2025-12-27T21:29:56.436623-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-ingm","title":"Molecular Chemistry Cleanup: Fix inaccurate molecule depictions","description":"## The Correct Model\n\nAccording to molecular-chemistry.md, the hierarchy is:\n\nFormula (YAML DSL) --cook--\u003e Proto (template beads) --pour/wisp--\u003e Mol/Wisp (actual work)\n\n- **Formulas**: YAML files in .beads/formulas/ that define workflow structure\n- **Cook**: Transform formula to proto beads (with [template] label)\n- **Proto**: Template beads with child beads for each step\n- **Pour**: Proto to Mol (persistent liquid)\n- **Wisp**: Proto to Wisp (ephemeral vapor)\n\n## What's Wrong\n\nWe have legacy code and docs that use the OLD model where molecules are defined\nin Go code as structs with embedded markdown, instead of as formula YAML cooked into proto beads.\n\n---\n\n## MAJOR ISSUES\n\n### 1. Go Code: molecules_*.go (DELETE OR REWRITE)\n\n**Files:**\n- internal/beads/builtin_molecules.go\n- internal/beads/molecules_patrol.go (19KB!)\n- internal/beads/molecules_session.go (13KB)\n- internal/beads/molecules_work.go (13KB)\n\n**Problem:** Defines molecules as Go structs with embedded markdown descriptions.\nThis is completely wrong. Molecules should come from:\n1. Formula YAML files cooked into proto beads\n2. Proto beads with child beads for each step\n\n**Used by:**\n- install.go:seedBuiltinMolecules() - seeds during install\n- catalog.go - includes in catalog\n- molecule_list.go:ExportBuiltinMolecules()\n\n**Fix:** Delete these files. Create formula YAML files instead. Update seeding\nto cook formulas into protos.\n\n### 2. Role Templates: Steps Listed in Markdown (REWRITE)\n\n**Files:**\n- prompts/roles/deacon.md - Lists 7 patrol steps\n- prompts/roles/refinery.md - Lists 10 patrol steps\n- prompts/roles/witness.md - Procedural \"Heartbeat Protocol\" (not molecule-based!)\n- prompts/roles/polecat.md - References molecule steps\n\n**Problem:** The steps are listed in the markdown template, but they should be\ndiscovered from the proto beads via bd mol current. Templates should say\n\"check your hook, run your patrol\" without duplicating step definitions.\n\n**Fix:** Templates should describe HOW to run a patrol (propulsion principle,\nbd mol current, bd close --continue), not WHAT the steps are.\n\n### 3. Protos are Empty/Missing (CREATE)\n\n**Current state:**\n- gt-iep9 (mol-deacon-patrol) - Only 2 child steps\n- gt-qflq (mol-witness-patrol) - EMPTY (0 children)\n- mol-refinery-patrol - DOESN'T EXIST\n\n**Fix:** Create formula YAML files for each patrol, cook them into proper protos\nwith child beads for each step.\n\n---\n\n## MEDIUM ISSUES\n\n### 4. Command Confusion: spawn vs pour/wisp\n\n**Files:**\n- prompts/roles/deacon.md - Uses bd mol spawn ... --assignee=deacon\n- prompts/roles/refinery.md - Uses bd mol spawn ... --wisp\n- Various docs\n\n**Problem:** The correct commands per molecular-chemistry.md:\n- bd pour proto - Create persistent mol (liquid)\n- bd wisp proto - Create ephemeral wisp (vapor)\n- bd mol spawn - Old/deprecated\n\n**Fix:** Update all references to use bd wisp for patrols, bd pour for\npersistent work.\n\n### 5. Command Confusion: gt mol vs bd mol\n\n**Files:** Various docs and templates\n\n**Problem:** Some use gt mol status, others bd mol status. Need to clarify:\n- gt mol = Gas Town wrapper (if it exists, what does it add?)\n- bd mol = Beads molecule commands (authoritative)\n\n**Fix:** Audit and standardize. If gt mol is a thin wrapper, remove it.\n\n### 6. polecat-wisp-architecture.md (UPDATE OR DELETE)\n\n**File:** docs/polecat-wisp-architecture.md\n\n**Problem:** References adding molecules to builtin_molecules.go - old model.\n\n### 7. manager.go (UPDATE)\n\n**File:** internal/rig/manager.go\n\n**Problem:** Comment references \"subset of builtin_molecules.go for seeding\"\n\n---\n\n## IMPLEMENTATION ORDER\n\n1. **Create formula YAML files** for the 3 patrol molecules:\n - mol-deacon-patrol.formula.yaml\n - mol-witness-patrol.formula.yaml\n - mol-refinery-patrol.formula.yaml\n\n2. **Implement bd cook** if not already working to create protos from formulas\n\n3. **Cook the formulas** into proto beads with proper child beads\n\n4. **Update role templates** to use propulsion principle without step lists\n\n5. **Delete molecules_*.go** once formulas are working\n\n6. **Update install.go** to cook formulas instead of seeding Go structs\n\n7. **Audit gt mol vs bd mol** and standardize","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-24T12:37:49.601971-08:00","updated_at":"2025-12-27T21:29:52.713364-08:00","dependencies":[{"issue_id":"gt-ingm","depends_on_id":"gt-6y8u","type":"relates-to","created_at":"2025-12-24T13:25:18.856554-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.713364-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ingm.1","title":"Create patrol formula YAML files","description":"Create formula YAML files for:\n- mol-deacon-patrol.formula.yaml\n- mol-witness-patrol.formula.yaml \n- mol-refinery-patrol.formula.yaml\n\nUse the step definitions currently in molecules_patrol.go as reference.\nStore in .beads/formulas/","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:38:10.472804-08:00","updated_at":"2025-12-27T21:29:52.705079-08:00","deleted_at":"2025-12-27T21:29:52.705079-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ingm.2","title":"Cook patrol formulas into proto beads","description":"Use bd cook to transform formula YAML into proto beads with proper child beads for each step.\n\nVerify with bd mol show that protos have all children.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:38:11.762962-08:00","updated_at":"2025-12-27T21:29:52.696791-08:00","dependencies":[{"issue_id":"gt-ingm.2","depends_on_id":"gt-ingm.1","type":"blocks","created_at":"2025-12-24T12:38:23.880593-08:00","created_by":"daemon"},{"issue_id":"gt-ingm.2","depends_on_id":"gt-8tmz.13","type":"blocks","created_at":"2025-12-24T13:01:04.581551-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.696791-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ingm.3","title":"Delete molecules_*.go files","description":"Remove legacy Go-defined molecules:\n- internal/beads/builtin_molecules.go\n- internal/beads/molecules_patrol.go\n- internal/beads/molecules_session.go\n- internal/beads/molecules_work.go\n\nUpdate install.go to cook formulas instead of seeding Go structs.\nUpdate catalog.go if needed.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:38:13.028833-08:00","updated_at":"2025-12-27T21:29:52.688518-08:00","dependencies":[{"issue_id":"gt-ingm.3","depends_on_id":"gt-ingm.2","type":"blocks","created_at":"2025-12-24T12:38:23.964643-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.688518-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ingm.4","title":"Rewrite role templates without step lists","description":"Update:\n- prompts/roles/deacon.md\n- prompts/roles/refinery.md\n- prompts/roles/witness.md\n\nTemplates should describe HOW to run patrols (propulsion principle, bd mol current, bd close --continue) not WHAT the steps are. Steps come from proto beads.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:38:14.388135-08:00","updated_at":"2025-12-27T21:29:52.68022-08:00","dependencies":[{"issue_id":"gt-ingm.4","depends_on_id":"gt-ingm.2","type":"blocks","created_at":"2025-12-24T12:38:24.049212-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.68022-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ingm.5","title":"Standardize bd mol vs gt mol commands","description":"Audit all docs and templates for command usage.\n\nClarify:\n- gt mol = Gas Town wrapper (what does it add?)\n- bd mol = Beads molecule commands (authoritative)\n\nIf gt mol is thin wrapper, consider removing.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T12:38:15.50826-08:00","updated_at":"2025-12-27T21:29:55.587235-08:00","deleted_at":"2025-12-27T21:29:55.587235-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ingm.6","title":"Remove spawn - use pour/wisp only (see bd-8y9t)","description":"Remove bd mol spawn entirely from vocabulary.\n\nReplace all references with:\n- bd pour \u003cproto\u003e - Create persistent mol (liquid)\n- bd wisp \u003cproto\u003e - Create ephemeral wisp (vapor)\n\n'spawn' doesn't fit the chemistry metaphor. Two phase transitions (pour/wisp) are clearer than one command with flags.\n\nSee bd-XXX for Beads-side removal.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T12:38:16.595926-08:00","updated_at":"2025-12-27T21:29:55.578757-08:00","deleted_at":"2025-12-27T21:29:55.578757-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ioglq","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 11: nominal (6 steps)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:45:12.015686-08:00","updated_at":"2025-12-27T21:26:01.386162-08:00","deleted_at":"2025-12-27T21:26:01.386162-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ioij","title":"mol-town-shutdown: Full Gas Town reset molecule","description":"## Summary\nMolecule for clean town shutdown/restart. Sling it to Mayor when ready to reboot.\n\n## Steps\n1. Clear all inboxes (witness, refinery, crew - all rigs)\n2. Unhook all agents (remove stale molecule attachments)\n3. Kill any active polecats\n4. Stop daemon\n5. Rotate/archive logs\n6. bd sync + git push\n7. Send Mayor handoff (what's next)\n8. Restart daemon fresh\n\n## Invocation\n```\ngt hook mol-town-shutdown # attach it\n# Mayor runs it on next session\n```\n\n## Related\n- Patrol cleanup (continuous) vs shutdown (nuclear)\n- See gt-xxx for patrol hygiene","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-24T17:25:45.295151-08:00","updated_at":"2025-12-27T21:29:52.580852-08:00","deleted_at":"2025-12-27T21:29:52.580852-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-iq486","title":"Digest: mol-deacon-patrol","description":"Patrol 17: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:59.619474-08:00","updated_at":"2025-12-27T21:26:03.364297-08:00","deleted_at":"2025-12-27T21:26:03.364297-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-iqlfe","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:03:57.38308-08:00","updated_at":"2025-12-27T21:26:03.099958-08:00","deleted_at":"2025-12-27T21:26:03.099958-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ir74","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-1wmw - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-24T22:55:01.794859-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-isje","title":"Implement mol bond command for dynamic child molecules","description":"Add 'gt mol bond' command that creates child molecules at runtime with variable substitution.\n\nUsage from mol-witness-patrol survey-workers step:\n```bash\nbd mol bond mol-polecat-arm $PATROL_WISP_ID \\\n --ref arm-$polecat \\\n --var polecat_name=$polecat \\\n --var rig=\u003crig\u003e\n```\n\nThis creates child wisps like patrol-x7k.arm-ace with variables expanded.\n\nImplementation:\n1. Add 'bond' subcommand to mol command\n2. Accept: proto ID, parent ID, --ref for child suffix, --var key=value pairs\n3. Call InstantiateMolecule with Context map populated from --var flags\n4. Return created child ID\n\nCritical for Christmas Ornament pattern - without this, Witness cannot spawn per-polecat inspection arms.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T21:35:56.192637-08:00","updated_at":"2025-12-27T21:29:52.828799-08:00","deleted_at":"2025-12-27T21:29:52.828799-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-it0e","title":"Merge: gt-oiv0","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-oiv0\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T22:09:36.611121-08:00","updated_at":"2025-12-27T21:27:22.551542-08:00","deleted_at":"2025-12-27T21:27:22.551542-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-itsed","title":"Digest: mol-deacon-patrol","description":"Patrol 10: all healthy, doctor pass","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:33:16.71908-08:00","updated_at":"2025-12-27T21:26:00.776564-08:00","deleted_at":"2025-12-27T21:26:00.776564-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-iu23","title":"Polecat doesn't auto-start after spawn inject - requires manual nudge","description":"## Problem\n\nAfter `gt spawn --issue \u003cid\u003e --create`, the polecat session shows Claude Code started with the injected prompt, but Claude doesn't begin processing. The prompt just sits there until manually nudged.\n\n## Evidence\n\n```\n$ gt spawn --issue gt-rixa --rig gastown --create\n...\n✓ Session started. Attach with: gt session at gastown/furiosa\n Polecat nudged to start working\n Witness notified to monitor startup\n```\n\nSession shows:\n```\n\u003e You have a work assignment. Run 'gt mail inbox' to see it, then start\n working on issue gt-rixa.\n\n ⏵⏵ bypass permissions on (shift+tab to cycle)\n```\n\nBut Claude doesn't respond. Manual nudge required:\n```\n$ gt nudge gt-gastown-furiosa \"Please start working...\"\n```\n\nAfter nudge, polecat immediately starts working correctly.\n\n## Hypothesis\n\nThe spawn inject happens before Claude Code is fully initialized. The text arrives in the input buffer, but Claude hasn't started listening yet. By the time Claude starts, the input has already been 'consumed' as initial prompt text but not submitted.\n\n## Resolution Plan\n\nThis will be solved by the **polecat molecule workflow** (mol-polecat-work), which provides structured lifecycle management. The molecule approach handles startup, work, and shutdown as discrete steps with proper state tracking.\n\n**Blocked on**: beads/crew/dave completing ephemeral molecules (bd mol bond, ephemeral beads repo).\n\n## Workaround\n\nFor now, use `gt nudge` if a polecat doesn't start within ~30 seconds of spawn.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-21T14:06:27.375686-08:00","updated_at":"2025-12-27T21:29:53.535065-08:00","deleted_at":"2025-12-27T21:29:53.535065-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-iua8","title":"Merge: gt-frs","description":"branch: polecat/Slit\ntarget: main\nsource_issue: gt-frs\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T16:30:05.529099-08:00","updated_at":"2025-12-27T21:27:22.947103-08:00","deleted_at":"2025-12-27T21:27:22.947103-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ivtk7","title":"Digest: mol-deacon-patrol","description":"Patrol 20: healthy - handoff cycle","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:25:40.059953-08:00","updated_at":"2025-12-27T21:25:59.977735-08:00","deleted_at":"2025-12-27T21:25:59.977735-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-iwnr8","title":"Digest: mol-deacon-patrol","description":"P14: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:59:22.287023-08:00","updated_at":"2025-12-27T21:26:02.360251-08:00","deleted_at":"2025-12-27T21:26:02.360251-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-j1i0r","title":"BUG: Daemon doesn't kill zombie tmux sessions before recreating","description":"## Problem\n\nWhen the daemon detects an agent isn't running (per bead state), it tries to create a new session. But if a zombie tmux session exists (session alive, Claude dead), the create fails:\n\n```\n2026/01/02 18:18:20 Witness for gastown not running per agent bead, starting...\n2026/01/02 18:18:20 Error creating witness session for gastown: session already exists\n```\n\n## Root Cause\n\n`gt witness start` (and similar) fail if tmux session exists, even if it's a zombie. The daemon should:\n1. Detect if existing session is responsive\n2. Kill zombie if unresponsive\n3. Create fresh session\n\n## Current Behavior\n\n```go\n// In ensureWitnessRunning or similar:\nif err := tmux.NewSession(sessionName, dir); err != nil {\n // Fails with 'session already exists'\n return err\n}\n```\n\n## Expected Behavior\n\n```go\n// Pseudo-code for 'ensure running' semantics:\nif sessionExists(name) {\n if !isResponsive(name) {\n killSession(name) // Kill zombie\n } else {\n return // Already running and healthy\n }\n}\ncreateSession(name) // Create fresh\n```\n\n## Impact\n\n- Dead witness/refinery stay dead indefinitely\n- Daemon logs fill with 'session already exists' errors\n- Town health degrades with no auto-recovery\n\n## Solution\n\nAdd `ensureSessionFresh()` helper that:\n1. Checks if session exists\n2. If exists, checks if Claude is responsive (look for prompt, or check heartbeat)\n3. Kills zombie sessions\n4. Creates new session only if needed","status":"closed","priority":1,"issue_type":"bug","created_at":"2026-01-02T18:42:20.732718-08:00","updated_at":"2026-01-02T18:52:13.100083-08:00","closed_at":"2026-01-02T18:52:13.100083-08:00","close_reason":"Fixed by adding EnsureSessionFresh() helper that kills zombie sessions before recreating. Commit: 3ef732bd","created_by":"mayor"}
{"id":"gt-j39xc","title":"Merge: capable-mjw47ef9","description":"branch: polecat/capable-mjw47ef9\ntarget: main\nsource_issue: capable-mjw47ef9\nrig: gastown\nagent_bead: gt-gastown-polecat-capable","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T17:34:31.613661-08:00","updated_at":"2026-01-01T19:55:59.959524-08:00","closed_at":"2026-01-01T19:55:59.959524-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/capable"}
{"id":"gt-j4nu","title":"Merge: gt-g44u.3","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-g44u.3\nrig: gastown","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T16:14:52.767156-08:00","updated_at":"2025-12-27T21:27:22.368034-08:00","deleted_at":"2025-12-27T21:27:22.368034-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-j5tk","title":"Work assignment messages should auto-close on completion","description":"When a polecat completes work on an issue, the work assignment message (msg-type:task) stays open. Found 7 stale work assignments in gastown after swarm completed.\n\nProposal: When bd close is called on an issue, auto-close any work assignment messages that reference that issue in their body.\n\nAlternative: Work assignment messages could use a different lifecycle - perhaps they should be acked (closed) when the polecat starts working, not when they finish.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-20T03:12:28.403974-08:00","updated_at":"2025-12-27T21:29:56.847664-08:00","deleted_at":"2025-12-27T21:29:56.847664-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-j6s8","title":"Refinery startup: bond mol-refinery-patrol on start","description":"Wire up Refinery to automatically bond its patrol molecule on startup.\n\n## Current state\n- mol-refinery-patrol exists in builtin_molecules.go\n- prompts/roles/refinery.md describes the protocol\n- Refinery doesn't auto-bond on startup\n\n## Desired behavior\nOn Refinery session start:\n1. gt prime detects RoleRefinery\n2. Check for existing in-progress patrol: bd list --status=in_progress --assignee=refinery\n3. If found: resume from current step\n4. If not found: bd mol bond mol-refinery-patrol --wisp\n5. Output patrol context to agent\n\n## Implementation options\nA) Add to gt prime (outputRefineryPatrolContext)\nB) Add startup hook in refinery CLAUDE.md\nC) Both (prime detects, template reinforces)\n\n## Testing\n- Start refinery session\n- Verify patrol bonds automatically\n- Kill mid-patrol, restart, verify resumes\n\n## Depends on\n- gt-3x0z.10 (existing issue for Refinery patrol)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:43:34.739741-08:00","updated_at":"2025-12-27T21:25:59.93635-08:00","deleted_at":"2025-12-27T21:25:59.93635-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-j755","title":"mol-polecat-arm: Add nudge_text variable definition","description":"The mol-polecat-arm formula references `{{nudge_text}}` in the execute step but doesn't define it in the variables section.\n\n## Current variables section\n\n```yaml\nvariables:\n - name: polecat_name\n required: true\n - name: rig\n required: true\n```\n\n## Missing\n\n```yaml\n - name: nudge_text\n required: false\n default: \"How's progress? Need any help?\"\n description: Text to send when nudging the polecat\n```\n\n## Usage in formula\n\n```bash\ntmux send-keys -t gt-{{rig}}-{{polecat_name}} \"{{nudge_text}}\" Enter\n```\n\n## Fix\n\nAdd nudge_text to the variables section with a sensible default.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-24T13:51:29.644259-08:00","updated_at":"2025-12-27T21:29:55.501829-08:00","deleted_at":"2025-12-27T21:29:55.501829-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-j87","title":"Design: Work flow simulation and validation","description":"Validate GGT designs through simulation before implementation.\n\n## Validation Approaches\n\n### 1. Dry-Run Simulation (Recommended First)\nMayor walks through scenarios mentally/on paper:\n- \"If polecat Toast signals done with dirty git state, what happens?\"\n- \"If Witness context fills mid-verification, what state is lost?\"\n- \"If two polecats try to close same issue, what happens?\"\n\nCreate beads for any gaps discovered.\n\n### 2. Real Work in gastown-py\nUse Python Gas Town to stress-test assumptions:\n- Run actual batch work on test repos\n- Observe edge cases in practice\n- Document issues found\n\n### 3. Edge Case Analysis\nSystematic review of failure modes:\n- Agent crashes mid-operation\n- Network failures during sync\n- Concurrent access to shared state\n- Context limits hit at bad times\n\n## Key Scenarios to Validate\n\n- [ ] Witness session cycling (state preservation)\n- [ ] Polecat decommission with dirty state\n- [ ] Merge conflicts in queue\n- [ ] Beads sync conflicts between workers\n- [ ] Escalation path (stuck worker -\u003e Mayor)\n- [ ] Cross-rig communication\n- [ ] Federation mail routing (future)\n\n## Success Criteria\n\n- No data loss scenarios identified\n- Clear recovery paths for all failure modes\n- Edge cases either handled or documented as limitations\n- Design improves as model cognition improves\n\n## Output\n\nFor each scenario validated:\n1. Document in relevant bead if issue found\n2. Create new beads for missing functionality\n3. Update architecture.md if design changes","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-15T20:24:11.251841-08:00","updated_at":"2025-12-27T21:29:54.529687-08:00","deleted_at":"2025-12-27T21:29:54.529687-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-j8roh","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy - Mayor OK, 2 witnesses OK, 2 refineries OK, 1 polecat active (furiosa). No lifecycle requests or orphans.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:27:03.39472-08:00","updated_at":"2025-12-27T21:26:02.63434-08:00","deleted_at":"2025-12-27T21:26:02.63434-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jann","title":"Digest: mol-deacon-patrol","description":"Patrol OK: no mail, 8 polecats working, all witnesses/refineries up","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T00:46:56.992828-08:00","updated_at":"2025-12-27T21:26:05.405537-08:00","deleted_at":"2025-12-27T21:26:05.405537-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-janx","title":"Digest: mol-deacon-patrol","description":"Patrol 7","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:08:30.225414-08:00","updated_at":"2025-12-27T21:26:04.457935-08:00","deleted_at":"2025-12-27T21:26:04.457935-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jemnt","title":"Check own context limit","description":"Check own context limit.\n\nThe Deacon runs in a Claude session with finite context.\nCheck if approaching the limit:\n\n```bash\ngt context --usage\n```\n\nIf context is high (\u003e80%), prepare for handoff:\n- Summarize current state\n- Note any pending work\n- Write handoff to molecule state\n\nThis enables the Deacon to burn and respawn cleanly.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.775076-08:00","updated_at":"2025-12-27T21:29:55.308984-08:00","dependencies":[{"issue_id":"gt-jemnt","depends_on_id":"gt-ezg69","type":"blocks","created_at":"2025-12-25T02:11:33.791943-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.308984-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jf6ls","title":"Merge: warboy-mjz94u1q","description":"branch: polecat/warboy-mjz94u1q\ntarget: main\nsource_issue: warboy-mjz94u1q\nrig: gastown\nagent_bead: gt-gastown-polecat-warboy\nretry_count: 0\nlast_conflict_sha: null\nconflict_task_id: null","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-04T10:35:04.769194-08:00","updated_at":"2026-01-04T10:39:31.840052-08:00","closed_at":"2026-01-04T10:39:31.840052-08:00","close_reason":"Already merged to main at 36301adf","created_by":"gastown/polecats/warboy"}
{"id":"gt-jgdx","title":"Digest: mol-deacon-patrol","description":"Test patrol cycle - first run, no actual work done","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T02:07:03.388821-08:00","updated_at":"2025-12-27T21:26:05.355876-08:00","deleted_at":"2025-12-27T21:26:05.355876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jgz7h","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Core healthy, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:29:29.948113-08:00","updated_at":"2025-12-27T21:26:02.609998-08:00","deleted_at":"2025-12-27T21:26:02.609998-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jj8q","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:42","description":"Patrol 2: All healthy, no actions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:42:36.066506-08:00","updated_at":"2025-12-27T21:26:05.171146-08:00","deleted_at":"2025-12-27T21:26:05.171146-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jj9tz","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 19: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:31:07.394419-08:00","updated_at":"2025-12-27T21:26:01.812315-08:00","deleted_at":"2025-12-27T21:26:01.812315-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jjba6","title":"Digest: mol-deacon-patrol","description":"Patrol complete: routine cycle","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:39:59.005662-08:00","updated_at":"2025-12-27T21:26:01.168229-08:00","deleted_at":"2025-12-27T21:26:01.168229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jju6v","title":"Digest: mol-deacon-patrol","description":"Patrol 5","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:49:46.93586-08:00","updated_at":"2026-01-01T08:49:46.93586-08:00","closed_at":"2026-01-01T08:49:46.935827-08:00"}
{"id":"gt-jjyr","title":"Merge: gt-vmk7","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-vmk7\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:36:30.232359-08:00","updated_at":"2025-12-27T21:27:22.418417-08:00","deleted_at":"2025-12-27T21:27:22.418417-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-jlxbs","title":"Digest: mol-deacon-patrol","description":"Patrol 6: 9 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:26:12.770442-08:00","updated_at":"2025-12-27T21:26:03.643659-08:00","deleted_at":"2025-12-27T21:26:03.643659-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jnr9i","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All healthy, 1 in_progress issue (max working)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:45:08.336319-08:00","updated_at":"2025-12-27T21:26:01.151072-08:00","deleted_at":"2025-12-27T21:26:01.151072-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jo9n","title":"OSS Launch Readiness","description":"Track all issues blocking or affecting OSS launch quality.\n\nP0 Blocker:\n- gt-ofl2: ProcessMRFromQueue not implemented\n\nP1 Must Fix:\n- gt-vzic: README missing prerequisites (tmux)\n- gt-xbfw: Missing OSS files (CONTRIBUTING, etc.)\n- gt-wexr: Polecat role swarm terminology\n- gt-6n13: Competing molecule mechanisms\n- gt-3abj: go install may fail\n\nP2 Should Fix:\n- gt-qj12: Obsolete beads cleanup\n- gt-zn9m: Error suppression patterns\n- gt-9uxr: Test coverage gaps\n- gt-fm75: os.Exit in library code\n- gt-yewf: Mismatched startup protocols\n- gt-1z4m: Undocumented gt swarm\n- gt-bho9: stderr suppression\n- gt-rxsh: Merge model confusion\n- gt-5ipl: Witness role commands\n\nP3 Nice to Have:\n- gt-t5mz: Hardcoded values","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T12:51:56.676473-08:00","updated_at":"2025-12-27T21:29:45.439573-08:00","deleted_at":"2025-12-27T21:29:45.439573-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-jongy","title":"Digest: mol-deacon-patrol","description":"Patrol 16: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:57.565433-08:00","updated_at":"2025-12-27T21:26:03.372468-08:00","deleted_at":"2025-12-27T21:26:03.372468-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jpfh5","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox empty, all agents healthy (mayor, 2 witnesses, 2 refineries), no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:59:07.847042-08:00","updated_at":"2025-12-27T21:26:03.445942-08:00","deleted_at":"2025-12-27T21:26:03.445942-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jpn0s","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:08:58.603203-08:00","updated_at":"2025-12-27T21:26:02.854757-08:00","deleted_at":"2025-12-27T21:26:02.854757-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jpt","title":"Town-level beads: Real DB for coordination mail","description":"Implement Option A from mail redesign: Town gets real beads DB for coordination.\n\n## Background\n\nMail is now Beads. But currently:\n- Town .beads/redirect points to rig beads\n- mayor/mail/ has legacy JSONL files\n- Cross-rig coordination has no clear home\n\n## Design\n\nTown beads = coordination, cross-rig mail, mayor inbox, handoffs\nRig beads = project issues, work items\n\nMatches HOP hierarchy: platform \u003e project \u003e worker\n\n## Structure\n\n~/gt/\n .beads/ # REAL beads DB (prefix: gm-)\n mayor/\n town.json\n state.json # NO mail/ directory\n gastown/\n .beads/ # Rig beads (prefix: ga-)\n\n## Tasks\n\n1. Delete ~/gt/.beads/redirect\n2. Run bd init --prefix gm at ~/gt/ (town beads)\n3. Delete ~/gt/mayor/mail/ directory\n4. Update gt mail to use beads not JSONL\n5. Add mail fields (thread_id, reply_to, msg_type)\n6. Update gt prime for two-tier model\n7. Update docs/architecture.md\n\n## Addressing\n\n- mayor/ -\u003e town beads\n- rig/agent -\u003e rig beads\n- Cross-rig -\u003e town beads","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-17T19:09:55.855955-08:00","updated_at":"2025-12-27T21:29:54.23675-08:00","deleted_at":"2025-12-27T21:29:54.23675-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-jq0f","title":"Merge: gt-yd98","description":"branch: polecat/valkyrie\ntarget: main\nsource_issue: gt-yd98\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T00:23:57.718716-08:00","updated_at":"2025-12-27T21:27:22.460136-08:00","deleted_at":"2025-12-27T21:27:22.460136-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-jqpm","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:23","description":"Patrol 6: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:23:13.766787-08:00","updated_at":"2025-12-27T21:26:05.297252-08:00","deleted_at":"2025-12-27T21:26:05.297252-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-js27","title":"Digest: mol-deacon-patrol","description":"Patrol #10: Halfway checkpoint","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:23:51.816965-08:00","updated_at":"2025-12-27T21:26:04.750688-08:00","deleted_at":"2025-12-27T21:26:04.750688-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jsbvm","title":"Digest: mol-deacon-patrol","description":"Patrol 17: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:24:35.506429-08:00","updated_at":"2025-12-27T21:26:00.027249-08:00","deleted_at":"2025-12-27T21:26:00.027249-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jsup","title":"gt sling: patrol spawning must use wisp storage, not main DB","description":"**P0 Launch Blocker**\n\ngt sling to patrol roles (witness/refinery/deacon) currently spawns patrols into the main beads database instead of wisp storage. This will pollute the permanent database with ephemeral patrol cycles.\n\n**Root cause:**\nspawnMoleculeFromProto() with IsWisp=true uses `--db .beads-wisp/beads.db`, but the patrol TEMPLATE (e.g., gt-qp2w for mol-witness-patrol) only exists in the main database. So bd mol run fails with 'no issue found'.\n\n**The fix:**\nbd mol run needs to support reading the template from the main DB while writing the spawned instance to wisp storage. This is a cross-database spawn operation.\n\nOptions:\n1. Add `--template-db` flag to bd mol run (read template from here, write instance to --db)\n2. Copy template to wisp DB before spawning\n3. Make bd mol run automatically check main DB for templates when using wisp storage\n\n**Workaround applied (MUST REVERT):**\nSet IsWisp=false in slingToWitness/slingToRefinery/slingToDeacon patrol spawning.\n\n**Files:**\n- internal/cmd/sling.go: lines 680, 883, 1050 (IsWisp: false)\n- Needs fix in bd mol run (beads repo)","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-23T16:30:56.128658-08:00","updated_at":"2025-12-27T21:29:45.469219-08:00","deleted_at":"2025-12-27T21:29:45.469219-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-jtey3","title":"Digest: mol-deacon-patrol","description":"P20 - handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:27:42.306007-08:00","updated_at":"2025-12-27T21:26:01.577074-08:00","deleted_at":"2025-12-27T21:26:01.577074-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jtp71","title":"Digest: mol-deacon-patrol","description":"Patrol 9: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:56:37.563658-08:00","updated_at":"2025-12-27T21:26:00.54203-08:00","deleted_at":"2025-12-27T21:26:00.54203-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ju1s7","title":"Digest: mol-deacon-patrol","description":"Patrol 11: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:36:50.154198-08:00","updated_at":"2025-12-27T21:26:00.35684-08:00","deleted_at":"2025-12-27T21:26:00.35684-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jv8mb","title":"Digest: mol-deacon-patrol","description":"Patrol 14: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:16:28.809707-08:00","updated_at":"2025-12-27T21:26:00.988852-08:00","deleted_at":"2025-12-27T21:26:00.988852-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jvr3","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-test - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T22:04:43.420231-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-jx2ov","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:11:54.653603-08:00","updated_at":"2026-01-01T08:11:54.653603-08:00","closed_at":"2026-01-01T08:11:54.653567-08:00"}
{"id":"gt-jyzrs","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 12: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:35.34827-08:00","updated_at":"2025-12-28T11:22:35.34827-08:00","closed_at":"2025-12-28T11:22:35.348237-08:00"}
{"id":"gt-jz82w","title":"Digest: mol-deacon-patrol","description":"Patrol 14: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:35:01.073875-08:00","updated_at":"2025-12-27T21:26:00.743535-08:00","deleted_at":"2025-12-27T21:26:00.743535-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-jzot","title":"gt done: Notify Witness with exit outcome","description":"When polecat runs gt done, it should send mail to Witness:\n\n```\ngt mail send \u003crig\u003e/witness -s 'POLECAT_DONE \u003cname\u003e' -m 'Exit: COMPLETED\nIssue: \u003cissue-id\u003e\nMR: \u003cmr-id\u003e\nBranch: \u003cbranch\u003e'\n```\n\nExit types:\n- COMPLETED: Work done, MR submitted\n- ESCALATED: Hit blocker, needs human\n- DEFERRED: Work paused, issue still open\n\nThis enables Witness patrol to:\n1. See completion in inbox-check step\n2. Verify git state is clean\n3. Kill session and prune worktree\n4. Close the polecat lease in its patrol wisp\n\nPaired with gt-r6td (spawn notification) - together they bracket polecat lifecycle.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T22:31:31.266716-08:00","updated_at":"2025-12-27T21:29:53.082733-08:00","deleted_at":"2025-12-27T21:29:53.082733-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k08o","title":"test pin fix","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T12:14:47.293815-08:00","updated_at":"2025-12-27T21:29:56.018698-08:00","deleted_at":"2025-12-27T21:29:56.018698-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k0vz8","title":"Digest: mol-deacon-patrol","description":"Patrol 16: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:39:47.367126-08:00","updated_at":"2025-12-27T21:26:00.312221-08:00","deleted_at":"2025-12-27T21:26:00.312221-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr","title":"Consolidate configuration architecture: three-tier model","description":"Rationalize Gas Town's scattered configuration into a clean three-tier model:\n\n## Current Problems\n- Config split across mayor/, .gastown/, and rig/config.json\n- Runtime state mixed with configuration\n- Hidden directories (.gastown/) are missed by agents\n- Category confusion: identity vs config vs runtime state\n\n## Proposed Architecture\n\n### Tier 1: Town Config (mayor/)\n```\nmayor/\n├── town.json # Town identity (unchanged)\n├── rigs.json # Rig registry (unchanged)\n└── config.json # NEW: Town-level configuration\n # theme defaults, daemon settings, etc.\n```\n\n### Tier 2: Town Runtime (.runtime/)\n```\n~/gt/.runtime/ # NEW: Gitignored\n├── daemon.json # {pid, started_at, heartbeat}\n├── deacon.json # {cycle, last_action}\n└── agent-requests.json # Lifecycle requests\n```\n\n### Tier 3: Rig Config (settings/)\n```\n\u003crig\u003e/\n├── config.json # Rig identity only (type, git_url, beads.prefix)\n├── settings/ # NEW: Visible, git-tracked\n│ ├── config.json # theme, merge_queue, max_workers\n│ ├── namepool.json # pool settings (style, max)\n│ └── roles/ # Per-role overrides (optional)\n└── .runtime/ # NEW: Gitignored\n ├── witness.json # {state, started_at, stats}\n ├── refinery.json # {state, started_at, stats}\n └── namepool-state.json # {in_use, overflow_next}\n```\n\n## Key Principles\n1. **Visible \u003e Hidden** for config agents need to find\n2. **Git-tracked** for identity and config; **gitignored** for runtime\n3. **Separation of concerns**: identity, config, runtime are distinct\n4. **Locality**: town config at town root, rig config at rig root\n\n## Migration\n- Move .gastown/ contents to appropriate new locations\n- Update all code that reads/writes these files\n- Update .gitignore patterns\n- Update documentation (architecture.md, CLAUDE.md)\n","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-22T01:01:48.96788-08:00","updated_at":"2025-12-27T21:29:53.340595-08:00","deleted_at":"2025-12-27T21:29:53.340595-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-k1lr.1","title":"Add mayor/config.json for town-level configuration","description":"Create new town-level config file:\n- Add TownConfig.Config field to types.go\n- Create mayor/config.json with theme defaults, daemon settings\n- Update loader to read/write mayor/config.json\n- Migrate any town-level config from .gastown/ to here","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:20.091293-08:00","updated_at":"2025-12-27T21:29:53.329573-08:00","dependencies":[{"issue_id":"gt-k1lr.1","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:20.09177-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.329573-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.2","title":"Create .runtime/ for town-level runtime state","description":"Move ephemeral town state to .runtime/:\n- Create ~/gt/.runtime/ directory structure\n- Move daemon/state.json → .runtime/daemon.json\n- Move deacon/heartbeat.json → .runtime/deacon.json \n- Move .gastown/agent-state.json → .runtime/agent-requests.json\n- Move .gastown/keepalive.json → .runtime/keepalive.json\n- Update .gitignore to ignore .runtime/\n- Update all code references","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:21.166304-08:00","updated_at":"2025-12-27T21:29:53.321064-08:00","dependencies":[{"issue_id":"gt-k1lr.2","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:21.166715-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.321064-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.3","title":"Create rig settings/ directory for rig configuration","description":"Replace .gastown/ with visible settings/ at rig level:\n- Create \u003crig\u003e/settings/ directory\n- Move .gastown/config.json → settings/config.json (theme, merge_queue, max_workers)\n- Create settings/namepool.json for pool settings (style, max - not state)\n- Create settings/roles/ for per-role overrides (optional)\n- Update RigConfig type to reference settings/\n- Update loader to read from settings/","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:22.13096-08:00","updated_at":"2025-12-27T21:29:53.312437-08:00","dependencies":[{"issue_id":"gt-k1lr.3","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:22.13369-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.312437-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.4","title":"Create rig .runtime/ for rig runtime state","description":"Move ephemeral rig state to .runtime/:\n- Create \u003crig\u003e/.runtime/ directory\n- Move .gastown/witness.json → .runtime/witness.json\n- Move .gastown/refinery.json → .runtime/refinery.json\n- Move .gastown/namepool.json (state portion) → .runtime/namepool-state.json\n- Update .gitignore to ignore .runtime/\n- Update all code references","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:23.072213-08:00","updated_at":"2025-12-27T21:29:53.303865-08:00","dependencies":[{"issue_id":"gt-k1lr.4","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:23.073919-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.303865-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.5","title":"Slim down rig root config.json to identity only","description":"Rig root config.json should only contain identity:\n- type, version, name, git_url, beads.prefix, created_at\n- Remove any behavioral config (move to settings/config.json)\n- Update RigConfig type to separate identity from settings\n- Ensure loader reads identity from root, settings from settings/","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:24.069337-08:00","updated_at":"2025-12-27T21:29:53.29529-08:00","dependencies":[{"issue_id":"gt-k1lr.5","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:24.070897-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.29529-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.6","title":"Update gt doctor to check new config locations","description":"Doctor should validate new structure:\n- Check mayor/config.json exists and is valid\n- Check settings/ exists for each rig\n- Warn if old .gastown/ files still exist\n- Offer --fix to migrate old locations to new","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T01:02:25.886588-08:00","updated_at":"2025-12-27T21:29:53.286676-08:00","dependencies":[{"issue_id":"gt-k1lr.6","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:25.888093-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.6","depends_on_id":"gt-k1lr.1","type":"blocks","created_at":"2025-12-22T01:02:37.750161-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.6","depends_on_id":"gt-k1lr.3","type":"blocks","created_at":"2025-12-22T01:02:37.820429-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.286676-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.7","title":"Update documentation for new config architecture","description":"Update docs to reflect new structure:\n- docs/architecture.md: Directory structure section\n- CLAUDE.md files: Config references\n- Add docs/configuration.md with full reference","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T01:02:27.104869-08:00","updated_at":"2025-12-27T21:29:56.412024-08:00","dependencies":[{"issue_id":"gt-k1lr.7","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:27.106351-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.7","depends_on_id":"gt-k1lr.1","type":"blocks","created_at":"2025-12-22T01:02:37.609785-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.7","depends_on_id":"gt-k1lr.3","type":"blocks","created_at":"2025-12-22T01:02:37.680026-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.412024-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k1lr.8","title":"Remove .gastown/ after migration complete","description":"Final cleanup:\n- Remove .gastown/ directories at town and rig levels\n- Remove daemon/ and deacon/ directories (replaced by .runtime/)\n- Update .gitignore to remove old patterns\n- Verify all tests pass with new structure","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T01:02:28.43634-08:00","updated_at":"2025-12-27T21:29:56.403854-08:00","dependencies":[{"issue_id":"gt-k1lr.8","depends_on_id":"gt-k1lr","type":"parent-child","created_at":"2025-12-22T01:02:28.437853-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.8","depends_on_id":"gt-k1lr.7","type":"blocks","created_at":"2025-12-22T01:02:37.466318-08:00","created_by":"daemon"},{"issue_id":"gt-k1lr.8","depends_on_id":"gt-k1lr.6","type":"blocks","created_at":"2025-12-22T01:02:37.539854-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.403854-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k2aj","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 0 mail, Mayor+2 witnesses+2 refineries healthy, 3 polecats working (gastown), no orphans, gc N/A","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T20:40:02.581522-08:00","updated_at":"2025-12-27T21:26:05.483662-08:00","deleted_at":"2025-12-27T21:26:05.483662-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k2ilq","title":"Digest: mol-deacon-patrol","description":"Patrol 7: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:48:55.379716-08:00","updated_at":"2025-12-27T21:26:02.888229-08:00","deleted_at":"2025-12-27T21:26:02.888229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k4dxp","title":"Digest: mol-deacon-patrol","description":"Patrol 4: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:30:11.536346-08:00","updated_at":"2025-12-27T21:26:00.82962-08:00","deleted_at":"2025-12-27T21:26:00.82962-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k6986","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T13:56:43.09177-08:00","updated_at":"2025-12-26T13:56:43.09177-08:00","closed_at":"2025-12-26T13:56:43.091735-08:00"}
{"id":"gt-k7l0w","title":"Digest: mol-deacon-patrol","description":"Patrol 2: Quiet cycle, all healthy, orphan gt-mol-aux persists","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:13:24.625145-08:00","updated_at":"2025-12-27T21:26:02.77341-08:00","deleted_at":"2025-12-27T21:26:02.77341-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k7x0","title":"Merge: gt-h5n.5","description":"branch: polecat/Scabrous\ntarget: main\nsource_issue: gt-h5n.5\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T14:53:42.318338-08:00","updated_at":"2025-12-27T21:27:22.78341-08:00","deleted_at":"2025-12-27T21:27:22.78341-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-k8dnh","title":"Digest: mol-deacon-patrol","description":"Patrol 4: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:10:16.350897-08:00","updated_at":"2025-12-27T21:26:01.074031-08:00","deleted_at":"2025-12-27T21:26:01.074031-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k8u5t","title":"Digest: mol-deacon-patrol","description":"Patrol 17: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:33:42.064804-08:00","updated_at":"2025-12-25T15:52:58.107047-08:00","deleted_at":"2025-12-25T15:52:58.107047-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k9185","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 2: quick scan, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:15:18.561034-08:00","updated_at":"2025-12-27T21:26:03.32014-08:00","deleted_at":"2025-12-27T21:26:03.32014-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-k949h","title":"Digest: mol-deacon-patrol","description":"Patrol 16: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:17:11.402724-08:00","updated_at":"2025-12-27T21:26:00.971834-08:00","deleted_at":"2025-12-27T21:26:00.971834-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kaue","title":"Digest: mol-deacon-patrol","description":"Patrol #11: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:33:25.611316-08:00","updated_at":"2025-12-27T21:26:04.310029-08:00","deleted_at":"2025-12-27T21:26:04.310029-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kbvbb","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 8: all healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:19:31.842546-08:00","updated_at":"2026-01-01T07:19:31.842546-08:00","closed_at":"2026-01-01T07:19:31.84251-08:00"}
{"id":"gt-kcee","title":"Witness commands: gt witness start/stop/status needed","description":"## Summary\n\nNo `gt witness` command exists. The witness should:\n- Monitor polecats for stuck/idle state\n- Nudge polecats that seem blocked\n- Report status to mayor\n- Handle polecat lifecycle\n\n## Expected Commands\n\n```bash\ngt witness start gastown # Start witness for rig\ngt witness stop gastown # Stop witness\ngt witness status # Show witness status\n```\n\n## Current State\n\n- Witness directory exists: /Users/stevey/gt/gastown/witness/\n- Has state.json but no active process\n- gt status shows 'Witnesses: 0'\n\n## Context\n\nFull polecat flow needs witness monitoring for production use.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-18T21:55:24.079671-08:00","updated_at":"2025-12-27T21:29:54.083885-08:00","deleted_at":"2025-12-27T21:29:54.083885-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-kcy0","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, no lifecycle requests, no orphans remediated","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:34:16.947195-08:00","updated_at":"2025-12-27T21:26:04.659108-08:00","deleted_at":"2025-12-27T21:26:04.659108-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kdkz0","title":"Digest: mol-deacon-patrol","description":"Patrol 3: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:04:32.188343-08:00","updated_at":"2025-12-27T21:26:03.033879-08:00","deleted_at":"2025-12-27T21:26:03.033879-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kg3ch","title":"Merge: rictus-mjtlq9xg","description":"branch: polecat/rictus-mjtlq9xg\ntarget: main\nsource_issue: rictus-mjtlq9xg\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:13:07.399371-08:00","updated_at":"2025-12-30T23:12:54.459144-08:00","closed_at":"2025-12-30T23:12:54.459144-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/rictus"}
{"id":"gt-kg4ne","title":"Digest: mol-deacon-patrol","description":"Patrol 17: All healthy, 2 mayor handoffs observed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:39:00.24628-08:00","updated_at":"2025-12-27T21:26:02.805903-08:00","deleted_at":"2025-12-27T21:26:02.805903-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kgk5.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-kgk5\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T12:46:15.807528-08:00","updated_at":"2025-12-27T21:29:55.570225-08:00","deleted_at":"2025-12-27T21:29:55.570225-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kh6q","title":"Remove Go-based witness patrol - Claude session is the executor","description":"## Problem\n\nThe witness has 1370 lines of Go code in internal/witness/manager.go that reimplements what mol-witness-patrol does:\n- healthCheck() - polls polecats\n- autoSpawnForReadyWork() - spawns polecats for ready issues \n- handleStuckPolecat() - nudge logic\n- processShutdownRequests() - cleanup logic\n- verifyPolecatState() - git state checks\n\nThis is backwards. The molecule should drive the work, Claude should execute it.\n\n## The Wrong Architecture (gt-59zd, now cancelled)\n\nThe cancelled gt-59zd tried to use molecules as 'tracking ledgers' for Go code:\n- Go code runs patrol logic\n- Molecule just records what Go did\n- Duplicates work, adds complexity\n\n## The Right Architecture\n\n1. Witness runs as Claude session (via gt sling witness or tmux)\n2. Claude executes mol-witness-patrol steps directly\n3. No Go patrol loop needed\n4. Molecule IS the executor, not bookkeeping\n\n## Scope\n\n1. Remove patrol-related functions from manager.go:\n - run(), checkAndProcess(), healthCheck()\n - handleStuckPolecat(), getNudgeCount(), recordNudge()\n - processShutdownRequests(), verifyPolecatState()\n - autoSpawnForReadyWork(), etc.\n\n2. Keep session management functions:\n - Start(), Stop(), Status() - for starting/stopping Claude session\n - State persistence for session identity\n\n3. Update gt witness commands to use Claude session model\n\n4. Test that mol-witness-patrol executes correctly in Claude session\n\n## Related\n\n- gt-zde4: Fixed Witness CLAUDE.md to emphasize mol-following\n- gt-59zd: Cancelled - wrong architecture\n- witness.md.tmpl: Already rewritten with ZFC principle","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T22:42:13.02343-08:00","updated_at":"2025-12-27T21:29:52.763067-08:00","deleted_at":"2025-12-27T21:29:52.763067-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ki3qa","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy, no mail, no action needed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:43:42.061004-08:00","updated_at":"2025-12-27T21:26:03.495134-08:00","deleted_at":"2025-12-27T21:26:03.495134-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kjnt","title":"Polecat Mood Plugin (standard)","description":"Standard witness plugin molecule that assesses polecat emotional state.\n\n## Molecule Definition\n\n```json\n{\n \"id\": \"mol-polecat-mood\",\n \"title\": \"Assess mood for {{polecat_name}}\",\n \"description\": \"Analyze captured output and determine emotional state.\\n\\nVars:\\n- {{polecat_name}} - The polecat to assess\\n- {{captured_output}} - Recent tmux capture\\n\\nOutput: gt polecat mood {{polecat_name}} \u003cemoji\u003e\",\n \"labels\": [\"template\", \"plugin\", \"witness\", \"tier:haiku\"],\n \"issue_type\": \"task\"\n}\n```\n\n## Mood Emojis\n\n```\n😺 working Active tool calls, making progress\n😸 productive Completing tasks, tests passing\n🐱 idle Waiting at prompt, no recent activity\n😼 confident Self-reviewing, about to submit\n😿 struggling Repeated errors, test failures\n🙀 stuck No progress for 10+ min\n😽 done Work complete, requesting shutdown\n😾 blocked Explicitly waiting on dependency\n```\n\n## Bonding\n\nDuring witness patrol plugin-run step:\n\n```bash\nbd mol bond mol-polecat-mood $PATROL_WISP \\\n --ref mood-{{polecat_name}} \\\n --var polecat_name=ace \\\n --var captured_output=\"$TMUX_CAPTURE\"\n```\n\n## Installation\n\n```bash\nbd mol install mol-polecat-mood # From Mol Mall\n```\n\n## Customization\n\nFork the molecule and modify description to change assessment criteria or add custom moods. Install your fork to override.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T16:17:12.841134-08:00","updated_at":"2025-12-27T21:29:56.562233-08:00","dependencies":[{"issue_id":"gt-kjnt","depends_on_id":"gt-u818","type":"blocks","created_at":"2025-12-21T16:17:20.444775-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.562233-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-kkcql","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:53:55.840419-08:00","updated_at":"2025-12-27T21:26:02.879581-08:00","deleted_at":"2025-12-27T21:26:02.879581-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kl5e","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:36:01.786393-08:00","updated_at":"2025-12-27T21:26:04.642845-08:00","deleted_at":"2025-12-27T21:26:04.642845-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmc7","title":"Digest: mol-deacon-patrol","description":"Patrol 2: Quick scan, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:05:53.201478-08:00","updated_at":"2025-12-27T21:26:04.499145-08:00","deleted_at":"2025-12-27T21:26:04.499145-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmn","title":"Batch Work System: Integration branch, merge queue, landing","description":"## Overview\n\nThis epic tracks batch work coordination in GGT. Key insight: **work is a stream, not discrete swarms**.\n\nThere are no \"swarm IDs\" - the epic IS the grouping, the merge queue IS the coordination. \"Swarming an epic\" is colloquial for spawning multiple workers on its children.\n\n## Architecture\n\n- **Epics** group related work\n- **Issues** are individual tasks \n- **Dependencies** encode ordering (multi-wave emerges automatically)\n- **Merge queue** coordinates completed work\n- **Workers** process issues independently\n\nSee Key Decision #11 in docs/architecture.md: \"Work is a Stream (No Swarm IDs)\"\n\n## What This Epic Covers\n\n1. Integration branch management for batch merges\n2. Refinery semantic merge processing\n3. Witness landing protocol (cleanup)\n4. Report generation (epic-based, not swarm-ID-based)\n5. Git safety auditing\n6. Work reconciliation\n\n## What Was Removed\n\n- Swarm IDs (sw-1, sw-2, etc.)\n- Per-swarm directories (.gastown/swarms/\u003cid\u003e/)\n- Swarm manifest/state/events files\n- swarm-started/swarm-landed beads\n\nThese were redundant - beads already provides hierarchy, status, dependencies, and history.\n\n## References\n\n- Architecture: docs/architecture.md (Key Decision #11, Multi-Wave Work Processing)\n- PGT: swarm/manager.py, swarm/landing.py (behavioral reference, but ignore swarm ID patterns)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-16T00:08:15.339127-08:00","updated_at":"2025-12-27T21:29:45.698939-08:00","deleted_at":"2025-12-27T21:29:45.698939-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-kmn.1","title":"Swarm package: Swarm, SwarmTask, SwarmState types","description":"Define core swarm types.\n\n## Types (in `internal/swarm/`)\n\n```go\n// SwarmState represents swarm lifecycle\ntype SwarmState string\nconst (\n SwarmCreated SwarmState = \"created\"\n SwarmActive SwarmState = \"active\"\n SwarmMerging SwarmState = \"merging\"\n SwarmLanded SwarmState = \"landed\"\n SwarmFailed SwarmState = \"failed\"\n SwarmCancelled SwarmState = \"cancelled\"\n)\n\n// Swarm references a beads epic that tracks swarm work\ntype Swarm struct {\n ID string // matches beads epic ID\n RigName string\n EpicID string // beads epic tracking this swarm\n BaseCommit string // git SHA all workers branch from\n Integration string // integration branch name\n State SwarmState\n CreatedAt time.Time\n Workers []string // polecat names assigned\n}\n\n// SwarmTask represents a single task in the swarm (maps to beads issue)\ntype SwarmTask struct {\n IssueID string // beads issue ID\n Assignee string // polecat name\n Branch string // worker branch name\n State string // mirrors beads status\n}\n```\n\n## Note\n\nSwarm state is primarily stored IN beads. These types are in-memory representations for the SwarmManager to work with. No separate manifest.json files.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:08:30.364047-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.1","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:08:30.364431-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.1","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-16T00:11:20.646487-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.10","title":"Batch work landing report generation","description":"Generate reports on batch work completion.\n\n## Trigger\n\nWhen all children of an epic are closed, generate a completion report.\n\n## Report Contents\n\n- Epic metadata (ID, title, created by, duration)\n- Task summary (completed counts, timing)\n- Per-task details (issue, assignee, time taken)\n- Git stats (commits merged, lines changed)\n- Cleanup stats (branches deleted)\n\n## Output Locations\n\n1. Mail to Mayor - summary report\n2. Optional: save to file with --save flag\n\n## Command\n\n```bash\ngt report --epic \u003cepic-id\u003e [--save report.md]\n```\n\n## Format\n\nMarkdown with sections:\n- Summary\n- Tasks\n- Workers\n- Timeline\n\n## Note\n\nThis replaces the swarm-based reporting (gt-662). Reports are generated from epic/issue data, not from separate swarm state files.\n\n## Reference\n\nPGT: swarm/report.py (for format ideas, ignore swarm ID patterns)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T00:10:23.242931-08:00","updated_at":"2025-12-27T21:29:54.478461-08:00","dependencies":[{"issue_id":"gt-kmn.10","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:10:23.243287-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.478461-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmn.11","title":"Daemon heartbeat: worker and queue monitoring","description":"Daemon periodic checks for work progress.\n\n## Heartbeat Actions\n\nEvery N seconds (configurable):\n\n1. Check Witness health per rig\n - Poke if witness needs to nudge workers\n \n2. Check Refinery queue per rig\n - Poke if pending work in queue\n \n3. Check work completion\n - If epic children all closed, notify Mayor\n\n## Eventually Convergent\n\nMultiple signals reinforce state:\n- Polecat signals done → Witness notices → pokes Refinery\n- Daemon heartbeat → checks queue → pokes Refinery\n- Beads status → queryable by any agent\n\nEven if one signal missed, system converges.\n\n## Interface\n\n```go\nfunc (d *Daemon) HeartbeatLoop() {\n for {\n for _, rig := range d.rigs {\n d.CheckWitness(rig)\n d.CheckRefinery(rig)\n d.CheckWorkProgress(rig)\n }\n time.Sleep(d.config.HeartbeatInterval)\n }\n}\n```\n\n## Notifications\n\nUse mail with low priority for heartbeat pokes.\nAgents can ignore if already processing.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T00:10:25.169417-08:00","updated_at":"2025-12-27T21:29:54.46958-08:00","dependencies":[{"issue_id":"gt-kmn.11","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:10:25.169767-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.11","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T11:50:17.699125-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.46958-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmn.2","title":"SwarmManager: create, start, update, query operations","description":"SwarmManager operations for creating and managing swarms.\n\n## Interface\n\n```go\ntype SwarmManager struct {\n rig *Rig\n beads *BeadsWrapper // shells out to bd CLI\n}\n\nfunc NewSwarmManager(rig *Rig) *SwarmManager\n\n// Lifecycle\nfunc (m *SwarmManager) Create(epicID string, workers []string) (*Swarm, error)\nfunc (m *SwarmManager) Start(swarmID string) error\nfunc (m *SwarmManager) UpdateState(swarmID string, state SwarmState) error\nfunc (m *SwarmManager) Cancel(swarmID string, reason string) error\n\n// Queries (delegate to beads)\nfunc (m *SwarmManager) GetSwarm(id string) (*Swarm, error)\nfunc (m *SwarmManager) GetReadyTasks(swarmID string) ([]SwarmTask, error)\nfunc (m *SwarmManager) GetActiveTasks(swarmID string) ([]SwarmTask, error)\nfunc (m *SwarmManager) IsComplete(swarmID string) (bool, error)\n```\n\n## Implementation Notes\n\n- `GetReadyTasks` wraps `bd ready --parent \u003cepicID\u003e`\n- `IsComplete` checks if all child issues are closed\n- State transitions update the epic's description or a tag field\n- No separate manifest files - beads IS the source of truth","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:04.814385-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.2","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:04.814876-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.2","depends_on_id":"gt-kmn.1","type":"blocks","created_at":"2025-12-16T00:11:20.744039-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.3","title":"Integration branch management for swarms","description":"Manage integration branches for swarm merging strategy.\n\n## Branch Naming\n\n- Integration branch: `integration/\u003cswarm-id\u003e` (e.g., integration/sw-1)\n- Worker branches: `\u003cswarm-id\u003e/\u003cpolecat\u003e` (e.g., sw-1/Toast)\n\n## Operations\n\n```go\n// Create integration branch at swarm start\nfunc (m *SwarmManager) CreateIntegrationBranch(swarmID string) error\n// - Creates from BaseCommit\n// - Pushes to origin\n\n// Merge worker branch to integration\nfunc (r *Refinery) MergeToIntegration(swarmID, workerBranch string) error\n// - Fetches worker branch\n// - Merges to integration/\u003cswarm-id\u003e\n// - Handles conflicts (semantic merge)\n\n// Land integration to main\nfunc (r *Refinery) LandToMain(swarmID string) error\n// - Merges integration/\u003cswarm-id\u003e to main\n// - Pushes to origin\n// - Triggers cleanup\n\n// Cleanup branches after landing\nfunc (m *SwarmManager) CleanupBranches(swarmID string) error\n// - Deletes integration/\u003cswarm-id\u003e (local + remote)\n// - Deletes all \u003cswarm-id\u003e/\u003cpolecat\u003e branches\n```\n\n## All Workers Same Base\n\nWhen swarm starts:\n1. Record current main HEAD as BaseCommit\n2. All polecats branch from BaseCommit\n3. Integration branch also starts at BaseCommit\n\nThis ensures clean merges within the swarm.\n\n## Reference\n\nPGT: ephemeral.py (create_integration_branch, merge_integration_to_main)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:06.825718-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.3","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:06.82608-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.3","depends_on_id":"gt-kmn.1","type":"blocks","created_at":"2025-12-16T00:11:20.848983-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.3","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-16T00:11:20.943465-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.4","title":"Refinery semantic merge processing loop","description":"Implement the Refinery semantic merge loop for processing polecat work.\n\n## Overview\n\nThe Refinery is a fully empowered Claude Code agent that:\n1. Fetches polecat branches\n2. Merges to integration branch\n3. Resolves conflicts semantically (re-implements if needed)\n4. Runs tests\n5. Escalates only when truly stuck\n\n## Processing Loop\n\n1. Check queue for completed tasks (polecat done, not yet merged)\n2. For each task:\n - Fetch polecat branch\n - Attempt merge to integration/\u003cswarm-id\u003e\n - On conflict: semantic resolution (understand intent, re-implement)\n - Run tests if configured\n - On success: mark merged, file bead, notify witness\n - On failure: notify polecat (test fail) or escalate (stuck)\n3. When all tasks merged: land integration to main\n\n## Semantic Merge\n\nWhen git merge fails, Refinery:\n1. Reads polecat changes and beads issue\n2. Understands intent\n3. Re-implements on current integration branch\n4. Only escalates if truly blocked\n\n## Daemon Poking\n\nRefinery is poked by:\n- Daemon heartbeat (periodic)\n- Witness notification (polecat done)\n- Mayor request (check queue)\n\n## Dependencies\n\nNeeds: gt-u1j.6 (mail), gt-u1j.3 (git), gt-u1j.14 (merge queue)\n\n## Reference\n\nSee PGT ephemeral.py for branch operations.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:19.536429-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.4","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:19.536808-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.4","depends_on_id":"gt-kmn.3","type":"blocks","created_at":"2025-12-16T00:11:22.86454-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.4","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-16T00:11:22.952967-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.4","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-16T00:11:23.046741-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.5","title":"Swarm beads naming convention and auto-filing","description":"Beads naming convention for swarm lifecycle events.\n\n## Bead IDs\n\n- swarm-started:\u003cswarm-id\u003e # Filed when swarm begins\n- swarm-task-merged:\u003cswarm-id\u003e:\u003ctask-id\u003e # Filed per task merge\n- swarm-landed:\u003cswarm-id\u003e # Filed when merged to main\n- swarm-failed:\u003cswarm-id\u003e # Filed on unrecoverable failure\n\n## swarm-started Bead\n\nFiled by: Mayor/Refinery at swarm creation\nContains:\n- Swarm title and description\n- List of tasks (issue IDs)\n- List of workers (polecat names)\n- Base commit SHA\n\n## swarm-task-merged Bead\n\nFiled by: Refinery after each successful merge\nContains:\n- Task issue ID and title\n- Polecat name\n- Merge commit SHA\n- Test results summary\n\n## swarm-landed Bead\n\nFiled by: Refinery after landing to main\nContains:\n- Final merge commit SHA\n- Stats: tasks completed, time taken\n- List of all merged task beads\n- Integration branch name (now deleted)\n\n## swarm-failed Bead\n\nFiled by: Refinery on unrecoverable failure\nContains:\n- Failure reason\n- Tasks merged vs not merged\n- Recovery guidance\n\n## Why Beads (not just mail)\n\n- Persistent, queryable, synced to git\n- Other agents can check status without mail\n- Creates audit trail\n- Works when agents offline","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:46.930708-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.5","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:46.931086-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.5","depends_on_id":"gt-kmn.1","type":"blocks","created_at":"2025-12-16T00:11:24.746439-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.5","depends_on_id":"gt-u1j.13","type":"blocks","created_at":"2025-12-16T00:11:24.81726-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.6","title":"Witness swarm landing protocol","description":"Witness responsibilities for swarm landing cleanup.\n\n## Trigger\n\nWitness receives \"swarm-landed\" mail from Refinery containing:\n- swarm_id\n- merge_commit\n- action: \"cleanup\"\n\n## Landing Protocol\n\n### Phase 1: Preflight\nVerify all polecat sessions stopped:\n- Check tmux for active sessions\n- Kill any remaining sessions\n- Wait for graceful shutdown\n\n### Phase 2: Git Audit\nFor each polecat in swarm:\n- Check for uncommitted changes\n- Check for unpushed commits\n- Check for stashes\n- Classify as \"beads-only\" (safe) or \"code at risk\"\n- If code at risk: escalate, abort landing\n\n### Phase 3: Cleanup\nFor each polecat:\n- Clear state entry (mark as idle)\n- Delete inbox (or archive)\n- Delete worker branch (local + remote)\n\n### Phase 4: Final Report\n- Update swarm state to \"landed\"\n- File landing report bead\n- Mail Mayor with summary\n\n## Code at Risk Handling\n\nIf git audit finds code at risk:\n1. Abort cleanup for that polecat\n2. Mail Mayor with details\n3. Human intervention required\n\n## Reference\n\nPGT: swarm/landing.py (6-phase protocol), swarm/git_audit.py","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:49.002317-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.6","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:49.002696-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.6","depends_on_id":"gt-kmn.8","type":"blocks","created_at":"2025-12-16T00:11:26.758388-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.6","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-16T00:11:26.839345-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.7","title":"CLI: gt swarm commands (create, status, list, land)","description":"CLI commands for swarm management.\n\n## Commands\n\n### gt swarm create\nCreate a new swarm in a rig.\n\n```\ngt swarm create \u003crig\u003e --title \"Fix all P1 bugs\" \\\n --task gt-abc --task gt-def \\\n --worker Toast --worker Nux\n```\n\nOptions:\n- --title: Swarm title (required)\n- --task: Beads issue IDs to include (repeatable)\n- --worker: Polecat names (repeatable, or --workers=3 to auto-assign)\n- --start: Start swarm immediately after creation\n\n### gt swarm status \u003cswarm-id\u003e\nShow swarm status and progress.\n\nOutput:\n- Swarm metadata (title, created, workers)\n- Task status (pending/in_progress/completed/merged)\n- Integration branch status\n- Overall progress percentage\n\n### gt swarm list [rig]\nList swarms, optionally filtered by rig or status.\n\n```\ngt swarm list --status=active\ngt swarm list gastown --status=landed\n```\n\n### gt swarm land \u003cswarm-id\u003e\nTrigger manual landing (normally automatic).\n\n### gt swarm cancel \u003cswarm-id\u003e\nCancel a swarm in progress.\n\n## Output Format\n\nSupport --json for machine-readable output.\n\n## Dependencies\n\nNeeds: gt-kmn.1 (types), gt-kmn.2 (manager)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T00:09:50.57674-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-kmn.7","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:09:50.577096-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.7","depends_on_id":"gt-kmn.2","type":"blocks","created_at":"2025-12-16T00:11:29.0905-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-kmn.8","title":"Git safety audit module","description":"Port git_audit.py safety checks to Go.\n\n## Purpose\n\nBefore destroying polecat clones, verify no code is at risk.\n\n## Checks\n\n1. Uncommitted changes (staged and unstaged)\n2. Unpushed commits\n3. Git stashes\n4. Local-only branches (not tracking remote)\n\n## Beads-Only Classification\n\nSome files are safe to discard:\n- .beads/*\n- .claude/settings.json\n\nChanges only affecting these = \"beads-only\" = safe to destroy.\n\n## Interface\n\n```go\ntype GitAuditResult struct {\n Polecat string\n Path string\n UncommittedFiles []string\n UncommittedIsBeadsOnly bool\n UnpushedCommits int\n Stashes []StashInfo\n StashesAreBeadsOnly bool\n LocalBranches []string\n Error error\n}\n\nfunc (r *GitAuditResult) HasCodeAtRisk() bool\n\nfunc AuditPolecat(name string, path string) *GitAuditResult\nfunc AuditPolecats(polecats []PolecatInfo) []*GitAuditResult\n```\n\n## Reference\n\nPGT: swarm/git_audit.py","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T00:10:19.52029-08:00","updated_at":"2025-12-27T21:29:54.495745-08:00","dependencies":[{"issue_id":"gt-kmn.8","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:10:19.520646-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.8","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-16T00:11:29.174184-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.495745-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmn.9","title":"Work reconciliation module","description":"Port reconcile.py work verification to Go.\n\n## Purpose\n\nVerify assigned issues are completed before landing.\n\n## Interface\n\n```go\ntype ReconcileResult struct {\n CompletedIssues []IssueInfo\n IncompleteIssues []IssueInfo\n AllComplete bool\n PolecatsChecked []string\n Errors []string\n}\n\nfunc ReconcileWork(rig string, polecats []string) *ReconcileResult\nfunc GetAssignedIssues(assignee string) []IssueInfo\nfunc UnassignIssue(issueID string, resetStatus bool) error\n```\n\n## Logic\n\n1. Get list of polecats in swarm\n2. For each polecat, query beads for assigned issues\n3. Categorize as completed (closed) or incomplete (open/in_progress)\n4. Return summary for landing decision\n\n## Handling Incomplete Work\n\nIf issues incomplete:\n- Abort landing (default)\n- Or force landing (leaves work unfinished)\n- Unassign incomplete issues for future work\n\n## Reference\n\nPGT: swarm/reconcile.py","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T00:10:21.351476-08:00","updated_at":"2025-12-27T21:29:54.487229-08:00","dependencies":[{"issue_id":"gt-kmn.9","depends_on_id":"gt-kmn","type":"parent-child","created_at":"2025-12-16T00:10:21.351869-08:00","created_by":"daemon"},{"issue_id":"gt-kmn.9","depends_on_id":"gt-u1j.13","type":"blocks","created_at":"2025-12-16T00:11:29.259187-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.487229-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kmv99","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:32.607007-08:00","updated_at":"2025-12-27T21:26:02.081585-08:00","deleted_at":"2025-12-27T21:26:02.081585-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kndf","title":"Digest: mol-deacon-patrol","description":"Patrol 19: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:02:19.530366-08:00","updated_at":"2025-12-27T21:26:04.869611-08:00","deleted_at":"2025-12-27T21:26:04.869611-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ko73c","title":"Digest: mol-deacon-patrol","description":"P9","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:24:32.507404-08:00","updated_at":"2025-12-27T21:26:01.671325-08:00","deleted_at":"2025-12-27T21:26:01.671325-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-koja9","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 17: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:49.243562-08:00","updated_at":"2025-12-28T11:22:49.243562-08:00","closed_at":"2025-12-28T11:22:49.243529-08:00"}
{"id":"gt-kp2","title":"Add merge-request type to Beads schema","description":"Define the merge-request bead type with fields:\n- branch: source branch (e.g., polecat/Nux/gt-xxx)\n- target_branch: destination (main or integration/xxx)\n- source_issue: the work being merged (gt-xxx)\n- worker: who did the work\n- rig: which rig\n- merge_commit: (filled on close) SHA of merge commit\n- close_reason: (filled on close) success/failure details\n\nThis may require beads schema changes or just convention.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T23:02:14.471248-08:00","updated_at":"2025-12-27T21:29:45.673918-08:00","dependencies":[{"issue_id":"gt-kp2","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:55.338148-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.673918-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kp3s3","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-ds3h3 - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T01:59:05.781319-08:00","updated_at":"2025-12-27T21:29:55.325758-08:00","deleted_at":"2025-12-27T21:29:55.325758-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-kqwpb","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All healthy, no issues","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:53:25.592462-08:00","updated_at":"2025-12-27T21:26:01.778644-08:00","deleted_at":"2025-12-27T21:26:01.778644-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kr3v6","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:05:52.581321-08:00","updated_at":"2025-12-27T21:26:03.009125-08:00","deleted_at":"2025-12-27T21:26:03.009125-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-krut","title":"Polecat .beads/ contamination from stale branches","description":"**Fixed in commit 8699b7b**\n\n## Problem\n\nWhen polecats are spawned from branches that previously had bd sync run on them, the .beads/ directory from the branch contains stale issues.jsonl, config.yaml, etc.\n\nThe setupSharedBeads() function was creating a redirect file, but NOT cleaning up the existing .beads/ contents first.\n\n## Root Cause\n\n1. Old polecat branch has .beads/ tracked in git (from previous bd sync)\n2. gt spawn uses Add() which reuses existing branch\n3. WorktreeAddExisting() checks out branch, including .beads/ files\n4. setupSharedBeads() creates redirect, but other files remain\n5. bd commands see stale data\n\n## Fix\n\nsetupSharedBeads() now:\n1. Removes entire .beads/ directory if it exists\n2. Recreates fresh with only redirect file\n3. Redirect points directly to mayor/rig/.beads (not through rig root)","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-22T12:22:42.781701-08:00","updated_at":"2025-12-27T21:29:53.24314-08:00","deleted_at":"2025-12-27T21:29:53.24314-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-kspu","title":"Work on gt-e11: gt mail send priority flag is incompatibl...","description":"Work on gt-e11: gt mail send priority flag is incompatible with bd mail send. Run 'bd show gt-e11' for details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T07:53:37.267279-08:00","updated_at":"2025-12-27T21:29:56.780749-08:00","deleted_at":"2025-12-27T21:29:56.780749-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ksrt.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-ksrt\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T16:16:46.436616-08:00","updated_at":"2025-12-27T21:29:55.481382-08:00","deleted_at":"2025-12-27T21:29:55.481382-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ktal","title":"Epic: Refinery Engineer Autonomy","description":"Make Refinery Engineer fully autonomous:\n\n## Goal\nRefinery can run indefinitely, processing all polecat work through merge queue, cycling sessions as needed, without human intervention.\n\n## Key Components\n1. Role prompting (CLAUDE.md, templates)\n2. Handoff mechanism (pinned beads)\n3. Context detection (gt prime)\n4. Communication protocol (Witness, Deacon)\n5. Future: Merge orchestration plugins\n\n## Success Criteria\n- Refinery starts, reads handoff, knows what to do\n- Processes merges sequentially with conflict resolution\n- Cycles sessions cleanly via Deacon\n- Communicates results to Witness\n- No work left behind","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-19T18:09:30.84543-08:00","updated_at":"2025-12-27T21:29:53.890138-08:00","deleted_at":"2025-12-27T21:29:53.890138-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ktf3","title":"bd ready --type: missing type filter for MQ integration","description":"bd ready lacks --type flag that engineer.go expects.\n\n## Code (internal/refinery/engineer.go)\nreadyMRs, err := e.beads.ReadyWithType(\"merge-request\")\n\n## Actual bd ready\nNo --type flag - only supports assignee, label, priority filters.\n\n## Impact\nRefinery can't find merge-requests in queue, so MQ doesn't process anything.\n\n## Fix Options\n1. Add --type flag to bd ready\n2. Use bd list --type=merge-request --status=open instead\n3. Both (ready filters for unblocked, list for all)\n\nThis is blocking the entire MQ pipeline.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-19T14:57:16.297847-08:00","updated_at":"2025-12-27T21:29:45.545131-08:00","deleted_at":"2025-12-27T21:29:45.545131-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-kut","title":"Test message","description":"Testing GGT mail system","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T16:12:11.437529-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"message"}
{"id":"gt-kuyo7","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, no messages, quiet cycle","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:03:59.845797-08:00","updated_at":"2025-12-27T21:26:03.042237-08:00","deleted_at":"2025-12-27T21:26:03.042237-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-kxw7x","title":"Digest: mol-deacon-patrol","description":"Patrol 16: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:00:34.661476-08:00","updated_at":"2025-12-27T21:26:00.482697-08:00","deleted_at":"2025-12-27T21:26:00.482697-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ky1nt","title":"Review PR #52: fix: Close MR beads after successful merge from queue","description":"Review PR #52. Verify MR beads are properly closed after merge. Approve with gh pr review --approve if good.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-03T11:40:27.032551-08:00","updated_at":"2026-01-03T11:44:19.915671-08:00","closed_at":"2026-01-03T11:44:19.915671-08:00","close_reason":"PR reviewed and approved","created_by":"mayor"}
{"id":"gt-l0z34","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 9: routine, healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:25:37.061769-08:00","updated_at":"2025-12-27T21:26:01.898256-08:00","deleted_at":"2025-12-27T21:26:01.898256-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l1o","title":"Harness \u0026 Priming: Document architecture and update all role contexts","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-17T16:42:29.314113-08:00","updated_at":"2025-12-27T21:29:54.270631-08:00","deleted_at":"2025-12-27T21:29:54.270631-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-l32kx","title":"implement","description":"Implement the solution for gt-u2vg. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:53:37.31087-08:00","updated_at":"2025-12-25T14:12:42.103957-08:00","dependencies":[{"issue_id":"gt-l32kx","depends_on_id":"gt-6n1cy","type":"parent-child","created_at":"2025-12-25T01:53:37.328352-08:00","created_by":"stevey"},{"issue_id":"gt-l32kx","depends_on_id":"gt-hkf8j","type":"blocks","created_at":"2025-12-25T01:53:37.344719-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.103957-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l3c","title":"Design: Polecat Beads write access","description":"Design for granting polecats direct beads write access.\n\n## Background\n\nWith Beads v0.30.0 tombstone-based rearchitecture, we have solid multi-agent support. Reversing the original read-only decision.\n\n## Benefits\n\n- Simplifies architecture (no mail-based issue filing proxy)\n- Empowers polecats to file discovered work\n- Beads handles work-disavowal\n\n## Complications\n\nFor OSS projects where you cannot commit to project .beads/, need per-rig beads repo configuration.\n\n## Subtasks (implementation)\n\n- gt-zx3: Per-rig beads configuration schema\n- gt-e1y: Worker prompting updates for beads access\n- gt-cjb: Witness proxy removal\n- gt-082: Beads sync in decommission checklist\n\n**Design complete.** Each subtask has full specification in its description.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-15T19:37:42.191734-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-l3gfn","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T07:29:32.029593-08:00","updated_at":"2025-12-25T07:29:32.029593-08:00","closed_at":"2025-12-25T07:29:32.029556-08:00"}
{"id":"gt-l42h","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-1wmw) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:55:01.79541-08:00","updated_at":"2025-12-25T15:52:58.278388-08:00","deleted_at":"2025-12-25T15:52:58.278388-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l4gm","title":"Crew workers: design documentation and remaining work","description":"## Summary\n\nCrew workers are user-managed persistent workspaces within a rig, distinct from polecats.\n\n## Current Implementation\n\n### Commands (all working)\n- `gt crew add \u003cname\u003e` - Create workspace (git clone)\n- `gt crew list` - List workspaces with status\n- `gt crew at \u003cname\u003e` - Attach to tmux session\n- `gt crew remove \u003cname\u003e` - Remove workspace\n- `gt crew pristine [\u003cname\u003e]` - git pull + bd sync\n- `gt crew refresh \u003cname\u003e` - Context cycle with handoff mail\n- `gt crew status [\u003cname\u003e]` - Detailed status\n- `gt crew rename \u003cname\u003e` - Rename workspace\n\n### Crew vs Polecat\n\n| Aspect | Crew | Polecat |\n|--------|------|---------|\n| Lifecycle | User-managed | Witness-managed |\n| Scheduling | Manual | `gt spawn` |\n| Merge flow | Direct push OK | Integration branch → refinery |\n| Garbage collection | Never | Auto on completion |\n| Identity | Long-lived (emma, dave) | Ephemeral (Nux, Toast) |\n\n## Known Bugs\n\n- **gt-70b3**: detectSender() doesn't recognize crew workers\n- **gt-vdp0**: Crew CLAUDE.md shows Refinery template\n\n## Potential Enhancements\n\n1. Rebase helper in `gt crew pristine` for conflict resolution\n2. Cross-rig crew support (crew worker in multiple rigs?)\n3. Better mail identity auto-detection for crew","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-18T21:49:58.524424-08:00","updated_at":"2025-12-27T21:29:57.076832-08:00","deleted_at":"2025-12-27T21:29:57.076832-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-l5l0f","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:14:41.221012-08:00","updated_at":"2025-12-27T21:26:02.756958-08:00","deleted_at":"2025-12-27T21:26:02.756958-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l6dzm","title":"Digest: mol-deacon-patrol","description":"Patrol 13: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:04.697254-08:00","updated_at":"2025-12-27T21:26:02.106315-08:00","deleted_at":"2025-12-27T21:26:02.106315-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l6r9","title":"Test molecule","description":"Test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T14:29:43.985973-08:00","updated_at":"2025-12-27T21:29:56.984534-08:00","deleted_at":"2025-12-27T21:29:56.984534-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l7cd","title":"Work on gt-role-template: Refine witness/CLAUDE.md role t...","description":"Work on gt-role-template: Refine witness/CLAUDE.md role template. See issue for details. Run 'bd show gt-role-template' to see the full issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T03:47:29.414394-08:00","updated_at":"2025-12-27T21:29:56.822691-08:00","deleted_at":"2025-12-27T21:29:56.822691-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l7uyu","title":"Digest: mol-deacon-patrol","description":"Patrol 6: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:31:17.646933-08:00","updated_at":"2025-12-27T21:26:00.813023-08:00","deleted_at":"2025-12-27T21:26:00.813023-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-l9owo","title":"Digest: mol-deacon-patrol","description":"Patrol 19: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:41.963501-08:00","updated_at":"2025-12-27T21:26:02.05242-08:00","deleted_at":"2025-12-27T21:26:02.05242-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lajrx","title":"Digest: mol-deacon-patrol","description":"Patrol 7: healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T06:09:55.036375-08:00","updated_at":"2025-12-28T06:09:55.036375-08:00","closed_at":"2025-12-28T06:09:55.036344-08:00"}
{"id":"gt-lcuxo","title":"Merge: furiosa-mjw349y2","description":"branch: polecat/furiosa-mjw349y2\ntarget: main\nsource_issue: furiosa-mjw349y2\nrig: gastown\nagent_bead: gt-gastown-polecat-furiosa","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:49:34.013211-08:00","updated_at":"2026-01-01T18:51:15.859145-08:00","closed_at":"2026-01-01T18:51:15.859145-08:00","close_reason":"Merged to main at e159489e","created_by":"gastown/polecats/furiosa"}
{"id":"gt-ldk8","title":"Witness should verify polecat submitted before stopping session","description":"## Problem\n\nWitness stopped dementus's session before it could call `gt done`, losing the MR submission. I had to manually push and submit the branch.\n\n## Root Cause\n\nWitness cleanup is triggered by nudge/manual check rather than by receiving POLECAT_DONE message. When Witness cleans up based on issue status (closed), it doesn't wait for the polecat to complete its shutdown sequence.\n\n## Expected Behavior\n\nWitness should only stop a polecat session after:\n1. Receiving POLECAT_DONE message from that polecat, OR\n2. Timeout waiting for POLECAT_DONE after issue is closed\n\n## Current Behavior\n\nWitness stops sessions immediately when asked to check for completions, even if polecat hasn't called `gt done` yet.\n\n## Fix\n\nIn mol-witness-patrol inbox-check step:\n- Only clean up polecats that have sent POLECAT_DONE\n- For polecats with closed issues but no DONE message, nudge them to complete\n- Add timeout before force-killing","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-22T23:54:12.969528-08:00","updated_at":"2025-12-27T21:29:53.074298-08:00","deleted_at":"2025-12-27T21:29:53.074298-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-ldm4","title":"verify-tests","description":"Run existing tests. Add new tests for new functionality.\nEnsure adequate coverage.\n\ngo test ./...\n\nFix any test failures before proceeding.\n\nDepends: implement","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:48:26.322056-08:00","updated_at":"2025-12-25T14:12:42.217455-08:00","deleted_at":"2025-12-25T14:12:42.217455-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-leeb","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-qwyu) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:58:52.59974-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","dependencies":[{"issue_id":"gt-leeb","depends_on_id":"gt-q6hl","type":"parent-child","created_at":"2025-12-21T21:58:52.600633-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-lek6","title":"gt rig reset --stale: Clear orphaned in_progress items","description":"Reset in_progress issues when their assigned agent no longer exists.\n\n## Problem\nWhen polecats die without cleanup, their issues remain in_progress forever.\nNeed a way to bulk-reset these orphaned items.\n\n## Command\n```bash\ngt rig reset --stale [--dry-run]\n```\n\n## Logic\nFor each in_progress issue in rig:\n1. Parse assignee (e.g., \"gastown/furiosa\")\n2. Map to tmux session name (gt-gastown-furiosa)\n3. If session does NOT exist:\n - Reset status to \"open\"\n - Clear assignee\n4. Exception: skip crew/* assignees (persistent identities)\n OR check if crew tmux session exists\n\n## Output\n```\nResetting stale work in gastown:\n gt-abc: gastown/furiosa (no session) → open\n gt-def: gastown/nux (no session) → open\n Skipped: gt-xyz: gastown/crew/max (persistent)\nReset 2 issues, skipped 1\n```\n\n## Related\n- gt-2kz: CLI cleanup commands for stale state\n- gt-rdmw: orphan-check in deacon patrol\n- gt-orphans command (list orphaned molecules)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-21T21:33:46.962413-08:00","updated_at":"2025-12-27T21:29:53.383925-08:00","deleted_at":"2025-12-27T21:29:53.383925-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-lf7c8","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final patrol, handing off","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:06:01.256064-08:00","updated_at":"2025-12-27T21:26:03.951969-08:00","deleted_at":"2025-12-27T21:26:03.951969-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lftg6","title":"Digest: mol-deacon-patrol","description":"Patrol 17","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T08:15:41.182015-08:00","updated_at":"2026-01-01T08:15:41.182015-08:00","closed_at":"2026-01-01T08:15:41.181975-08:00"}
{"id":"gt-lg66","title":"Mail should use wisps for ephemeral orchestration messages","description":"## Problem\n\nTown HQ beads are polluted with lifecycle orchestration messages:\n- POLECAT_STARTED notifications\n- Work assignments to polecats\n- \"Start work\" nudges\n- Test messages\n\nThese should never have been persistent beads. They accumulate forever.\n\n## Analysis\n\nMail serves two purposes:\n1. **Durable messages** - Handoffs, escalations, human-agent comms (need persistence)\n2. **Ephemeral signals** - Lifecycle pings, work assignments, nudges (should be wisps)\n\n## Options\n\n### Option A: Dual-inbox architecture\n- `gt mail inbox` checks both `.beads/` and `.beads-wisp/`\n- Sender specifies `--ephemeral` flag for transient messages\n- Ephemeral messages auto-expire or squash to digests\n\n### Option B: All mail becomes wisps\n- Default to wisp storage for all mail\n- Important messages explicitly promoted to persistent\n- Simpler model but loses audit trail for some messages\n\n### Option C: Message type determines storage\n- `message` type beads go to wisps by default\n- `handoff` type stays persistent\n- Automatic routing based on content\n\n## Recommendation\n\nOption A (dual-inbox) seems cleanest:\n- Explicit control via `--ephemeral`\n- Backwards compatible\n- Clear mental model\n\n## Acceptance Criteria\n\n- Lifecycle pings (POLECAT_STARTED, etc.) go to wisps\n- Work assignments go to wisps\n- Handoffs stay persistent\n- `gt mail inbox` shows both\n- Wisps auto-cleanup on patrol squash\n\n## Blocks\n\nThis blocks reliable swarm operations - every spawn pollutes HQ indefinitely.","status":"tombstone","priority":0,"issue_type":"feature","created_at":"2025-12-24T19:17:43.874045-08:00","updated_at":"2025-12-27T21:29:45.268297-08:00","deleted_at":"2025-12-27T21:29:45.268297-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-liv9i","title":"Merge: nux-mjxn8p5t","description":"branch: polecat/nux-mjxn8p5t\ntarget: main\nsource_issue: nux-mjxn8p5t\nrig: gastown\nagent_bead: gt-gastown-polecat-nux","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T17:59:46.736566-08:00","updated_at":"2026-01-03T12:46:58.645617-08:00","closed_at":"2026-01-03T12:46:58.645617-08:00","close_reason":"Merged to main at 386dbf85 (verified on main)","created_by":"gastown/polecats/nux"}
{"id":"gt-ljow","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:43","description":"Patrol 4: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:43:45.317294-08:00","updated_at":"2025-12-27T21:26:05.154186-08:00","deleted_at":"2025-12-27T21:26:05.154186-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ljr5m","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Interrupted, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:51:12.740668-08:00","updated_at":"2025-12-27T21:26:01.527469-08:00","deleted_at":"2025-12-27T21:26:01.527469-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lkskw","title":"Digest: mol-deacon-patrol","description":"Patrol 2: all healthy, no action needed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:28:55.321314-08:00","updated_at":"2025-12-27T21:26:00.83782-08:00","deleted_at":"2025-12-27T21:26:00.83782-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lly5x","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:33:27.266684-08:00","updated_at":"2025-12-27T21:26:02.552212-08:00","deleted_at":"2025-12-27T21:26:02.552212-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lnji","title":"gt polecat git-state command for pre-kill verification","description":"Add git-state subcommand to gt polecat for Witness pre-kill verification.\n\n## Usage\n```bash\ngt polecat git-state \u003crig\u003e/\u003cpolecat\u003e\n```\n\n## Output\n```\nGit State: gastown/furiosa\n\n Working Tree: clean | dirty\n Uncommitted: 0 files | N files (list)\n Unpushed: 0 commits | N commits ahead\n Stashes: 0 | N stashes\n\n Verdict: CLEAN (safe to kill) | DIRTY (needs cleanup)\n```\n\n## JSON output (--json flag)\n```json\n{\n \"clean\": true,\n \"uncommitted_files\": [],\n \"unpushed_commits\": 0,\n \"stash_count\": 0\n}\n```\n\n## Used by\n- Witness pre-kill verification (mol-witness-patrol)\n- Manual cleanup checks\n\n## Implementation\n- Check git status in polecat worktree\n- Check git log origin/main..HEAD\n- Check git stash list","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:43:10.035052-08:00","updated_at":"2025-12-27T21:29:53.175099-08:00","deleted_at":"2025-12-27T21:29:53.175099-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lno","title":"Swarm state persistence: manifest + state + events","description":"Upgrade swarm persistence to match PGT pattern.\n\n## Current State\nSingle .gastown/swarms.json with all swarms.\n\n## Target State (per-swarm)\n```\n.gastown/swarms/\n└── \u003cswarm-id\u003e/\n ├── manifest.json # Immutable config\n ├── state.json # Mutable progress\n ├── events.jsonl # Audit log\n └── report.md # Generated report\n```\n\n## File Schemas\n\n### manifest.json (immutable after creation)\n```json\n{\n \"id\": \"sw-1\",\n \"title\": \"Epic description\",\n \"epic_id\": \"gt-abc\",\n \"rig\": \"gastown\",\n \"base_commit\": \"abc123\",\n \"integration_branch\": \"swarm/sw-1\",\n \"target_branch\": \"main\",\n \"workers\": [\"Toast\", \"Nux\"],\n \"tasks\": [{\"id\": \"gt-xyz\", \"title\": \"...\"}],\n \"created_at\": \"2024-01-01T00:00:00Z\"\n}\n```\n\n### state.json (mutable)\n```json\n{\n \"state\": \"active\",\n \"task_states\": {\n \"gt-xyz\": {\"state\": \"merged\", \"assignee\": \"Toast\"}\n },\n \"updated_at\": \"2024-01-01T01:00:00Z\",\n \"error\": null\n}\n```\n\n### events.jsonl (append-only audit)\n```jsonl\n{\"event\": \"created\", \"ts\": \"...\", \"data\": {...}}\n{\"event\": \"task_assigned\", \"ts\": \"...\", \"data\": {...}}\n{\"event\": \"task_merged\", \"ts\": \"...\", \"data\": {...}}\n```\n\n## Benefits\n- Audit trail via events.jsonl\n- Manifest immutability prevents corruption\n- Cleaner separation of concerns\n- Per-swarm isolation\n\n## Migration\nKeep backward compat with swarms.json during transition.\n\n## Files to Modify\n- internal/swarm/manager.go: Refactor persistence\n- internal/cmd/swarm.go: SwarmStore → directory-based\n\n## Acceptance Criteria\n- [ ] Per-swarm directory structure\n- [ ] Events logged to JSONL\n- [ ] Manifest immutable after creation\n- [ ] List command aggregates from directories","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T14:48:14.151538-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-lom0","title":"Digest: mol-deacon-patrol","description":"Patrol 20: OK - Handoff threshold reached","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:02:40.085741-08:00","updated_at":"2025-12-27T21:26:04.861368-08:00","deleted_at":"2025-12-27T21:26:04.861368-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lpki","title":"test message","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:41:51.652131-08:00","updated_at":"2025-12-27T21:29:56.761096-08:00","deleted_at":"2025-12-27T21:29:56.761096-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-lqgf","title":"burn-or-loop","description":"Squash wisp and decide: loop or cycle session.\n\nIf context low: spawn new wisp and loop\nIf context high: handoff and request cycle\n\nNeeds: generate-summary","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.505125-08:00","updated_at":"2025-12-25T15:52:57.932675-08:00","dependencies":[{"issue_id":"gt-lqgf","depends_on_id":"gt-g261","type":"blocks","created_at":"2025-12-23T01:41:54.521716-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T15:52:57.932675-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lsfqq","title":"Hook ergonomics: auto-replace completed pins","description":"## Problem\n\nWhen pinning a new bead, if the old pinned bead is complete, the new pin should automatically replace it. Currently this fails and requires manual unpinning (which doesn't exist as a command).\n\nObserved friction:\n1. Old patrol wisp 100% complete, still pinned\n2. Created new patrol wisp\n3. `gt mol attach` failed: 'not pinned'\n4. Tried to close old wisp: 'cannot close pinned issue'\n5. Tried `gt hook unpin`: command doesn't exist\n6. Stuck in limbo\n\n## Proposed fix\n\n- `gt hook \u003cnew-bead\u003e` auto-replaces if old pin is complete\n- Require `--force` if old pin is incomplete\n- `gt mol attach` should auto-pin if bead isn't pinned\n\n## Orphan detection\n\nUnpinned incomplete molecules aren't stale orphans until they create graph pressure (blocking other work). Orphan detection should focus on dependency blocking, not just 'unpinned' status.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-26T17:29:28.717725-08:00","updated_at":"2025-12-27T21:29:54.783279-08:00","deleted_at":"2025-12-27T21:29:54.783279-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-lsjjb","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 9: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:02:18.589316-08:00","updated_at":"2026-01-01T07:02:18.589316-08:00","closed_at":"2026-01-01T07:02:18.58928-08:00"}
{"id":"gt-lta16","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 10: routine, healthy - midpoint","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:26:07.534823-08:00","updated_at":"2025-12-27T21:26:01.889513-08:00","deleted_at":"2025-12-27T21:26:01.889513-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y","title":"MVP Takeoff: Polecats just work","description":"## Goal\nGet to the point where we can sling any issue to a polecat with a formula and it \"just works\" end-to-end.\n\n## Success Criteria\n1. `gt daemon start` brings up the engine\n2. Deacon patrols: inbox, health checks, cleanup\n3. Witnesses patrol: monitor polecats, verify completion, cleanup\n4. Refineries patrol: process merge queue\n5. Polecat lifecycle: spawn → work → complete → cleanup\n6. Shiny formula: design → implement → review → test → submit\n\n## What This Enables\n- \"Engineer in a box\" - take any issue, expand via shiny, get a reviewed/tested PR\n- Reliable multi-agent coordination\n- Self-healing infrastructure (patrols fix problems)\n\n## Blocked By\n- gt-ronyn (deacon murder bug) - FIXED\n- Basic patrol formulas - DONE (mol-*-patrol.formula.toml exist)\n\n## Children\nSee linked tasks below.","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-27T14:19:03.856634-08:00","updated_at":"2025-12-27T21:29:45.234488-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:45.234488-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-lth8y.1","title":"Verify gt daemon start/stop works","description":"Verify the daemon orchestrator:\n\n```bash\ngt daemon start # Should start and stay running\ngt daemon status # Should show running\ngt daemon stop # Should stop cleanly\n```\n\nThe daemon is responsible for:\n- Respawning the deacon when it exits (context exhaustion)\n- Heartbeat monitoring\n- Clean shutdown coordination\n\nTest: Start, verify deacon session appears, stop, verify clean exit.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:22.630437-08:00","updated_at":"2025-12-27T21:29:45.226172-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.1","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:22.630913-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.226172-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.2","title":"Verify deacon patrol loop executes","description":"Verify the deacon runs its patrol formula:\n\n1. Start deacon: `gt daemon start`\n2. Watch deacon session: `tmux attach -t gt-deacon`\n3. Verify it cycles through:\n - inbox-check\n - trigger-pending-spawns\n - gate-evaluation \n - health-scan\n - orphan-check\n - session-gc\n - context-check\n - loop-or-exit\n\nThe deacon should continuously patrol, not stall.\n\nTest: Watch 2-3 patrol cycles complete.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:23.783795-08:00","updated_at":"2025-12-27T21:29:45.217448-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.2","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:23.785539-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.2","depends_on_id":"gt-lth8y.1","type":"blocks","created_at":"2025-12-27T14:20:01.767406-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.217448-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.3","title":"Verify witness patrol loop executes","description":"Verify witnesses run their patrol formula:\n\n1. Check witness session: `gt session status gastown/witness`\n2. If not running: `gt session start gastown/witness`\n3. Watch witness: `tmux attach -t gt-gastown-witness`\n4. Verify it cycles through:\n - inbox-check\n - process-cleanups\n - check-refinery\n - survey-workers\n - context-check\n - loop-or-exit\n\nTest for both rigs: gastown and beads.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:25.131575-08:00","updated_at":"2025-12-27T21:29:45.20901-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.3","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:25.133321-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.3","depends_on_id":"gt-lth8y.1","type":"blocks","created_at":"2025-12-27T14:20:01.798714-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.20901-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.4","title":"Verify refinery patrol loop executes","description":"Verify refineries run their patrol formula:\n\n1. Check refinery session: `gt session status gastown/refinery`\n2. If not running: `gt session start gastown/refinery`\n3. Watch refinery: `tmux attach -t gt-gastown-refinery`\n4. Verify it handles:\n - Checking for pending MRs\n - Processing merge queue\n - Reporting results\n\nTest for both rigs: gastown and beads.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:26.348416-08:00","updated_at":"2025-12-27T21:29:45.200674-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.4","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:26.350081-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.4","depends_on_id":"gt-lth8y.1","type":"blocks","created_at":"2025-12-27T14:20:01.829305-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.200674-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.5","title":"Test polecat lifecycle end-to-end","description":"Test the full polecat lifecycle:\n\n1. **Spawn**: Sling a test issue to a polecat\n```bash\nbd create --title=\"Test issue for polecat\" --type=task\ngt sling gt-\u003cid\u003e gastown\n```\n\n2. **Execute**: Watch polecat work\n```bash\ntmux attach -t gt-gastown-\u003cname\u003e\n```\n\n3. **Complete**: Polecat should:\n - Close the issue: `bd close gt-\u003cid\u003e`\n - Commit work: `git commit \u0026\u0026 git push`\n - Send POLECAT_DONE mail to witness\n\n4. **Cleanup**: Witness should:\n - Receive POLECAT_DONE\n - Verify git state clean\n - Kill session and remove worktree\n\nSuccess: Issue closed, PR merged, polecat cleaned up.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:47.678973-08:00","updated_at":"2025-12-27T21:29:45.192186-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.5","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:47.679492-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.5","depends_on_id":"gt-lth8y.2","type":"blocks","created_at":"2025-12-27T14:20:02.792824-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.5","depends_on_id":"gt-lth8y.3","type":"blocks","created_at":"2025-12-27T14:20:02.824708-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.192186-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.6","title":"Test shiny formula end-to-end","description":"Test the shiny (engineer-in-box) formula:\n\n1. **Create issue with formula**:\n```bash\ngt sling shiny --var feature=\"Add hello world endpoint\" gastown\n```\n\n2. **Watch polecat execute shiny steps**:\n - design: Architecture thinking\n - implement: Write the code\n - review: Self-review\n - test: Run tests\n - submit: Create PR\n\n3. **Verify output**:\n - Design notes in issue/commit\n - Implementation matches design\n - Tests pass\n - PR created and ready for merge\n\nSuccess: A reviewed, tested PR from a single command.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-27T14:19:49.453837-08:00","updated_at":"2025-12-27T21:29:45.183783-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.6","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:49.455617-08:00","created_by":"daemon"},{"issue_id":"gt-lth8y.6","depends_on_id":"gt-lth8y.5","type":"blocks","created_at":"2025-12-27T14:20:03.991745-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.183783-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lth8y.7","title":"Fix patrol role priming","description":"Patrol roles (deacon, witness, refinery) need proper context priming.\n\nCurrent issue: Sessions start but may not have proper CLAUDE.md context.\n\nFix:\n1. Ensure role templates include patrol formula reference\n2. Verify SessionStart hook runs gt prime\n3. Check that patrol loop starts automatically\n\nRelated: Role templates in internal/templates/roles/*.md.tmpl","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T14:19:50.867349-08:00","updated_at":"2025-12-27T21:29:45.781691-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-lth8y.7","depends_on_id":"gt-lth8y","type":"parent-child","created_at":"2025-12-27T14:19:50.868931-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.781691-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ltomz","title":"Digest: mol-deacon-patrol","description":"Patrol 17: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:35.729682-08:00","updated_at":"2025-12-27T21:26:02.072669-08:00","deleted_at":"2025-12-27T21:26:02.072669-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lve0","title":"Digest: mol-deacon-patrol","description":"Patrol #2: All healthy, no changes","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:29:43.926838-08:00","updated_at":"2025-12-27T21:26:04.383994-08:00","deleted_at":"2025-12-27T21:26:04.383994-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lwuu","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- {{issue}} - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T21:47:15.553926-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-lwuu.1","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue ({{issue}}) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:47:23.880531-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-lwuu.1","depends_on_id":"gt-lwuu","type":"parent-child","created_at":"2025-12-21T21:47:23.882049-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-lwuu.2","title":"implement","description":"Implement the solution for {{issue}}. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:47:46.876765-08:00","updated_at":"2025-12-25T11:45:02.114796-08:00","dependencies":[{"issue_id":"gt-lwuu.2","depends_on_id":"gt-lwuu","type":"parent-child","created_at":"2025-12-21T21:47:46.878332-08:00","created_by":"daemon"},{"issue_id":"gt-lwuu.2","depends_on_id":"gt-lwuu.1","type":"blocks","created_at":"2025-12-21T21:48:04.4865-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T11:45:02.114796-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-lx3n","title":"Witness startup: bond mol-witness-patrol on start","description":"Wire up Witness to automatically bond its patrol molecule on startup.\n\n## Desired behavior\nOn Witness session start:\n1. gt prime detects RoleWitness\n2. Check for existing in-progress patrol\n3. If found: resume from current step\n4. If not found: bd mol bond mol-witness-patrol --wisp\n5. Output patrol context to agent\n\n## Depends on\n- gt-83k0 (mol-witness-patrol definition)\n- gt-caih (handoff bead state persistence)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:43:42.840567-08:00","updated_at":"2025-12-27T21:25:59.928119-08:00","dependencies":[{"issue_id":"gt-lx3n","depends_on_id":"gt-83k0","type":"blocks","created_at":"2025-12-22T16:43:59.685455-08:00","created_by":"daemon"},{"issue_id":"gt-lx3n","depends_on_id":"gt-caih","type":"blocks","created_at":"2025-12-22T16:43:59.760763-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:25:59.928119-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lxn1c","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:07:03.19311-08:00","updated_at":"2025-12-27T21:26:02.984243-08:00","deleted_at":"2025-12-27T21:26:02.984243-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lxsw","title":"gt done: Command doesn't exist but documented in polecat CLAUDE.md","notes":"The polecat CLAUDE.md documents 'gt done' as the command to signal work is ready for merge queue, but running it gives 'unknown command'. Either implement the command or update the documentation.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-20T07:59:44.548479-08:00","updated_at":"2025-12-27T21:29:56.769402-08:00","deleted_at":"2025-12-27T21:29:56.769402-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-lxv2z","title":"Digest: mol-deacon-patrol","description":"Patrol 20: routine, handing off","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:20:25.378692-08:00","updated_at":"2025-12-27T21:26:03.503292-08:00","deleted_at":"2025-12-27T21:26:03.503292-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lxxh2","title":"Epic: Merge Queue Scaling","description":"Scale the merge queue beyond 20 polecats per rig.\n\n## Current Limits\n- Serial MQ works well for 5-20 polecats\n- Conflict rate increases with polecat count and task duration\n- At 50+ polecats, conflicts become frequent enough to bottleneck\n\n## Scaling Strategies (in order of implementation)\n\n### Phase 1: Smart Scheduling\n- Track file hotspots (which files change most)\n- Conflict prediction before spawn (warn if touching hot files)\n- Dependency-aware reordering (maximize parallel non-conflicting merges)\n- Priority lanes (P0 jumps queue)\n\n### Phase 2: Ownership Zones\n- Partition code by ownership (team-a owns auth/, team-b owns payments/)\n- Serialize access to shared/hot zones\n- Parallelize work in non-overlapping zones\n\n### Phase 3: Speculative Execution\n- Refinery pool (multiple refineries try merges in parallel)\n- First to pass CI wins, losers rebase and retry\n- Requires cheap CI and tolerance for wasted compute\n\n### Phase 4: Semantic Resolution\n- AI understands change intent, not just diffs\n- Re-implement changes on new baseline when rebase fails\n- Detect incompatible intents → escalate to human\n\n## Key Insight\nThe merge queue is fundamentally a serialization bottleneck. Solutions either:\n1. Reduce staleness (faster work, continuous rebasing)\n2. Reduce conflicts (partition code, ownership zones)\n3. Resolve conflicts (smart merge, semantic understanding)\n4. Avoid conflicts (predictive scheduling)\n\n## Related\n- gt-gmqe: Bare repo architecture (foundation for this)\n- gt-4u5z: Worktree design (precursor)","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T18:30:01.55335-08:00","updated_at":"2025-12-27T21:29:55.172863-08:00","deleted_at":"2025-12-27T21:29:55.172863-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-lyn3","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:45","description":"Patrol 8: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:45:04.933298-08:00","updated_at":"2025-12-27T21:26:05.120555-08:00","deleted_at":"2025-12-27T21:26:05.120555-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lz13","title":"Update templates with molecule navigation workflow","description":"Update all agent templates to use new molecule navigation commands.\n\n## Commands to integrate\n- bd mol current: orientation after startup/handoff (bd-sal9)\n- bd close --continue: seamless step transitions (bd-ieyy)\n\n## Templates to update\n\n### prompts/roles/polecat.md\n- Add bd mol current to 'Finding Your Work' section\n- Replace manual 3-command dance with bd close --continue\n- Update 'Working Through Steps' section\n\n### prompts/roles/crew.md \n- Add molecule navigation to workflow section\n- Show bd mol current for session startup\n\n### prompts/roles/refinery.md\n- Update patrol step transitions to use --continue\n\n### prompts/roles/witness.md\n- Update patrol step transitions to use --continue\n\n### prompts/roles/deacon.md\n- Update patrol step transitions to use --continue\n\n## Key message\nThe Propulsion Principle: close a step, immediately get handed the next.\nNo friction, no forgetting, no 3-command dance.\n\n## Blocked by (Beads features)\n- bd-sal9: bd mol current\n- bd-ieyy: bd close --continue","notes":"BLOCKED 2025-12-23 00:17: Waiting for beads features (bd mol current, bd close --continue) to be implemented. Notified mayor.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T17:01:12.119194-08:00","updated_at":"2025-12-27T21:29:53.13326-08:00","dependencies":[{"issue_id":"gt-lz13","depends_on_id":"gt-qswb","type":"blocks","created_at":"2025-12-22T17:01:31.707885-08:00","created_by":"daemon"},{"issue_id":"gt-lz13","depends_on_id":"gt-fly0","type":"blocks","created_at":"2025-12-22T17:01:31.78232-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.13326-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-lzlee","title":"Digest: mol-deacon-patrol","description":"Patrol 16: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:36:01.407687-08:00","updated_at":"2025-12-27T21:26:00.727181-08:00","deleted_at":"2025-12-27T21:26:00.727181-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-m0fx.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-m0fx\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T23:28:07.781991-08:00","updated_at":"2025-12-27T21:29:55.645876-08:00","deleted_at":"2025-12-27T21:29:55.645876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-m1g43","title":"Session ended: gt-gastown-organic","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:53:49.359265-08:00","updated_at":"2026-01-04T16:41:37.866809-08:00","closed_at":"2026-01-04T16:41:37.866809-08:00","close_reason":"Archived","created_by":"gastown/polecats/organic"}
{"id":"gt-m3hh","title":"Merge: gt-7hor","description":"branch: polecat/slit\ntarget: main\nsource_issue: gt-7hor\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T12:32:43.108463-08:00","updated_at":"2025-12-27T21:27:22.869876-08:00","deleted_at":"2025-12-27T21:27:22.869876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-m46z2","title":"Digest: mol-deacon-patrol","description":"Patrol 15: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:39:09.102076-08:00","updated_at":"2025-12-27T21:26:00.32059-08:00","deleted_at":"2025-12-27T21:26:00.32059-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-m72d","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:21","description":"Patrol 4: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:21:43.191633-08:00","updated_at":"2025-12-27T21:26:05.314017-08:00","deleted_at":"2025-12-27T21:26:05.314017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-m7ll2","title":"Digest: mol-deacon-patrol","description":"Patrol 8: all clear, handing off for fresh context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:20:31.549297-08:00","updated_at":"2025-12-27T21:26:01.176466-08:00","deleted_at":"2025-12-27T21:26:01.176466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-m9uq3","title":"Digest: mol-deacon-patrol","description":"Patrol 12: All green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:33:56.421573-08:00","updated_at":"2025-12-27T21:26:02.543968-08:00","deleted_at":"2025-12-27T21:26:02.543968-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-maieo","title":"Digest: mol-deacon-patrol","description":"Patrol 5: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:39:28.463668-08:00","updated_at":"2025-12-27T21:26:00.904716-08:00","deleted_at":"2025-12-27T21:26:00.904716-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mat34","title":"Digest: mol-deacon-patrol","description":"Patrol 8: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:02.92058-08:00","updated_at":"2025-12-27T21:26:03.421553-08:00","deleted_at":"2025-12-27T21:26:03.421553-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mbyy","title":"CLI API Audit: gt and bd command structure review before OSS launch","description":"# CLI API Audit for OSS Launch\n\nBefore launching Gas Town + Beads as an SDK, we need to ensure the command-line\nAPI is intelligently designed, consistent, intuitive, and structurally sound.\n\n## Current State Audit\n\n### gt (Gas Town) - 50+ top-level commands\n\n**Agent Lifecycle:**\n- `gt mayor`, `gt deacon`, `gt witness`, `gt refinery` - agent management\n- `gt polecat` - worker management (subcommands: add, remove, list, etc.)\n- `gt crew` - persistent workspace management\n- `gt spawn` - create polecat with work\n\n**Work Lifecycle:**\n- `gt spawn` - assign work to new polecat\n- `gt sling` - hook work + start immediately\n- `gt hook` - attach work durably\n- `gt handoff` - pass work to fresh session\n- `gt done` - signal work complete\n- `gt release` - release stuck issues\n\n**Infrastructure:**\n- `gt daemon`, `gt up`, `gt down`, `gt start`, `gt stop`, `gt shutdown`\n- `gt init`, `gt install`, `gt doctor`\n\n**Communication:**\n- `gt mail` (subcommands: send, inbox, read, etc.)\n- `gt nudge`, `gt broadcast`\n\n**Molecules:**\n- `gt molecule` / `gt mol` (17 subcommands!)\n\n### bd (Beads) - Well-categorized but some sprawl\n\n**Core CRUD:**\n- `bd create`, `bd show`, `bd update`, `bd close`, `bd list`\n\n**Molecules:**\n- `bd mol` (5 subcommands: catalog, show, bond, run, distill)\n- `bd wisp` (3 subcommands: create, gc, list)\n- `bd pour` (top-level alias for instantiation)\n- `bd cook` (compile formula to proto)\n\n**Hook/Pin:**\n- `bd pin` - attach mol to agent hook\n- `bd unpin` - remove from hook\n- `bd hook` - inspect what's on hook\n\n## Structural Concerns\n\n### 1. gt is sprawling (50+ top-level commands)\n\nMany commands could be subcommands:\n```\n# Current # Could be\ngt up/down/start/stop gt daemon {up|down|start|stop}\ngt spawn/sling/hook/handoff gt work {spawn|sling|hook|handoff}\ngt mayor/deacon/witness gt agent {mayor|deacon|witness|...}\n```\n\n### 2. Overlap between gt and bd\n\n| Concept | gt | bd |\n|---------|----|----|\n| Hook | `gt hook` | `bd pin`, `bd hook` |\n| Mail | `gt mail` | `bd mail` (delegates) |\n| Molecules | `gt mol` (17 cmds) | `bd mol` (5 cmds) |\n\nWhich owns what? User confusion likely.\n\n### 3. Naming inconsistencies\n\n- `gt polecat` (noun) vs `gt spawn` (verb) - both about polecats\n- `gt done` vs `gt handoff` - both end work but different\n- `bd mol run` vs `bd pour` - both instantiate\n\n### 4. Subcommand depth decisions\n\n**Good patterns:**\n- `bd mol {catalog|show|bond|run|distill}` - coherent group\n- `gt mail {send|inbox|read|archive}` - coherent group\n\n**Questionable patterns:**\n- `bd create --wisp` AND `bd wisp create` - two ways to do same thing\n- `gt spawn --molecule X` AND `gt mol instantiate` - overlap\n\n### 5. Flag vs subcommand guidelines needed\n\nWhen to use flags vs subcommands?\n- `--json` as flag ✓ (modifier)\n- `--wisp` as flag for create? Or separate `bd wisp create`?\n- `--molecule` on spawn? Or separate workflow?\n\n## Recommendations to Evaluate\n\n### A. Consolidate gt top-level commands\n\nGroup related commands:\n```\ngt agent {mayor|deacon|witness|refinery|polecat}\ngt work {spawn|sling|hook|handoff|done}\ngt infra {daemon|up|down|doctor}\n```\n\n### B. Clarify gt vs bd ownership\n\n| Domain | Owner | Other defers |\n|--------|-------|--------------|\n| Issues | bd | gt uses bd |\n| Agents | gt | bd doesn't touch |\n| Molecules | bd mol | gt mol wraps for UX |\n| Hooks | bd pin/hook | gt hook wraps |\n| Mail | gt mail | bd mail delegates |\n\n### C. Apply consistent naming\n\n- All agent commands: nouns (`gt polecat`, `gt mayor`)\n- All work commands: verbs (`gt spawn`, `gt sling`, `gt hook`)\n- Avoid synonyms: pick one term and stick with it\n\n### D. Flag vs subcommand rule\n\n- Flags: modify behavior of command (`--json`, `--force`, `--wisp`)\n- Subcommands: distinct operations (`mol run`, `mol squash`, `mol burn`)\n- Exception: very common operations can have top-level aliases\n\n## Tasks\n\n1. [ ] Review gt command groupings - propose consolidation\n2. [ ] Review bd command groupings - check for sprawl\n3. [ ] Document gt vs bd ownership boundaries\n4. [ ] Identify and resolve naming inconsistencies\n5. [ ] Document flag vs subcommand decision criteria\n6. [ ] Create migration plan for any breaking changes\n7. [ ] Update help text to be consistent\n8. [ ] Test agent UX: can Claude guess commands correctly?\n\n## Success Criteria\n\n- New user can guess command structure intuitively\n- `gt --help` fits on one screen (or is well-categorized)\n- No duplicate functionality between gt and bd\n- Consistent naming throughout\n- Claude agents can discover commands without documentation","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-24T23:48:42.420867-08:00","updated_at":"2025-12-27T21:29:52.547876-08:00","dependencies":[{"issue_id":"gt-mbyy","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T23:48:52.221824-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.547876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-mc4n","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-test123) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:56:18.534569-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mcch0","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T07:28:57.926784-08:00","updated_at":"2025-12-25T07:28:57.926784-08:00","closed_at":"2025-12-25T07:28:57.926736-08:00"}
{"id":"gt-mcjd","title":"Work on gt-o9j: Fix tmux status bar polecat count - exclu...","description":"Work on gt-o9j: Fix tmux status bar polecat count - exclude static roles (mayor, deacon, witnesses, refineries, docs, hop). Run 'bd show gt-o9j' for details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T07:52:51.623541-08:00","updated_at":"2025-12-27T21:29:56.806106-08:00","deleted_at":"2025-12-27T21:29:56.806106-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-md2rg.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-md2rg\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:14.340033-08:00","updated_at":"2025-12-27T21:29:55.317325-08:00","deleted_at":"2025-12-27T21:29:55.317325-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mdgt8","title":"Convert formula files from YAML to JSON","description":"Formulas currently use .formula.yaml but the project avoids YAML. Convert to .formula.json for consistency with the rest of the codebase. This blocks implementing compose operators (advice, expand, etc.) since the schema needs to be right first.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T01:23:06.149396-08:00","updated_at":"2025-12-27T21:29:52.53127-08:00","deleted_at":"2025-12-27T21:29:52.53127-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mfatq","title":"Digest: mol-deacon-patrol","description":"Patrol 17: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:52:57.848245-08:00","updated_at":"2025-12-27T21:26:04.091698-08:00","deleted_at":"2025-12-27T21:26:04.091698-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mh18d","title":"Digest: mol-deacon-patrol","description":"Patrol 11: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T13:57:28.9637-08:00","updated_at":"2025-12-26T13:57:28.9637-08:00","closed_at":"2025-12-26T13:57:28.96366-08:00"}
{"id":"gt-mjs5e","title":"Digest: mol-deacon-patrol","description":"Patrol 9: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:35:40.754118-08:00","updated_at":"2025-12-27T21:26:00.373404-08:00","deleted_at":"2025-12-27T21:26:00.373404-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mjso","title":"Merge: gt-rixa","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-rixa\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T14:09:26.367745-08:00","updated_at":"2025-12-27T21:27:22.912542-08:00","deleted_at":"2025-12-27T21:27:22.912542-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-mkdb","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:41.689195-08:00","updated_at":"2025-12-27T21:26:04.532481-08:00","deleted_at":"2025-12-27T21:26:04.532481-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mkh7","title":"check-refinery","description":"Ensure the refinery is alive and processing merge requests.\n\nNeeds: inbox-check","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:18:21.598893-08:00","updated_at":"2025-12-25T15:52:58.020202-08:00","deleted_at":"2025-12-25T15:52:58.020202-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mmbh","title":"Fix out-of-sync molecule test expectations","description":"Tests in builtin_molecules_test.go have hardcoded expectations that no longer match actual molecules:\n- TestBuiltinMolecules: expects 9 molecules, got 11\n- TestPolecatWorkMolecule: step refs out of sync\n- TestDeaconPatrolMolecule: step count and refs out of sync\n\nThis is pre-existing on main, not introduced by any polecat work.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-22T22:20:38.8105-08:00","updated_at":"2025-12-27T21:29:56.317305-08:00","deleted_at":"2025-12-27T21:29:56.317305-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-mol-047qp","title":"Move disk 7: B → C","description":"Move disk 7 from peg B to peg C. (Move 960/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.046085-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-047v","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 396/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.821856-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-05bp","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 892/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.016687-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-06p","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 88/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.720383-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-06pw","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 796/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.976493-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-07k","title":"Implement","description":"Implement the solution for gt-gmqe. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:45:53.64976-08:00","updated_at":"2025-12-27T21:29:55.156264-08:00","dependencies":[{"issue_id":"gt-mol-07k","depends_on_id":"gt-mol-aux","type":"parent-child","created_at":"2025-12-25T18:45:53.650847-08:00","created_by":"mayor"},{"issue_id":"gt-mol-07k","depends_on_id":"gt-mol-1vj","type":"blocks","created_at":"2025-12-25T18:45:53.66264-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.156264-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-08o95","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 975/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.053064-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-08p","title":"Move disk 1: A → C","description":"Move the smallest disk from peg A to peg C.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.365236-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0964","title":"Move disk 8: C → B","description":"Move disk 8 from peg C to peg B. (Move 384/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.817681-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0a91","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 743/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.954791-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0amk","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 340/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.802396-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0cj4","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 278/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.781499-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0drc","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 506/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.862172-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0ds","title":"Verify initial state","description":"All 10 disks stacked on peg A. Largest on bottom.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.693491-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0fa7","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 792/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.974852-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0fm1","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 338/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.801739-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0fw2","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 379/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.815934-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0gz8","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 470/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.848991-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0i75","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 780/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.969896-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0il4","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 904/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.022019-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0j4","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 29/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.702421-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0kbv","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 854/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.000629-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0lr5","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 730/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.949612-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0mz","title":"Load context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-8tmz.36) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:36.237689-08:00","updated_at":"2025-12-27T21:29:55.106067-08:00","deleted_at":"2025-12-27T21:29:55.106067-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-0o36","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 847/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.997704-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0pbn","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 559/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.882117-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0r5o","title":"Move disk 7: A → B","description":"Move disk 7 from peg A to peg B. (Move 448/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.840971-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0tx","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 38/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.705063-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0u9d","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 593/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.894987-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0uqo","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 634/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.910818-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0uzs","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 441/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.83843-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0w7i","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 726/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.947963-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0wu","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 3/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.694505-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0x7b","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 363/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.810383-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-0xox","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 516/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.865906-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1053i","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 1010/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.068603-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-127u","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 581/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.890453-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-13pvx","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 994/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.06151-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-14ry","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 225/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.764096-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-14s3","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 341/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.802741-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-16r0","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 188/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.752128-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-179r","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 244/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.77032-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-17wq","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 180/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.749543-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-18y","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 33/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.703608-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1b7pf","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 965/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.048625-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1be","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 15/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.698175-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1bwt","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 921/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.029291-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1cut","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 426/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.832952-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1d2","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 34/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.703914-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1e7","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 50/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.708697-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1eg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 97/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.72309-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1erw","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 167/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.745293-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1j3c3","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 969/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.050389-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1j4f","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 785/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.971972-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1mi","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 19/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.699427-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1mp5","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 232/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.766345-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1pl5q","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 962/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.046961-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1qlx","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 195/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.754364-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1r1","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 102/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.72461-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1s3z","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 406/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.825375-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1ssz","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 533/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.872359-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1u1x","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 457/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.844235-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1up38","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 1015/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.070845-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1vj","title":"Load context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-gmqe) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:45:53.650028-08:00","updated_at":"2025-12-27T21:29:55.147812-08:00","dependencies":[{"issue_id":"gt-mol-1vj","depends_on_id":"gt-mol-aux","type":"parent-child","created_at":"2025-12-25T18:45:53.674603-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.147812-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-1wf2","title":"Move disk 8: B → A","description":"Move disk 8 from peg B to peg A. (Move 640/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.913203-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1wp6","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 641/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.913588-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-1zpe","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 290/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.785504-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-20s","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 106/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.725832-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-21e","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 23/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.700645-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-22ya","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 186/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.751499-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-23ew","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 291/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.785833-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2468","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 253/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.773253-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-253","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 52/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.709362-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2995","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 867/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.006065-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-29hq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 621/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.905729-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2ag5","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 620/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.905345-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2bby","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 939/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.036913-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2bg7","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 190/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.752751-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2c5y","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 718/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.944775-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2f78","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 589/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.893467-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2hf4","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 685/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.931187-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2hgra","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 966/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.049068-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2i1","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 120/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.730101-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2jjl","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 299/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.788538-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2jvk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 869/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.006887-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2l9","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 108/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.726437-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2lb","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 105/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.725533-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2mn8","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 777/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.968682-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2nbv","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 308/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.791571-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2nrb","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 505/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.861794-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2ocu","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 750/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.957624-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2odt","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 781/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.970301-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2pt","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 47/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.707696-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2u5","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 78/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.717386-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2udc","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 435/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.836264-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2wle","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 587/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.892717-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2x7y","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 592/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.894615-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2xla","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 537/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.873849-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2y3nv","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 1022/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.073958-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-2zxn","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 234/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.767023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-30lb","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 941/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.037778-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-32ej","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 564/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.884002-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-35l0","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 230/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.765699-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-36gk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 371/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.813164-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-36rk7","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 998/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.063267-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-37g7","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 355/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.807591-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-38gi","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 524/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.868945-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-38zg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 754/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.959249-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3b26","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 390/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.81978-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3c0","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 40/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.705658-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3cmi","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 582/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.890829-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3dk9","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 144/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.737543-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3dp9","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 841/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.995193-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3f0","title":"Implement","description":"Implement the solution for gt-8tmz.34. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:10.031026-08:00","updated_at":"2025-12-27T21:29:55.13109-08:00","dependencies":[{"issue_id":"gt-mol-3f0","depends_on_id":"gt-mol-xl8","type":"blocks","created_at":"2025-12-25T19:47:10.045526-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.13109-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-3hik","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 462/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.846066-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3igy","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 124/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.731323-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3jc6","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 687/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.931981-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3jed6","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 968/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.049953-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3nt","title":"Move disk 2: A → B","description":"Move disk 2 from peg A to peg B.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.365562-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3o8x","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 221/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.762767-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3pr3","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 484/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.854126-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3pxk9","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 1020/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.073067-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3rq8","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 911/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.025033-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3teb","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 862/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.003974-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3ueb","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 303/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.789861-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3wrc","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 734/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.951208-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-3wyv","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 669/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.924727-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-43c4","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 477/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.851543-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-43iji","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 1023/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.074408-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-43sw","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 731/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.95001-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-44fi","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 569/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.885901-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-44qj","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 674/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.926721-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-45mg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 931/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.03352-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-467o","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 871/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.00779-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-47j4","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 210/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.759239-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-47k4","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 312/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.792923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-47wp","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 413/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.827822-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4996","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 307/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.791242-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-49h9","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 543/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.876075-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-49qb","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 337/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.801383-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4bce","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 573/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.88744-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4bp","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 11/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.696996-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4bx","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 20/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.699736-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4byg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 649/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.916713-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4dz1v","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 971/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.051292-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4g9z","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 329/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.798683-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4hq5","title":"Move disk 6: B → A","description":"Move disk 6 from peg B to peg A. (Move 736/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.952008-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4ik","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 25/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.70124-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4ilj","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 138/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.735668-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4kqy","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 424/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.832195-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4mgn","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 604/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.899181-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4n8e","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 740/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.953593-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4p68","title":"Move disk 7: B → C","description":"Move disk 7 from peg B to peg C. (Move 576/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.888575-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4pf","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 32/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.703319-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4pj9","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 583/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.891206-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4q45","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 203/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.756982-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4qd9","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 863/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.004393-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4qta","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 759/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.961291-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4r982","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 952/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.042647-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4rse","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 263/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.776573-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4sb6n","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 973/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.052194-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4son","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 298/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.788207-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4td78","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 988/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.058886-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4u6k","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 824/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.988133-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4vuz","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 145/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.737856-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4wgp6","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 999/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.063706-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4wie","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 305/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.790562-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4wt5","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 521/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.867841-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4xcm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 609/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.90108-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4xqs","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 468/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.848273-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-4yxj","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 689/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.932763-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-502r","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 473/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.85011-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-51h","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 81/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.718279-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-52jc","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 140/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.736284-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-52rs","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 354/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.807237-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-54bq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 403/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.824329-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-54ka","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 161/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.74335-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-55i7","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 463/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.846421-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-564c","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 153/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.740331-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-572","title":"Implement","description":"Implement the solution for gt-8tmz.36. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:36.237423-08:00","updated_at":"2025-12-27T21:29:55.114325-08:00","dependencies":[{"issue_id":"gt-mol-572","depends_on_id":"gt-mol-0mz","type":"blocks","created_at":"2025-12-25T19:47:36.252069-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.114325-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-57w","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 58/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.711392-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5ayq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 317/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.794617-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5dej","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 326/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.797666-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5eb2","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 494/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.857767-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5erz","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 498/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.859226-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5g1","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 16/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.698484-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5gkq","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 362/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.810021-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5hn","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 2/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.694179-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5ht","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 110/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.727035-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5i7","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 56/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.710713-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5ici2","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 996/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.062375-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5je3","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 937/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.036055-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5k7g","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 313/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.79327-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5klm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 663/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.922361-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5knn","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 300/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.788864-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5nog","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 837/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.993512-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5ou19","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 955/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.043924-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5pj3","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 568/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.88551-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5qt2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 851/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.999363-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5r2b","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 238/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.768361-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5rbq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 529/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.870851-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5rv8","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 686/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.931586-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5tfe","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 842/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.995604-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5vxy","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 212/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.759856-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5xfa","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 866/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.005637-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5yeu","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 179/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.749218-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-5zup","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 642/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.913967-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-607l","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 935/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.035195-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-638c","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 794/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.975659-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-63cr","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 900/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.020155-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-671m","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 442/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.838787-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-68b1","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 284/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.783502-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-68z1","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 182/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.750205-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-69gm","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 766/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.964127-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6aga","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 433/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.835524-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6g768","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 974/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.052638-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6gov","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 447/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.840597-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6h46","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 491/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.856675-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6io1","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 668/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.924347-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6l4m","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 676/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.927505-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6nl9","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 350/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.805832-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6qo7","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 274/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.780192-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6qv7","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 717/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.94437-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6rx1","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 270/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.778868-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6sk4","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 531/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.871615-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6vqi","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 779/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.969494-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6wiy","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 600/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.897638-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6x8z","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 838/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.993941-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6xmy","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 665/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.923166-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6yz","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 103/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.724917-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6z6n","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 347/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.804805-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-6zuc","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 714/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.94317-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-70m7","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 548/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.877971-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-712mi","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 983/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.056527-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-738","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 91/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.72129-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-73sj","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 517/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.8663-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-73x1","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 395/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.82149-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-75o","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 31/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.703024-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-78j7","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 486/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.854867-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7a3z","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 758/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.960886-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7aj2","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 541/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.875333-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7dn","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 85/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.719467-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7es","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 68/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.71441-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7g4","title":"Move disk 1: B → A","description":"Move disk 1 from peg B to peg A.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.366491-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7gy3b","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 1000/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.064144-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7hxv","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 302/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.789531-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7k14","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 500/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.85996-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7k7i","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 693/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.934374-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7qo","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 21/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.700041-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7rzx","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 178/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.7489-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7s1b","title":"Move disk 6: B → A","description":"Move disk 6 from peg B to peg A. (Move 544/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.876448-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7vd9","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 612/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.902265-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7w2","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 9/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.696395-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7wj6","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 868/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.006485-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-7yyq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 681/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.929506-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-80sw","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 163/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.744006-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-820q","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 690/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.933164-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-825","title":"Move disk 6: C → B","description":"Move disk 6 from peg C to peg B. (Move 96/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.72278-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-82n","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 51/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.70904-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-86s5","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 393/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.820809-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-87idy","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 984/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.056963-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-88kh","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 348/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.805136-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-89yw","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 683/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.930334-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8ax","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 94/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.722188-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8b0","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 98/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.723395-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8bhr","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 769/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.965362-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8cbk","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 876/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.009909-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8d5","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 82/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.718568-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8etg","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 281/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.782513-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8ex","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 122/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.730709-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8f2e","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 316/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.794285-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8g7","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 84/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.719165-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8hf","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 63/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.712929-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8hp","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 72/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.715589-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8ii1j","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 987/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.058412-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8jrz","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 823/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.987733-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8m5n","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 654/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.918666-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8ma","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 107/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.726131-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8n8d","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 334/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.800364-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8njt","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 515/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.865528-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8nz","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 13/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.697586-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8o3a","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 247/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.771293-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8oeu","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 451/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.842073-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8oq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 59/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.711718-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8pxr","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 891/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.01627-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8q1l","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 365/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.811076-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8qpq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 465/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.84715-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8rtd","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 913/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.025905-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8v4","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 92/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.721603-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8vdv","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 623/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.906486-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8wnv","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 309/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.791895-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8yb","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 35/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.704199-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-8yiw","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 888/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.015005-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-90e8","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 251/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.772587-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-90pm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 825/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.988533-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-91l7","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 860/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.003147-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-967c","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 189/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.752443-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-96pv","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 631/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.90965-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-970j","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 919/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.028449-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-98b9","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 421/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.83062-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-98d7","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 416/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.828885-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-98oa","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 902/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.02101-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-99a2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 755/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.959662-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9ajp","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 460/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.845318-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9as","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 10/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.696702-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9cox","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 381/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.816641-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9dfl","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 479/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.852279-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9eiy","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 710/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.941557-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9fgl","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 709/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.941152-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9frw","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 295/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.787182-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9gd3","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 411/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.827105-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9gfz","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 933/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.034357-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9io","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 48/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.707983-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9iru","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 808/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.981468-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9ish","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 879/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.011168-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9itq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 929/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.032672-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9kma","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 224/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.763761-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9mdo","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 578/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.889324-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9n8v","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 874/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.009052-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9of9","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 653/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.91826-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9oj2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 845/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.996874-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9okd","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 818/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.985639-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9p8z","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 716/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.943966-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9tl0","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 850/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.998948-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9xv","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 87/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.720078-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-9z70","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 282/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.782856-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a05n","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 325/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.797332-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a3wo","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 392/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.82045-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a4j5","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 205/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.757651-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a4lq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 761/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.962094-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a4s6","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 511/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.864033-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a72y","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 895/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.017979-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a7c7","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 148/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.738775-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a80y","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 789/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.973595-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a8b","title":"Load context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-8tmz.10) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:48.59927-08:00","updated_at":"2025-12-27T21:29:55.089396-08:00","deleted_at":"2025-12-27T21:29:55.089396-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-a8sq","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 330/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.799014-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a8tm","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 782/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.970737-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a94m","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 616/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.903819-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-a9b2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 659/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.92074-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-aai3","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 157/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.742026-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-aarx","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 629/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.908877-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-acx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 109/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.726743-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ae2","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 6/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.695436-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-afdh","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 360/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.80934-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-affc","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 666/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.923577-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ah4","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 12/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.697298-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ahwvw","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 1006/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.0668-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ai53","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 858/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.00229-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ai9ou","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 979/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.054772-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-air","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 24/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.700941-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ajv","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 90/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.720987-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ajw","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 74/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.716181-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-akmy","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 697/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.936347-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-am7","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 62/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.712629-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ambk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 821/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.986885-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-anxfx","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 982/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.056085-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-arrz","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 358/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.808646-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-asg7","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 133/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.734105-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-auem","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 268/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.778214-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-aux","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work. A polecat that crashes after any step can restart, read its molecule state, and continue from the last completed step. No work is lost.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| issue | Yes | The source issue ID being worked on |","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T18:45:53.649284-08:00","updated_at":"2025-12-27T21:29:55.16453-08:00","deleted_at":"2025-12-27T21:29:55.16453-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-mol-avip","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 482/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.853387-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-avp5","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 475/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.850833-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ax83","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 601/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.898023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ay4e","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 333/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.800028-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-aynu","title":"Move disk 6: B → A","description":"Move disk 6 from peg B to peg A. (Move 928/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.032248-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-b1vt","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 880/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.011585-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-b3ba","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 483/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.853765-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bb11a","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 997/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.062828-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bd4p","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 418/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.829582-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-be9x","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 294/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.786828-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bejq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 373/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.813869-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bexo","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 260/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.775556-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bfa0","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 747/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.956426-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bg8","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 86/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.719767-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bh7a","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 566/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.884774-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bh7k","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 656/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.919462-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bj6v","title":"Move disk 7: B → C","description":"Move disk 7 from peg B to peg C. (Move 192/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.7534-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bm2a","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 242/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.769678-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bpp4","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 397/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.822202-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-brh3","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 367/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.811768-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bs0q","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 906/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.022896-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bse3","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 870/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.007371-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bui9","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 257/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.774593-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bvna","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 474/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.850464-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-by3k","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 786/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.972368-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-bzpw","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 696/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.935932-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-c11um","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 989/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.059333-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-c3pp","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 727/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.948392-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-c5y6","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 420/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.830281-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-c7yf","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 520/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.867459-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ca8a","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 504/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.861414-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cab4","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 787/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.972793-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cb4v","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 346/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.804466-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cchk","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 199/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.7557-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cda","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 104/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.725233-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ce5e","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 605/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.89955-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cejf","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 738/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.952799-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cfrm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 897/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.018877-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cg17","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 301/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.789204-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cg7p","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 391/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.820117-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ciw","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 5/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.695127-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cjc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 101/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.724317-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-clbb","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 622/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.906114-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-clbog","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 951/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.042204-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-co0d","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 276/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.780833-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cso7v","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 948/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.040776-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ctl6","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 245/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.770668-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ctns","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 833/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.991861-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cubr","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 293/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.786498-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cuqh","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 147/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.738463-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cuxx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 595/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.895766-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cw8f","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 399/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.822914-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cwaz","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 262/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.776243-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-cx8s","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 370/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.812824-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d0e","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 121/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.730414-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d0wg","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 509/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.863312-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d198","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 499/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.859598-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d1hk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 599/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.897266-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d2hbz","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 1018/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.072148-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d49","title":"Move disk 3: A → C","description":"Move the largest disk from peg A to peg C.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.366188-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d69","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 115/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.728585-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-d71u","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 389/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.81942-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dafo","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 552/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.879475-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dbc","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 80/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.717984-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dbi6","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 707/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.940333-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dcwy","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 492/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.857033-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ddau","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 831/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.991037-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-debl","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 152/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.740022-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dej","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 117/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.729196-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-df6d","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 695/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.935409-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dhxq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 567/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.885144-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-djcc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 617/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.904204-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dlc","title":"Move disk 7: A → B","description":"Move disk 7 from peg A to peg B. (Move 64/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.713216-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dm83","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 187/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75182-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-domz","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 425/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.832587-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dopr","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 321/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.795958-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dos","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 17/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.698786-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dpib","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 196/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.754702-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dpzn","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 679/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.928722-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ds5y","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 912/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.025483-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dtjx","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 772/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.966603-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dv11k","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 957/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.044782-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dw7","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 18/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.69912-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dxah","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 127/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.732237-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dxhq","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 386/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.818374-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-dzqz","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 185/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.751164-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e0fs","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 405/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.825006-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e0n","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 27/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.701831-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e16j","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 522/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.868208-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e1rk","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 408/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.826064-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e385v","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 1001/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.064569-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e3lzc","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 992/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.06064-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e3uq","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 658/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.920259-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e40","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 53/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.709666-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e4cq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 815/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.984418-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e5vl","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 942/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.038191-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e5yv","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 633/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.910423-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e712","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 286/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.784153-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e7j","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 83/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.718867-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e80i","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 218/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.761796-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-e8e5","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 811/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.982719-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ea57","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 584/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.891583-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eae","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 30/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.702723-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-edan","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 677/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.927924-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eeho","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 258/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.774917-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-een7","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 380/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.816294-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eg7o","title":"Move disk 7: C → A","description":"Move disk 7 from peg C to peg A. (Move 704/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.939144-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-egyl","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 385/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.818024-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ehd3","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 745/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.955598-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ehmj","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 369/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.812454-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ehp6","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 184/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.750847-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ej4d","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 840/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.994786-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-em4c","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 644/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.91476-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-em5w","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 645/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.915139-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-emj8","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 934/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.03477-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-enga","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 793/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.975263-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eo0aa","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 1016/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.07128-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eof3","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 827/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.989379-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eogc","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 319/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.795289-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eoth","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 856/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.001439-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-epgj","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 791/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.974434-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-et0n","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 372/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.813499-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-etdq","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 444/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.83951-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-etsh","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 557/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.881373-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-eui0","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 201/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75634-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f47","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 89/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.720685-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f57r","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 194/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75405-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f5cx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 571/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.886674-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f5fr","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 702/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.938324-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f6e0g","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 1011/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.069053-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-f9hc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 899/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.019737-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fa56","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 277/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.781168-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fcme","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 795/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.97607-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fdax","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 804/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.979824-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ffgw","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 527/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.870063-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fgm","title":"Move disk 1: A → C","description":"Move disk 1 from peg A to peg C.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.36709-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-flvc","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 349/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.805465-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fo4c","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 819/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.986056-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fo8a","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 927/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.031823-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fof7p","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 1014/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.070385-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fp47","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 280/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.782175-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fqix","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 800/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.978149-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fs8x","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 682/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.929917-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-fz48","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 518/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.866717-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g0iv","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 131/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.733453-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g1y8","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 208/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.758604-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g51z","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 283/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.78318-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g7xv","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 162/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.743675-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g8j3","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 343/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.803437-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-g8x","title":"Request shutdown","description":"Send shutdown request to Witness.\nWait for termination.\n\nThe polecat is now ready to be cleaned up.\nDo not exit directly - wait for Witness to kill the session.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:48.599526-08:00","updated_at":"2025-12-27T21:29:55.081039-08:00","dependencies":[{"issue_id":"gt-mol-g8x","depends_on_id":"gt-mol-jpg","type":"blocks","created_at":"2025-12-25T19:47:48.657908-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.081039-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-g9fg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 673/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.926338-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gb42","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 485/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.854494-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gbg7","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 728/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.948801-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gd6o","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 773/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.967038-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gdt1","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 749/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.957228-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ge3g","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 249/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.771945-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gfem","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 828/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.989808-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ggz0","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 878/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.010757-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ghmh","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 432/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.83516-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gjqv","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 455/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.843521-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gjse","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 271/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.77921-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gk8x","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 813/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.983587-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-glg","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 113/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.727981-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gmw3","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 918/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.028026-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gob9","title":"Move disk 6: B → A","description":"Move disk 6 from peg B to peg A. (Move 352/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.80654-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-goq2","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 134/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.734408-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gp9e","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 357/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.8083-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gp9n","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 176/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.748239-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gqo6","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 805/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.980234-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gqwt","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 250/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.772267-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-grvy","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 556/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.880974-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-grweg","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 995/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.061935-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gszz","title":"Move disk 6: C → B","description":"Move disk 6 from peg C to peg B. (Move 480/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.852653-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gths","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 328/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.798344-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gtl0","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 155/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.741076-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gtn2","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 790/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.973998-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-guip","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 619/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.904948-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-guov","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 648/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.916327-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gv4c","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 315/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.793933-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gx20","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 875/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.009475-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gx3o","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 170/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.746271-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gxsy","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 723/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.946777-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gxvm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 519/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.867082-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gyvc","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 146/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.738155-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-gzty","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 886/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.014149-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-h03u","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 783/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.971145-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-h1ad","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 691/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.933566-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-h4y7","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 125/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.731618-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-h9v","title":"towers-of-hanoi","description":"AGENT EXECUTION PROTOCOL - Towers of Hanoi\n\nPURPOSE: This is a durability proof, not computation. Steps are pre-computed.\nYour job is to execute them mechanically, proving crash-recovery at scale.\n\nEXECUTION LOOP:\n1. Find current state: bd mol current \u003cmol-id\u003e\n2. Find the next ready (unblocked) step\n3. Close it: bd close \u003cstep-id\u003e\n4. Repeat until no ready steps remain\n\nON RESUME (after crash/restart/handoff):\n- Same as fresh start. The molecule IS the state.\n- Query `bd mol current \u003cmol-id\u003e`, continue from there.\n- No memory of previous session needed.\n\nDO:\n- Close steps as fast as possible (they're trivial mechanical moves)\n- Use `gt handoff` when context fills (proactive cycling)\n- Trust the pre-computed solution - every move is already correct\n\nDO NOT:\n- Try to solve Hanoi yourself - moves are already computed\n- Mark steps in_progress - just close them directly\n- Ask for permission - this is GUPP territory, just execute\n- Stop for human input - run autonomously until complete\n\nMONITORING:\n- Progress: Count closed children of the molecule\n- For mega-molecules: Use convoy dashboard when available\n- Completion: All steps closed = molecule complete\n\nThis proves Gas Town can execute arbitrarily long workflows with\nnondeterministic idempotence - different sessions, same outcome.\n","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-31T01:13:36.36421-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-mol-haq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 49/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.708297-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hb9","title":"Check own context limit","description":"Check own context limit.\n\nThe Deacon runs in a Claude session with finite context. Check if approaching the limit:\n\n```bash\ngt context --usage\n```\n\nIf context is high (\u003e80%), prepare for handoff:\n- Summarize current state\n- Note any pending work\n- Write handoff to molecule state\n\nThis enables the Deacon to burn and respawn cleanly.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-30T16:09:54.469207-08:00","updated_at":"2025-12-30T16:55:07.166448-08:00","closed_at":"2025-12-30T16:55:07.166448-08:00","close_reason":"Context usage command not implemented (gt context --usage unavailable)","dependencies":[{"issue_id":"gt-mol-hb9","depends_on_id":"gt-mol-265","type":"parent-child","created_at":"2025-12-30T16:09:54.55394-08:00","created_by":"gastown/polecats/rictus"},{"issue_id":"gt-mol-hb9","depends_on_id":"gt-mol-dna","type":"blocks","created_at":"2025-12-30T16:09:54.668866-08:00","created_by":"gastown/polecats/rictus"}]}
{"id":"gt-mol-hbu8","title":"Move disk 7: A → B","description":"Move disk 7 from peg A to peg B. (Move 832/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.99145-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hckq","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 848/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.998121-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hg12","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 223/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.763424-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hgfe","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 126/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.731936-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hh89","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 254/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.773584-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hi2","title":"towers-of-hanoi-10","description":"AGENT EXECUTION PROTOCOL - Towers of Hanoi (10 disks, 1023 moves)\n\nPURPOSE: This is a durability proof, not computation. Steps are pre-computed.\nYour job is to execute them mechanically, proving crash-recovery at scale.\n\nEXECUTION LOOP:\n1. Find current state: bd mol current \u003cmol-id\u003e\n2. Find the next ready (unblocked) step\n3. Close it: bd close \u003cstep-id\u003e\n4. Repeat until no ready steps remain\n\nON RESUME (after crash/restart/handoff):\n- Same as fresh start. The molecule IS the state.\n- Query `bd mol current \u003cmol-id\u003e`, continue from there.\n- No memory of previous session needed.\n\nDO:\n- Close steps as fast as possible (they're trivial mechanical moves)\n- Use `gt handoff` when context fills (proactive cycling)\n- Trust the pre-computed solution - every move is already correct\n\nDO NOT:\n- Try to solve Hanoi yourself - moves are already computed\n- Mark steps in_progress - just close them directly\n- Ask for permission - this is GUPP territory, just execute\n- Stop for human input - run autonomously until complete\n\nMONITORING:\n- Progress: Count closed children of the molecule\n- For mega-molecules: Use convoy dashboard when available\n- Completion: All steps closed = molecule complete\n\nThis proves Gas Town can execute arbitrarily long workflows with\nnondeterministic idempotence - different sessions, same outcome.\n","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-31T01:21:38.692768-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-mol-hj5y","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 684/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.930756-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hltwt","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 944/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.039032-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hluu","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 414/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.828174-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hoeo","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 241/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.769355-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hojr","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 495/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.858129-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hs9k","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 643/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.914368-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hv1z","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 351/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.806179-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-hzjs","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 136/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.735046-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-i1mr","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 798/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.977347-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-i459","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 586/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.892343-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-i5fa","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 861/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.003553-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-i6zm","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 222/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.763092-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-i90b","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 438/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.837341-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ib3gh","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 959/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.045652-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ibmj","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 830/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.990633-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ibu3","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 419/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.829923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-idl","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 42/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.706223-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-idol","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 926/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.031394-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iklx","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 440/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.838067-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iocl","title":"Move disk 6: B → A","description":"Move disk 6 from peg B to peg A. (Move 160/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.743023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ioxc","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 700/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.937533-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ipp4","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 501/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.860322-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-irj5","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 855/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.001029-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iu2n","title":"Move disk 6: C → B","description":"Move disk 6 from peg C to peg B. (Move 672/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.925923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iu46","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 680/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.929126-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iubu","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 149/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.739077-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iufb","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 546/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.877213-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iur1","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 588/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.893091-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ixqu","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 646/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.91551-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ixx","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 60/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.712023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-iys","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 41/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.70594-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-izdl","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 814/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.984006-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-j4g0","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 530/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.871223-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-j8zn","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 762/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.962493-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-j9ag","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 233/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.766678-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-j9cq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 901/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.020576-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jagd","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 227/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.764731-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jcll","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 202/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75666-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jdmf","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 228/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.765055-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-je7f","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 410/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.826765-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-je7m","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 930/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.033098-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jejt","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 753/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.958839-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jerl","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 287/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.78451-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jfcv","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 774/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.967432-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jhipz","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 970/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.050831-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jj0u","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 607/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.900316-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jozh","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 923/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.030124-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jpg","title":"Self-review","description":"Review your own changes. Look for:\n- Bugs and edge cases\n- Style issues\n- Missing error handling\n- Security concerns\n\nFix any issues found before proceeding.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:48.59976-08:00","updated_at":"2025-12-27T21:29:55.072671-08:00","dependencies":[{"issue_id":"gt-mol-jpg","depends_on_id":"gt-mol-uau","type":"blocks","created_at":"2025-12-25T19:47:48.687264-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.072671-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-jsnn","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 922/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.029709-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jy5","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 71/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.715283-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jy61","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 275/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.780515-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-jzhvm","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 980/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.055205-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k08s","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 181/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.749882-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k1kw","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 562/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.883261-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k1qi","title":"Move disk 6: C → B","description":"Move disk 6 from peg C to peg B. (Move 864/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.004827-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k238","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 551/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.879108-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k23v","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 324/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.796991-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k2e0","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 476/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.851187-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k3r","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 26/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.701531-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k42h","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 844/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.996454-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k6op","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 310/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.792235-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-k761","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 306/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.790904-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kavc","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 534/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.872734-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kbeg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 778/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.969093-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kelw","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 297/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.787861-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kf9","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 116/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.728887-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kg76","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 159/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.742696-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-khib","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 193/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.753738-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-khsh","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 699/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.937146-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kjbk","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 585/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.891959-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kk6n","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 523/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.868583-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kl1","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 95/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.722486-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kldbm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 1005/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.06635-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kmeb9","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 1004/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.06589-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ksg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 70/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.714993-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ksjf","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 314/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.793607-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ktbh","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 382/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.816982-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ku7y","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 422/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.831269-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kul8","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 724/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.947168-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kum0","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 472/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.849744-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kzd5","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 721/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.945986-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-kzx2","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 877/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.010329-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-l2lt","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 554/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.880248-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-l2um","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 269/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.778555-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-l7ot","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 213/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.76018-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lajm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 711/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.941962-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lde4","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 729/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.949219-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ldnq","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 692/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.933962-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lfmh","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 174/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.747596-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lhnx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 715/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.943566-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lhzh7","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 961/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.046538-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ljhp","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 332/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.799698-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-loam","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 784/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.971537-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lprc","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 694/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.934766-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lqdt","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 751/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.958023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lsxq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 265/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.777225-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-luen","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 598/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.896881-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lups","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 376/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.814888-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lvec","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 606/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.899933-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lvw","title":"Detect cleanup needs","description":"**DETECT ONLY** - Check if cleanup is needed and dispatch to dog.\n\n**Step 1: Preview cleanup needs**\n```bash\ngt doctor -v\n# Check output for issues that need cleaning\n```\n\n**Step 2: If cleanup needed, dispatch to dog**\n```bash\n# Sling session-gc formula to an idle dog\ngt sling mol-session-gc deacon/dogs --var mode=conservative\n```\n\n**Important:** Do NOT run `gt doctor --fix` inline. Dogs handle cleanup.\nThe Deacon stays lightweight - detection only.\n\n**Step 3: If nothing to clean**\nSkip dispatch - system is healthy.\n\n**Cleanup types (for reference):**\n- orphan-sessions: Dead tmux sessions\n- orphan-processes: Orphaned Claude processes\n- wisp-gc: Old wisps past retention\n\n**Exit criteria:** Session GC dispatched to dog (if needed).","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-30T16:09:54.46833-08:00","updated_at":"2025-12-30T16:53:21.659155-08:00","closed_at":"2025-12-30T16:53:21.659155-08:00","close_reason":"Cleanup needed (7 missing agent beads, 4 stale locks, 1 diverged clone) but dog dispatch not available - requires manual 'gt doctor --fix'","dependencies":[{"issue_id":"gt-mol-lvw","depends_on_id":"gt-mol-265","type":"parent-child","created_at":"2025-12-30T16:09:54.534623-08:00","created_by":"gastown/polecats/rictus"},{"issue_id":"gt-mol-lvw","depends_on_id":"gt-mol-znx","type":"blocks","created_at":"2025-12-30T16:09:54.645408-08:00","created_by":"gastown/polecats/rictus"}]}
{"id":"gt-mol-lwx","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 100/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.724014-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lx8s","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 197/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.755033-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lxmc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 839/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.994368-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ly9t","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 908/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.023772-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lyq7","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 550/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.878743-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-lzpvs","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 976/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.053491-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m06t","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 675/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.927113-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m15qm","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 964/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.04784-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m1db","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 450/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.841712-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m2ps","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 129/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.732841-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m5g","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 7/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.695776-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m5ja","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 764/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.963323-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m71e8","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 1002/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.065003-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m7y8","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 560/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.882483-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m80i","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 797/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.976923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m8om","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 602/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.898414-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-m8ux","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 883/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.012872-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mamd","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 719/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.94517-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mdri","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 356/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.807942-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mefy","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 374/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.814209-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-menj","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 732/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.950412-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mezwj","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 956/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.044341-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mfyh","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 436/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.836626-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mgjpq","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 946/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.039912-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mhvc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 209/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.758915-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mib2","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 366/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.811418-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mits","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 458/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.844609-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-miyd","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 200/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.756024-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mkmit","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 1008/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.067685-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mkyf","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 938/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.036484-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mn6lg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 958/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.045219-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mol0","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 513/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.864772-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mpvu","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 502/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.860684-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mqiy","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 846/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.99729-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mrf4","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 539/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.874589-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mt53","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 139/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.735977-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mw8","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 67/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.714113-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mwic7","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 1013/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.069956-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mwz87","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 985/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.05747-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mx2k","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 427/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.833325-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mzli","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 266/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.77754-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-mzrd","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 493/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.857402-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n15","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 43/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.706519-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n3jz","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 412/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.827462-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n5xc","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 733/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.950813-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n63z","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 237/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.768041-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n6sf","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 843/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.99603-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n7c2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 173/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.747262-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-n82k","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 322/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.79631-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nck","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 28/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.702129-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ne3y","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 409/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.826406-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nfbkk","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 954/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.043496-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ngid","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 289/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.785177-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ngv","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 39/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.705361-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nic4","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 916/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.027166-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nimm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 885/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.013734-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nnsq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 857/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.001872-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nol3","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 572/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.887058-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-np5","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 93/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.721897-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ns9r","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 169/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.745948-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nsix","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 446/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.840224-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nsq4","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 142/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.736894-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-nxvx","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 443/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.839149-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o1ey","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 452/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.842433-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o33e0","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 1003/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.065455-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o3me","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 388/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.819089-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o59u","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 849/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.998527-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o64x","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 720/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.945582-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o8ee","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 662/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.921976-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-o9wa","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 667/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.923964-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ocqp","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 456/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.843884-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-od5w","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 776/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.968285-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-odf5","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 610/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.901485-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oe44","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 661/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.921581-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ogm","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 112/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.72768-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oher","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 810/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.982301-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oin1","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 626/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.907686-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oiu0","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 164/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.744335-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oqve","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 503/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.861041-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-os1","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 55/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.710269-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-os2r","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 882/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.012443-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ouj","title":"Move disk 2: B → C","description":"Move disk 2 from peg B to peg C.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.366791-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-oyv","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 36/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.704493-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ozjf","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 708/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.940739-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ozou","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 594/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.895375-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p13","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 69/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.714702-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p1b0","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 898/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.019306-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p2bf","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 353/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.806901-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p2xd","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 625/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.907299-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p31u","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 547/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.877586-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p3bx","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 920/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.028868-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p3k","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 14/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.697884-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p50b9","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 949/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.041298-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p5zm","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 467/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.847901-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p8h1","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 652/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.917878-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p8z9","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 401/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.823631-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-p9z9","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 561/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.882874-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pa17","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 565/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.88439-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pdpt","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 664/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.922758-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pgh9","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 177/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.748564-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pipa","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 907/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.02334-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pjkg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 538/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.874216-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pjp1","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 204/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75733-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pko9","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 540/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.874947-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-plx0","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 132/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.733788-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pm8","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 1/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.69384-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pmh","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 61/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.712323-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-poch","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 154/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.740665-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pop","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 77/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.717087-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-powo","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 905/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.022464-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ppq0","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 466/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.847519-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pr4s","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 497/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.858861-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-prm0","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 261/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.775909-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-prta","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 345/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.804124-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pvzh","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 542/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.875717-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-pxo5","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 252/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.772913-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-q3xb","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 226/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.764407-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-q88","title":"Move disk 1: C → B","description":"Move disk 1 from peg C to peg B.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.365876-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-q93o","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 150/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.739382-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qa55","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 558/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.881748-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qbws","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 630/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.909259-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qdc","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 79/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.717686-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qdhr","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 803/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.979409-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qdj1","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 143/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.737229-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qesl","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 434/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.835892-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qfrm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 639/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.912806-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qgpu","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 423/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.831804-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qh8l","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 910/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.02461-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qhh","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 99/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.723717-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qhso","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 445/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.83987-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qhto","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 339/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.802072-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qko6","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 461/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.8457-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ql1v","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 172/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.746935-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ql2c","title":"Move disk 9: B → C","description":"Move disk 9 from peg B to peg C. (Move 768/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.96494-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qlai","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 490/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.856322-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qmfy","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 449/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.841342-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qmrur","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 950/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.041762-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qq0z","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 613/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.902659-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-quy","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 4/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.694814-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qxkn","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 240/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.76903-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qxqi","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 881/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.01202-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qy9x","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 549/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.878356-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qyst2","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 978/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.054345-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qz3","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 54/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.709969-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-qzbm","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 168/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.745623-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r0bj","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 378/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.815601-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r0lc","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 671/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.9255-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r1gp","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 703/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.938718-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r3g4","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 579/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.889692-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r4s1","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 765/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.963729-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r5co","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 318/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.794949-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r6tu","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 387/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.81874-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r7itm","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 991/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.060217-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-r9fn","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 744/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.955197-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rax","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 76/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.71679-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rbhq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 737/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.952395-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rc2o","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 788/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.973196-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rcju","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 580/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.890066-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rd5u","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 915/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.026741-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rdz1","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 336/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.80106-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rinm","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 722/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.946385-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rk9v","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 243/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.769995-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rlb6","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 806/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.980641-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rp8z","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 591/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.894244-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rqt0","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 670/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.925108-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rvf3u","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 947/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.040355-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rxan","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 264/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.776898-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rylc","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 231/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.766024-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ryzm","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 816/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.984838-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-rzog","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 394/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.821151-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-s0j98","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 1019/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.072608-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-s3z9","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 216/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.761152-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-s5r10","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 943/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.038617-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-s80r","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 725/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.947566-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-s86y","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 469/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.848625-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sbo5","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 577/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.888942-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sebo","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 834/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.992284-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sed","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 66/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.713824-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sf3","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 75/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.716478-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sivl","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 615/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.903424-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sjip","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 508/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.862927-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-skyk","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 361/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.809689-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-slqq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 657/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.919861-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-snug","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 141/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.736586-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-so4h","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 215/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.760834-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-spri","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 636/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.911592-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sr1","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 8/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.696094-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-suv3","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 220/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.762441-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-suxg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 487/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.855225-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-suzg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 514/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.865138-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sv7r","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 688/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.932367-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-swzc","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 756/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.960066-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-sy54","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 532/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.871977-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t088","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 431/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.834795-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t1b","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 44/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.706802-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t1cg","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 400/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.823287-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t1tz","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 219/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.762122-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t2n2","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 206/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.757954-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t395","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 429/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.834065-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t3p","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 57/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.711071-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t4q5","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 342/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.803088-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t62v","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 259/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.775237-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t715","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 763/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.962916-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t8jr","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 417/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.82923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t8lm","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 344/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.803784-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t8o7","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 903/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.021539-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-t8tc","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 660/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.921162-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tc9l","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 651/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.917466-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-teau","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 211/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.759545-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tejz","title":"Move disk 7: C → A","description":"Move disk 7 from peg C to peg A. (Move 320/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.795621-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tfn9","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 807/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.98104-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-thn7","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 760/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.961698-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tkmq","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 632/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.910023-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tled","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 914/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.026323-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tpoq","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 925/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.030973-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tqik","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 917/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.027593-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tqn2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 311/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.792584-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tquk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 407/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.825732-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tqzt","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 894/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.017528-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-trqd","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 489/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.855947-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tsat","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 510/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.863674-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ttk2a","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 990/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.059769-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tup","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 45/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.707081-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-tw5d","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 889/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.015411-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u09n","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 822/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.987319-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u14u","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 655/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.919063-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u1ff","title":"Move disk 10: A → C","description":"Move disk 10 from peg A to peg C. (Move 512/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.864404-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u2d6","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 940/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.037358-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u2lcy","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 993/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.061077-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u3vk","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 375/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.814557-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u4jf","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 887/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.014577-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u618","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 292/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.786167-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u652","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 267/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.777868-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-u9c8","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 713/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.942774-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uau","title":"Implement","description":"Implement the solution for gt-8tmz.10. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:48.599004-08:00","updated_at":"2025-12-27T21:29:55.097728-08:00","dependencies":[{"issue_id":"gt-mol-uau","depends_on_id":"gt-mol-a8b","type":"blocks","created_at":"2025-12-25T19:47:48.614361-08:00","created_by":"mayor"}],"deleted_at":"2025-12-27T21:29:55.097728-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-uctx","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 638/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.912398-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-udhz","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 826/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.988961-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-udls","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 701/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.93793-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uew","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 65/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.713516-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uexx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 835/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.992695-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ui21","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 359/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.808979-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ui9s","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 481/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.853024-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uioo","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 191/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75307-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uldfi","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 953/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.043081-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uoc5","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 123/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.731014-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uqha","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 741/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.95399-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-us09","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 574/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.887834-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-usp2","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 563/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.883632-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uup4","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 748/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.956827-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uwfg","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 770/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.965768-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-uxvz","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 453/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.842792-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v0p1z","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 972/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.051752-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v3eq","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 802/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.978991-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v4r9","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 214/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.760513-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v54w","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 439/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.837694-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v5v","title":"Verify initial state","description":"All 3 disks stacked on peg A. Largest on bottom.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.36487-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v6cb","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 893/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.017108-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v6da","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 890/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.015854-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v87e","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 175/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.747913-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v8mw","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 628/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.908474-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v94h","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 596/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.896128-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-v9dn","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 678/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.928324-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-van0u","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 1017/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.071721-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vbtq6","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 945/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.039468-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vbx3","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 801/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.978575-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vdlnu","title":"Verify final state","description":"All 10 disks now on peg C. Tower intact, all moves were legal.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.074868-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vdth","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 368/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.812121-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vepl8","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 981/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.05564-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vgdg","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 865/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.005234-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vmlk","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 767/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.964526-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vnfgo","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 967/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.049505-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vo8d","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 872/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.008206-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vpq6","title":"Move disk 4: C → B","description":"Move disk 4 from peg C to peg B. (Move 936/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.03562-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vs57","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 873/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.008632-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vsj","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 22/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.700342-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vsxh","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 705/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.939526-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vti2","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 165/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.744652-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vvbq","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 536/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.873455-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vw5x","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 624/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.90689-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vwg5","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 614/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.903028-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vyzd","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 817/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.985239-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-vzyw","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 166/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.744971-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w1v3","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 735/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.951601-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w2kq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 471/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.849368-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w2z","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 37/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.704783-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w35v","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 402/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.82397-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w5i6","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 597/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.896494-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w6dq6","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 1007/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.067233-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w8y9","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 464/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.846772-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-w96l","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 836/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.993104-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wcnj4","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 977/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.053922-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wcve","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 183/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.750521-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wfx2","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 235/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.767361-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wg42","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 364/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.810741-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wix8","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 575/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.888219-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wk29","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 323/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.796658-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wlu4p","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 986/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.057945-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wmfb","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 304/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.790211-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wnd0","title":"Move disk 5: B → C","description":"Move disk 5 from peg B to peg C. (Move 528/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.870456-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wpzj","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 404/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.824668-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wqms","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 255/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.773909-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wr7i","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 924/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.030565-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wrlw","title":"Move disk 8: A → C","description":"Move disk 8 from peg A to peg C. (Move 128/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.732537-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wt88","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 398/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.822563-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wu4d","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 459/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.844963-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wvjx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 829/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.990216-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wwtp","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 545/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.876847-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wxgv","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 535/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.873099-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wxm5","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 158/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.742363-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-wzwg","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 454/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.843168-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-x03q","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 488/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.855586-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-x0na","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 137/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.735351-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-x1rx","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 217/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.761459-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-x6jm","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 650/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.917091-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xa68","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 229/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.765369-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xb4j","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 171/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.74662-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xb7s","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 698/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.936758-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xbl","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 119/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.729797-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xbzx","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 437/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.836974-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xfcy","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 525/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.869327-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xfsa","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 627/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.908077-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xfvb","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 757/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.960461-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xhmy","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 135/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.734723-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xit1","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 156/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.741643-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xl8","title":"Load context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-8tmz.34) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:47:10.031297-08:00","updated_at":"2025-12-27T21:29:55.122589-08:00","deleted_at":"2025-12-27T21:29:55.122589-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mol-xmm18","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 1012/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.069494-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xmy4","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 809/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.981872-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xn4j","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 430/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.834428-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xnaq","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 279/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.781825-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xo7j","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 932/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.033938-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xo9x","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 859/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.002721-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xtv4","title":"Move disk 4: B → A","description":"Move disk 4 from peg B to peg A. (Move 712/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.942356-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xuej","title":"Move disk 5: A → B","description":"Move disk 5 from peg A to peg B. (Move 496/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.858484-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xui8","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 752/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.958431-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xyw9","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 742/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.954386-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xzty","title":"Move disk 6: A → C","description":"Move disk 6 from peg A to peg C. (Move 608/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.900695-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-xzv","title":"Verify final state","description":"All 3 disks now on peg C. Tower intact, all moves were legal.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:13:36.36739-08:00","updated_at":"2025-12-31T01:27:23.921829-08:00","close_reason":"Closed","deleted_at":"2025-12-31T01:27:23.921829-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y29x","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 526/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.869696-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y2n1","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 909/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.024196-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y2od","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 296/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.787524-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y32n","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 478/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.851899-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y3qq","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 635/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.9112-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y7n","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 73/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.715883-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y9e1","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 771/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.966165-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-y9ll","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 415/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.828524-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yb2u","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 383/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.817339-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yb7a","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 739/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.953199-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yckw","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 236/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.767686-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yd7","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 118/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.729491-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ye5k","title":"Move disk 9: A → B","description":"Move disk 9 from peg A to peg B. (Move 256/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.774249-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yeh6d","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 963/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.047409-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yg71","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 853/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.000177-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ygem","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 799/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.977753-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ygi1","title":"Move disk 4: A → C","description":"Move disk 4 from peg A to peg C. (Move 248/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.771608-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yh1","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 111/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.727352-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yh44","title":"Move disk 3: B → C","description":"Move disk 3 from peg B to peg C. (Move 852/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.999769-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yixi","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 603/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.898803-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ym28g","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 1021/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.073506-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ymr","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 46/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.707402-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ypxl","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 618/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.904577-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ysmpw","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 1009/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.068116-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ytar","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 775/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.967864-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ytcg","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 647/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.915923-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ywt0","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 151/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.739701-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yyxz","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 377/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.815249-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-yzo5","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 335/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.800723-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-z1gb","title":"Move disk 3: A → B","description":"Move disk 3 from peg A to peg B. (Move 820/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.986474-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-z2sb","title":"Move disk 6: C → B","description":"Move disk 6 from peg C to peg B. (Move 288/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.784841-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-z3l2","title":"Move disk 5: C → A","description":"Move disk 5 from peg C to peg A. (Move 272/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.779528-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-z71l","title":"Move disk 8: A → C","description":"Move disk 8 from peg A to peg C. (Move 896/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.018441-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-z8mh","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 590/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.893863-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zagk","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 327/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.797996-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zbiv","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 570/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.886294-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zbo4","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 706/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.939942-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zd55","title":"Move disk 2: A → C","description":"Move disk 2 from peg A to peg C. (Move 746/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.956014-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zey7","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 553/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.879867-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zf0t","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 239/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.768703-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zf5a","title":"Move disk 1: C → A","description":"Move disk 1 from peg C to peg A. (Move 611/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.901878-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zj2s","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 331/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.799357-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zjga","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 198/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.755357-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zjxh","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 273/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.779869-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zm1","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 114/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.728274-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zo7j","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 285/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.78383-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zp1c","title":"Move disk 2: C → B","description":"Move disk 2 from peg C to peg B. (Move 246/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.770977-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zru4","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 812/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.983137-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-ztdr","title":"Move disk 1: A → B","description":"Move disk 1 from peg A to peg B. (Move 637/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.911997-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zvc7","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 507/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.862546-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zvcx","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 884/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:39.013305-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zvfh","title":"Move disk 3: C → A","description":"Move disk 3 from peg C to peg A. (Move 428/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.833703-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zvhz","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 555/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.880609-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zxgm","title":"Move disk 2: B → A","description":"Move disk 2 from peg B to peg A. (Move 130/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.733136-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mol-zzrm","title":"Move disk 1: B → C","description":"Move disk 1 from peg B to peg C. (Move 207/1023)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:21:38.75826-08:00","updated_at":"2025-12-31T01:27:37.010792-08:00","deleted_at":"2025-12-31T01:27:37.010792-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-mqbm","title":"Digest: mol-deacon-patrol","description":"Patrol: TRACER BULLET SUCCESS - gt-oiv0 merged to main, furiosa completed and exited cleanly, 0 polecats now, all agents up","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T23:13:30.087307-08:00","updated_at":"2025-12-27T21:26:05.441745-08:00","deleted_at":"2025-12-27T21:26:05.441745-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mqtqs","title":"Merge: capable-mjw47ef9","description":"branch: polecat/capable-mjw47ef9\ntarget: main\nsource_issue: capable-mjw47ef9\nrig: gastown\nagent_bead: gt-gastown-polecat-capable","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:03:40.714326-08:00","updated_at":"2026-01-01T19:07:31.153945-08:00","closed_at":"2026-01-01T19:07:31.153945-08:00","close_reason":"Merged to main at 84450468","created_by":"gastown/polecats/capable"}
{"id":"gt-mqu65","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, no messages, no issues","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:13:57.610811-08:00","updated_at":"2025-12-27T21:26:03.601385-08:00","deleted_at":"2025-12-27T21:26:03.601385-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mquab","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All healthy, routine cycle complete","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:51:57.921364-08:00","updated_at":"2025-12-27T21:26:03.454115-08:00","deleted_at":"2025-12-27T21:26:03.454115-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mr432","title":"Digest: mol-deacon-patrol","description":"Patrol 10: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:42:19.234818-08:00","updated_at":"2025-12-27T21:26:00.887861-08:00","deleted_at":"2025-12-27T21:26:00.887861-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-msea3","title":"Merge: dementus-mjw46vz4","description":"branch: polecat/dementus-mjw46vz4\ntarget: main\nsource_issue: dementus-mjw46vz4\nrig: gastown\nagent_bead: gt-gastown-polecat-dementus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:05:57.621708-08:00","updated_at":"2026-01-01T19:12:07.190383-08:00","closed_at":"2026-01-01T19:12:07.190383-08:00","close_reason":"All commits already on main - duplicate MR","created_by":"gastown/polecats/dementus"}
{"id":"gt-mtj4","title":"Digest: mol-deacon-patrol","description":"Patrol 11: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:59:34.040451-08:00","updated_at":"2025-12-27T21:26:04.935865-08:00","deleted_at":"2025-12-27T21:26:04.935865-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mv8ve","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 4: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-01T07:00:41.739402-08:00","updated_at":"2026-01-01T07:00:41.739402-08:00","closed_at":"2026-01-01T07:00:41.739367-08:00"}
{"id":"gt-mwiuk","title":"gt nudge doesn't work with crew addresses","description":"## Bug\n\n`gt nudge beads/crew/dave \"message\"` fails because it uses the polecat session manager which produces wrong session names.\n\n## Expected\nSession name: `gt-beads-crew-dave` (hyphen)\n\n## Actual \nSession name: `gt-beads-crew/dave` (slash, from polecat manager)\n\n## Root Cause\n\nIn nudge.go line 46-57, parseAddress returns polecatName=`crew/dave`, then SessionName keeps the slash.\n\n## Fix\n\nDetect `crew/` prefix and use crewSessionName() instead.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-26T15:43:41.099431-08:00","updated_at":"2025-12-27T21:29:45.856721-08:00","deleted_at":"2025-12-27T21:29:45.856721-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-mwpcq","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 3: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:19:40.396361-08:00","updated_at":"2025-12-28T11:19:40.396361-08:00","closed_at":"2025-12-28T11:19:40.396326-08:00"}
{"id":"gt-mx6s","title":"Witness patrol wisp with polecat leases","description":"Witness should maintain a rolling patrol wisp that tracks active polecats:\n\n```\nwisp-witness-patrol\n├── lease: furiosa (boot → working → done)\n├── lease: nux (working)\n└── lease: slit (done, closed)\n```\n\nWhen POLECAT_STARTED arrives:\n- bd mol bond mol-polecat-lease wisp-patrol --var polecat=X\n\nPatrol loop iterates leases:\n- gt peek $polecat\n- If idle: gt nudge\n- If shutdown received: close lease\n\nWhen all leases closed:\n- bd mol squash wisp-xxx --summary='N polecats processed'\n\nRequires mol-polecat-lease proto definition.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T22:01:13.640901-08:00","updated_at":"2025-12-27T21:29:53.099772-08:00","dependencies":[{"issue_id":"gt-mx6s","depends_on_id":"gt-cp2s","type":"blocks","created_at":"2025-12-22T22:31:40.126113-08:00","created_by":"daemon"},{"issue_id":"gt-mx6s","depends_on_id":"gt-83k0","type":"blocks","created_at":"2025-12-22T22:31:40.204487-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.099772-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-mxxpl","title":"Digest: mol-deacon-patrol","description":"Patrol 2: archived stale Mayor handoffs, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:20:50.046144-08:00","updated_at":"2025-12-27T21:26:01.720963-08:00","deleted_at":"2025-12-27T21:26:01.720963-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mxyj","title":"Witness session startup (gt witness start)","description":"Implement gt witness start \u003crig\u003e\n\nShould:\n1. Verify rig exists and has witness/ directory\n2. Check for existing witness session (don't double-start)\n3. Create tmux session: gt-\u003crig\u003e-witness\n4. Start Claude Code with --dangerously-skip-permissions\n5. Send gt prime to load context\n6. Send startup prompt\n\nSimilar pattern to refinery startup in internal/refinery/manager.go","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:20.38027-08:00","updated_at":"2025-12-27T21:29:53.82338-08:00","dependencies":[{"issue_id":"gt-mxyj","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.170879-08:00","created_by":"daemon"},{"issue_id":"gt-mxyj","depends_on_id":"gt-ni6a","type":"blocks","created_at":"2025-12-20T03:14:38.764636-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.82338-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mzal.1","title":"Define mol-gastown-boot proto structure","description":"Create the proto molecule definition with all steps.\n\n## Proto Definition\n\nEach step has:\n- Title (step name)\n- Description with Action/Verify/OnStall/OnFail sections\n- Dependencies (Needs: directive)\n\n## Steps\n\n1. ensure-daemon\n - Action: gt daemon status || gt daemon start\n - Verify: daemon PID exists and responding\n\n2. ensure-deacon \n - Action: gt deacon start\n - Verify: session exists, not stalled, heartbeat fresh\n - OnStall: gt nudge deacon/ \"Start patrol.\"\n\n3. ensure-witnesses (parallel container)\n - ensure-gastown-witness\n - ensure-beads-witness\n - Verify each: session exists, not stalled\n\n4. ensure-refineries (parallel container) \n - ensure-gastown-refinery\n - ensure-beads-refinery\n - Verify each: session exists, not stalled\n\n5. verify-town-health\n - Action: gt status\n - Verify: all expected agents shown\n\n## Output\n\nProto stored in beads as molecule type issue.\n","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T21:00:08.736237-08:00","updated_at":"2025-12-27T21:29:53.116531-08:00","dependencies":[{"issue_id":"gt-mzal.1","depends_on_id":"gt-mzal","type":"parent-child","created_at":"2025-12-22T21:00:08.736738-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.116531-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mzal.7","title":"Proto marketplace: shareable molecule templates","description":"Enable sharing protos between Gas Town installations.\n\n## Vision\n\nA public registry of protos that users can pull and use:\n\n```bash\ngt proto search \"code review\"\ngt proto install gastown/code-review\ngt sling code-review gastown/Toast --wisp\n```\n\n## Registry Design\n\n### Local Catalog\n`~/gt/molecules/` - user-defined and installed protos\n\n### Remote Registry\n`registry.gastown.dev/protos/` (future)\n- Browse online catalog\n- Version-controlled protos\n- Rating/reviews\n- Usage statistics\n\n## Proto Package Format\n\n```\ngastown-code-review-1.0.0/\n├── PROTO.md # Main proto definition\n├── README.md # Usage documentation\n├── LICENSE # Usage terms\n└── plugins/ # For pluggable molecules\n ├── security/\n └── performance/\n```\n\n## Commands\n\n```bash\ngt proto list # Show installed protos\ngt proto search \u003cquery\u003e # Search registry\ngt proto install \u003cname\u003e # Install from registry\ngt proto publish \u003cpath\u003e # Publish to registry\ngt proto update # Update all installed\n```\n\n## For Now\n\nStart with local catalog. Marketplace is future phase.\nEnsure proto format is registry-compatible from the start.\n","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-22T21:00:49.124222-08:00","updated_at":"2025-12-27T21:29:57.539089-08:00","dependencies":[{"issue_id":"gt-mzal.7","depends_on_id":"gt-mzal","type":"parent-child","created_at":"2025-12-22T21:00:49.124687-08:00","created_by":"daemon"},{"issue_id":"gt-mzal.7","depends_on_id":"gt-mzal.3","type":"blocks","created_at":"2025-12-22T21:01:01.874557-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.539089-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-mziio","title":"Digest: mol-deacon-patrol","description":"Patrol 10: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:36:15.586442-08:00","updated_at":"2025-12-27T21:26:00.365273-08:00","deleted_at":"2025-12-27T21:26:00.365273-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-mzzl9","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox checked (4 msgs, 1 escalation resolved), all agents healthy (2 witnesses, 2 refineries, 3 polecats, 5 crew), cleaned 66 abandoned wisps","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:42:46.3183-08:00","updated_at":"2025-12-27T21:26:02.46653-08:00","deleted_at":"2025-12-27T21:26:02.46653-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n33lx","title":"Digest: mol-deacon-patrol","description":"Patrol 16: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:19:26.175734-08:00","updated_at":"2025-12-27T21:26:03.519688-08:00","deleted_at":"2025-12-27T21:26:03.519688-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n3i7","title":"Digest: mol-deacon-patrol","description":"Patrol 15: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:00:56.816258-08:00","updated_at":"2025-12-27T21:26:04.902723-08:00","deleted_at":"2025-12-27T21:26:04.902723-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n4vz7","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 5: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T11:20:57.158949-08:00","updated_at":"2026-01-01T17:33:45.292139-08:00","deleted_at":"2026-01-01T17:33:45.292139-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-n508","title":"Merge: gt-70b3","description":"type: merge-request\nbranch: polecat/Rictus\ntarget: main\nsource_issue: gt-70b3\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T21:56:57.840796-08:00","updated_at":"2025-12-27T21:29:54.07535-08:00","deleted_at":"2025-12-27T21:29:54.07535-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n5uy","title":"Digest: mol-deacon-patrol","description":"Patrol OK: 8 polecats, 4 witness/refineries up","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-23T01:06:42.456269-08:00","updated_at":"2025-12-27T21:26:05.397271-08:00","deleted_at":"2025-12-27T21:26:05.397271-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n6mjs","title":"Session ended: gt-gastown-crew-george","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:28:51.084791-08:00","updated_at":"2026-01-04T16:41:00.357779-08:00","closed_at":"2026-01-04T16:41:00.357779-08:00","close_reason":"Archived session telemetry","created_by":"gastown/crew/george"}
{"id":"gt-n7cm4","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Green","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:34:39.56723-08:00","updated_at":"2025-12-27T21:26:02.527491-08:00","deleted_at":"2025-12-27T21:26:02.527491-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n7z7","title":"Bug: refinery --foreground detects parent session as already running","description":"When gt refinery start gastown runs:\n1. Creates tmux session gt-gastown-refinery\n2. Sends 'gt refinery start gastown --foreground' into the session\n3. The foreground command checks HasSession() - finds the session it's inside\n4. Returns 'already running' error\n\nThe foreground mode check should either:\n- Skip the tmux session check (only check PID)\n- Use a different indicator that the daemon loop is running\n- Pass a flag to indicate we're being called from the background starter\n\nWorkaround: Manually run 'gt refinery start gastown --foreground' from a fresh session.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T00:56:20.326369-08:00","updated_at":"2025-12-27T21:29:53.873282-08:00","deleted_at":"2025-12-27T21:29:53.873282-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-n8q36","title":"Digest: mol-deacon-patrol","description":"Patrol 8: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:56:05.014442-08:00","updated_at":"2025-12-27T21:26:00.550769-08:00","deleted_at":"2025-12-27T21:26:00.550769-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n8s1.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-n8s1\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:06:13.827065-08:00","updated_at":"2025-12-27T21:29:55.695738-08:00","deleted_at":"2025-12-27T21:29:55.695738-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-n8u5","title":"bd list --parent: filter by parent issue","description":"Add --parent flag to bd list to filter issues by parent.\n\nExample:\n```bash\nbd list --parent=gt-h5n --status=open\n```\n\nWould show all open children of gt-h5n.\n\nUseful for:\n- Checking epic progress\n- Finding swarmable work within an epic\n- Molecule step listing","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T01:50:56.224031-08:00","updated_at":"2025-12-27T21:29:56.233779-08:00","deleted_at":"2025-12-27T21:29:56.233779-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-n9o2","title":"save-state","description":"Update handoff bead with new state.\n\nPersist nudge counts and pending actions.\n\nNeeds: execute-actions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.507176-08:00","updated_at":"2025-12-25T15:52:57.676488-08:00","deleted_at":"2025-12-25T15:52:57.676488-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-na7y.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-na7y\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:40:59.065447-08:00","updated_at":"2025-12-27T21:29:55.463886-08:00","deleted_at":"2025-12-27T21:29:55.463886-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nam3","title":"Update docs to reflect molecule-first paradigm","description":"Gas Town is fundamentally a molecule execution engine. Documentation should reflect this more clearly.\n\n## Issues Found\n\n### 1. gt spawn examples show molecule as optional\nREADME.md line 116: `gt spawn --issue \u003cid\u003e # Start polecat on issue`\nShould emphasize: polecats execute molecules, not just issues.\n\n### 2. Architecture.md spawn examples inconsistent\nLine 344 shows molecule: `gt spawn --issue gt-xyz --molecule mol-engineer-in-box`\nLine 1434 shows without: `gt spawn --issue \u003cid\u003e`\n\n### 3. Config vs molecule distinction not clear\noutposts.yaml shows static policy - should note when molecules apply.\n\n### 4. Operational molecules section is good but buried\nLines 430-566 cover operational molecules well. Should be more prominent.\n\n## Updates Needed\n- [ ] README: Update spawn examples to show molecule usage\n- [ ] architecture.md: Ensure all spawn examples include molecules\n- [ ] architecture.md: Add section on \"when config vs molecule\"\n- [ ] architecture.md: Move operational molecules higher in document\n- [ ] Add principle: \"If it requires cognition, it's a molecule\"\n- [ ] federation-design.md: Note that policy can escalate to mol-outpost-assign\n\n## Key Message\nGas Town doesn't spawn workers on issues. It spawns workers on molecules.\nThe issue is just the seed data for the molecule execution.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:26:31.842406-08:00","updated_at":"2025-12-27T21:29:53.781173-08:00","deleted_at":"2025-12-27T21:29:53.781173-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nd18","title":"Merge: gt-caih","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-caih\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T01:15:24.2771-08:00","updated_at":"2025-12-27T21:27:22.443431-08:00","deleted_at":"2025-12-27T21:27:22.443431-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ndm2m","title":"Digest: mol-deacon-patrol","description":"Patrol 19: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:00:27.991392-08:00","updated_at":"2025-12-27T21:26:01.754018-08:00","deleted_at":"2025-12-27T21:26:01.754018-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ne1t","title":"Design molecule step hooks","description":"Hooks that fire between molecule steps. When a bead in a molecule closes, trigger hook that can spawn agent attention to prompts/requests. This enables reactive orchestration - the molecule drives, hooks respond. Gas Town feature built on Beads data plane.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T17:53:14.568075-08:00","updated_at":"2025-12-27T21:29:56.503969-08:00","deleted_at":"2025-12-27T21:29:56.503969-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-ne41f","title":"Hardcoded 'gastown' rig in Deacon patrol context","description":"In prime.go, the Deacon patrol context hardcodes 'gastown' as the rig name.\n\n**Location**: prime.go L807\n\n```go\nrigBeadsDir := filepath.Join(ctx.TownRoot, \"gastown\", \"mayor\", \"rig\")\n```\n\nThis breaks if:\n- The rig is renamed\n- The town has multiple rigs\n- The Deacon is deployed to a different rig\n\n**Fix options**:\n1. Detect rig from context (ctx.Rig)\n2. Use town-level beads for Deacon patrol\n3. Pass rig name as parameter\n\nThe Deacon is a town-level role, so option 2 may be most appropriate.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-25T22:03:12.131973-08:00","updated_at":"2025-12-27T21:29:54.886015-08:00","deleted_at":"2025-12-27T21:29:54.886015-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-neim","title":"Digest: mol-deacon-patrol","description":"Patrol: dave handoff (vision docs, 7 design gaps filed), furiosa tracer bullet on gt-oiv0, all agents up","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T22:38:37.785872-08:00","updated_at":"2025-12-27T21:26:05.450107-08:00","deleted_at":"2025-12-27T21:26:05.450107-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nfyrw","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, fixed 1 stale lock","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:52:33.556453-08:00","updated_at":"2025-12-27T21:26:01.78709-08:00","deleted_at":"2025-12-27T21:26:01.78709-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ngkd","title":"Work on gt-ogr: Fix rig count in tmux status bar. The cou...","description":"Work on gt-ogr: Fix rig count in tmux status bar. The count is showing wrong. Run 'bd show gt-ogr' for details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T07:53:01.093695-08:00","updated_at":"2025-12-27T21:29:56.797574-08:00","deleted_at":"2025-12-27T21:29:56.797574-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ngpz","title":"mol-christmas-launch: 3-day execution plan","description":"\n\n---\n\n## Wisp Integration Wave Plan (added 2025-12-21)\n\nDependency-ordered execution for gt-3x0z (Wisp Molecules) + gt-rana (Patrol System):\n\n### Wave 1 (parallel - no blockers)\n- gt-3x0z.1: gt rig init creates .beads-ephemeral/\n- gt-3x0z.2: Configure bd for ephemeral molecule bonding\n- gt-3x0z.3: gt doctor checks for ephemeral health\n- gt-rana.1: Attachment field on pinned beads\n- gt-rana.3: mol-deacon-patrol definition\n\n### Wave 2 (after Wave 1)\n- gt-3x0z.4: gt spawn --molecule bonds ephemeral (GATE 1)\n- gt-rana.2: Daemon attachment detection\n\n### Wave 3-5 (sequential)\n- gt-3x0z.5, gt-3x0z.6 → gt-3x0z.7 → gt-3x0z.8 (GATE 2: squash)\n\n### Wave 6+ (patrol integration)\n- gt-rana.4: Basic patrol runner (needs gt-rana.3 + gt-3x0z.8)\n- gt-3x0z.9: mol-deacon-patrol uses wisp (needs gt-rana.3 + gt-3x0z.8)\n- Then: gt-rana.5-7, gt-3x0z.10-12\n\n### Key Gates\n1. gt-3x0z.4 - spawn/bond unlocks Phase 2\n2. gt-3x0z.8 - squash unlocks patrol integration\n3. gt-rana.4 - patrol runner unlocks Phase 2+ patrol\n","notes":"Postponed to New Year's launch. Christmas was too ambitious.","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-20T21:06:44.718065-08:00","updated_at":"2025-12-26T16:14:11.120923-08:00","deleted_at":"2025-12-26T16:14:11.120923-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ngu1","title":"Pinned beads should appear first in mail inbox","description":"Currently bd mail inbox sorts by priority then date, but pinned beads (handoff context) should always appear at the top.\n\n**Current behavior:**\n- Pinned beads mixed in with regular mail based on priority/date\n\n**Expected behavior:**\n- Pinned beads always first in inbox (before priority sorting)\n- This enables handoff beads to be the default first item an agent sees\n\n**Implementation:**\n1. In mail.go runMailInbox(), after filtering, sort pinned beads to top\n2. Within pinned and non-pinned groups, maintain priority/date sort\n\n**Related:**\n- gt-r8ej: Implement pinned beads for handoff state\n- gt-8h4: Pinned Beads epic\n\n**Acceptance criteria:**\n- bd mail inbox shows pinned beads first\n- Pinned beads still sorted by priority/date among themselves\n- Non-pinned mail sorted normally after pinned section","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T17:45:24.133521-08:00","updated_at":"2025-12-27T21:29:53.66911-08:00","dependencies":[{"issue_id":"gt-ngu1","depends_on_id":"gt-8h4","type":"parent-child","created_at":"2025-12-20T17:45:31.978176-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.66911-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nho2","title":"Merge: gt-i4kq","description":"branch: polecat/slit\ntarget: main\nsource_issue: gt-i4kq\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T23:44:37.465886-08:00","updated_at":"2025-12-27T21:27:22.493339-08:00","deleted_at":"2025-12-27T21:27:22.493339-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-ni6a","title":"Witness role template + CLAUDE.md","description":"Create the Witness agent's role context:\n\n1. internal/templates/roles/witness.md.tmpl\n - Witness responsibilities (polecat lifecycle)\n - Core loop description\n - Commands available (gt polecats, bd ready, etc)\n - Escalation protocol\n\n2. Generate CLAUDE.md for witness/ directories\n - Context about the rig\n - Mail address: \u003crig\u003e/witness\n - Startup protocol (gt prime, check inbox)\n\nThe Witness is the per-rig 'pit boss' - NOT a global coordinator. It manages polecats for ONE rig only.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T03:14:18.617496-08:00","updated_at":"2025-12-27T21:29:53.831749-08:00","dependencies":[{"issue_id":"gt-ni6a","depends_on_id":"gt-53w6","type":"parent-child","created_at":"2025-12-20T03:14:37.105264-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.831749-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nimml","title":"Digest: mol-deacon-patrol","description":"Patrol 8: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:35:05.066322-08:00","updated_at":"2025-12-27T21:26:00.381735-08:00","deleted_at":"2025-12-27T21:26:00.381735-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-njem","title":"Remove gt mail dependency on bd mail commands","description":"Replaced bd mail send/inbox/read/ack with bd create/list/show/close. Messages are now stored as beads issues with type=message.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:52:43.47722-08:00","updated_at":"2025-12-27T21:29:56.72792-08:00","deleted_at":"2025-12-27T21:29:56.72792-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-njr","title":"Engineer session restart protocol","description":"Implement session restart flow for when the Engineer needs to split work:\n\n1. Engineer creates subtask(s) in Beads assigned to self\n2. Engineer sends handoff mail to self (🤝 HANDOFF)\n3. Engineer sends restart request to Witness\n4. Witness verifies:\n - Handoff mail exists in Engineer outbox/sent\n - Subtasks filed in Beads\n5. Witness restarts the Refinery session (new Engineer context)\n\nThis enables \"occasionally consistent, eventually convergent\" work patterns.\nThe Refinery continues; the Engineer gets fresh context.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T23:02:48.22994-08:00","updated_at":"2025-12-27T21:29:54.402848-08:00","dependencies":[{"issue_id":"gt-njr","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:56.148564-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.402848-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nk7i","title":"Digest: mol-deacon-patrol","description":"Patrol 13: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:00:14.154408-08:00","updated_at":"2025-12-27T21:26:04.919231-08:00","deleted_at":"2025-12-27T21:26:04.919231-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nmtp","title":"Refactor builtin_molecules.go (1620 lines)","description":"## Summary\n\nbuiltin_molecules.go is 1620 lines and growing. Should be split into:\n- builtin_molecules.go - registration and common helpers\n- molecules_patrol.go - patrol molecules (deacon, witness, refinery)\n- molecules_work.go - work molecules (polecat-work, ready-work, engineer-in-box)\n- molecules_session.go - session wrappers (crew-session, polecat-session)\n\n## Benefits\n- Easier to find/edit specific molecules\n- Smaller diffs on changes\n- Clear categorization","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T01:19:15.540532-08:00","updated_at":"2025-12-27T21:29:57.522496-08:00","deleted_at":"2025-12-27T21:29:57.522496-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nnfm5","title":"Digest: mol-deacon-patrol","description":"Patrol 8: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:21:22.564416-08:00","updated_at":"2025-12-27T21:26:00.173566-08:00","deleted_at":"2025-12-27T21:26:00.173566-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-noih","title":"test pin issue","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:45:42.660977-08:00","updated_at":"2025-12-27T21:29:56.059848-08:00","deleted_at":"2025-12-27T21:29:56.059848-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-npnph","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 9: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:44:12.479941-08:00","updated_at":"2025-12-27T21:26:01.402635-08:00","deleted_at":"2025-12-27T21:26:01.402635-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nq1a","title":"Add 'gt account add' command","description":"Register new account: create config dir, spawn claude with CLAUDE_CONFIG_DIR set, user completes /login, add entry to accounts.yaml.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:19.444213-08:00","updated_at":"2025-12-27T21:29:56.184087-08:00","dependencies":[{"issue_id":"gt-nq1a","depends_on_id":"gt-58tu","type":"blocks","created_at":"2025-12-23T03:24:34.722487-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.184087-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nq6j","title":"Rename .beads-ephemeral to .beads-wisps in docs","description":"The wisp storage directory should be named `.beads-wisps/` not `.beads-ephemeral/`.\n\nAlso fix the architecture: Witness and Refinery share mayor/rig's beads and wisps.\nThey don't have separate ephemeral stores.\n\n## Files to update\n- docs/wisp-architecture.md\n- docs/architecture.md \n- ~/gt/docs/patrol-system-design.md\n- ~/gt/CLAUDE.md\n\n## Changes\n1. Rename `.beads-ephemeral/` → `.beads-wisps/` everywhere\n2. Remove incorrect references to witness/.beads-ephemeral/ and refinery/rig/.beads-ephemeral/\n3. Clarify that all rig-level patrols use mayor/rig/.beads-wisps/","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T23:38:04.110859-08:00","updated_at":"2025-12-27T21:29:53.358466-08:00","deleted_at":"2025-12-27T21:29:53.358466-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nqrh","title":"Auto-restart patrol agents after N loops","description":"All patrol roles should auto-restart to keep context usage low. Role-specific heuristics:\n\n**Deacon**: \n- 20 patrol loops without major incident\n- Immediate restart after ANY extraordinary action (lifecycle request, remediation, escalation)\n- Rationale: Keep context short so there's headroom if something big comes up\n\n**Witness**:\n- Based on polecats processed (not loop count)\n- Restart after processing N polecats (spawns, nudges, decommissions)\n- Suggested N: 10-15 polecats\n\n**Refinery**:\n- Based on MRs processed (not loop count) \n- Restart after processing N merge requests\n- Suggested N: 5-10 MRs (merges are context-heavy)\n\nBenefits:\n- Keeps context fresh (better focus/performance)\n- Reduces cost (quadratic increase vs conversation length)\n- Preserves headroom for handling surprises\n- Simple and predictable (no context estimation needed)\n\nEach role tracks their metric in state file and hands off when threshold reached.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T14:00:00.093148-08:00","updated_at":"2025-12-27T21:29:52.990581-08:00","deleted_at":"2025-12-27T21:29:52.990581-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nrawz","title":"Digest: mol-deacon-patrol","description":"Patrol 8: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:17:21.167521-08:00","updated_at":"2025-12-27T21:26:03.560638-08:00","deleted_at":"2025-12-27T21:26:03.560638-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nrer7","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:09:00.383617-08:00","updated_at":"2025-12-27T21:26:02.942517-08:00","deleted_at":"2025-12-27T21:26:02.942517-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nriy","title":"Test: Alpha to Beta","description":"Sibling communication test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:44:00.731578-08:00","updated_at":"2025-12-25T14:12:42.250457-08:00","deleted_at":"2025-12-25T14:12:42.250457-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-nsyy","title":"Merge: gt-h6eq.7","description":"branch: polecat/dag\ntarget: main\nsource_issue: gt-h6eq.7\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:42:39.260767-08:00","updated_at":"2025-12-27T21:27:22.853057-08:00","deleted_at":"2025-12-27T21:27:22.853057-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-nti8","title":"Polecats should not push branches to remote","description":"## Current Behavior\n\nPolecats push their branches to origin (e.g., `polecat/furiosa`), which pollutes the remote with many short-lived branches.\n\n## Desired Behavior\n\nPolecats should only commit locally. The Refinery handles all remote pushes:\n1. Polecat works on local `polecat/\u003cname\u003e` branch\n2. Polecat signals done (state → idle)\n3. Refinery pulls from local polecat branch\n4. Refinery runs tests, merges to main\n5. Refinery pushes main to remote\n6. If PR review needed, Refinery creates the PR\n\n## Benefits\n\n- Clean remote (no branch pollution)\n- Clear responsibility (Refinery is the quality gate)\n- Simpler cleanup (local branches deleted with worktree)\n- Less noise in GitHub UI\n\n## Trade-offs\n\n- If polecat crashes before Refinery merges, code is lost locally\n- But beads issue remains open, another polecat can redo the work\n- This is acceptable for ephemeral workers\n\n## Implementation\n\nIn polecat CLAUDE.md or landing protocol:\n- Remove `git push origin HEAD` from workflow\n- Replace with just `git commit` + signal done\n- Refinery handles the rest","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T14:13:48.804954-08:00","updated_at":"2025-12-27T21:29:56.619965-08:00","deleted_at":"2025-12-27T21:29:56.619965-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-nvql","title":"max Handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:47:59.020501-08:00","updated_at":"2025-12-27T21:29:56.04343-08:00","deleted_at":"2025-12-27T21:29:56.04343-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nxea","title":"Digest: mol-deacon-patrol","description":"Patrol #3: Routine - 6 agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:16:10.287785-08:00","updated_at":"2025-12-27T21:26:04.810885-08:00","deleted_at":"2025-12-27T21:26:04.810885-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nz6t","title":"Remove unused style helper functions","description":"internal/style/style.go defines RenderSuccess, RenderWarning, RenderError, and RenderInfo helper functions that are never used. Code uses style.Success.Render() directly instead. Either use the helpers consistently or remove them.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-21T21:34:43.822193-08:00","updated_at":"2025-12-27T21:29:57.907963-08:00","deleted_at":"2025-12-27T21:29:57.907963-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-nzona","title":"Digest: mol-deacon-patrol","description":"Patrol 12: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:30.756048-08:00","updated_at":"2025-12-27T21:26:03.397034-08:00","deleted_at":"2025-12-27T21:26:03.397034-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o0ooa","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:00:14.192253-08:00","updated_at":"2025-12-27T21:26:04.050859-08:00","deleted_at":"2025-12-27T21:26:04.050859-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o29j","title":"inspect-workers","description":"Capture output for each working polecat.\n\n```bash\ngt peek \u003crig\u003e/\u003cpolecat\u003e\n```\n\nNeeds: survey-workers","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.506324-08:00","updated_at":"2025-12-25T15:52:57.508608-08:00","deleted_at":"2025-12-25T15:52:57.508608-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o2wgw","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:38.841389-08:00","updated_at":"2025-12-27T21:26:02.060659-08:00","deleted_at":"2025-12-27T21:26:02.060659-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o3is","title":"gt sling pinToHook doesn't set pinned boolean field","description":"## Updated Root Cause Analysis (2025-12-23)\n\nThe issue is **NOT** in gt as originally thought. It's in the beads (bd) codebase.\n\n### What happens:\n1. `bd pin` correctly sets pinned=1 in SQLite\n2. Any subsequent `bd` command (even `bd show`) resets pinned to 0\n3. This happens even with `--no-auto-import` and `--sandbox` flags\n\n### Evidence:\n```bash\n$ bd --no-daemon pin gt-k08o --for=max\n📌 Pinned gt-k08o to max's hook\n\n$ sqlite3 beads.db 'SELECT id, pinned FROM issues WHERE id=\"gt-k08o\"'\ngt-k08o|1 # ← Correct immediately after pin\n\n$ bd --no-daemon --no-auto-import show gt-k08o\n[shows issue without pinned field]\n\n$ sqlite3 beads.db 'SELECT id, pinned FROM issues WHERE id=\"gt-k08o\"' \ngt-k08o|0 # ← WRONG\\! bd show overwrote it\n```\n\n### Where to look:\nThe bug is likely in one of these beads code paths:\n- Some import/hydration running despite --no-auto-import\n- WAL mode not flushing before subsequent reads\n- Multi-repo or redirect handling corrupting pinned field\n\n### Workaround:\nThe handoff bead attachment mechanism (AttachMolecule) works correctly.\nThe pinned field is cosmetic for `bd hook` visibility only.\ngt sling correctly uses AttachMolecule for work assignment.\n\n### Next steps:\nCreate a beads issue to fix this properly in the bd codebase.\nThis is not a gt issue.","notes":"Created beads bug: bd-phtv in ~/gt/beads","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-23T04:41:12.668958-08:00","updated_at":"2025-12-27T21:29:53.040476-08:00","deleted_at":"2025-12-27T21:29:53.040476-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-o40t","title":"gt sling --force: return displaced work to ready pool","description":"When slinging with --force to an agent with occupied hook, the displaced molecule should be returned to the ready pool rather than silently orphaned.\n\nCurrent behavior:\n- --force overwrites the hook attachment\n- Previous work becomes orphaned (still assigned but not pinned)\n\nDesired behavior:\n1. Unpin the displaced molecule (clear assignee, set pinned=false)\n2. Print warning in tool output: 'Warning: displaced gt-xxx back to ready pool'\n3. Proceed with new sling\n\nThis ensures:\n- No silent data loss\n- Agent sees the warning and can act on it\n- Human caller sees what happened\n- Displaced work is discoverable via 'bd ready'\n\nImplementation location: checkHookCollision() or the sling handlers themselves (after the --force check passes)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:00:23.713733-08:00","updated_at":"2025-12-27T21:29:55.968712-08:00","deleted_at":"2025-12-27T21:29:55.968712-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o5ra","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy, no lifecycle requests, 8 active sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:54:13.477737-08:00","updated_at":"2025-12-27T21:26:05.013415-08:00","deleted_at":"2025-12-27T21:26:05.013415-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o75l","title":"Merge: gt-h6eq.3","description":"branch: polecat/keeper\ntarget: main\nsource_issue: gt-h6eq.3\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:49:58.84455-08:00","updated_at":"2025-12-27T21:27:22.828169-08:00","deleted_at":"2025-12-27T21:27:22.828169-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-o7d4a","title":"Digest: mol-deacon-patrol","description":"Patrol 11: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:07:49.101641-08:00","updated_at":"2025-12-27T21:26:02.967565-08:00","deleted_at":"2025-12-27T21:26:02.967565-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o975j","title":"Digest: mol-deacon-patrol","description":"Patrol 17: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:36:27.195688-08:00","updated_at":"2025-12-27T21:26:00.718894-08:00","deleted_at":"2025-12-27T21:26:00.718894-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-o9osx","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:47:16.478644-08:00","updated_at":"2025-12-27T21:26:03.470488-08:00","deleted_at":"2025-12-27T21:26:03.470488-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-obxz","title":"Digest: mol-deacon-patrol","description":"Patrol #9","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:23:37.705161-08:00","updated_at":"2025-12-27T21:26:04.759434-08:00","deleted_at":"2025-12-27T21:26:04.759434-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oc2","title":"Daemon: proper rig discovery","description":"Currently discovers rigs by scanning tmux session names for gt-*-witness pattern. Should instead:\n- Read rigs from mayor/rigs.json\n- Or scan town directory for .gastown subdirs\n- Handle rigs that exist but don't have running witnesses","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T13:38:15.825299-08:00","updated_at":"2025-12-27T21:29:57.184842-08:00","dependencies":[{"issue_id":"gt-oc2","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T13:38:26.826697-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.184842-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-occdx","title":"Digest: mol-deacon-patrol","description":"Patrol 6: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:43:56.478866-08:00","updated_at":"2025-12-27T21:26:02.896845-08:00","deleted_at":"2025-12-27T21:26:02.896845-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ocuf2","title":"Digest: mol-refinery-patrol","description":"Patrol: Queue empty. Updated workflow docs with MR verification requirements. Closed 5 unverifiable orphaned MRs.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-31T01:25:42.193318-08:00","updated_at":"2026-01-01T17:34:34.342482-08:00","close_reason":"Squashed from 11 wisps","deleted_at":"2026-01-01T17:34:34.342482-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-odfr","title":"Add WaitsFor parsing to molecule steps","description":"Add WaitsFor field to MoleculeStep struct and parse 'WaitsFor: all-children' (and future variants) from molecule descriptions.\n\nThis enables:\n- mol progress to understand fanout patterns\n- Validation that WaitsFor references valid constructs\n- Future automated orchestration\n\nImplementation:\n1. Add WaitsFor []string to MoleculeStep struct\n2. Add waitsForLineRegex to parse 'WaitsFor: ...' lines\n3. Update ParseMoleculeSteps to extract WaitsFor\n4. Update tests","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T21:35:55.004767-08:00","updated_at":"2025-12-27T21:29:52.836996-08:00","deleted_at":"2025-12-27T21:29:52.836996-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-odvf","title":"Document bd mol bond/squash/burn CLI","description":"Create CLI reference documentation for molecule commands:\n\n## bd mol bond\n\nInstantiate a proto into a runnable molecule.\n\n```bash\nbd mol bond \u003cproto-id\u003e [--wisp] [--assignee=\u003caddr\u003e]\n```\n\n- Default: creates a Mol (durable, in main beads)\n- --wisp: creates a Wisp (ephemeral, in .beads-ephemeral/)\n- --assignee: who will execute this molecule\n\n## bd mol squash\n\nComplete a molecule and generate digest.\n\n```bash\nbd mol squash \u003cmol-id\u003e --summary='...'\n```\n\n- For Mol: creates digest in git history\n- For Wisp: evaporates (no permanent record)\n- --summary: required summary of what was accomplished\n\n## bd mol burn\n\nAbandon a molecule without completing.\n\n```bash\nbd mol burn \u003cmol-id\u003e [--reason='...']\n```\n\n- Discards molecule state\n- No digest created\n- Use when molecule is no longer needed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T16:33:06.462105-08:00","updated_at":"2025-12-27T21:29:56.537382-08:00","dependencies":[{"issue_id":"gt-odvf","depends_on_id":"gt-62hm","type":"blocks","created_at":"2025-12-21T16:33:17.530156-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.537382-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-odvr","title":"Merge: gt-r6td","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-r6td\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T22:54:01.000047-08:00","updated_at":"2025-12-27T21:27:22.543258-08:00","deleted_at":"2025-12-27T21:27:22.543258-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-of0w1","title":"Digest: mol-deacon-patrol","description":"Patrol 14: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:24:00.367405-08:00","updated_at":"2025-12-27T21:26:02.830298-08:00","deleted_at":"2025-12-27T21:26:02.830298-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ofl2","title":"CRITICAL: ProcessMRFromQueue not implemented - Refinery merge logic stubbed","description":"File: internal/refinery/engineer.go:384\nThe ProcessMRFromQueue() function returns hardcoded failure with TODO comment.\nThis is a core refinery function that cannot process merges in current state.\nMust implement actual merge logic before OSS launch.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-24T12:50:27.824454-08:00","updated_at":"2025-12-27T21:29:45.447974-08:00","dependencies":[{"issue_id":"gt-ofl2","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:04.778532-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.447974-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-ofour","title":"Merge: nux-mjtj9d8q","description":"branch: polecat/nux-mjtj9d8q\ntarget: main\nsource_issue: nux-mjtj9d8q\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T20:58:23.224285-08:00","updated_at":"2025-12-30T22:05:57.650004-08:00","closed_at":"2025-12-30T22:05:57.650004-08:00","close_reason":"Commits already in main","created_by":"gastown/polecats/nux"}
{"id":"gt-ogmac","title":"Session ended: gt-gastown-ace","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:44:36.060214-08:00","updated_at":"2026-01-04T16:41:00.369759-08:00","closed_at":"2026-01-04T16:41:00.369759-08:00","close_reason":"Archived session telemetry","created_by":"gastown/polecats/ace"}
{"id":"gt-ogouu","title":"Digest: mol-deacon-patrol","description":"Patrol 5: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:33:03.832151-08:00","updated_at":"2025-12-27T21:26:00.407051-08:00","deleted_at":"2025-12-27T21:26:00.407051-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oh90.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-oh90\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T00:23:29.326738-08:00","updated_at":"2025-12-27T21:29:55.595877-08:00","deleted_at":"2025-12-27T21:29:55.595877-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oil7x","title":"Digest: mol-deacon-patrol","description":"Patrol 8: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:32:16.234736-08:00","updated_at":"2025-12-27T21:26:00.795985-08:00","deleted_at":"2025-12-27T21:26:00.795985-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oiv0","title":"Remove bd sync instruction from polecat startup workflow","description":"Polecats are instructed to run `bd sync --from-main` on startup (spawn.go:634).\n\n## Problem\n- Spawn command already syncs beads before spawning (line 239)\n- Polecats share rig-level beads via `.beads/redirect`\n- Multiple polecats starting simultaneously all try to sync same shared beads\n- This causes git conflicts/failures when many polecats spawn at once\n\n## Observed\nUser reported: 'all polecats failing on beads sync on startup in one run'\n\n## Fix\nRemove line 634 from buildWorkAssignmentMail():\n```\nbody.WriteString(\"2. Run \\`bd sync --from-main\\` to get fresh beads\\n\")\n```\n\nPolecats only need to sync at END of work (already in steps 5/7).\n\n## Files\n- internal/cmd/spawn.go: buildWorkAssignmentMail() and buildSpawnContext()","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-21T23:45:52.25177-08:00","updated_at":"2025-12-27T21:29:53.34987-08:00","deleted_at":"2025-12-27T21:29:53.34987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-oj9io","title":"Merge: organic-mjwjck2f","description":"branch: polecat/organic-mjwjck2f\ntarget: main\nsource_issue: organic-mjwjck2f\nrig: gastown\nagent_bead: gt-gastown-polecat-organic","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T23:17:46.090654-08:00","updated_at":"2026-01-01T23:38:59.490812-08:00","closed_at":"2026-01-01T23:38:59.490812-08:00","close_reason":"Stale MR - organic resubmitted as gt-a28hb","created_by":"gastown/polecats/organic"}
{"id":"gt-oki8p","title":"Liftoff implementation plan in Beads","description":"## Context\n\nSession on 2025-12-27 produced three key docs:\n- `~/gt/docs/agent-as-bead.md` - Agents ARE beads (identity, hook slots, CV chain roots)\n- `~/gt/docs/zfc-violations-audit.md` - Where Go infers instead of trusting agents\n- `~/gt/docs/liftoff-plan.md` - 4.5 day plan to self-sustaining Gas Town\n\n## Work Required\n\n### Phase 1: Audit existing beads\n- Review all open beads in gt- prefix\n- Close obsolete/stale issues\n- Update any that need refinement\n- Note which existing beads align with liftoff plan\n\n### Phase 2: File new beads\nTranslate liftoff-plan.md into beads with proper dependencies.\n\nThree pillars to decompose:\n1. **Agent beads** - schema, slot commands, migration\n2. **Patrol ignition** - witness/refinery formula wiring\n3. **Polecat lifecycle** - recycle/nuke commands, session-per-step\n\nKey dependency trap to avoid:\n- \"Phase 1 blocks Phase 2\" is WRONG (temporal thinking)\n- \"Phase 2 depends on Phase 1\" is RIGHT (requirement thinking)\n- Use `bd dep add \u003cchild\u003e \u003cparent\u003e` where child NEEDS parent\n\n### Phase 3: Review\nHuman reviews the plan in a separate session.\n\n## Deliverable\n\nA complete dependency graph in beads that can be:\n1. Queried with `bd ready` to find available work\n2. Slung to polecats in dependency order\n3. Tracked to completion\n\n## References\n\n- gt-552hb: Swarm orchestration epic (existing, may subsume)\n- gt-t6muy: Polecat lifecycle design (existing, captures session-per-step)","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-27T20:50:46.343298-08:00","updated_at":"2025-12-27T21:43:14.52461-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-oki8p","depends_on_id":"gt-d0jqp","type":"blocks","created_at":"2025-12-27T20:56:21.755649-08:00","created_by":"daemon"},{"issue_id":"gt-oki8p","depends_on_id":"gt-hwka3","type":"blocks","created_at":"2025-12-27T20:56:21.804561-08:00","created_by":"daemon"},{"issue_id":"gt-oki8p","depends_on_id":"gt-4a2qt","type":"blocks","created_at":"2025-12-27T20:56:21.85321-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:43:14.52461-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-olq2","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:47","description":"Patrol 17: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:47:44.403543-08:00","updated_at":"2025-12-27T21:26:05.054599-08:00","deleted_at":"2025-12-27T21:26:05.054599-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-omhjk","title":"Digest: mol-deacon-patrol","description":"Patrol 11: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:33:44.7986-08:00","updated_at":"2025-12-27T21:26:00.768247-08:00","deleted_at":"2025-12-27T21:26:00.768247-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-on0i.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-on0i\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:50:25.205763-08:00","updated_at":"2025-12-27T21:29:55.38614-08:00","deleted_at":"2025-12-27T21:29:55.38614-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-on46","title":"Work on gt-fix-bugs: Fix blocking infrastructure bugs. Se...","description":"Work on gt-fix-bugs: Fix blocking infrastructure bugs. See issue for details. Run 'bd show gt-fix-bugs' to see the full issue.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T03:47:14.322631-08:00","updated_at":"2025-12-27T21:29:56.831001-08:00","deleted_at":"2025-12-27T21:29:56.831001-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ooz6","title":"bd close hooks: context check and notifications","description":"Add hook system to bd close for context checking and notifications.\n\n## Concept\n\nRegister hooks that run on every bd close:\n\n# .beads/config.yaml\nhooks:\n on_close:\n - command: 'gt context-check'\n - notify: 'Next step ready'\n\n## Use cases\n\n### Context check on close\nAfter closing a step, check if context is getting full.\nIf \u003e80%, output warning suggesting session cycling.\n\n### Next step notification \nAutomatically show next ready step (complements --continue).\n\n### Custom actions\nUser-defined scripts for workflow automation.\n\n## Hook types\n\n- command: Run shell command with env vars (BEAD_ID, BEAD_TITLE, etc)\n- notify: Send message to agent\n- webhook: POST to URL (future)\n\n## Integration with context detection\n\nThe context-check hook could:\n1. Capture tmux pane (if in tmux session)\n2. Estimate context usage (turn count, output length)\n3. If high, output: 'Context at ~85%. Consider cycling.'\n\n## Priority\nP2 - nice to have, not blocking launch.\n\n## Related\n- gt-qswb (bd mol current)\n- gt-fly0 (bd close --continue)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T17:01:44.717704-08:00","updated_at":"2025-12-27T21:29:56.333785-08:00","deleted_at":"2025-12-27T21:29:56.333785-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-oqzf","title":"Digest: mol-deacon-patrol","description":"Patrol #15: 3/4 through","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:25:28.130003-08:00","updated_at":"2025-12-27T21:26:04.708812-08:00","deleted_at":"2025-12-27T21:26:04.708812-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-otfcf","title":"Session ended: gt-gastown-crew-george","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:50:12.83361-08:00","updated_at":"2026-01-04T16:41:37.84757-08:00","closed_at":"2026-01-04T16:41:37.84757-08:00","close_reason":"Archived","created_by":"gastown/crew/george"}
{"id":"gt-ouo","title":"gt swarm start: Does not spawn polecat sessions","description":"gt swarm start marks swarm as 'active' but doesn't start any polecat sessions.\n\nRepro:\n1. gt swarm create gastown --epic gt-hw6 --worker Toast --worker Nux\n2. gt swarm start gt-hw6\n3. gt session list - shows no new sessions\n\nExpected: Polecat sessions should start for each worker.\nActual: No sessions started, workers sit idle.","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-17T22:25:43.430981-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-ov2","title":"Refinery agent: merge queue processing loop","description":"The Refinery agent processes the merge queue, merging polecat work to main.\n\n## Interface\n\n```go\ntype Refinery struct {\n rig *Rig\n queue *MergeQueue\n git *Git\n mail *Mailbox\n config RefineryConfig\n}\n\ntype RefineryConfig struct {\n AutoMerge bool // Auto-merge passing MRs\n RunTests bool // Run tests before merge\n TestCommand string // Command to run tests\n RequireReview bool // Require review before merge (future)\n}\n\nfunc NewRefinery(rig *Rig, ...) *Refinery\n\n// Lifecycle\nfunc (r *Refinery) Start() error\nfunc (r *Refinery) Stop() error\nfunc (r *Refinery) IsRunning() bool\n\n// Processing\nfunc (r *Refinery) ProcessQueue() error\nfunc (r *Refinery) ProcessMR(mr *MergeRequest) error\n```\n\n## Processing Loop\n\n1. Check queue for pending MRs\n2. For each pending MR:\n a. Fetch polecat branch\n b. Attempt merge to refinery/rig (local main)\n c. Run tests if configured\n d. If pass: push to origin, mark merged\n e. If fail: mark rejected, notify polecat\n3. Sleep, repeat\n\n## Merge Strategy\n\n- Fast-forward when possible\n- Merge commit when not\n- On conflict: reject MR, polecat must rebase\n\n## Test Integration\n\nIf tests configured:\n```bash\ncd refinery/rig\ngit merge polecat/branch\n\u003ctest_command\u003e # e.g., go test ./...\n```\nResult determines merge/reject.\n\n## Notifications\n\nOn merge success:\n- Mail to polecat: \"Your work merged\"\n- Update bead if issue tracked\n\nOn merge failure:\n- Mail to polecat: \"Merge failed: \u003creason\u003e\"\n- Include conflict details if applicable\n\n## Dependencies\n\nNeeds: Rig management, Git wrapper, Mail system, Merge queue","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T23:22:08.498771-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-ov2","depends_on_id":"gt-u1j.14","type":"blocks","created_at":"2025-12-15T23:22:21.801826-08:00","created_by":"daemon"},{"issue_id":"gt-ov2","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-15T23:22:21.89716-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ox9","title":"Test from Mayor","description":"This is a test message via GGT mail","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T22:04:31.483843-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"message"}
{"id":"gt-oxqry","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 18: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:30:31.909987-08:00","updated_at":"2025-12-27T21:26:01.82079-08:00","deleted_at":"2025-12-27T21:26:01.82079-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oxtdp","title":"Digest: mol-deacon-patrol","description":"Patrol 11: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:17:35.222414-08:00","updated_at":"2025-12-27T21:26:02.707702-08:00","deleted_at":"2025-12-27T21:26:02.707702-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-oz7hl","title":"Merge: nux-1767079896198","description":"branch: polecat/nux-1767079896198\ntarget: main\nsource_issue: nux-1767079896198\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-29T23:39:54.738732-08:00","updated_at":"2025-12-29T23:55:11.856019-08:00","closed_at":"2025-12-29T23:55:11.856019-08:00","close_reason":"Stale MR from nuked polecat","created_by":"gastown/polecats/nux"}
{"id":"gt-p3v5n","title":"Witness Arm Bonding","description":"# Witness Patrol Redesign\n\n\u003e **Status**: Design pivoted from Christmas Ornament to linear patrol (Dec 2024)\n\n## Original Problem\n\nWitness needs to monitor and intervene on multiple polecats. Original design proposed \"Christmas Ornament\" pattern with dynamic arms per polecat and fanout gates.\n\n## Design Pivot: Why Christmas Ornament Was Wrong\n\nThe Christmas Ornament pattern (dynamic mol-polecat-arm per worker) felt un-Gas-Town:\n\n1. **Explicit state tracking**: Nudge counts, timestamps, per-polecat state\n2. **Complex machinery**: Dynamic bonding, fanout gates, aggregation\n3. **Against philosophy**: Gas Town prefers discovery over tracking, events over state\n\n**Key insight**: With step-based polecat sessions (`gt mol step done` restarts session each step), polecats are either working a step or starting a step. \"Nudge counts across sessions\" becomes less meaningful.\n\n## New Design: Linear Patrol + Task Tool Parallelism + Cleanup Wisps\n\n### Patrol Shape (Deacon-style)\n\n```\ninbox-check -\u003e process-cleanups -\u003e check-refinery -\u003e survey-workers -\u003e context-check -\u003e loop\n```\n\nNo dynamic arms. No fanout gates. Linear and simple.\n\n### Key Changes\n\n| Aspect | Old (Christmas Ornament) | New (Linear Patrol) |\n|--------|--------------------------|---------------------|\n| Per-polecat inspection | mol-polecat-arm bonded dynamically | Task tool subagents within survey-workers |\n| State tracking | Persistent nudge counts in handoff bead | Fresh observation each cycle (discover from beads timestamps) |\n| Cleanup queue | Explicit list in patrol state | Cleanup wisps (finalizer pattern) |\n| Parallelism | Molecule arms (fanout gate) | Task tool subagents |\n| Aggregation | waits_for: all-children | Task tool returns |\n\n### The Finalizer Pattern (Cleanup Wisps)\n\nWhen POLECAT_DONE arrives, Witness creates a cleanup wisp:\n```bash\nbd create --wisp --title \"cleanup:ace\" --labels cleanup,polecat:ace\n```\n\nThe wisp's existence IS the pending cleanup. Witness processes these in `process-cleanups` step:\n- Verify git clean, issue closed, commits exist\n- Kill session, remove worktree\n- Burn the wisp\n\nFailed cleanups? Leave wisp, retry next cycle. GC-style finalization.\n\n### Task Tool Parallelism\n\nInstead of molecule arms, use Claude's Task tool for parallel polecat inspection:\n\n```markdown\n## survey-workers step\n\nFor each polecat, launch a Task tool subagent:\n- Capture tmux state\n- Assess working/idle/error/done\n- Check beads for step progress\n- Decide and execute action (nudge/escalate/none)\n\nTask tool handles parallelism. No molecule machinery needed.\n```\n\n## Implementation\n\n- [x] Draft mol-witness-patrol-v2.formula.toml (linear patrol)\n- [ ] Test new patrol formula\n- [ ] Deprecate mol-polecat-arm\n- [ ] Remove fanout gate dependency\n\n## Files\n\n- `.beads/formulas/mol-witness-patrol-v2.formula.toml` - New linear patrol\n- `.beads/formulas/mol-witness-patrol.formula.toml` - Old Christmas Ornament (to deprecate)\n\n## Success Criteria\n\n- Witness patrols work with 3+ polecats using Task tool parallelism\n- Cleanup wisps correctly track pending cleanups\n- No persistent nudge state - fresh observation each cycle\n- Simpler, more Gas Town-y design","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-25T20:46:22.29819-08:00","updated_at":"2025-12-27T21:29:52.489656-08:00","dependencies":[{"issue_id":"gt-p3v5n","depends_on_id":"gt-psj76","type":"blocks","created_at":"2025-12-25T20:47:17.026491-08:00","created_by":"daemon"},{"issue_id":"gt-p3v5n","depends_on_id":"gt-twjr5","type":"blocks","created_at":"2025-12-25T20:55:05.133165-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.489656-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-p3v5n.1","title":"mol-polecat-arm formula","description":"Define mol-polecat-arm.formula.yaml with steps: capture (gt peek worker), assess (analyze output for stalls/errors), intervene (gt nudge or escalate). This is the per-worker arm in Christmas Ornament pattern.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:56:07.924265-08:00","updated_at":"2025-12-27T21:29:45.960343-08:00","dependencies":[{"issue_id":"gt-p3v5n.1","depends_on_id":"gt-p3v5n","type":"parent-child","created_at":"2025-12-25T20:56:07.924724-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.960343-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p3v5n.2","title":"Dynamic arm bonding in witness patrol","description":"Witness survey step uses bd mol bond to attach mol-polecat-arm for each active worker. Variables: worker name, rig. Arms become children of the patrol molecule. Requires on_complete/for_each runtime support.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:56:09.24636-08:00","updated_at":"2025-12-27T21:29:45.951888-08:00","dependencies":[{"issue_id":"gt-p3v5n.2","depends_on_id":"gt-p3v5n","type":"parent-child","created_at":"2025-12-25T20:56:09.246818-08:00","created_by":"daemon"},{"issue_id":"gt-p3v5n.2","depends_on_id":"gt-p3v5n.1","type":"blocks","created_at":"2025-12-25T20:56:49.215515-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.951888-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p3v5n.3","title":"Fanout gate runtime evaluation","description":"Aggregate step uses waits-for: all-children. Deacon must evaluate this gate by checking if all dynamically-bonded arm children are complete. If no children bonded, gate passes immediately.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:56:10.4868-08:00","updated_at":"2025-12-27T21:29:45.943469-08:00","dependencies":[{"issue_id":"gt-p3v5n.3","depends_on_id":"gt-p3v5n","type":"parent-child","created_at":"2025-12-25T20:56:10.488846-08:00","created_by":"daemon"},{"issue_id":"gt-p3v5n.3","depends_on_id":"gt-p3v5n.2","type":"blocks","created_at":"2025-12-25T20:56:49.311282-08:00","created_by":"daemon"},{"issue_id":"gt-p3v5n.3","depends_on_id":"gt-twjr5.1","type":"blocks","created_at":"2025-12-25T20:56:59.237508-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.943469-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p3v5n.4","title":"Witness arm bonding integration test","description":"Test witness with 3 polecats. Verify: arms bonded per worker, each arm captures/assesses independently, aggregation waits for all arms, witness proceeds after aggregation.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:56:12.056218-08:00","updated_at":"2025-12-27T21:29:54.927515-08:00","dependencies":[{"issue_id":"gt-p3v5n.4","depends_on_id":"gt-p3v5n","type":"parent-child","created_at":"2025-12-25T20:56:12.056676-08:00","created_by":"daemon"},{"issue_id":"gt-p3v5n.4","depends_on_id":"gt-p3v5n.3","type":"blocks","created_at":"2025-12-25T20:56:49.406137-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.927515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p3yhp","title":"gt doctor: Check and fix beads routing configuration","description":"## Summary\n\n`gt doctor` should check that all rigs with beads have proper routing entries\nin `~/gt/.beads/routes.jsonl`, and offer to fix any missing entries.\n\n## Checks to Add\n\n### Check: routes.jsonl exists\n- Verify `~/gt/.beads/routes.jsonl` exists\n- If missing, create with entries for all known rigs\n\n### Check: All rigs have routing entries\nFor each rig in town:\n1. Find the rig's beads prefix (from config.yaml or sample issue)\n2. Check if routes.jsonl has an entry for that prefix\n3. If missing, report and offer to add\n\n### Check: Routes point to valid locations\nFor each route in routes.jsonl:\n1. Verify the path exists and has a .beads directory\n2. Verify the .beads directory has a beads.db\n3. If invalid, report and offer to remove\n\n## Fix Actions\n\n- `--fix` should automatically add missing routes\n- `--fix` should remove invalid routes (or prompt)\n- Should handle redirect files when validating paths\n\n## Example Output\n\n```\nChecking beads routing...\n [!] Rig 'myproject' (prefix: mp-) has no routing entry\n Run: gt doctor --fix to add routing\n\nRoutes health:\n [OK] gt- -\u003e gastown/mayor/rig\n [OK] bd- -\u003e beads/mayor/rig\n [!] mp- -\u003e myproject/mayor/rig (MISSING)\n```","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-26T14:45:49.438679-08:00","updated_at":"2025-12-27T21:29:54.835023-08:00","deleted_at":"2025-12-27T21:29:54.835023-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-p4s9","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:36:49.392521-08:00","updated_at":"2025-12-27T21:26:04.62643-08:00","deleted_at":"2025-12-27T21:26:04.62643-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p52mk","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 17: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:29:58.302087-08:00","updated_at":"2025-12-27T21:26:01.829087-08:00","deleted_at":"2025-12-27T21:26:01.829087-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p552z","title":"Digest: mol-deacon-patrol","description":"Patrol 14: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:59:17.733299-08:00","updated_at":"2025-12-27T21:26:00.499368-08:00","deleted_at":"2025-12-27T21:26:00.499368-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-p9zh","title":"gt doctor: detect orphaned code on beads-sync branch","description":"After merging beads-sync to main, some code changes can be lost if the merge conflict resolution drops files.\n\nAdd a doctor check that runs:\n git diff main..beads-sync -- '*.go' '*.md'\n\nIf there are differences in code files (not just .beads/), warn about potentially orphaned work.\n\nToday's incident: Merge 96c773f lost mailbox.go and router.go changes from 5791752, requiring re-implementation.\n\nAcceptance:\n- gt doctor warns if beads-sync has unmerged code changes\n- Excludes .beads/ directory (expected to differ)\n- Shows file list of orphaned changes","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-20T22:01:56.794648-08:00","updated_at":"2025-12-27T21:29:56.670099-08:00","deleted_at":"2025-12-27T21:29:56.670099-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-pbjim","title":"Digest: mol-deacon-patrol","description":"Patrol 15: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T00:35:05.785047-08:00","updated_at":"2025-12-25T00:35:05.785047-08:00","closed_at":"2025-12-25T00:35:05.785013-08:00"}
{"id":"gt-pbr3","title":"Add godoc comments to exported functions","description":"Several exported functions lack godoc comments. While not critical, adding documentation would improve code maintainability. Focus on:\n\n- Public API functions in each package\n- Exported types and their methods\n- Functions that have non-obvious behavior\n\nCan be addressed incrementally as code is touched.","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-21T21:35:26.732436-08:00","updated_at":"2025-12-27T21:29:57.88304-08:00","deleted_at":"2025-12-27T21:29:57.88304-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pc2p0","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, no messages, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:28:10.81431-08:00","updated_at":"2025-12-27T21:26:03.943851-08:00","deleted_at":"2025-12-27T21:26:03.943851-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pc5d","title":"Recover stale polecat work: 4 branches with unpushed commits","description":"During observation of polecat workflow, found 4 polecats with unpushed work:\n\n## Branches Pushed (preserved)\n- polecat/capable: 3 commits (molecule catalog, doctor orphan detection, gt done)\n- polecat/dementus: 4 commits (Witness MVP, handoff fixes)\n- polecat/furiosa: 2 commits (bulk polecat removal, spawn handoff)\n- polecat/rictus: 1 commit (molecule docs)\n\n## Action Required\n1. Review each branch for merge-worthiness\n2. Either:\n a. Create PRs for valuable work\n b. OR discard if superseded\n3. After decision, clean up polecats properly\n\n## Root Cause\nPolecats were not cleaned up after previous work sessions. This is exactly why we need:\n- gt-u1k: gt shutdown should fully cleanup polecats\n- gt-8v8: Refuse to lose uncommitted work\n- gt-9nf: Always create fresh polecats","status":"tombstone","priority":1,"issue_type":"chore","created_at":"2025-12-20T15:24:29.232772-08:00","updated_at":"2025-12-27T21:29:53.694253-08:00","deleted_at":"2025-12-27T21:29:53.694253-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"chore"}
{"id":"gt-pcqda","title":"Digest: mol-deacon-patrol","description":"Patrol 4: 8 sessions active, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:30:08.65881-08:00","updated_at":"2025-12-27T21:26:03.919311-08:00","deleted_at":"2025-12-27T21:26:03.919311-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pdqc","title":"gt spawn: beads sync warnings on fresh worktree","description":"When spawning fresh polecats, seeing 'Warning: beads sync: exit status 1' and 'Warning: beads push: exit status 1'. Worktrees are created but beads state may not be properly synced.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-23T00:19:09.375303-08:00","updated_at":"2025-12-27T21:29:56.27578-08:00","deleted_at":"2025-12-27T21:29:56.27578-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-pedt","title":"Digest: mol-deacon-patrol","description":"Patrol 9: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:58:49.010496-08:00","updated_at":"2025-12-27T21:26:04.952273-08:00","deleted_at":"2025-12-27T21:26:04.952273-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pia6","title":"Merge: gt-h6eq.4","description":"branch: polecat/valkyrie\ntarget: main\nsource_issue: gt-h6eq.4\nrig: gastown","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-23T12:13:07.092756-08:00","updated_at":"2025-12-27T21:27:22.989131-08:00","deleted_at":"2025-12-27T21:27:22.989131-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-piy9c","title":"Digest: mol-deacon-patrol","description":"Patrol 9","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T20:51:02.414152-08:00","updated_at":"2025-12-25T20:51:02.414152-08:00","closed_at":"2025-12-25T20:51:02.414093-08:00"}
{"id":"gt-pj222","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All agents healthy, no messages, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:26:34.954324-08:00","updated_at":"2025-12-27T21:26:02.934026-08:00","deleted_at":"2025-12-27T21:26:02.934026-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pkm69","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 10: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:14.872507-08:00","updated_at":"2025-12-28T11:22:14.872507-08:00","closed_at":"2025-12-28T11:22:14.872464-08:00"}
{"id":"gt-plcg","title":"Add --account flag to gt spawn and gt crew attach","description":"Add --account=\u003chandle\u003e flag. Precedence: GT_ACCOUNT env \u003e --account flag \u003e default from config.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:24:23.832188-08:00","updated_at":"2025-12-27T21:29:56.15928-08:00","dependencies":[{"issue_id":"gt-plcg","depends_on_id":"gt-58tu","type":"blocks","created_at":"2025-12-23T03:24:34.984336-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.15928-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pln0","title":"Digest: mol-deacon-patrol","description":"Patrol 6: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:57:32.057118-08:00","updated_at":"2025-12-27T21:26:04.977001-08:00","deleted_at":"2025-12-27T21:26:04.977001-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pm5rs","title":"Digest: mol-deacon-patrol","description":"Patrol 3: nux exited (wrong-rig escalation), 11 sessions, all core healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:09:23.555634-08:00","updated_at":"2025-12-27T21:26:02.318996-08:00","deleted_at":"2025-12-27T21:26:02.318996-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pnu4","title":"Test","description":"Body","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:38:39.019559-08:00","updated_at":"2025-12-25T14:12:42.272191-08:00","deleted_at":"2025-12-25T14:12:42.272191-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-pnurj","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Green - routine cycle complete, handing off","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:36:47.596964-08:00","updated_at":"2025-12-27T21:26:02.474719-08:00","deleted_at":"2025-12-27T21:26:02.474719-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-podor","title":"Digest: mol-deacon-patrol","description":"Patrol 20: all clear, handoff due","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:54:52.330177-08:00","updated_at":"2025-12-27T21:26:01.230064-08:00","deleted_at":"2025-12-27T21:26:01.230064-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-polecat-gastown-test-cat","title":"gt-polecat-gastown-test-cat","description":"gt-polecat-gastown-test-cat\n\nrole_type: polecat\nrig: gastown\nagent_state: spawning\nhook_bead: null\nrole_bead: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T02:16:55.810497-08:00","updated_at":"2025-12-28T02:17:01.786065-08:00","created_by":"mayor","deleted_at":"2025-12-28T02:17:01.786065-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"agent"}
{"id":"gt-polecat-gastown-testcat","title":"gt-polecat-gastown-testcat","description":"gt-polecat-gastown-testcat\n\nrole_type: polecat\nrig: gastown\nagent_state: running\nhook_bead: null\nrole_bead: null","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T15:32:27.70761-08:00","updated_at":"2025-12-28T15:33:38.022666-08:00","created_by":"mayor","deleted_at":"2025-12-28T15:33:38.022666-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"agent"}
{"id":"gt-poxd","title":"Create handoff beads for Witness and Refinery roles","description":"Each patrol role needs a pinned handoff bead to track attached molecules and patrol state.\n\n## Pattern (from Deacon)\n- Title: '\u003crole\u003e Handoff' (e.g., 'witness Handoff', 'refinery Handoff')\n- Status: pinned\n- Description contains structured state\n\n## Witness Handoff State\n\n```yaml\nattached_molecule: mol-witness-patrol\nattached_at: 2025-12-24T10:00:00Z\n\n# Nudge escalation tracking\nnudges:\n toast:\n count: 2\n last: \"2025-12-24T10:30:00Z\"\n ace:\n count: 0\n last: null\n\n# Polecats queued for cleanup\npending_cleanup:\n - nux # received POLECAT_DONE, awaiting verification\n```\n\n## Refinery Handoff State\n\n```yaml\nattached_molecule: mol-refinery-patrol\nattached_at: 2025-12-24T10:00:00Z\n\n# Merge queue tracking\nlast_processed_branch: polecat/toast\nbranches_merged_this_cycle: 3\n```\n\n## Tasks\n1. Create 'witness Handoff' bead in each rig's beads\n2. Create 'refinery Handoff' bead in each rig's beads\n3. Update Witness/Refinery startup to check handoff bead for attached work\n4. Update templates to document handoff bead usage\n5. Include nudge state schema for Witness\n6. Include merge state schema for Refinery\n\n## Note\nThese are rig-level beads (in gastown/.beads/, beads/.beads/), not town-level like deacon Handoff (in ~/gt/.beads/).\n\n## See Also\n- docs/witness-patrol-design.md - Theory of operation","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T13:19:41.055563-08:00","updated_at":"2025-12-27T21:29:53.007253-08:00","dependencies":[{"issue_id":"gt-poxd","depends_on_id":"gt-y481","type":"parent-child","created_at":"2025-12-23T13:20:15.89851-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.007253-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ppdmm","title":"Digest: mol-deacon-patrol","description":"Patrol 10: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:15:09.272455-08:00","updated_at":"2025-12-27T21:26:01.022637-08:00","deleted_at":"2025-12-27T21:26:01.022637-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pqhka","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All agents healthy - triggering handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T06:46:30.78827-08:00","updated_at":"2025-12-27T21:26:03.693445-08:00","deleted_at":"2025-12-27T21:26:03.693445-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-psj76","title":"Patrol Lifecycle Completion","description":"Patrols complete their full cycle without wisp accumulation.\n\n## Problem\nPatrol wisps accumulate (73+ observed) because burn-or-loop step does not complete. This wastes storage and makes bd list noisy.\n\n## Requirements\n- Burn-or-loop step executes and burns completed wisps\n- Fresh wisp instantiated for next patrol cycle \n- Wisp GC runs periodically as safety net (orphan-check step)\n- Context cycling triggers gt handoff when budget exhausted\n- No wisp accumulation over time\n\n## Success Criteria\n- bd wisp list shows fewer than 5 wisps at any time\n- Patrols run indefinitely without intervention\n- Context cycles cleanly produce handoff mail\n\nThis is the foundation for all autonomous patrol work.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-25T20:45:54.435395-08:00","updated_at":"2025-12-27T21:29:52.506519-08:00","deleted_at":"2025-12-27T21:29:52.506519-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-psj76.1","title":"Burn-or-loop step execution","description":"Fix step completion detection in patrol formulas. Ensure done() properly clears the molecule and triggers burn. Currently wisps accumulate because this step doesn't complete.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:21.305965-08:00","updated_at":"2025-12-27T21:29:46.013195-08:00","dependencies":[{"issue_id":"gt-psj76.1","depends_on_id":"gt-psj76","type":"parent-child","created_at":"2025-12-25T20:55:21.306455-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:46.013195-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-psj76.2","title":"Wisp GC command","description":"Implement bd wisp gc command that cleans orphaned wisps older than a threshold (default 1h). Safety net for wisps that escape normal burn-or-loop cleanup. Should run as part of Deacon patrol.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:22.455591-08:00","updated_at":"2025-12-27T21:29:46.002119-08:00","dependencies":[{"issue_id":"gt-psj76.2","depends_on_id":"gt-psj76","type":"parent-child","created_at":"2025-12-25T20:55:22.457703-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:46.002119-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-psj76.3","title":"Context cycling protocol","description":"Document and verify gt handoff pattern for patrol agents. Ensure hook chain works: handoff mail sent, session respawns, SessionStart hook runs gt prime, work continues from pinned molecule.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:23.973741-08:00","updated_at":"2025-12-27T21:29:45.99377-08:00","dependencies":[{"issue_id":"gt-psj76.3","depends_on_id":"gt-psj76","type":"parent-child","created_at":"2025-12-25T20:55:23.974279-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.99377-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-psj76.4","title":"Patrol lifecycle integration test","description":"Verify patrol runs 3+ cycles without wisp accumulation. bd wisp list should show \u003c5 wisps. Context cycles cleanly produce handoff mail. Test with Deacon patrol.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:55:25.121614-08:00","updated_at":"2025-12-27T21:29:54.952315-08:00","dependencies":[{"issue_id":"gt-psj76.4","depends_on_id":"gt-psj76","type":"parent-child","created_at":"2025-12-25T20:55:25.123746-08:00","created_by":"daemon"},{"issue_id":"gt-psj76.4","depends_on_id":"gt-psj76.1","type":"blocks","created_at":"2025-12-25T20:56:44.636528-08:00","created_by":"daemon"},{"issue_id":"gt-psj76.4","depends_on_id":"gt-psj76.2","type":"blocks","created_at":"2025-12-25T20:56:44.733351-08:00","created_by":"daemon"},{"issue_id":"gt-psj76.4","depends_on_id":"gt-psj76.3","type":"blocks","created_at":"2025-12-25T20:56:44.825574-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.952315-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pu1t5","title":"Fix CLI mismatches from bd-2fs7 (wisp-\u003eephemeral migration)","description":"bd-2fs7 moved wisp/pour under bd mol, but gastown wasn't updated.\n\nFixed:\n- router.go: --wisp -\u003e --ephemeral for bd create\n- patrol_helpers.go: bd wisp create -\u003e bd mol wisp create \n- wisp_check.go: bd wisp gc -\u003e bd mol wisp gc\n\nThis was causing handoff failures (gt mail send errored with 'unknown flag: --wisp').","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-27T15:00:52.52154-08:00","updated_at":"2025-12-27T21:29:45.175296-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:45.175296-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-pvox","title":"Digest: mol-deacon-patrol","description":"Patrol #4: 2 Witnesses OK, 2 Refineries OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:21:42.037058-08:00","updated_at":"2025-12-27T21:26:04.802034-08:00","deleted_at":"2025-12-27T21:26:04.802034-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pvzj","title":"Digest: mol-deacon-patrol @ 2025-12-24 20:23","description":"Patrol complete: inbox clear, all agents healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:23:09.751138-08:00","updated_at":"2025-12-27T21:26:05.021658-08:00","deleted_at":"2025-12-27T21:26:05.021658-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pwep","title":"implement","description":"Implement the solution for gt-test123. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:56:18.534804-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-py01y","title":"Digest: mol-deacon-patrol","description":"Patrol 7: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:29:10.720846-08:00","updated_at":"2025-12-27T21:26:03.287626-08:00","deleted_at":"2025-12-27T21:26:03.287626-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-pytfu","title":"Digest: mol-deacon-patrol","description":"Patrol 16: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:33:56.257524-08:00","updated_at":"2025-12-27T21:26:03.083677-08:00","deleted_at":"2025-12-27T21:26:03.083677-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q16tj","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox empty, all agents healthy, fixed invalid hook attachment on gt-w98d","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:18:37.954041-08:00","updated_at":"2025-12-27T21:26:01.96809-08:00","deleted_at":"2025-12-27T21:26:01.96809-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q1z4","title":"Digest: mol-deacon-patrol","description":"Patrol 13: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:37:57.933025-08:00","updated_at":"2025-12-27T21:26:04.576491-08:00","deleted_at":"2025-12-27T21:26:04.576491-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q3ac","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:24","description":"Patrol 9: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:24:50.821721-08:00","updated_at":"2025-12-27T21:26:05.280694-08:00","deleted_at":"2025-12-27T21:26:05.280694-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q511","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:21","description":"Patrol 3: all healthy, no work","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:21:01.70741-08:00","updated_at":"2025-12-27T21:26:05.322399-08:00","deleted_at":"2025-12-27T21:26:05.322399-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q5qoq","title":"Digest: mol-deacon-patrol","description":"Patrol 12: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:17:53.371675-08:00","updated_at":"2025-12-27T21:26:02.699557-08:00","deleted_at":"2025-12-27T21:26:02.699557-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-q6hl","title":"mol-polecat-work","description":"Full polecat lifecycle from assignment to decommission.\n\nThis proto enables nondeterministic idempotence for polecat work.\nA polecat that crashes after any step can restart, read its molecule state,\nand continue from the last completed step. No work is lost.\n\nVariables:\n- gt-qwyu - The source issue ID being worked on","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-21T21:58:52.59934-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-q8du","title":"Configuration in Beads: use pinned beads for rig/agent config","description":"## Summary\n\nMove configuration from JSON files into Beads data plane using pinned beads.\n\n## Motivation\n\n- Config becomes versionable and syncable like issues\n- Agents can read config via `bd show`\n- Unified data model for everything\n- Supports the 'Beads as Universal Data Plane' vision (gt-aqm)\n\n## Config Types to Migrate\n\n### Rig-level (pinned per rig)\n- Theme/colors for tmux status line\n- Default branch naming conventions\n- Merge queue settings\n- Witness thresholds\n\n### Agent-level (pinned per role)\n- Handoffs (already planned - gt-cu7r)\n- Agent preferences\n- Current focus/context\n\n### Town-level (pinned at town root)\n- Cross-rig settings\n- Federation config\n- Global themes\n\n## Implementation\n\n1. Implement StatusPinned (beads-6v2) ✓ in progress\n2. Create config pinned beads on rig init\n3. Add `bd config get/set` commands that read/write pinned beads\n4. Migrate existing config.json fields\n\n## Naming Convention\n\n```\n\u003cprefix\u003e-cfg-\u003cscope\u003e\ngt-cfg-theme # gastown theme\nbd-cfg-merge # beads merge settings\ngm-cfg-town # town-level config\n```\n\n## Dependencies\n\n- beads-6v2: Add StatusPinned to beads schema\n- gm-w13: Pinned Beads epic","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-18T21:58:59.898116-08:00","updated_at":"2025-12-27T21:29:57.043378-08:00","deleted_at":"2025-12-27T21:29:57.043378-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-q9jj3","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 12: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:45:35.574924-08:00","updated_at":"2025-12-27T21:26:01.377911-08:00","deleted_at":"2025-12-27T21:26:01.377911-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qbdb","title":"Work on ga-lzh: Add gt witness attach command. Allow atta...","description":"Work on ga-lzh: Add gt witness attach command. Allow attaching to witness session for a rig, similar to gt mayor attach. When done, submit MR (not PR) to integration branch for Refinery.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T22:58:44.945425-08:00","updated_at":"2025-12-27T21:29:56.872763-08:00","deleted_at":"2025-12-27T21:29:56.872763-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qcr6","title":"Polecat template: fix 'git clone' to 'git worktree'","description":"Template says 'You're in a project git clone' but polecats use worktrees, not clones. Fix terminology.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-23T16:56:44.091888-08:00","updated_at":"2025-12-27T21:29:55.951993-08:00","dependencies":[{"issue_id":"gt-qcr6","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.207186-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.951993-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-qctb0","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All services healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:51:18.083218-08:00","updated_at":"2025-12-27T21:26:02.010983-08:00","deleted_at":"2025-12-27T21:26:02.010983-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qcvqx","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:28.120992-08:00","updated_at":"2025-12-27T21:26:02.155407-08:00","deleted_at":"2025-12-27T21:26:02.155407-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qdnmm","title":"Digest: mol-deacon-patrol","description":"Patrol 9: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:43:53.751357-08:00","updated_at":"2025-12-27T21:26:03.214415-08:00","deleted_at":"2025-12-27T21:26:03.214415-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qduud","title":"Merge: furiosa-1767084006859","description":"branch: polecat/furiosa-1767084006859\ntarget: main\nsource_issue: furiosa-1767084006859\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T00:48:06.516625-08:00","updated_at":"2025-12-30T01:01:04.244313-08:00","closed_at":"2025-12-30T01:01:04.244313-08:00","close_reason":"Already merged to main","created_by":"gastown/polecats/furiosa"}
{"id":"gt-qfgfr","title":"Merge: ace-mjz94g3q","description":"branch: polecat/ace-mjz94g3q\ntarget: main\nsource_issue: ace-mjz94g3q\nrig: gastown\nagent_bead: gt-gastown-polecat-ace\nretry_count: 0\nlast_conflict_sha: null\nconflict_task_id: null","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-04T10:45:27.229461-08:00","updated_at":"2026-01-04T10:48:07.106263-08:00","closed_at":"2026-01-04T10:48:07.106263-08:00","close_reason":"Already merged at b8250e13","created_by":"gastown/polecats/ace"}
{"id":"gt-qgbe","title":"Digest: mol-deacon-patrol","description":"Patrol 3: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:55:49.186887-08:00","updated_at":"2025-12-27T21:26:05.005406-08:00","deleted_at":"2025-12-27T21:26:05.005406-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qivm","title":"gt crew at: auto-prime when exec'ing Claude in-session","description":"When running 'gt crew at \u003cname\u003e' from inside the target session, we exec Claude directly. But this means we can't send 'gt prime' afterward since we ARE the process.\n\nPossible solutions:\n1. Claude startup hook that runs gt prime\n2. Pass prompt as argument to claude CLI\n3. Wrapper script approach\n\nRelated: crew resume prompt also can't be sent in this path.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T15:13:38.035775-08:00","updated_at":"2025-12-27T21:29:56.975825-08:00","deleted_at":"2025-12-27T21:29:56.975825-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qj12","title":"Obsolete beads issues need cleanup","description":"Found obsolete issues that should be closed or fixed:\n\nTest artifacts to close:\n- gt-nriy: Test: Alpha to Beta (message test)\n- gt-gswn: Integration test (test artifact)\n- gt-54kn: Test: New Router\n\nBroken dependencies (reference non-existent gt-test123):\n- gt-vhby: implement\n- gt-tvos: load-context\n- gt-lwuu.2: implement (template variable unresolved)\n\nCloudRun references (non-existent /deploy/cloudrun/):\n- gt-9a2.6, gt-9a2.7, gt-9a2.11\n\nReview and close/update these before launch.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T12:50:57.843146-08:00","updated_at":"2025-12-27T21:29:55.561682-08:00","dependencies":[{"issue_id":"gt-qj12","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:07.394519-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.561682-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qk7p7","title":"Merge: nux-mjxn8p5t","description":"branch: polecat/nux-mjxn8p5t\ntarget: main\nsource_issue: nux-mjxn8p5t\nrig: gastown\nagent_bead: gt-gastown-polecat-nux\nretry_count: 0\nlast_conflict_sha: null\nconflict_task_id: null","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T20:51:43.86311-08:00","updated_at":"2026-01-04T21:30:49.411047+13:00","closed_at":"2026-01-03T11:51:50.62837-08:00","close_reason":"Merged to main","created_by":"gastown/polecats/nux"}
{"id":"gt-qn4l","title":"bd create should support molecule type","description":"gt molecule commands expect type=molecule but bd validates against bug|feature|task|epic|chore|merge-request only","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-19T18:25:31.591953-08:00","updated_at":"2025-12-27T21:29:56.932709-08:00","deleted_at":"2025-12-27T21:29:56.932709-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-qn5rg","title":"Digest: mol-deacon-patrol","description":"Patrol 10: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:57:05.118801-08:00","updated_at":"2025-12-27T21:26:01.770399-08:00","deleted_at":"2025-12-27T21:26:01.770399-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qna4","title":"gt done: Missing command referenced in polecat docs","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T01:10:57.495372-08:00","updated_at":"2025-12-27T21:29:53.856677-08:00","deleted_at":"2025-12-27T21:29:53.856677-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-qns0","title":"TIDY UP: Your previous work (patrol runner) was already m...","description":"TIDY UP: Your previous work (patrol runner) was already merged to main. Check your git status is clean, sync beads, and if nothing to do, just run 'gt done'.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:26:39.343497-08:00","updated_at":"2025-12-27T21:29:56.512587-08:00","deleted_at":"2025-12-27T21:29:56.512587-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qnt78","title":"Daemon sending heartbeat to Mayor - should not","description":"Daemon appears to be sending heartbeat messages to Mayor, but Mayor should not receive heartbeats. Observed at least once or twice - needs investigation to confirm and fix.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-26T17:39:55.247942-08:00","updated_at":"2025-12-27T21:29:54.774741-08:00","deleted_at":"2025-12-27T21:29:54.774741-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-qobkl","title":"Digest: mol-deacon-patrol","description":"Patrol 13: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:18:57.096949-08:00","updated_at":"2025-12-27T21:26:02.838451-08:00","deleted_at":"2025-12-27T21:26:02.838451-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qp98","title":"Refactor: Eliminate state.json, use beads assignee field for polecat state","description":"## Problem\n\nstate.json in each polecat worktree duplicates data that should be in beads:\n- `issue` field duplicates assignment (should be issue.assignee)\n- `state` field duplicates status (derivable from issue.status)\n- `name/rig/branch/clone_path` are all derivable from filesystem/git\n\nThis violates 'Beads as the data plane' (HOP Decision 001).\n\n## Current State\n\n```json\n{\n \"name\": \"Angharad\", // Derivable: basename(cwd)\n \"rig\": \"gastown\", // Derivable: parent dir name\n \"state\": \"working\", // Should be: issue.status\n \"clone_path\": \"/path\", // Derivable: cwd\n \"branch\": \"polecat/X\", // Derivable: git branch\n \"issue\": \"gt-xyz\", // Should be: issue.assignee\n \"timestamps\": \"...\" // Should be in beads history\n}\n```\n\n## Target State\n\n**Polecat identity**: Derived from worktree path\n**Polecat assignment**: `issue.assignee = 'gastown/Angharad'`\n**Polecat state**: Derived from issue.status (in_progress = working, closed = done)\n**Polecat existence**: Worktree directory exists\n\n## Benefits\n\n1. Single source of truth (no sync issues)\n2. Queryable: `bd list --assignee=gastown/Angharad`\n3. History tracking via beads\n4. Cross-agent coordination via beads sync\n5. Simpler code (no state file management)\n\n## Implementation\n\n1. Update `gt spawn` to set issue.assignee instead of writing state.json\n2. Update polecat list/status commands to query beads\n3. Update Witness to query beads for polecat state\n4. Remove state.json read/write code\n5. Keep state.json as optional bootstrap cache (perf optimization only)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T11:40:33.389689-08:00","updated_at":"2025-12-27T21:29:54.050126-08:00","deleted_at":"2025-12-27T21:29:54.050126-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qpwv4","title":"Witness: detect polecat completion and trigger merge","description":"## Problem\nWhen polecat pushes completed work, witness needs to detect it and signal refinery.\n\n## ZFC-Compliant Solution\nAdd step to `mol-witness-patrol.formula.toml`:\n\n```toml\n[[step]]\nid = \"check-ready-branches\"\ntitle = \"Check for branches ready to merge\"\ndescription = \"\"\"\n1. List polecat branches: git branch -r | grep origin/polecat/\n2. For each branch ahead of main:\n - Check if polecat session is done (no session or at handoff prompt)\n - Check if bead is closed: bd show \u003cbead-id\u003e\n3. For ready branches, mail refinery:\n gt mail send \u003crig\u003e/refinery -s \"MERGE_READY: polecat/\u003cname\u003e\" -m \"Branch ready for merge\"\n\"\"\"\ndepends_on = [\"survey-polecats\"]\n```\n\n## Alternative: Polecat signals explicitly\nAdd to polecat workflow (end of work):\n```\ngt mail send \u003crig\u003e/refinery -s \"MERGE_READY\" -m \"polecat/\u003cname\u003e ready\"\ngt mail send \u003crig\u003e/witness -s \"WORK_COMPLETE\" -m \"\u003cbead-id\u003e done\"\n```\n\n## Why This Works\n- Witness/polecat agents make decisions\n- Mail is the coordination primitive\n- No daemon code needed\n\n## Files\n- formulas/mol-witness-patrol.formula.toml\n- formulas/mol-polecat-work.formula.toml (add completion signal)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-27T16:41:04.857442-08:00","updated_at":"2025-12-27T21:29:45.756607-08:00","created_by":"mayor","dependencies":[{"issue_id":"gt-qpwv4","depends_on_id":"gt-u6siy","type":"relates-to","created_at":"2025-12-27T20:59:10.634142-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.756607-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qrze","title":"Work on gt-role-template: Refine witness/CLAUDE.md role t...","description":"Work on gt-role-template: Refine witness/CLAUDE.md role template. Add clearer heartbeat protocol, mail checking procedure, nudge decision criteria, escalation thresholds. Run 'bd show gt-role-template' for details.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T07:53:27.305468-08:00","updated_at":"2025-12-27T21:29:56.789455-08:00","deleted_at":"2025-12-27T21:29:56.789455-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qscw","title":"Merge: gt-72so","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-72so\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T16:20:02.203335-08:00","updated_at":"2025-12-27T21:27:22.733773-08:00","deleted_at":"2025-12-27T21:27:22.733773-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-qsvq","title":"gt doctor: Detect and clean up orphaned sessions/processes","description":"## Problem\nOrphaned tmux sessions and Claude processes accumulate over time, consuming memory.\n\n## Discovered Orphans\n- gt-deacon: Idle Claude session, not part of any rig\n\n## Proposed Solution\n\n### gt doctor --check (default)\nDetect issues without fixing:\n- Orphan sessions (not matching rig/polecat/crew/witness/refinery/mayor pattern)\n- Claude processes without parent tmux session\n- Tmux sessions without Claude (stuck at bash prompt)\n- Polecats marked 'working' but session idle\n\n### gt doctor --fix\nClean up detected issues:\n- Kill orphan sessions\n- Kill orphan Claude processes\n- Optionally reset stuck polecat state\n\n### gt gc (alternative name)\nShort alias for cleanup operations.\n\n## Acceptance Criteria\n- [ ] Detects orphan sessions\n- [ ] Detects orphan processes\n- [ ] Safe cleanup (doesn't kill active work)\n- [ ] Reports what was cleaned","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T01:06:39.046959-08:00","updated_at":"2025-12-27T21:29:53.865024-08:00","deleted_at":"2025-12-27T21:29:53.865024-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-qsvv","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:45","description":"Patrol 9: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:45:20.853532-08:00","updated_at":"2025-12-27T21:26:05.112326-08:00","deleted_at":"2025-12-27T21:26:05.112326-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qswb","title":"bd mol current: soft cursor showing current/next step","description":"Add bd mol current command for molecule navigation orientation.\n\n## Usage\n\nbd mol current [mol-id]\n\nIf mol-id given, show status for that molecule.\nIf not given, infer from in_progress issues assigned to current agent.\n\n## Output\n\nYou're working on molecule gt-abc (Feature X)\n\n [checkmark] gt-abc.1: Design\n [checkmark] gt-abc.2: Scaffold \n [checkmark] gt-abc.3: Implement\n [arrow] gt-abc.4: Write tests [in_progress] \u003c- YOU ARE HERE\n [circle] gt-abc.5: Documentation\n [circle] gt-abc.6: Exit decision\n\nProgress: 3/6 steps complete\n\n## Key behaviors\n- Shows full molecule structure with status indicators\n- Highlights current in_progress step\n- If no in_progress, highlights first ready step\n- Works without explicit cursor tracking (inferred from state)\n\n## Implementation notes\n- Query children of mol-id\n- Sort by dependency order\n- Find first in_progress or first ready\n- Format with status indicators\n\n## Beads feature\nThis is a bd command - needs implementation in beads repo.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T17:00:47.394852-08:00","updated_at":"2025-12-27T21:29:53.150009-08:00","deleted_at":"2025-12-27T21:29:53.150009-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-qvn7","title":"New Year's Launch: Single-Polecat Engine","description":"# Tracer Bullet: Single-Polecat Engine\n\nThe minimal viable Gas Town: one polecat, one issue, one complete cycle.\n\n## Vision\n\n```\nDeacon wakes → checks hook → spawns polecat → polecat works → polecat submits → loop\n```\n\nNo Witness automation. No Refinery automation. Just the core propulsion loop.\n\n## Key Insight: Molecules Are the Logic\n\nGo code is connector/glue. The cognition lives in molecules. When the Deacon\nwakes up, it reads its hook, finds a molecule, and WALKS IT. The molecule\nsays what to do. Go just provides the verbs (spawn, send, close).\n\n## The Propulsion Principle\n\n\u003e If you find something on your hook, YOU RUN IT.\n\nThis is the universal Gas Town rule. Every agent, every wake-up:\n1. Check hook\n2. If mol → execute it\n3. If nothing → wait for assignment\n\n## Critical Path\n\n1. **Slinging Handoff** - `gt sling \u003cbead\u003e` attaches work and restarts\n2. **Hook Execution** - Agent wakes, primes, reads hook, runs hook\n3. **Polecat Work Cycle** - Spawn → work → submit → complete\n4. **Deacon Patrol** - The loop that spawns polecats\n\n## Success Criteria\n\n- `gt sling gt-xxx` restarts agent with work attached\n- Agent wakes up and automatically begins work\n- Single polecat completes one issue end-to-end\n- Deacon can spawn polecats on demand\n\n## Non-Goals (v0.1)\n\n- Autonomous Witness nudging\n- Autonomous Refinery merging\n- Multi-polecat coordination\n- Self-healing from crashes\n\nThese are v0.2+. Tracer bullet first.","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:33:45.930555-08:00","updated_at":"2025-12-27T21:29:45.431138-08:00","deleted_at":"2025-12-27T21:29:45.431138-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.1","title":"Phase 1: Slinging Handoff","description":"# Phase 1: Slinging Handoff\n\nThe restart-and-resume loop. This is the fundamental propulsion mechanism.\n\n## What It Does\n\n```bash\ngt sling gt-xxx\n```\n\n1. Creates a wisp on the agent's hook containing the bead\n2. Sends handoff mail (optional context)\n3. Triggers session restart (daemon SIGUSR1 or direct respawn)\n4. Fresh Claude wakes up\n5. SessionStart hook runs `gt prime`\n6. Agent reads hook → finds the slung bead\n7. Agent executes the work\n\n## Why Wisp, Not Mol\n\nThe sling is ephemeral scaffolding. It says \"run this bead\" - nothing more.\nOnce the work is picked up, the wisp can be burned. No permanent record needed\nfor the handoff wrapper itself; the actual work is tracked in the bead.\n\n## Implementation\n\n### gt sling command\n\n```go\n// cmd: gt sling \u003cbead-id\u003e [-s subject] [-m message]\nfunc sling(beadID string, subject, message string) {\n // 1. Create wisp on hook\n // - Creates .beads-wisp/hook-\u003cagent\u003e.wisp.yaml\n // - Contains: bead_id, slung_at, context\n \n // 2. Optional: send handoff mail\n // - gt mail send \u003cself\u003e -s \"🤝 HANDOFF: \u003csubject\u003e\" -m \"\u003cmessage\u003e\"\n \n // 3. Trigger restart\n // - For crew: just exit (human restarts)\n // - For polecats: signal daemon (SIGUSR1)\n // - For deacon: signal daemon (SIGUSR1)\n}\n```\n\n### Hook Reading in gt prime\n\n```go\n// In gt prime, after loading context:\nfunc checkHook(agent string) *SlungWork {\n wispPath := \".beads-wisp/hook-\" + agent + \".wisp.yaml\"\n if exists(wispPath) {\n return parseWisp(wispPath)\n }\n return nil\n}\n```\n\n### Agent Response to Slung Work\n\nWhen agent wakes and finds slung work:\n\n1. Read the wisp: `hook-\u003cagent\u003e.wisp.yaml`\n2. Get the bead ID\n3. Show the bead: `bd show \u003cbead-id\u003e`\n4. Begin working on it\n5. Burn the wisp (it's been picked up\n\n## Files Changed\n\n- - New command\n- - Wisp creation/reading\n- - Hook check on startup\n- Role CLAUDE.md files - Instructions for hook processing\n\n## Acceptance Criteria\n\n- [ ] creates wisp on hook\n- [ ] Agent restart picks up the slung work\n- [ ] Work begins automatically (no human nudge)\n- [ ] Wisp is burned after pickup\nEOF\n)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:34:11.051245-08:00","updated_at":"2025-12-27T21:29:45.422578-08:00","dependencies":[{"issue_id":"gt-qvn7.1","depends_on_id":"gt-qvn7","type":"parent-child","created_at":"2025-12-24T15:34:11.053313-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.422578-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.1.1","title":"gt sling command implementation","description":"Implement the gt sling command for handoff with attached work.\n\n## Command Signature\n\n```bash\ngt sling \u003cbead-id\u003e [-s subject] [-m message]\n```\n\n## Behavior\n\n1. Validate the bead exists: bd show \u003cbead-id\u003e\n2. Create wisp on agents hook:\n - File: .beads-wisp/hook-\u003cagent\u003e.wisp.yaml\n - Content: bead_id, slung_at, optional context\n3. Optionally send handoff mail with subject/message\n4. Exit cleanly (or signal restart)\n\n## Wisp Format\n\n```yaml\ntype: slung-work\nbead_id: gt-xxx\nslung_at: 2025-12-24T...\nslung_by: crew/joe\ncontext: \"Optional message\"\n```\n\n## Files to Create/Modify\n\n- cmd/gt/sling.go (new)\n- internal/wisp/wisp.go (may need creation)\n- internal/wisp/hook.go (hook reading)\n\n## Notes\n\n- Keep it simple - this is connector code\n- The wisp is burned after pickup\n- Sling should work for any role (crew, polecat, deacon)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:36:14.175753-08:00","updated_at":"2025-12-27T21:29:45.379465-08:00","dependencies":[{"issue_id":"gt-qvn7.1.1","depends_on_id":"gt-qvn7.1","type":"parent-child","created_at":"2025-12-24T15:36:14.177841-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.1.1","depends_on_id":"gt-qvn7.1.2","type":"blocks","created_at":"2025-12-24T15:36:56.672678-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.379465-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.1.2","title":"Wisp directory and format","description":"Define and implement the .beads-wisp directory structure.\n\n## Directory Structure\n\n```\n\u003cclone\u003e/.beads-wisp/\n hook-\u003cagent\u003e.wisp.yaml # Slung work for this agent\n patrol-\u003cid\u003e.wisp.yaml # Patrol cycle state (future)\n```\n\n## Wisp File Format\n\n```yaml\ntype: slung-work | patrol-cycle\ncreated_at: ISO timestamp\ncreated_by: entity who created it\n\n# For slung-work:\nbead_id: gt-xxx\ncontext: optional message\n\n# For patrol-cycle (future):\nformula: mol-deacon-patrol\ncurrent_step: inbox-check\nstep_states: {...}\n```\n\n## Key Properties\n\n- NOT git tracked (.beads-wisp/ in .gitignore)\n- Ephemeral - burned after use\n- Local to each clone\n- Fast writes (no sync needed)\n\n## Implementation\n\n- internal/wisp/dir.go - Directory creation/management\n- internal/wisp/types.go - Wisp type definitions\n- internal/wisp/io.go - Read/write operations","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:36:25.061863-08:00","updated_at":"2025-12-27T21:29:45.370956-08:00","dependencies":[{"issue_id":"gt-qvn7.1.2","depends_on_id":"gt-qvn7.1","type":"parent-child","created_at":"2025-12-24T15:36:25.063924-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.370956-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.1.3","title":"Hook reading in gt prime","description":"Enhance gt prime to check and display hook contents.\n\n## Current Behavior\n\ngt prime loads context:\n- Reads CLAUDE.md\n- Injects beads workflow info\n- Shows status\n\n## New Behavior\n\nAfter loading context, check hook:\n\n```go\n// In prime.go\nfunc checkHook(agent string) {\n wispPath := \".beads-wisp/hook-\" + agent + \".wisp.yaml\"\n if wisp := readWisp(wispPath); wisp != nil {\n // Inject hook info into prime output\n showSlungWork(wisp)\n }\n}\n```\n\n## Output Format\n\nWhen hook has slung work:\n\n```markdown\n## Your Hook\n\nYou have work assigned:\n\n**Bead:** gt-xxx\n**Title:** Fix authentication bug\n**Priority:** P1\n\nContext from previous session:\n\u003e Continue where I left off...\n\n**BEGIN WORK NOW.** Claim it with:\n bd update gt-xxx --status=in_progress\n```\n\n## Files Modified\n\n- cmd/gt/prime.go - Add hook checking\n- internal/wisp/hook.go - Hook reading utilities","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:36:37.061178-08:00","updated_at":"2025-12-27T21:29:45.362536-08:00","dependencies":[{"issue_id":"gt-qvn7.1.3","depends_on_id":"gt-qvn7.1","type":"parent-child","created_at":"2025-12-24T15:36:37.063211-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.1.3","depends_on_id":"gt-qvn7.1.2","type":"blocks","created_at":"2025-12-24T15:36:56.759282-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.362536-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.1.4","title":"Wisp burn after pickup","description":"Implement wisp cleanup after work is picked up.\n\n## When to Burn\n\nAfter agent has picked up slung work:\n- Agent claims the issue (bd update --status=in_progress)\n- Or agent explicitly acknowledges the handoff\n- Wisp file is deleted\n\n## Implementation Options\n\nOption A: Agent burns manually\n```bash\ngt hook burn # Deletes current hook wisp\n```\n\nOption B: Auto-burn on claim\n- bd update detects theres a hook for this issue\n- Burns the wisp automatically\n\nOption C: Burn in prime (after display)\n- Once shown to agent, delete the wisp\n- Simple but might lose info if agent crashes before reading\n\n## Recommendation\n\nUse Option A (manual burn) for v0.1:\n- Explicit is better than implicit\n- Agent controls when theyre done with handoff\n- Add auto-burn in v0.2\n\n## Command\n\n```bash\ngt hook burn\n# Deletes .beads-wisp/hook-\u003cagent\u003e.wisp.yaml\n# Confirms: \"Hook cleared\"\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T15:36:48.284445-08:00","updated_at":"2025-12-27T21:29:52.605658-08:00","dependencies":[{"issue_id":"gt-qvn7.1.4","depends_on_id":"gt-qvn7.1","type":"parent-child","created_at":"2025-12-24T15:36:48.286893-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.1.4","depends_on_id":"gt-qvn7.1.3","type":"blocks","created_at":"2025-12-24T15:36:56.843124-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.605658-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.2","title":"Phase 2: Hook Execution Protocol","description":"# Phase 2: Hook Execution Protocol\n\nMaking agents actually RUN what's on their hook.\n\n## The Problem\n\nCurrently, agents wake up, get primed, and... wait. They don't automatically\nexecute their pinned molecule or slung work. The propulsion is missing.\n\n## The Solution\n\ngt prime should end with an ACTION, not just context loading:\n\n```\ngt prime output:\n 1. Load context (existing)\n 2. Check hook\n 3. If hook has work → tell agent to RUN IT NOW\n```\n\n## Hook Types\n\n| Hook Content | Action |\n|--------------|--------|\n| Pinned molecule (mol-*) | Execute current step, close step when done |\n| Slung bead (gt-xxx) | Show bead, begin work |\n| Nothing | Wait for assignment |\n\n## Molecule Execution Pattern\n\nWhen hook has a molecule:\n\n```markdown\n## Your Hook: mol-deacon-patrol\n\nCurrent step: inbox-check (gt-xxx)\nStatus: pending (ready to execute)\n\n**EXECUTE THIS STEP NOW.**\n\n### Step: inbox-check\nCheck the Mayor's inbox for messages...\n\nWhen complete, close the step:\n bd close gt-xxx\n\nThen check for next step:\n bd ready\n```\n\nThe agent sees this, does the step, closes the step bead.\n\n## Key Insight (from bd-hulf/bd-3zm7 rejection)\n\nMolecule execution state is DERIVED from which child beads are closed vs open.\nNo separate state.yaml files needed. bd mol current already does this.\n\n\"Advancing\" a molecule = closing the current step bead. Next step is\nwhichever child is now ready (unblocked).\n\n## Remaining Work\n\ngt-qvn7.2.3: Enhance gt prime to show molecule step prompt using bd mol current.\n\n## Files Changed\n\n- internal/cmd/prime.go - Hook check and action injection\n- Uses bd mol current for step discovery (already exists in beads)\n\n## Acceptance Criteria\n\n- gt prime shows current hook and prompts action\n- Agent closes step with bd close \u003cstep-id\u003e\n- bd ready / bd mol current shows next step\n- Agent naturally follows the execution flow","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:34:32.037329-08:00","updated_at":"2025-12-27T21:29:45.413795-08:00","dependencies":[{"issue_id":"gt-qvn7.2","depends_on_id":"gt-qvn7","type":"parent-child","created_at":"2025-12-24T15:34:32.039415-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.2","depends_on_id":"gt-qvn7.1","type":"blocks","created_at":"2025-12-24T15:36:00.393831-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.413795-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.2.1","title":"bd mol advance command","description":"## Cross-Repo Reference\n\nThis work is tracked in beads repo as **bd-3zm7**.\n\nThe gastown task gt-qvn7.2.1 is a placeholder. \nActual implementation happens in beads.\n\nSee: bd show bd-3zm7 (from beads/crew/dave)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:37:10.413975-08:00","updated_at":"2025-12-27T21:29:45.353998-08:00","dependencies":[{"issue_id":"gt-qvn7.2.1","depends_on_id":"gt-qvn7.2","type":"parent-child","created_at":"2025-12-24T15:37:10.416191-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.2.1","depends_on_id":"gt-qvn7.2.2","type":"blocks","created_at":"2025-12-24T15:37:39.740685-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.353998-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.2.2","title":"Molecule state file management","description":"## Cross-Repo Reference\n\nThis work is tracked in beads repo as **bd-hulf**.\n\nThe gastown task gt-qvn7.2.2 is a placeholder.\nActual implementation happens in beads.\n\nSee: bd show bd-hulf (from beads/crew/dave)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:37:21.389134-08:00","updated_at":"2025-12-27T21:29:45.345275-08:00","dependencies":[{"issue_id":"gt-qvn7.2.2","depends_on_id":"gt-qvn7.2","type":"parent-child","created_at":"2025-12-24T15:37:21.39098-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.345275-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.2.3","title":"gt prime molecule execution prompt","description":"Enhance gt prime to show molecule execution prompts.\n\n## When Hook Has Molecule\n\ngt prime should output:\n\n```markdown\n## Your Patrol: mol-deacon-patrol\n\n**Current Step:** inbox-check\n**Status:** pending (ready to execute)\n\n### Step: inbox-check\n\nHandle callbacks from agents.\n\nCheck the Mayor inbox for messages from:\n- Witnesses reporting polecat status\n- Refineries reporting merge results\n...\n\n**EXECUTE THIS STEP NOW.**\n\nWhen complete, close the step:\n bd close \u003cstep-id\u003e\n\nThen check for next step:\n bd ready\n```\n\n## Logic\n\n1. Check if hook has pinned molecule (via handoff bead with attachment)\n2. Use bd mol current to find current step (derives from open children)\n3. Get step details from the bead itself\n4. Format execution prompt\n5. Include close command (NOT bd mol advance - that was rejected)\n\n## Key Insight (from bd-hulf rejection)\n\nMolecule execution state is DERIVED from which child beads are closed vs open.\nNo separate state.yaml files. bd mol current already does this correctly.\n\n\"Advancing\" a molecule = closing the current step bead. The next step\nis whichever child is now ready (unblocked).\n\n## Different from Slung Work\n\nSlung work (bead) says \"work on this issue\"\nMolecule says \"execute this step of this workflow\"\n\nBoth use the hook, but prompts differ.\n\n## Files\n\n- internal/cmd/prime.go\n- Uses bd mol current for step discovery","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:37:33.304687-08:00","updated_at":"2025-12-27T21:29:45.336537-08:00","dependencies":[{"issue_id":"gt-qvn7.2.3","depends_on_id":"gt-qvn7.2","type":"parent-child","created_at":"2025-12-24T15:37:33.306622-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.2.3","depends_on_id":"gt-qvn7.2.2","type":"blocks","created_at":"2025-12-24T15:37:39.825493-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.336537-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.3","title":"Phase 3: Polecat Work Cycle","description":"# Phase 3: Polecat Work Cycle\n\nThe complete lifecycle of a single polecat working a single issue.\n\n## The Cycle\n\n```\nspawn → wake → prime → read hook → work → submit → complete → cleanup\n```\n\n## Step by Step\n\n### 1. Spawn (Deacon/Mayor initiates)\n\n```bash\ngt spawn --issue gt-xxx --rig gastown\n```\n\nCreates fresh worktree, assigns issue, creates polecat session.\n\n### 2. Wake (Claude starts)\n\nPolecat Claude session starts. SessionStart hook fires.\n\n### 3. Prime (Context loads)\n\n```bash\ngt prime # Runs automatically via hook\n```\n\nLoads CLAUDE.md, injects beads context, checks hook.\n\n### 4. Read Hook\n\ngt prime finds the assigned issue on the hook:\n\n```\n## Your Hook: gt-xxx\n\nYou have been assigned this issue. Begin work now.\n\nTitle: Fix authentication bug\nPriority: P1\nDescription: ...\n\nClaim it with: bd update gt-xxx --status=in_progress\n```\n\n### 5. Work\n\nPolecat:\n- Claims issue: `bd update gt-xxx --status=in_progress`\n- Reads relevant code\n- Makes changes\n- Commits with issue ID\n\n### 6. Submit\n\nPolecat:\n- Pushes branch\n- Creates MR bead: `bd create \"Merge: gt-xxx\" --type=merge-request`\n- Signals completion\n\n### 7. Complete\n\nPolecat:\n- Closes the issue: `bd close gt-xxx --reason \"Fixed in commit abc123\"`\n- Syncs beads: `bd sync`\n\n### 8. Cleanup\n\nPolecat:\n- Sends completion mail to Witness\n- Session terminates (or awaits next assignment)\n\n## The mol-polecat-work Molecule\n\nThis is already defined. The polecat should be bonded to it on spawn.\n\nKey steps:\n1. orient - Read context, understand codebase\n2. claim - Update issue status\n3. work - Implement the fix\n4. test - Run tests, verify\n5. submit - Push, create MR\n6. handoff - Signal done, sync\n\n## Implementation\n\n### gt spawn enhancements\n\n- Automatically pin mol-polecat-work to new polecat\n- Pass issue ID as variable\n- Ensure hook is set before Claude starts\n\n### Polecat CLAUDE.md\n\nMust include:\n- Hook reading instructions\n- Molecule execution pattern\n- Exit state vocabulary (COMPLETED, BLOCKED, ESCALATE)\n\n## Acceptance Criteria\n\n- gt spawn creates polecat with issue on hook\n- Polecat wakes and begins work automatically\n- Polecat completes issue and submits MR\n- Issue is closed with commit reference","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:34:53.710094-08:00","updated_at":"2025-12-27T21:29:45.405365-08:00","dependencies":[{"issue_id":"gt-qvn7.3","depends_on_id":"gt-qvn7","type":"parent-child","created_at":"2025-12-24T15:34:53.712193-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.3","depends_on_id":"gt-qvn7.2","type":"blocks","created_at":"2025-12-24T15:36:00.479432-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.405365-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.3.1","title":"gt spawn pins mol-polecat-work","description":"Enhance gt spawn to automatically pin mol-polecat-work to new polecats.\n\n## Current Behavior\n\ngt spawn --issue gt-xxx creates polecat with issue in mail but no molecule.\n\n## New Behavior\n\ngt spawn should:\n1. Create fresh worktree (existing)\n2. Initialize polecat beads (existing)\n3. Bond mol-polecat-work with issue as variable\n4. Pin the molecule to polecat hook\n5. Create polecat session\n\n## Molecule Bonding\n\n```bash\nbd mol bond mol-polecat-work --var issue_id=gt-xxx --var rig=gastown\n```\n\nCreates .beads/molecules/mol-polecat-work-\u003chash\u003e.state.yaml\n\n## Hook Pinning\n\n```bash\nbd pin mol-polecat-work-\u003chash\u003e --for polecat/\u003cname\u003e\n```\n\nCreates .beads-wisp/hook-polecat-\u003cname\u003e.wisp.yaml pointing to molecule\n\n## Files\n\n- internal/spawn/spawn.go - Add mol-polecat-work bonding\n- Or: internal/cmd/spawn.go depending on structure","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:37:51.450108-08:00","updated_at":"2025-12-27T21:29:45.327987-08:00","dependencies":[{"issue_id":"gt-qvn7.3.1","depends_on_id":"gt-qvn7.3","type":"parent-child","created_at":"2025-12-24T15:37:51.452167-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.327987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.3.2","title":"Polecat CLAUDE.md update for hook execution","description":"Update polecat role CLAUDE.md to include hook execution instructions.\n\n## Key Additions\n\n### Startup Protocol\n\n```markdown\n## Startup Protocol\n\nWhen you wake up:\n\n1. **Check your hook first**\n Run: gt prime\n This shows your assigned work.\n\n2. **If hook has molecule**\n You are running a workflow. Execute the current step.\n When done: bd mol advance \u003cmol-id\u003e\n\n3. **If hook has slung bead**\n Work on that specific issue.\n Claim it: bd update \u003cid\u003e --status=in_progress\n\n4. **If hook empty**\n Wait for assignment via mail or spawn.\n```\n\n### Work Completion\n\n```markdown\n## Completing Work\n\nWhen you finish your assigned issue:\n\n1. Commit changes: git add \u0026\u0026 git commit -m \"description (issue-id)\"\n2. Push branch: git push -u origin \u003cbranch\u003e\n3. Create MR bead: bd create \"Merge: \u003cissue-id\u003e\" --type=merge-request\n4. Close issue: bd close \u003cissue-id\u003e --reason \"Fixed in commit abc123\"\n5. Sync: bd sync\n6. Signal done: gt mail send witness/ -s \"LIFECYCLE: Done\" -m \"Completed \u003cissue-id\u003e\"\n```\n\n### Exit States\n\n```markdown\n## Exit States\n\nWhen finishing, signal your state:\n\n- COMPLETED: Work done successfully\n- BLOCKED: Cannot proceed, need help\n- ESCALATE: Problem beyond your scope\n- HANDOFF: Passing to successor\n```\n\n## File\n\ntemplates/polecat/CLAUDE.md","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:38:08.153576-08:00","updated_at":"2025-12-27T21:29:45.319448-08:00","dependencies":[{"issue_id":"gt-qvn7.3.2","depends_on_id":"gt-qvn7.3","type":"parent-child","created_at":"2025-12-24T15:38:08.155584-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.319448-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.3.3","title":"mol-polecat-work minimal formula","description":"Create minimal mol-polecat-work formula for tracer bullet.\n\n## Simplified Steps\n\nFor v0.1, just the essentials:\n\n```yaml\nformula: mol-polecat-work-v01\ndescription: Minimal polecat work cycle for tracer bullet.\nversion: 1\n\nvariables:\n - name: issue_id\n required: true\n - name: rig\n required: true\n\nsteps:\n - id: claim\n title: Claim the issue\n description: |\n Run: bd update {{issue_id}} --status=in_progress\n Read the issue: bd show {{issue_id}}\n Understand what needs to be done.\n\n - id: work\n title: Implement the fix\n needs: [claim]\n description: |\n Do the work described in the issue.\n Make your changes.\n Test them.\n Commit with issue ID: git commit -m \"description ({{issue_id}})\"\n\n - id: submit\n title: Submit for review\n needs: [work]\n description: |\n Push your branch: git push -u origin \u003cbranch\u003e\n Create MR: bd create \"Merge: {{issue_id}}\" --type=merge-request\n Close issue: bd close {{issue_id}}\n Sync: bd sync\n Signal done: gt mail send witness/ -s \"LIFECYCLE: Done\"\n```\n\n## Location\n\n.beads/formulas/mol-polecat-work-v01.formula.yaml\n\n## Notes\n\n- Keep it simple for tracer bullet\n- Full mol-polecat-work can be more elaborate\n- This proves the pattern works","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:38:20.520847-08:00","updated_at":"2025-12-27T21:29:45.31101-08:00","dependencies":[{"issue_id":"gt-qvn7.3.3","depends_on_id":"gt-qvn7.3","type":"parent-child","created_at":"2025-12-24T15:38:20.52293-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.31101-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.4","title":"Phase 4: Deacon Patrol Loop","description":"$(cat /tmp/phase4.md)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:35:32.511322-08:00","updated_at":"2025-12-27T21:29:45.396645-08:00","dependencies":[{"issue_id":"gt-qvn7.4","depends_on_id":"gt-qvn7","type":"parent-child","created_at":"2025-12-24T15:35:32.513848-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.4","depends_on_id":"gt-qvn7.2","type":"blocks","created_at":"2025-12-24T15:36:00.565162-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.396645-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.4.1","title":"Deacon CLAUDE.md for patrol execution","description":"Update Deacon role CLAUDE.md for patrol execution.\n\n## Key Additions\n\n### Role Definition\n\n```markdown\n## Your Role: DEACON\n\nYou are the Deacon - Gas Towns heartbeat process.\nYou run mol-deacon-patrol continuously.\n\nYour job:\n- Check mail for callbacks\n- Spawn polecats for ready work\n- Monitor rig health (basic)\n- Cycle context when full\n```\n\n### Patrol Execution\n\n```markdown\n## Patrol Execution\n\nYou run mol-deacon-patrol on your hook.\n\nOn each heartbeat from the daemon:\n1. Check your current step: bd hook\n2. Execute that step\n3. Advance: bd mol advance mol-deacon-patrol\n4. If COMPLETE: restart the patrol or cycle context\n\nThe daemon keeps you alive. Your job is to run the patrol.\n```\n\n### Heartbeat Response\n\n```markdown\n## Heartbeat Response\n\nWhen you see \"HEARTBEAT: ...\" from the daemon:\n- If idle: check your hook, start/resume patrol\n- If mid-step: continue working\n- If waiting: the heartbeat reminds you to check state\n\nHeartbeats are your clock tick. Use them.\n```\n\n### Context Cycling\n\n```markdown\n## Context Cycling\n\nWhen context is high (\u003e80%):\n1. Write state to molecule\n2. Run: gt handoff -s \"Deacon cycling\" -m \"Context full\"\n3. Exit cleanly\n4. Daemon respawns you\n5. You wake, prime, read hook, continue\n```\n\n## File\n\ndeacon/CLAUDE.md (in town root)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:38:35.345899-08:00","updated_at":"2025-12-27T21:29:45.30249-08:00","dependencies":[{"issue_id":"gt-qvn7.4.1","depends_on_id":"gt-qvn7.4","type":"parent-child","created_at":"2025-12-24T15:38:35.347599-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.30249-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.4.2","title":"mol-deacon-patrol minimal formula","description":"Create minimal mol-deacon-patrol formula for tracer bullet.\n\n## Simplified Patrol (v0.1)\n\n```yaml\nformula: mol-deacon-patrol-v01\ndescription: Minimal Deacon patrol for tracer bullet.\nversion: 1\n\nsteps:\n - id: inbox-check\n title: Check inbox\n description: |\n Run: gt mail inbox\n Handle any messages.\n Note any spawn requests or status updates.\n\n - id: spawn-work\n title: Spawn polecat for ready work\n needs: [inbox-check]\n description: |\n Run: bd ready\n If work available and no polecat already working it:\n gt spawn --issue \u003cfirst-ready\u003e --rig gastown\n If no work: proceed to next step.\n\n - id: self-inspect\n title: Assess context and decide loop vs cycle\n needs: [spawn-work]\n description: |\n Evaluate whether to loop (reset) or cycle (handoff).\n \n ## Heuristics\n \n **Favor LOOP (bd mol reset):**\n - Context \u003c 60%\n - No errors in this cycle\n - Patrol completed normally\n \n **Favor CYCLE (gt handoff):**\n - Context \u003e 80%\n - Errors encountered\n - Been running \u003e 1 hour without cycle\n - Model suggests refresh would help\n \n **Action:**\n - If LOOP: bd mol reset mol-deacon-patrol\n - If CYCLE: gt handoff -s \"Deacon cycling\" -m \"Context: X%, cycles: N\"\n```\n\n## Loop via Reset\n\nAfter self-inspect, if staying:\n```bash\nbd mol reset mol-deacon-patrol\n```\n\nThis resets all steps to pending, current_step to inbox-check.\nDeacon immediately continues with fresh patrol cycle.\n\n## Cycle via Handoff\n\nIf context high or refresh needed:\n```bash\ngt handoff -s \"Deacon cycling\" -m \"Context high, cycling for fresh start\"\n```\n\nDaemon respawns fresh Deacon with mol-deacon-patrol on hook.\n\n## Location\n\n.beads/formulas/mol-deacon-patrol-v01.formula.yaml","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:38:48.948745-08:00","updated_at":"2025-12-27T21:25:59.894595-08:00","dependencies":[{"issue_id":"gt-qvn7.4.2","depends_on_id":"gt-qvn7.4","type":"parent-child","created_at":"2025-12-24T15:38:48.950928-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:25:59.894595-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.4.3","title":"Deacon initialization with patrol molecule","description":"Ensure Deacon always has mol-deacon-patrol on hook.\n\n## The Problem\n\nWhen Deacon starts fresh, it needs:\n1. mol-deacon-patrol bonded\n2. Molecule pinned to hook\n3. Ready to execute\n\n## Solution: Bootstrap in gt prime\n\nWhen gt prime runs for role=deacon:\n\n1. Check if mol-deacon-patrol exists on hook\n2. If not: bond it and pin it\n3. Then show execution prompt\n\n```go\n// In prime.go for deacon role\nif role == \"deacon\" \u0026\u0026 !hasPatrolOnHook() {\n // Bond the patrol formula\n bondDeaconPatrol()\n // Pin to hook\n pinToHook(\"deacon\", patrolMolID)\n}\n```\n\n## Alternative: Deacon Startup Hook\n\nOr handle in Deacon SessionStart hook:\n- Check hook\n- If empty: gt mol bond mol-deacon-patrol \u0026\u0026 gt mol pin\n\n## Files\n\n- cmd/gt/prime.go (if handling in prime)\n- deacon/hooks/session-start.sh (if handling in hook)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:39:00.670376-08:00","updated_at":"2025-12-27T21:29:45.293841-08:00","dependencies":[{"issue_id":"gt-qvn7.4.3","depends_on_id":"gt-qvn7.4","type":"parent-child","created_at":"2025-12-24T15:39:00.672391-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.293841-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.5","title":"Phase 5: Integration and Demo","description":"$(cat /tmp/phase5.md)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-24T15:35:52.803427-08:00","updated_at":"2025-12-27T21:29:45.387845-08:00","dependencies":[{"issue_id":"gt-qvn7.5","depends_on_id":"gt-qvn7","type":"parent-child","created_at":"2025-12-24T15:35:52.805435-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.5","depends_on_id":"gt-qvn7.3","type":"blocks","created_at":"2025-12-24T15:36:00.648003-08:00","created_by":"daemon"},{"issue_id":"gt-qvn7.5","depends_on_id":"gt-qvn7.4","type":"blocks","created_at":"2025-12-24T15:36:00.730985-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.387845-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-qvn7.5.1","title":"End-to-end integration test","description":"Create and run end-to-end integration test.\n\n## Test Scenario\n\n1. Clean state: no polecats, fresh beads\n2. Create test issue: simple documentation fix\n3. Verify Deacon sees it: bd ready shows issue\n4. Trigger Deacon: send heartbeat or wait\n5. Verify spawn: gt polecats shows new polecat\n6. Watch polecat work: tmux attach\n7. Verify completion: issue closed, MR created\n\n## Test Script\n\n```bash\n#!/bin/bash\n# e2e-test.sh\n\n# 1. Clean state\ngt polecat remove --all\nbd create \"Test: update README typo\" --type=task --priority=1\n\n# 2. Verify ready\nbd ready | grep \"update README\"\n\n# 3. Wait for Deacon or trigger\n# (manual for v0.1)\n\n# 4. Watch\ngt status\ngt polecats gastown\n\n# 5. Wait for completion\n# (manual observation for v0.1)\n\n# 6. Verify\nbd list --status=closed | grep \"update README\"\n```\n\n## Success Criteria\n\n- Issue created and visible in bd ready\n- Polecat spawned and working\n- Issue closed with commit reference\n- No manual intervention (except observation)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T15:39:20.642596-08:00","updated_at":"2025-12-27T21:29:45.285264-08:00","dependencies":[{"issue_id":"gt-qvn7.5.1","depends_on_id":"gt-qvn7.5","type":"parent-child","created_at":"2025-12-24T15:39:20.64439-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.285264-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.5.2","title":"Documentation humility pass","description":"Update documentation with humble framing.\n\n## Changes Needed\n\n### README.md\n\n- Add \"Experimental\" or \"Alpha\" badge\n- Clarify current state vs future vision\n- List what works vs what is planned\n- Include honest limitations section\n\n### Architecture docs\n\n- Mark autonomous features as \"planned\"\n- Distinguish implemented from designed\n- Add \"Current Status\" sections\n\n### CLAUDE.md files\n\n- Remove claims about autonomous operation (where not true)\n- Add \"v0.1\" qualifiers to features\n- Be honest about manual steps required\n\n## Tone\n\nNOT: \"Gas Town automatically manages agent lifecycles\"\nYES: \"Gas Town provides infrastructure for agent lifecycle management.\n v0.1 requires some manual coordination.\"\n\nNOT: \"The Refinery processes merges autonomously\"\nYES: \"The Refinery is designed for autonomous merge processing.\n v0.1 demonstrates the pattern with manual triggering.\"\n\n## Files to Review\n\n- README.md\n- docs/architecture.md\n- docs/quickstart.md (if exists)\n- CLAUDE.md files for all roles","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T15:39:32.215796-08:00","updated_at":"2025-12-27T21:29:52.597325-08:00","deleted_at":"2025-12-27T21:29:52.597325-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qvn7.5.3","title":"Demo recording","description":"Record a terminal demo of the tracer bullet.\n\n## Recording Tool\n\nUse asciinema:\n```bash\nbrew install asciinema\nasciinema rec gastown-demo.cast\n```\n\n## Demo Script\n\n1. Show gt status (the town)\n2. Show bd ready (available work)\n3. Show Deacon running (tmux session)\n4. Trigger spawn (create issue, Deacon picks it up)\n5. Show polecat working (tmux attach briefly)\n6. Show completion (bd list, issue closed)\n7. Show MR created (bd list --type=merge-request)\n\n## Tips\n\n- Practice run first\n- Keep it under 3 minutes\n- Add commentary (terminal echo or voiceover)\n- Show the key moments clearly\n\n## Output\n\n- gastown-demo.cast (asciinema file)\n- gastown-demo.gif (optional, for embedding)\n- YouTube/Vimeo upload (optional)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T15:39:43.630035-08:00","updated_at":"2025-12-27T21:29:52.589159-08:00","dependencies":[{"issue_id":"gt-qvn7.5.3","depends_on_id":"gt-qvn7.5","type":"parent-child","created_at":"2025-12-24T15:39:43.632383-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.589159-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qwyu","title":"Test issue for spawn molecule","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T21:58:43.699993-08:00","updated_at":"2025-12-27T21:29:56.453323-08:00","deleted_at":"2025-12-27T21:29:56.453323-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-qx3k","title":"Merge: gt-jzot","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-jzot\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T22:55:45.3321-08:00","updated_at":"2025-12-27T21:27:22.534987-08:00","deleted_at":"2025-12-27T21:27:22.534987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-qxei","title":"Test4","description":"test4 body","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:47:12.137051-08:00","updated_at":"2025-12-27T21:29:56.74451-08:00","deleted_at":"2025-12-27T21:29:56.74451-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-qz2l","title":"Refinery patrol: Add banners and wisp-based execution","description":"Bring Refinery patrol up to Deacon's level of sophistication:\n\n## Current state\n- mol-refinery-patrol exists (needs verification)\n- Basic merge queue processing\n\n## Needed\n1. **Banners** - Print step banners like Deacon does:\n ```\n ═══════════════════════════════════════════════════════════════\n ⚗️ QUEUE-CHECK\n Processing merge queue entries\n ═══════════════════════════════════════════════════════════════\n ```\n\n2. **Wisp-based execution** - Spawn patrol as wisp, squash when complete\n3. **Handoff bead attachment** - Refinery needs its own handoff bead with attached_molecule\n4. **Loop-or-exit step** - Context-aware cycling like Deacon\n5. **Patrol summary banner** at end of each cycle\n\n## Reference\nSee Deacon patrol implementation in ~/gt/deacon/CLAUDE.md","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T13:19:40.777589-08:00","updated_at":"2025-12-27T21:29:53.015552-08:00","dependencies":[{"issue_id":"gt-qz2l","depends_on_id":"gt-y481","type":"parent-child","created_at":"2025-12-23T13:20:15.787696-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.015552-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-r01","title":"EXTERNAL: Beads Messaging \u0026 Knowledge Graph (bd-kwro)","description":"Tracking issue for external dependency on Beads v0.30.2 messaging features.\n\nBeads epic: bd-kwro in ~/src/beads (steveyegge/beads repo)\n\nThis blocks GGT work that depends on:\n- bd mail send/inbox/read/ack commands\n- message issue type\n- replies_to threading\n- Hooks system for notifications\n- Identity configuration\n\nGGT mail commands will be thin wrappers around bd mail once available.\n\nWhen bd-kwro ships in Beads v0.30.2, close this and unblock dependent work.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T13:12:02.676883-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-r099o","title":"Merge: imperator-1767106079026","description":"branch: polecat/imperator-1767106079026\ntarget: main\nsource_issue: imperator-1767106079026\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T10:46:40.45114-08:00","updated_at":"2025-12-31T14:03:14.671468-08:00","closed_at":"2025-12-31T14:03:14.671468-08:00","close_reason":"Stale MR - no branch","created_by":"gastown/polecats/imperator"}
{"id":"gt-r56e4","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All agents healthy, 1 message processed, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:02:47.692316-08:00","updated_at":"2025-12-27T21:26:03.050494-08:00","deleted_at":"2025-12-27T21:26:03.050494-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-r6td","title":"gt spawn: Notify Deacon and Witness on polecat start","description":"When gt spawn creates a polecat, it should mail both Deacon and Witness:\n\n```\ngt mail send \u003crig\u003e/witness -s 'POLECAT_STARTED furiosa' -m 'Issue: gt-xxx'\ngt mail send deacon/ -s 'POLECAT_STARTED gastown/furiosa' -m 'Issue: gt-xxx'\n```\n\nThis enables:\n- Witness to bond a lease to its patrol wisp\n- Deacon to verify worker started (redundancy)\n- Both to nudge if worker is idle at prompt\n\nPart of the village self-monitoring architecture.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T22:01:11.790203-08:00","updated_at":"2025-12-27T21:29:53.108254-08:00","deleted_at":"2025-12-27T21:29:53.108254-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-r73s","title":"Merge: gt-5wb7","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-5wb7\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:43:40.018286-08:00","updated_at":"2025-12-27T21:27:22.576429-08:00","deleted_at":"2025-12-27T21:27:22.576429-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-r7jj2","title":"Move mol run semantics from bd into gt spawn","description":"Currently gt spawn shells out to bd mol run for the pour+assign+pin combo. This blurs the architectural boundary between bd (data layer) and gt (orchestration layer).\n\n## Current State\n- bd mol run = bd pour + bd update --status=in_progress + bd pin --for me\n- gt spawn calls: exec.Command(\"bd\", \"mol\", \"run\", ...)\n\n## Proposed Change\nMove the combo logic into gt spawn directly:\n1. gt spawn calls bd pour (create issues from template)\n2. gt spawn calls bd update --status=in_progress (claim work)\n3. gt spawn calls bd pin (set pinned flag)\n4. gt spawn does session/tmux orchestration\n\nThis keeps bd as pure data operations and gt as the orchestration layer.\n\n## Why\n- Cleaner separation of concerns\n- bd mol run feels like orchestration but it's in the data layer\n- gt should own the \"start a workflow\" semantics\n\n## Coordination\nbeads side will deprecate bd mol run once gt no longer depends on it.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:59:01.873712-08:00","updated_at":"2025-12-27T21:29:55.351339-08:00","deleted_at":"2025-12-27T21:29:55.351339-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-r8ej","title":"Implement pinned beads for handoff state","description":"Add pinned bead support to beads:\n- Pinned beads never close, only update\n- Use for persistent state like Refinery handoffs\n- bd create --pinned flag\n- bd list --pinned to find them\n- Update description to change state","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T18:09:08.019293-08:00","updated_at":"2025-12-27T21:29:53.915582-08:00","dependencies":[{"issue_id":"gt-r8ej","depends_on_id":"gt-ktal","type":"blocks","created_at":"2025-12-19T18:09:39.340915-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.915582-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-r9ng","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:46","description":"Patrol 14: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:46:56.641819-08:00","updated_at":"2025-12-27T21:26:05.079324-08:00","deleted_at":"2025-12-27T21:26:05.079324-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana","title":"Patrol System: Agent lifecycle loops with attachments","description":"Enable Gas Town agents to run continuous patrols, survive crashes, and hand off work across sessions.\n\n## Core Concepts\n- **Attachments**: Molecules bound to agent's pinned bead until complete\n- **Patrols**: Cyclic molecules that loop (deacon, witness, refinery)\n- **Quiescent**: Agents that sleep until triggered (witness, refinery)\n\n## Design Doc\nSee docs/patrol-system-design.md\n\n## Phases\nPhase 1: Foundation (attachment field, daemon detection, mol-deacon-patrol)\nPhase 2: Quiescent Agents (wake triggers, witness/refinery patrols)\nPhase 3: Callbacks and Plugins (mail protocol, plugin runner)\nPhase 4: Polish (gt patrol status, metrics, tuning)\n\n## Key Decisions\n- Attachment as field on pinned bead (not edge type, for now)\n- Mail-based orchestration for all callbacks\n- Queue replacement for heartbeat delivery (not pile-up)\n- Burn and respawn for patrol loops (not in-place reset)","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-21T13:38:23.416949-08:00","updated_at":"2025-12-27T21:29:53.568581-08:00","deleted_at":"2025-12-27T21:29:53.568581-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-rana.1","title":"Phase 1.1: Attachment field on pinned beads","description":"Add attached_molecule field to pinned bead schema.\n\n## Schema Change\nAdd to handoff/pinned bead issues:\n- attached_molecule: string (root issue ID of attached mol)\n- attached_at: timestamp\n\n## Implementation\n- Update beads Issue struct if needed\n- Or: use labels/metadata field\n- Ensure bd show displays attachment info\n\n## Acceptance\n- Can set/clear attachment on a bead\n- bd show displays attachment status","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T13:38:34.251531-08:00","updated_at":"2025-12-27T21:29:53.560114-08:00","dependencies":[{"issue_id":"gt-rana.1","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:38:34.253378-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.560114-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.2","title":"Phase 1.2: Daemon attachment detection","description":"Daemon polls Deacon's pinned bead to detect naked state.\n\n## Implementation\n- Daemon knows Deacon's pinned bead ID\n- Every heartbeat cycle: bd show \u003cpinned\u003e --json\n- Parse attached_molecule field\n- If null/empty: Deacon is naked\n\n## Actions on Naked\n- Spawn mol-deacon-patrol\n- Attach to Deacon's pinned\n- Nudge Deacon to start\n\n## Failsafes\n- Keepalive file monitoring\n- Stale detection (\u003e10min no progress)\n- Force nudge on stale\n\nDepends: gt-rana.1","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T13:38:53.520213-08:00","updated_at":"2025-12-27T21:29:53.551771-08:00","dependencies":[{"issue_id":"gt-rana.2","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:38:53.521672-08:00","created_by":"daemon"},{"issue_id":"gt-rana.2","depends_on_id":"gt-rana.1","type":"blocks","created_at":"2025-12-21T13:39:11.333008-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.551771-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.3","title":"Phase 1.3: mol-deacon-patrol definition","description":"Define the Deacon patrol molecule in builtin_molecules.go.\n\n## Steps\n1. inbox-check - Handle callbacks from agents\n2. health-scan - Ping Witnesses and Refineries\n3. plugin-run - Execute registered plugins\n4. orphan-check - Find abandoned work\n5. session-gc - Clean dead sessions\n6. context-check - Check own context limit\n7. loop-or-exit - Burn and let daemon respawn, or exit if context high\n\n## Implementation\n- Add DeaconPatrolMolecule() to builtin_molecules.go\n- Add to BuiltinMolecules() list\n- Test with bd mol show mol-deacon-patrol","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T13:38:54.555938-08:00","updated_at":"2025-12-27T21:25:59.969445-08:00","dependencies":[{"issue_id":"gt-rana.3","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:38:54.55744-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:25:59.969445-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.4","title":"Phase 1.4: Basic patrol runner in Deacon","description":"Deacon CLAUDE.md and prime context for patrol execution.\n\n## Deacon Context\n- Update Deacon CLAUDE.md with patrol instructions\n- On wake: check pinned bead for attachment\n- If attached: resume molecule from current step\n- Execute steps, close each when done\n- On final step: burn molecule, go naked\n\n## gt prime Enhancement\n- Detect if agent has attached molecule\n- Show molecule progress in prime output\n- Include patrol-specific instructions\n\nDepends: gt-rana.3","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T13:38:55.457748-08:00","updated_at":"2025-12-27T21:29:53.543442-08:00","dependencies":[{"issue_id":"gt-rana.4","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:38:55.459852-08:00","created_by":"daemon"},{"issue_id":"gt-rana.4","depends_on_id":"gt-rana.3","type":"blocks","created_at":"2025-12-21T13:39:11.402784-08:00","created_by":"daemon"},{"issue_id":"gt-rana.4","depends_on_id":"gt-3x0z.8","type":"blocks","created_at":"2025-12-21T15:20:16.297449-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.543442-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.5","title":"Phase 2: Quiescent agents (Witness, Refinery)","description":"Enable Witness and Refinery to sleep until triggered.\n\n## Scope\n- Wake triggers: gt spawn, MR submit, mail, Deacon ping\n- Wake SLA: \u003c1 minute\n- mol-witness-patrol and mol-refinery-patrol definitions\n- Quiescent entry/exit protocol\n\n## Implementation\n- Kill tmux session on quiescent entry\n- Preserve sandbox (worktree, state files)\n- Restart session on wake trigger\n- Agent checks pinned, spawns default patrol if naked\n\nDepends: Phase 1 complete","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T13:39:12.926917-08:00","updated_at":"2025-12-27T21:29:56.653512-08:00","dependencies":[{"issue_id":"gt-rana.5","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:39:12.928741-08:00","created_by":"daemon"},{"issue_id":"gt-rana.5","depends_on_id":"gt-rana.4","type":"blocks","created_at":"2025-12-21T13:39:23.067953-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.653512-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.6","title":"Phase 3: Callbacks and plugins","description":"Mail-based callback protocol and plugin surface.\n\n## Callbacks\n- Polecat → Witness: shutdown requests\n- Witness → Deacon: escalations, status\n- Crew → Deacon: recycle requests\n- Deacon → Mayor: escalations\n\n## Plugins\n- .beads/plugins.yaml registry\n- Cooldown tracking\n- Plugin runner in deacon patrol\n\nDepends: Phase 2","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T13:39:14.120827-08:00","updated_at":"2025-12-27T21:29:56.645105-08:00","dependencies":[{"issue_id":"gt-rana.6","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:39:14.122713-08:00","created_by":"daemon"},{"issue_id":"gt-rana.6","depends_on_id":"gt-rana.5","type":"blocks","created_at":"2025-12-21T13:39:23.138268-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.645105-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rana.7","title":"Phase 4: Polish and observability","description":"Production readiness.\n\n## Commands\n- gt patrol status - Show patrol state for all agents\n- gt patrol history - Recent patrol activity\n\n## Observability\n- Metrics collection\n- Alert thresholds\n- Dashboard (optional)\n\n## Tuning\n- Cooldown optimization\n- Wake SLA verification\n- Error threshold tuning\n\nDepends: Phase 3","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-21T13:39:15.421608-08:00","updated_at":"2025-12-27T21:29:57.597569-08:00","dependencies":[{"issue_id":"gt-rana.7","depends_on_id":"gt-rana","type":"parent-child","created_at":"2025-12-21T13:39:15.423245-08:00","created_by":"daemon"},{"issue_id":"gt-rana.7","depends_on_id":"gt-rana.6","type":"blocks","created_at":"2025-12-21T13:39:23.209512-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.597569-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rbacd","title":"Digest: mol-deacon-patrol","description":"Patrol 20: clear, handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:18:37.41487-08:00","updated_at":"2025-12-27T21:26:00.937721-08:00","deleted_at":"2025-12-27T21:26:00.937721-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rbasx","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 5: routine, healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:23:16.720779-08:00","updated_at":"2025-12-27T21:26:01.933659-08:00","deleted_at":"2025-12-27T21:26:01.933659-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rbp6.1","title":"Test Polecat Arm","description":"Test child for bonding pattern","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:32:00.648392-08:00","updated_at":"2025-12-27T21:29:55.403481-08:00","deleted_at":"2025-12-27T21:29:55.403481-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rectf","title":"Digest: mol-deacon-patrol","description":"P11","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:25:17.043804-08:00","updated_at":"2025-12-27T21:26:01.65427-08:00","deleted_at":"2025-12-27T21:26:01.65427-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rgd9x","title":"Deprecate hook files, use pinned beads for propulsion","description":"## Problem\n\nHook files were \"deprecated\" via comments but NOT eradicated. The code is all still there and actively used.\n\n## Current State (NOT Done)\n\n**Live hook file exists:**\n- `/Users/stevey/gt/deacon/.beads/hook-deacon.json`\n\n**Code still calling hook functions:**\n- `resume.go:151` - WriteSlungWork\n- `park.go:114` - ReadHook \n- `spawn.go:407` - WriteSlungWork\n- `molecule_step.go:293` - WriteSlungWork\n- `molecule_step.go:368` - BurnHook\n\n**Hook infrastructure still exists:**\n- `internal/wisp/io.go` - All hook I/O functions\n- `internal/wisp/types.go` - SlungWork, Wisp types, constants\n- `.beads/.gitignore` - hook-*.json pattern\n\n**14 docs still reference hooks:**\narchitecture.md, beads-data-plane.md, molecular-chemistry.md, molecules.md,\npinned-beads-design.md, polecat-lifecycle.md, polecat-wisp-architecture.md,\npropulsion-principle.md, session-lifecycle.md, sling-design.md, vision.md,\nwisp-architecture.md, witness-patrol-design.md, ~/gt/CLAUDE.md\n\n## Goal: Total Eradication\n\n1. Remove ALL hook file code from callers\n2. Delete hook functions from internal/wisp/\n3. Delete hook types and constants\n4. Remove gitignore patterns\n5. Delete any live hook files, convert to pinned beads\n6. Update all documentation\n7. Verify propulsion cycle works with pinned beads only","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-26T15:36:31.396028-08:00","updated_at":"2025-12-27T21:29:45.86571-08:00","deleted_at":"2025-12-27T21:29:45.86571-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-rgd9x.1","title":"Remove hook code from cmd callers","description":"Remove hook file calls from:\n- resume.go:151 - WriteSlungWork → use bd update --status=pinned\n- park.go:114 - ReadHook → query pinned beads\n- spawn.go:407 - WriteSlungWork → use bd update --status=pinned\n- molecule_step.go:293 - WriteSlungWork → use bd update --status=pinned\n- molecule_step.go:368 - BurnHook → use bd update --status=open","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-26T16:13:26.012493-08:00","updated_at":"2025-12-27T21:29:45.848267-08:00","dependencies":[{"issue_id":"gt-rgd9x.1","depends_on_id":"gt-rgd9x","type":"parent-child","created_at":"2025-12-26T16:13:26.012978-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.848267-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rgd9x.2","title":"Delete hook functions from internal/wisp","description":"Delete from internal/wisp/io.go:\n- WriteSlungWork\n- ReadHook\n- BurnHook\n- HasHook\n- ListHooks\n- HookPath\n\nKeep: EnsureDir, WispPath, writeJSON (used by other things)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-26T16:13:27.392457-08:00","updated_at":"2025-12-27T21:29:45.840027-08:00","dependencies":[{"issue_id":"gt-rgd9x.2","depends_on_id":"gt-rgd9x","type":"parent-child","created_at":"2025-12-26T16:13:27.396963-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.840027-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rgd9x.3","title":"Delete hook types from internal/wisp","description":"Delete from internal/wisp/types.go:\n- WispType type\n- TypeSlungWork constant\n- HookPrefix, HookSuffix constants\n- Wisp struct\n- SlungWork struct\n- NewSlungWork function\n- HookFilename function\n- AgentFromHookFilename function\n\nThe entire file may become empty or minimal.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-26T16:13:28.450647-08:00","updated_at":"2025-12-27T21:29:45.831852-08:00","dependencies":[{"issue_id":"gt-rgd9x.3","depends_on_id":"gt-rgd9x","type":"parent-child","created_at":"2025-12-26T16:13:28.452579-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.831852-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rgd9x.4","title":"Clean up hook-related gitignore and live files","description":"- Remove hook-*.json from .beads/.gitignore\n- Delete /Users/stevey/gt/deacon/.beads/hook-deacon.json\n- Convert gt-wisp-0gu to pinned bead if needed","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-26T16:13:29.60532-08:00","updated_at":"2025-12-27T21:29:45.823418-08:00","dependencies":[{"issue_id":"gt-rgd9x.4","depends_on_id":"gt-rgd9x","type":"parent-child","created_at":"2025-12-26T16:13:29.605807-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.823418-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rgd9x.5","title":"Update all documentation to remove hook references","description":"14 docs reference hooks:\n- docs/architecture.md\n- docs/beads-data-plane.md\n- docs/molecular-chemistry.md\n- docs/molecules.md\n- docs/pinned-beads-design.md\n- docs/polecat-lifecycle.md\n- docs/polecat-wisp-architecture.md\n- docs/propulsion-principle.md\n- docs/session-lifecycle.md\n- docs/sling-design.md\n- docs/vision.md\n- docs/wisp-architecture.md\n- docs/witness-patrol-design.md\n- ~/gt/CLAUDE.md\n\nUpdate all to reflect pinned beads as the sole propulsion mechanism.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:13:30.654895-08:00","updated_at":"2025-12-27T21:29:54.825636-08:00","dependencies":[{"issue_id":"gt-rgd9x.5","depends_on_id":"gt-rgd9x","type":"parent-child","created_at":"2025-12-26T16:13:30.658462-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.825636-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rhfji","title":"gt up: set BD_ACTOR env var when spawning agents","description":"When gt spawns agents (polecats, crew, patrol roles), it should set BD_ACTOR env var so that bd commands (like `bd hook`) know the agent identity.\n\nCurrently gt sets GT_ROLE but bd can't check that (no coupling to gt). bd already checks BD_ACTOR, so gt just needs to set it.\n\nPlaces to update:\n- `gt up` for polecats/crew\n- Deacon, Witness, Refinery spawning\n- Any other agent spawn points\n\nRelated: closed bd-fej5 in beads repo as obsolete (fix belongs here).","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T12:41:21.092758-08:00","updated_at":"2025-12-27T21:29:55.223663-08:00","deleted_at":"2025-12-27T21:29:55.223663-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rig-test","title":"test-rig","description":"Test rig identity bead","status":"tombstone","priority":2,"issue_type":"rig","created_at":"2026-01-06T18:51:32.532241-08:00","updated_at":"2026-01-06T18:51:48.141533-08:00","created_by":"gastown/polecats/furiosa","deleted_at":"2026-01-06T18:51:48.141533-08:00","deleted_by":"gastown/polecats/furiosa","delete_reason":"delete","original_type":"rig"}
{"id":"gt-rivr","title":"Activity Feed TUI","description":"Terminal UI for browsing Gas Town activity. Shows hierarchical view of rigs, workers, and their current state. Features: live updates, expandable details, molecule progress (step X of Y). Built with bubbletea/lipgloss. Design doc: history/activity-feed-tui-design.md","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T16:26:45.667677-08:00","updated_at":"2025-12-27T21:29:52.948339-08:00","dependencies":[{"issue_id":"gt-rivr","depends_on_id":"gt-3pm0f","type":"relates-to","created_at":"2025-12-28T09:34:57.509657-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.948339-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-rixa","title":"Bug: parseLifecycleRequest always matches 'cycle' due to LIFECYCLE prefix","description":"In lifecycle.go, parseLifecycleRequest checks strings.Contains(title, \"cycle\") first, but the prefix \"LIFECYCLE:\" contains the word \"cycle\". This means ALL lifecycle messages match the cycle action, making restart and shutdown unreachable. Fix: check for restart/shutdown before cycle, or use word boundaries.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-19T16:17:16.083512-08:00","updated_at":"2025-12-27T21:29:56.958825-08:00","deleted_at":"2025-12-27T21:29:56.958825-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-rm3","title":"CLI: gt refinery commands (start, stop, status, queue)","description":"CLI commands for managing the Refinery agent.\n\n## Commands\n\n```bash\ngt refinery start \u003crig\u003e # Start refinery for a rig\ngt refinery stop \u003crig\u003e # Stop refinery\ngt refinery status \u003crig\u003e # Show refinery status\ngt refinery queue \u003crig\u003e # Show merge queue\n```\n\n## gt refinery start\n\nStarts the Refinery daemon for the specified rig.\n\nOptions:\n- --foreground: Run in foreground (default: background)\n- --auto-merge: Enable auto-merge (default: from config)\n\n## gt refinery stop\n\nStops a running Refinery. Gracefully finishes current MR if processing.\n\n## gt refinery status\n\nShows:\n- Running state (running/stopped)\n- Current MR being processed (if any)\n- Queue length\n- Last merge time\n- Recent activity\n\n## gt refinery queue\n\nShows the merge queue:\n```\nMerge queue for 'wyvern':\n 1. [pending] Toast/polecat-auth-fix (15m ago)\n 2. [pending] Capable/polecat-new-feature (5m ago)\n \n1 merged today, 0 rejected\n```\n\n## Implementation\n\nUses gt-ov2 (Refinery agent) for daemon functionality.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T23:22:24.754361-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-rm3","depends_on_id":"gt-ov2","type":"blocks","created_at":"2025-12-15T23:22:30.679909-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ro5hc","title":"Digest: mol-deacon-patrol","description":"Patrol 20: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:02:56.374874-08:00","updated_at":"2025-12-27T21:26:00.448545-08:00","deleted_at":"2025-12-27T21:26:00.448545-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ronyn","title":"CRITICAL: gt doctor orphan-processes kills active crew sessions","description":"## Problem\n\nThe deacon's patrol runs `gt doctor --fix` which includes an orphan-processes check. This check is incorrectly identifying ACTIVE crew sessions as orphaned and killing them.\n\n## Impact\n\n- Crew workers are being killed mid-session\n- Work is lost\n- The deacon is currently disabled until this is fixed\n\n## Root Cause\n\nThe orphan-processes detection logic in `internal/doctor/` doesn't properly identify crew workers. It likely:\n1. Doesn't recognize the `gt-\u003crig\u003e-\u003cname\u003e` session pattern for crew\n2. Or misidentifies the Claude process as orphaned when it has a valid tmux parent\n\n## Workaround\n\nThe patrol formula was updated to instruct deacon not to run `--fix` if orphan-processes reports any PIDs. But this workaround failed - the deacon still killed crew.\n\n## Fix Required\n\n1. Update `internal/doctor/identity_check.go` or `branch_check.go` to properly detect crew sessions\n2. Add crew session patterns to the valid session detection\n3. Add test coverage for crew worker scenarios\n\n## Files\n\n- internal/doctor/identity_check.go\n- internal/doctor/branch_check.go \n- internal/cmd/doctor.go\n\n## Related\n\n- Commit 3452b041 attempted to fix via formula instructions but failed\n- The digest hq-yedg claimed to file gt-vj5zc but never did","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-27T00:14:05.870126-08:00","updated_at":"2025-12-27T21:29:45.789954-08:00","created_by":"deacon","deleted_at":"2025-12-27T21:29:45.789954-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-rp0k","title":"Extend auto-continue to polecats (not just crew)","description":"gt prime currently only outputs AUTO-CONTINUE MODE for crew workers.\nPolecats with attached work should also auto-continue.\n\n## Current behavior\noutputCrewAttachmentStatus() in prime.go:\n- Only runs for RoleCrew\n- Outputs '→ AUTO-CONTINUE MODE' when attached work detected\n\n## Desired behavior\n- Rename to outputAttachmentStatus() or similar\n- Run for RoleCrew AND RolePolecat\n- Same directive: if attachment exists, work immediately\n\n## The Propulsion Principle\n'If you find something on your hook, YOU RUN IT.'\n\nThis applies to ALL workers, not just crew.\n\n## Implementation\n1. Extend role check in outputCrewAttachmentStatus()\n2. Adjust assignee detection for polecats vs crew\n3. Test with both worker types","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T16:43:22.149252-08:00","updated_at":"2025-12-27T21:29:53.166587-08:00","deleted_at":"2025-12-27T21:29:53.166587-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rpsu2","title":"Digest: mol-deacon-patrol","description":"Patrol 6: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:31:06.415769-08:00","updated_at":"2025-12-27T21:26:03.903011-08:00","deleted_at":"2025-12-27T21:26:03.903011-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rsxwb","title":"implement","description":"Implement the solution for gt-ds3h3. Follow codebase conventions.\nFile discovered work as new issues with bd create.\n\nMake regular commits with clear messages.\nKeep changes focused on the assigned issue.\n\nDepends: load-context","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:59:05.782317-08:00","updated_at":"2025-12-25T14:12:42.080641-08:00","dependencies":[{"issue_id":"gt-rsxwb","depends_on_id":"gt-kp3s3","type":"parent-child","created_at":"2025-12-25T01:59:05.800066-08:00","created_by":"stevey"},{"issue_id":"gt-rsxwb","depends_on_id":"gt-up9uw","type":"blocks","created_at":"2025-12-25T01:59:05.816315-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.080641-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rt6g","title":"Bug: Deacon session linked to Mayor pane causes heartbeat crosstalk","description":"**Root cause found**: The gt-deacon tmux session was linked to gt-mayor window 2 - they shared the same pane (@283). When the daemon sent heartbeats to the Deacon, they appeared as typed input in the Mayor's window, interrupting user prompts.\n\n**Fix applied**: Killed the broken gt-deacon session. The daemon auto-recreates it with a proper independent pane.\n\n**Prevention**: Investigate how sessions can get linked and add safeguards to session creation code.\n\nOriginal symptom: Mayor receiving 'HEARTBEAT: Tip: Witnesses monitor polecats...' messages that ate user input.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T17:00:28.667896-08:00","updated_at":"2025-12-27T21:29:56.345049-08:00","deleted_at":"2025-12-27T21:29:56.345049-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ruw","title":"Fix TestHasPolecat test failure in internal/session","description":"TestHasPolecat in internal/session/manager_test.go fails because it expects\nspecific polecats (Toast, Cheedo) to exist in the test environment.\n\nError:\n```\nmanager_test.go:46: expected hasPolecat(Toast) = true\nmanager_test.go:49: expected hasPolecat(Cheedo) = true\n```\n\nFix: Either create test fixtures or mock the filesystem check.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-17T15:02:30.030032-08:00","updated_at":"2025-12-27T21:29:57.305373-08:00","deleted_at":"2025-12-27T21:29:57.305373-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-rvx9n","title":"Digest: mol-deacon-patrol","description":"Patrol 10: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:22:05.638179-08:00","updated_at":"2025-12-27T21:26:00.141263-08:00","deleted_at":"2025-12-27T21:26:00.141263-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rw2z","title":"gt mail send: support reading message body from stdin","description":"Currently gt mail send -m requires the message as a command-line argument, which causes shell escaping issues with backticks, quotes, and special characters.\n\nAdd support for reading message body from stdin:\n- gt mail send addr -s 'Subject' --stdin # Read body from stdin\n- echo 'body' | gt mail send addr -s 'Subject' -m - # Convention: -m - means stdin\n\nThis would allow:\ncat \u003c\u003c'EOF' | gt mail send addr -s 'Subject' --stdin\nMessage with \\`backticks\\` and 'quotes' safely\nEOF\n\nWithout this, agents struggle to send handoff messages containing code snippets.\n\n## Moved from beads\nOriginally bd-3bsz. gt mail is in gastown, not beads.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T12:19:23.505896-08:00","updated_at":"2025-12-27T21:29:55.993621-08:00","deleted_at":"2025-12-27T21:29:55.993621-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-rwdtl","title":"Digest: mol-deacon-patrol","description":"Patrol 10: routine, halfway","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:17:53.312326-08:00","updated_at":"2025-12-27T21:26:03.552463-08:00","deleted_at":"2025-12-27T21:26:03.552463-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rx2y4","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:02:29.590204-08:00","updated_at":"2025-12-27T21:26:04.009896-08:00","deleted_at":"2025-12-27T21:26:04.009896-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rx5sc","title":"Digest: mol-deacon-patrol","description":"Patrol 18: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:01:46.227281-08:00","updated_at":"2025-12-27T21:26:00.465467-08:00","deleted_at":"2025-12-27T21:26:00.465467-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rxewi","title":"Digest: mol-deacon-patrol","description":"Patrol complete: all agents healthy, no callbacks, cleaned 1 stale lock","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:56:57.672878-08:00","updated_at":"2025-12-27T21:26:00.686015-08:00","deleted_at":"2025-12-27T21:26:00.686015-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ry56s","title":"Digest: mol-deacon-patrol","description":"Patrol 12: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:37:23.791096-08:00","updated_at":"2025-12-27T21:26:00.348032-08:00","deleted_at":"2025-12-27T21:26:00.348032-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ryz8u","title":"Digest: mol-deacon-patrol","description":"Patrol 2: inbox empty, all agents healthy, fixed 1 orphan process","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:33:22.412072-08:00","updated_at":"2025-12-27T21:26:00.929431-08:00","deleted_at":"2025-12-27T21:26:00.929431-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-rzbjo","title":"Merge: nux-mjw3mn8o","description":"branch: polecat/nux-mjw3mn8o\ntarget: main\nsource_issue: nux-mjw3mn8o\nrig: gastown\nagent_bead: gt-gastown-polecat-nux","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T18:50:42.47422-08:00","updated_at":"2026-01-01T18:59:28.573183-08:00","closed_at":"2026-01-01T18:59:28.573183-08:00","close_reason":"Merged to main at a6ae2c61","created_by":"gastown/polecats/nux"}
{"id":"gt-rzimf","title":"test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T00:07:02.72402-08:00","updated_at":"2025-12-28T00:13:37.296433-08:00","created_by":"gastown/crew/joe","deleted_at":"2025-12-28T00:13:38.296433-08:00"}
{"id":"gt-s07hu","title":"Session ended: gt-gastown-ace","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:54:06.118057-08:00","updated_at":"2026-01-04T16:41:37.873849-08:00","closed_at":"2026-01-04T16:41:37.873849-08:00","close_reason":"Archived","created_by":"gastown/polecats/ace"}
{"id":"gt-s31zw","title":"Digest: mol-deacon-patrol","description":"Patrol 2: Quick scan, all healthy, no work","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:43:30.599994-08:00","updated_at":"2025-12-27T21:26:01.560854-08:00","deleted_at":"2025-12-27T21:26:01.560854-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-s3m0","title":"gt polecat: add 'done' or 'finish' command to transition from working to idle","description":"When a polecat finishes work but session wasn't properly cleaned up, there's no way to reset it from 'working' state back to 'idle'.\n\nTried:\n```\ngt polecat sleep gastown/Angharad\nError: sleeping polecat: polecat is not active (state: working)\n```\n\nThe sleep command only works on 'active' polecats, not 'working' ones.\n\nHad to manually edit state.json to reset:\n```\njq '.state = \"idle\" | .issue = \"\"' state.json\n```\n\nNeed a command like:\n```\ngt polecat done gastown/Angharad # working -\u003e idle\ngt polecat finish gastown/Angharad # working -\u003e idle \ngt polecat reset gastown/Angharad # any state -\u003e idle (force)\n```","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-19T01:41:39.851037-08:00","updated_at":"2025-12-27T21:29:57.018342-08:00","deleted_at":"2025-12-27T21:29:57.018342-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-s6xh6","title":"Digest: mol-deacon-patrol","description":"Patrol 7: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:31:50.767512-08:00","updated_at":"2025-12-27T21:26:00.804761-08:00","deleted_at":"2025-12-27T21:26:00.804761-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-s7mj1","title":"Digest: mol-deacon-patrol","description":"Patrol 13: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:58:45.986834-08:00","updated_at":"2025-12-27T21:26:00.507128-08:00","deleted_at":"2025-12-27T21:26:00.507128-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-s7mok","title":"Digest: mol-deacon-patrol","description":"P4: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:55:29.033222-08:00","updated_at":"2025-12-27T21:26:02.425646-08:00","deleted_at":"2025-12-27T21:26:02.425646-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-s89rg","title":"Phase 1: Core messaging primitives (@group, list:, queue:)","description":"## Scope\n\nCore messaging infrastructure that enables multi-recipient and work-queue patterns.\n\n### Deliverables\n\n1. **Config directory structure** - ~/gt/config/ with lists.json, queues.json\n2. **@group dynamic resolution** - Filesystem scan for agent directories\n - @rig/gastown → scan gastown/ for agents\n - @town → scan all rigs\n - @witnesses, @crew/gastown, etc.\n3. **list:name static lookup** - Fan-out to N copies (each recipient gets work item)\n4. **queue:name with claim** - Shared storage, atomic claim via bd update --claim\n5. **Fan-out at send time** - gt mail send handles expansion\n\n### Key semantics\n- @group and list: both fan out (N copies, N obligations)\n- queue: shared copy, first-to-claim wins\n- All resolve at send time, not receive time","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T14:56:33.896754-08:00","updated_at":"2025-12-27T21:29:55.181182-08:00","deleted_at":"2025-12-27T21:29:55.181182-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-s8iu","title":"Digest: mol-deacon-patrol","description":"Test patrol cycle - verifying wisp flow","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T02:10:19.639919-08:00","updated_at":"2025-12-27T21:26:05.339313-08:00","deleted_at":"2025-12-27T21:26:05.339313-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-s98ic","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:25:44.064035-08:00","updated_at":"2026-01-01T19:27:51.294526-08:00","closed_at":"2026-01-01T19:27:51.294526-08:00","close_reason":"Merged to main at dbdf47c3","created_by":"gastown/polecats/rictus"}
{"id":"gt-sadkq","title":"Execute registered plugins","description":"Execute registered plugins.\n\nScan ~/gt/plugins/ for plugin directories. Each plugin has a plugin.md with\nYAML frontmatter defining its gate (when to run) and instructions (what to do).\n\nSee docs/deacon-plugins.md for full documentation.\n\nGate types:\n- cooldown: Time since last run (e.g., 24h)\n- cron: Schedule-based (e.g., \"0 9 * * *\")\n- condition: Metric threshold (e.g., wisp count \u003e 50)\n- event: Trigger-based (e.g., startup, heartbeat)\n\nFor each plugin:\n1. Read plugin.md frontmatter to check gate\n2. Compare against state.json (last run, etc.)\n3. If gate is open, execute the plugin\n\nPlugins marked parallel: true can run concurrently using Task tool subagents.\nSequential plugins run one at a time in directory order.\n\nSkip this step if ~/gt/plugins/ does not exist or is empty.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.776878-08:00","updated_at":"2025-12-27T21:29:55.266102-08:00","dependencies":[{"issue_id":"gt-sadkq","depends_on_id":"gt-v7wq4","type":"blocks","created_at":"2025-12-25T02:11:33.919459-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.266102-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-scak","title":"inbox-check","description":"Process witness mail: lifecycle requests, help requests.\n\n```bash\ngt mail inbox\n```\n\nHandle by message type:\n- LIFECYCLE/Shutdown: Queue for pre-kill verification\n- Blocked/Help: Assess if resolvable or escalate\n- HANDOFF: Load predecessor state\n- Work complete: Verify issue closed, proceed to pre-kill\n\nRecord any pending actions for later steps.\nMark messages as processed when complete.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.504858-08:00","updated_at":"2025-12-25T15:52:57.761579-08:00","deleted_at":"2025-12-25T15:52:57.761579-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-scdyn","title":"Merge: organic-mjwjck2f","description":"branch: polecat/organic-mjwjck2f\ntarget: main\nsource_issue: organic-mjwjck2f\nrig: gastown\nagent_bead: gt-gastown-polecat-organic","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T23:40:12.50114-08:00","updated_at":"2026-01-01T23:41:19.133358-08:00","closed_at":"2026-01-01T23:41:19.133358-08:00","close_reason":"Duplicate - work already merged as gt-a28hb","created_by":"gastown/polecats/organic"}
{"id":"gt-sd3up","title":"Digest: mol-deacon-patrol","description":"Patrol 13: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:18:42.136385-08:00","updated_at":"2025-12-27T21:26:03.527896-08:00","deleted_at":"2025-12-27T21:26:03.527896-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sd6","title":"Enhanced polecat decommission prompting","description":"Add decommission checklist to polecat AGENTS.md.template. Make crystal clear: verify ALL before signaling done.\n\n## Checklist for AGENTS.md.template\n\n```markdown\n## Decommission Checklist\n\n**CRITICAL**: Before signaling done, you MUST complete this checklist.\nThe Witness will verify each item and bounce you back if dirty.\n\n### Pre-Done Verification\n\n```bash\n# 1. Git status - must be clean\ngit status\n# Expected: \"nothing to commit, working tree clean\"\n\n# 2. Stash list - must be empty\ngit stash list\n# Expected: (empty output)\n\n# 3. Beads sync - must be up to date\nbd sync --status\n# Expected: \"Up to date\" or \"Nothing to sync\"\n\n# 4. Branch merged - your work must be on main\ngit log main --oneline -1\ngit log HEAD --oneline -1\n# Expected: Same commit\n```\n\n### If Any Check Fails\n\n- **Uncommitted changes**: Commit them or discard if unnecessary\n- **Stashes**: Pop and commit, or drop if obsolete\n- **Beads out of sync**: Run `bd sync`\n- **Branch not merged**: Complete the merge workflow\n\n### Signaling Done\n\nOnly after ALL checks pass:\n\n```bash\nbd close \u003cissue-id\u003e\nbd sync\ntown mail send \u003crig\u003e/witness -s \"Work Complete\" -m \"Issue \u003cid\u003e done.\"\n```\n```\n\n## Implementation\n\nAdd to AGENTS.md.template in the polecat prompting section.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:48:57.911311-08:00","updated_at":"2025-12-27T21:29:54.56387-08:00","dependencies":[{"issue_id":"gt-sd6","depends_on_id":"gt-82y","type":"blocks","created_at":"2025-12-15T19:49:06.008061-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.56387-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-selw","title":"gt spawn: add --polecat flag for explicit worker selection","description":"Currently gt spawn requires positional arg format:\n```\ngt spawn gastown/Angharad --issue gt-xyz\n```\n\nBut I tried the more intuitive flag form:\n```\ngt spawn --issue gt-xyz --polecat Angharad\n```\n\nThis failed with 'unknown flag: --polecat'.\n\nThe flag form is more discoverable and consistent with other commands. Add --polecat flag as alternative to positional arg.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-19T01:41:38.540563-08:00","updated_at":"2025-12-27T21:29:57.026674-08:00","deleted_at":"2025-12-27T21:29:57.026674-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-sezjt","title":"Merge: slit-mjtj9dc8","description":"branch: polecat/slit-mjtj9dc8\ntarget: main\nsource_issue: slit-mjtj9dc8\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:37:48.699134-08:00","updated_at":"2025-12-30T23:12:37.135918-08:00","closed_at":"2025-12-30T23:12:37.135918-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/slit"}
{"id":"gt-sgzsb","title":"BUG: Boot spawns in wrong session (gt-deacon instead of gt-deacon-boot)","description":"## Problem\n\nBoot (the Deacon's watchdog) is supposed to spawn in its own session `gt-deacon-boot` per the code:\n```go\nconst SessionName = \"gt-deacon-boot\"\n```\n\nBut when checking tmux, Boot appears to be running in the `gt-deacon` session with cwd `dogs/boot`.\n\n## Evidence\n\n```\n$ tmux capture-pane -t gt-deacon -p\n⏺ Bash(cd /Users/stevey/gt/deacon \u0026\u0026 gt feed --since 10m --plain 2\u003e/dev/null)\n Shell cwd was reset to /Users/stevey/gt/deacon/dogs/boot\n```\n\n## Impact\n\n- `gt status` shows Deacon as \"stopped\" when Boot is running in its session\n- Boot and Deacon lifecycle are confused\n- Daemon's two-layer approach (Boot triage + Deacon heartbeat) doesn't work correctly\n\n## Expected Behavior\n\n1. Boot spawns in `gt-deacon-boot` session\n2. Boot runs triage, decides if Deacon needs starting\n3. If yes, Boot starts Deacon in `gt-deacon` session\n4. Both sessions can coexist: Boot (ephemeral triage) and Deacon (persistent patrol)\n\n## Investigation Needed\n\n- Check `boot.spawnTmux()` - is it creating the right session name?\n- Check if something is reusing the Deacon session for Boot\n- Verify session naming throughout the daemon/boot code","status":"closed","priority":1,"issue_type":"bug","created_at":"2026-01-02T18:42:01.182249-08:00","updated_at":"2026-01-02T18:56:01.865782-08:00","closed_at":"2026-01-02T18:56:01.865782-08:00","close_reason":"Fixed tmux prefix matching bug: renamed Boot session from gt-deacon-boot to gt-boot to prevent HasSession collision","created_by":"mayor"}
{"id":"gt-sh53w","title":"Merge: slit-mjtj9dc8","description":"branch: polecat/slit-mjtj9dc8\ntarget: main\nsource_issue: slit-mjtj9dc8\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:27:46.827128-08:00","updated_at":"2025-12-30T23:12:42.826094-08:00","closed_at":"2025-12-30T23:12:42.826094-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/slit"}
{"id":"gt-shnp","title":"Create Refinery role template","description":"Add Refinery template to internal/templates/roles/:\n- refinery.md.tmpl with full role context\n- Variables: rig name, working directory, handoff bead ID\n- Update SeedRoleTemplates to include it\n- gt prime uses this template for Refinery context","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T18:09:28.951284-08:00","updated_at":"2025-12-27T21:29:53.898723-08:00","dependencies":[{"issue_id":"gt-shnp","depends_on_id":"gt-ktal","type":"blocks","created_at":"2025-12-19T18:09:39.706849-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.898723-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sij0a","title":"GT_ROLE env var not set correctly for crew workers","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-26T16:49:12.570524-08:00","updated_at":"2025-12-27T21:29:45.815017-08:00","deleted_at":"2025-12-27T21:29:45.815017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-sj02z","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:19:09.886984-08:00","updated_at":"2025-12-27T21:26:02.675108-08:00","deleted_at":"2025-12-27T21:26:02.675108-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sl5rw","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 20: final before handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:31:45.905226-08:00","updated_at":"2025-12-27T21:26:01.804017-08:00","deleted_at":"2025-12-27T21:26:01.804017-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-slo","title":"Fix TestHasPolecat test failure","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-17T17:30:19.474356-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-sm9ti","title":"Digest: mol-deacon-patrol","description":"Patrol 5: Quick cycle, all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:05:29.027668-08:00","updated_at":"2025-12-27T21:26:03.017334-08:00","deleted_at":"2025-12-27T21:26:03.017334-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sn8if","title":"Digest: mol-deacon-patrol","description":"Patrol 8: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:43:39.832046-08:00","updated_at":"2025-12-27T21:26:03.222694-08:00","deleted_at":"2025-12-27T21:26:03.222694-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sob9x","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 16: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:48.666655-08:00","updated_at":"2026-01-01T17:35:15.929729-08:00","deleted_at":"2026-01-01T17:35:15.929729-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-sp18","title":"Digest: mol-deacon-patrol","description":"Patrol #18","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:26:11.030566-08:00","updated_at":"2025-12-27T21:26:04.683985-08:00","deleted_at":"2025-12-27T21:26:04.683985-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-spdof","title":"Digest: mol-deacon-patrol","description":"Patrol 1: All healthy, no messages, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:25:58.151972-08:00","updated_at":"2025-12-27T21:26:03.303882-08:00","deleted_at":"2025-12-27T21:26:03.303882-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-spt8v","title":"Fix docs: Formulas are JSON not YAML","description":"The molecular chemistry docs (molecule-algebra.md, molecular-chemistry.md) incorrectly refer to formulas as YAML. Formulas are JSON (.formula.json). Fix all references: .formula.yaml -\u003e .formula.json, 'YAML files' -\u003e 'JSON files', 'formula YAML' -\u003e 'formula JSON', code blocks from yaml to json syntax.","status":"tombstone","priority":2,"issue_type":"chore","created_at":"2025-12-25T14:33:19.775501-08:00","updated_at":"2025-12-27T21:29:55.189761-08:00","deleted_at":"2025-12-27T21:29:55.189761-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"chore"}
{"id":"gt-sq0t3","title":"BUG: gt handoff --dry-run still sends mail","description":"In handoff.go, the dry-run check happens AFTER sendHandoffMail() is called. This means --dry-run still sends mail.\n\n**Location**: handoff.go L126-143\n\n**Observed**: Testing with `gt handoff --dry-run -s 'Test' -m 'Message'` actually sent mail to mayor.\n\n**Fix**: Move mail sending inside the non-dry-run block, or check dry-run first.\n\n```go\n// Current (buggy):\nif handoffSubject \\!= \"\" || handoffMessage \\!= \"\" {\n sendHandoffMail(...) // Executes even in dry-run\\!\n}\nif handoffDryRun { // Too late\n return nil\n}\n\n// Should be:\nif handoffDryRun {\n // show what would happen\n return nil\n}\n// Then send mail\n```","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-25T22:02:51.953925-08:00","updated_at":"2025-12-27T21:29:45.934971-08:00","deleted_at":"2025-12-27T21:29:45.934971-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-sqa7w","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 4: routine check, all nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:37:41.769679-08:00","updated_at":"2025-12-27T21:26:01.444363-08:00","deleted_at":"2025-12-27T21:26:01.444363-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sqi","title":"gt session restart/status: Complete session management","description":"Add missing session subcommands:\n\n- gt session restart \u003crig\u003e \u003cpolecat\u003e - Restart a session (stop + start)\n- gt session status \u003crig\u003e \u003cpolecat\u003e - Show session status details\n\nstatus should show:\n- Running state\n- Uptime\n- Current activity\n- Last output timestamp","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-17T21:47:34.700494-08:00","updated_at":"2025-12-27T21:29:57.239369-08:00","dependencies":[{"issue_id":"gt-sqi","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:23:43.034222-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.239369-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sr8","title":"Test merge request","description":"branch: polecat/Test/gt-test\ntarget: main\nsource_issue: gt-test\nworker: TestWorker\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T20:08:22.678439-08:00","updated_at":"2025-12-27T21:27:22.980748-08:00","deleted_at":"2025-12-27T21:27:22.980748-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-srvfp","title":"Digest: mol-deacon-patrol","description":"Patrol 10","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:03:22.57505-08:00","updated_at":"2025-12-27T21:26:00.652776-08:00","deleted_at":"2025-12-27T21:26:00.652776-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sult","title":"gt spawn beads sync warning is misleading for redirect-based polecats","description":"## Problem\n\nWhen spawning a polecat, gt spawn shows a warning:\n```\nWarning: beads sync: exit status 1\n```\n\nThis is misleading because polecats using the redirect architecture (`.beads/redirect`) share the canonical database at `mayor/rig/.beads/beads.db`. The 'stale beads' indicated by the warning is just git branch divergence (main vs beads-sync), not actual data staleness.\n\n## Expected Behavior\n\ngt spawn should either:\n1. Skip the beads sync check for polecats using redirects (they share the canonical DB)\n2. Or provide a clearer message like 'beads redirect active, using shared database'\n\n## Reproduction\n\n```bash\ngt spawn --issue gt-xxx --rig gastown --create\n# Shows 'Warning: beads sync: exit status 1' even though beads are current\n```\n\n## Root Cause\n\nspawn.go calls beads sync and treats any non-zero exit as a warning. But with redirects, the polecat doesn't need its own beads - it uses the canonical source via the redirect chain:\n```\npolecat/.beads/redirect -\u003e ../../.beads -\u003e gastown/.beads/redirect -\u003e mayor/rig/.beads\n```\n\n## Fix Options\n\n1. Check for .beads/redirect before calling sync\n2. Have bd sync return 0 when redirect is present\n3. Suppress the warning in spawn when redirect exists","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-21T14:02:01.28061-08:00","updated_at":"2025-12-27T21:29:56.636475-08:00","deleted_at":"2025-12-27T21:29:56.636475-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-suvmb","title":"Digest: mol-deacon-patrol","description":"Patrol 18: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T16:43:56.468927-08:00","updated_at":"2025-12-27T21:26:03.066921-08:00","deleted_at":"2025-12-27T21:26:03.066921-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi","title":"Implement gt mq CLI commands","description":"Add gt mq subcommands as sugar over bd:\n\n- gt mq submit: Create MR for current branch\n- gt mq list: Show open merge requests\n- gt mq next: Show next MR ready to process\n- gt mq process: Engineer processes the queue\n- gt mq reorder \u003cid\u003e --after \u003cx\u003e: Change ordering via deps\n- gt mq status \u003cid\u003e: Show MR details\n\nAll commands should work with the Beads data plane.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-16T23:02:16.649648-08:00","updated_at":"2025-12-27T21:29:45.665684-08:00","dependencies":[{"issue_id":"gt-svi","depends_on_id":"gt-h5n","type":"blocks","created_at":"2025-12-16T23:02:55.456462-08:00","created_by":"daemon"},{"issue_id":"gt-svi","depends_on_id":"gt-kp2","type":"blocks","created_at":"2025-12-16T23:03:12.689547-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.665684-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi.1","title":"gt mq submit: create MR from current branch","description":"Implement 'gt mq submit' command that creates a merge-request bead.\n\nAuto-detection logic:\n1. Branch: current git branch\n2. Issue: parse from branch name (polecat/Nux/gt-xyz → gt-xyz)\n3. Target: main (or integration branch if --epic specified)\n4. Worker: parse from branch name\n5. Rig: current rig\n\nOptions:\n- --branch BRANCH: explicit source branch\n- --issue ISSUE: explicit source issue\n- --epic EPIC: target integration/EPIC instead of main\n- --priority P: override priority (default: inherit from source issue)\n\nCreates merge-request bead and prints MR ID.\n\nReference: docs/merge-queue-design.md#creating-merge-requests","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:50:21.652412-08:00","updated_at":"2025-12-27T21:29:45.632586-08:00","dependencies":[{"issue_id":"gt-svi.1","depends_on_id":"gt-svi","type":"parent-child","created_at":"2025-12-17T13:50:21.65435-08:00","created_by":"daemon"},{"issue_id":"gt-svi.1","depends_on_id":"gt-h5n.1","type":"blocks","created_at":"2025-12-17T13:53:02.317401-08:00","created_by":"daemon"},{"issue_id":"gt-svi.1","depends_on_id":"gt-h5n.2","type":"blocks","created_at":"2025-12-17T13:53:02.438987-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.632586-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi.2","title":"gt mq list: show queue with status/priority/age","description":"Implement 'gt mq list' command to display the merge queue.\n\nOutput format:\nID STATUS PRIORITY BRANCH WORKER AGE\ngt-mr-001 ready P0 polecat/Nux/gt-xyz Nux 5m\ngt-mr-002 in_progress P1 polecat/Toast/gt-abc Toast 12m\ngt-mr-003 blocked P1 polecat/Capable/gt-def Capable 8m\n (waiting on gt-mr-001)\n\nOptions:\n- --ready: show only ready-to-merge (no blockers, not in progress)\n- --status STATUS: filter by status\n- --worker WORKER: filter by worker\n- --epic EPIC: show MRs targeting integration/EPIC\n\nUnder the hood: bd list --type=merge-request with filters.\n\nReference: docs/merge-queue-design.md#gt-mq-list","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:50:23.295587-08:00","updated_at":"2025-12-27T21:29:45.624349-08:00","dependencies":[{"issue_id":"gt-svi.2","depends_on_id":"gt-svi","type":"parent-child","created_at":"2025-12-17T13:50:23.297307-08:00","created_by":"daemon"},{"issue_id":"gt-svi.2","depends_on_id":"gt-h5n.1","type":"blocks","created_at":"2025-12-17T13:53:02.560128-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.624349-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi.3","title":"gt mq status: detailed MR view","description":"Implement 'gt mq status \u003cid\u003e' command for detailed MR view.\n\nDisplay:\n- All MR fields (branch, target, source_issue, worker, rig)\n- Current status with timestamps\n- Dependencies (what it's waiting on)\n- Blockers (what's waiting on it)\n- Processing history (attempts, failures)\n\nUnder the hood: bd show \u003cid\u003e with MR-specific formatting.\n\nReference: docs/merge-queue-design.md#command-details","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-17T13:50:25.119914-08:00","updated_at":"2025-12-27T21:29:45.61575-08:00","dependencies":[{"issue_id":"gt-svi.3","depends_on_id":"gt-svi","type":"parent-child","created_at":"2025-12-17T13:50:25.121848-08:00","created_by":"daemon"},{"issue_id":"gt-svi.3","depends_on_id":"gt-h5n.1","type":"blocks","created_at":"2025-12-17T13:53:02.676972-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.61575-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi.4","title":"gt mq retry: retry a failed MR","description":"Implement 'gt mq retry \u003cid\u003e' to retry a failed merge request.\n\nActions:\n1. Verify MR exists and is in failed state (open with failure labels)\n2. Remove failure labels (needs-rebase, needs-fix)\n3. Reset to ready state\n4. Optionally re-run immediately (--now flag)\n\nOptions:\n- --now: immediately process (instead of waiting for Engineer loop)\n\nReference: docs/merge-queue-design.md#cli-commands","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:50:36.336017-08:00","updated_at":"2025-12-27T21:29:54.386372-08:00","dependencies":[{"issue_id":"gt-svi.4","depends_on_id":"gt-svi","type":"parent-child","created_at":"2025-12-17T13:50:36.3382-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.386372-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svi.5","title":"gt mq reject: manual MR rejection","description":"Implement 'gt mq reject \u003cid\u003e --reason \"...\"' for manual rejection.\n\nActions:\n1. Verify MR exists and is open\n2. Close MR with close_reason=rejected\n3. Notify worker via mail (optional)\n4. Do NOT close source issue (work not done)\n\nOptions:\n- --reason REASON: required explanation\n- --notify: send mail to worker\n\nReference: docs/merge-queue-design.md#cli-commands","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T13:50:38.691775-08:00","updated_at":"2025-12-27T21:29:54.378041-08:00","dependencies":[{"issue_id":"gt-svi.5","depends_on_id":"gt-svi","type":"parent-child","created_at":"2025-12-17T13:50:38.693749-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.378041-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-svjyr","title":"Digest: mol-deacon-patrol","description":"Patrol 3: inbox clear, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:21:48.063307-08:00","updated_at":"2025-12-27T21:26:01.712636-08:00","deleted_at":"2025-12-27T21:26:01.712636-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-sw95t","title":"Digest: mol-deacon-patrol","description":"P20: stable - handoff triggered","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:15:29.507422-08:00","updated_at":"2025-12-27T21:26:02.212221-08:00","deleted_at":"2025-12-27T21:26:02.212221-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-swqfx","title":"Digest: mol-deacon-patrol","description":"Patrol 17","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T14:57:10.122945-08:00","updated_at":"2025-12-26T14:57:10.122945-08:00","closed_at":"2025-12-26T14:57:10.122884-08:00"}
{"id":"gt-swrw","title":"Merge: gt-3x0z.4","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-3x0z.4\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:18:10.46877-08:00","updated_at":"2025-12-27T21:27:22.593005-08:00","deleted_at":"2025-12-27T21:27:22.593005-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-sye","title":"Mayor startup protocol prompting","description":"Add startup protocol to Mayor CLAUDE.md template.\n\n## On Session Start\n\n1. Check for handoff:\n town inbox | grep \"Session Handoff\"\n\n2. If handoff found:\n - Read it: town read \u003cmsg-id\u003e\n - Process pending escalations (highest priority)\n - Check status of noted swarms\n - Verify rig health matches notes\n - Continue with documented next steps\n\n3. If no handoff:\n town status # Overall health\n town rigs # Each rig\n bd ready # Work items\n town inbox # Any messages\n Build your own picture of current state.\n\n4. After processing handoff:\n - Archive or delete the handoff message\n - You now own the current state\n\n## Handoff Best Practices\n\n- Be specific: 'Toast has merge conflict in auth/middleware.go' not 'Toast is stuck'\n- Include context: Why decisions are pending, what you were thinking\n- Prioritize next steps: What is most urgent\n- Note time-sensitive items: Anything that might have changed since handoff","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:15:27.915484-08:00","updated_at":"2025-12-27T21:29:54.546761-08:00","dependencies":[{"issue_id":"gt-sye","depends_on_id":"gt-u82","type":"blocks","created_at":"2025-12-15T20:15:39.459108-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.546761-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-szsq","title":"gt spawn --create should auto-add polecat if missing","description":"## Problem\n\n`gt spawn gastown --issue gt-xxx --create` fails if no polecats exist.\n\n## Current Behavior\n\n```\nError: auto-select polecat: no available polecats in rig 'gastown'\n```\n\n## Expected Behavior\n\n`--create` should:\n1. Create a new polecat if none exist\n2. Or create the specified polecat if `gt spawn gastown/NewName --create`\n\n## Workaround\n\nMust manually run `gt polecat add gastown Name` first.\n\n## Principle\n\nAgent UX should be forgiving - if I say spawn with --create, make it work.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:52:07.75062-08:00","updated_at":"2025-12-27T21:29:54.092218-08:00","deleted_at":"2025-12-27T21:29:54.092218-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-t072g","title":"Merge: nux-1767084010093","description":"branch: polecat/nux-1767084010093\ntarget: main\nsource_issue: nux-1767084010093\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T00:50:06.175461-08:00","updated_at":"2025-12-30T01:01:04.201113-08:00","closed_at":"2025-12-30T01:01:04.201113-08:00","close_reason":"Already merged to main","created_by":"gastown/polecats/nux"}
{"id":"gt-t0x8w","title":"Digest: mol-deacon-patrol","description":"Patrol 17: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:01:07.169901-08:00","updated_at":"2025-12-27T21:26:00.473924-08:00","deleted_at":"2025-12-27T21:26:00.473924-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t2nh","title":"gt nudge: support no-arg invocation with default message","description":"Currently gt nudge requires a message argument:\n\n```\ngt nudge \u003crig/polecat\u003e \u003cmessage\u003e\n```\n\nWould be nice to support:\n```\ngt nudge \u003crig/polecat\u003e\n```\n\nWith a sensible default message like 'nudge' or 'continue' or checking for pending work automatically.","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-23T01:29:23.623455-08:00","updated_at":"2025-12-27T21:29:57.514236-08:00","deleted_at":"2025-12-27T21:29:57.514236-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-t2o92","title":"Digest: mol-deacon-patrol","description":"Patrol 12: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:51:07.540535-08:00","updated_at":"2025-12-27T21:26:04.13259-08:00","deleted_at":"2025-12-27T21:26:04.13259-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t2vj","title":"Merge: gt-8v8","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-8v8\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T16:31:37.562534-08:00","updated_at":"2025-12-27T21:27:22.659126-08:00","deleted_at":"2025-12-27T21:27:22.659126-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-t3kya","title":"Digest: mol-deacon-patrol","description":"Patrol 2: 1 mail archived, all agents healthy, 1 orphan fixed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:14:35.793224-08:00","updated_at":"2025-12-27T21:26:01.209556-08:00","deleted_at":"2025-12-27T21:26:01.209556-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t5pc","title":"test with labels","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T17:42:30.258069-08:00","updated_at":"2025-12-27T21:29:56.752768-08:00","deleted_at":"2025-12-27T21:29:56.752768-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-t5uk","title":"Deacon trigger-pending-spawns: Auto-nudge polecats when ready","description":"The Deacon's mol-deacon-patrol defines a trigger-pending-spawns step, but it's not implemented.\n\n## Problem\nAfter gt spawn creates a polecat session, Claude initializes for 10-20s. Nobody sends the initial 'Begin.' message to start work, so polecats sit at prompts forever.\n\n## Current Workaround\nManual: gt nudge gt-\u003crig\u003e-\u003cpolecat\u003e 'Begin.'\n\n## Required Implementation\n1. Track pending spawns (polecats spawned but not triggered)\n2. In Deacon patrol, poll WaitForClaudeReady for each pending spawn\n3. When ready, send 'Begin.' via NudgeSession\n4. Mark as triggered\n\n## Evidence\nTracer bullet 2025-12-23: Had to manually nudge polecat/tracer to start work.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T21:58:26.070984-08:00","updated_at":"2025-12-27T21:29:52.820568-08:00","deleted_at":"2025-12-27T21:29:52.820568-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t6vkc","title":"Digest: mol-deacon-patrol","description":"Patrol 2: inbox clear, all agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:30:31.061654-08:00","updated_at":"2025-12-27T21:26:00.43234-08:00","deleted_at":"2025-12-27T21:26:00.43234-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t8du","title":"Digest: mol-deacon-patrol","description":"Patrol 3: All nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:47:04.511823-08:00","updated_at":"2025-12-27T21:26:04.210544-08:00","deleted_at":"2025-12-27T21:26:04.210544-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-t9u7","title":"Polecat template cleanup","description":"Collection of improvements to the polecat role template (polecat.md.tmpl). Focus: reduce prose redundancy, move behavior into molecules, fix terminology, improve clarity.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-23T16:57:05.03738-08:00","updated_at":"2025-12-27T21:29:55.889275-08:00","deleted_at":"2025-12-27T21:29:55.889275-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-tca","title":"Polecats should auto-cleanup after MR submission","description":"Currently polecats must manually run 'gt handoff --shutdown' after completing work. This is error-prone and leaves stale polecats around.\n\n## Desired Flow\n\n1. Polecat completes work\n2. Polecat runs 'gt mq submit' (or similar)\n3. MR is added to integration queue\n4. **Polecat automatically cleans up** (no manual handoff needed)\n\n## Implementation Options\n\n### Option A: mq submit triggers cleanup\nIn 'gt mq submit':\n1. Submit MR to queue\n2. Automatically run cleanup (same as gt handoff --shutdown)\n3. Polecat session terminates\n\n### Option B: Refinery triggers cleanup\nWhen Refinery picks up MR:\n1. Refinery processes MR\n2. Sends message to Witness: 'CLEANUP: \u003cpolecat\u003e'\n3. Witness cleans up polecat\n\n### Option C: Molecule-driven\nDefine cleanup as final phase of polecat-work molecule:\n1. code → test → submit-mr → cleanup\n\n## Note\nThis reinforces the ephemeral model: polecats exist only for the duration of a single task.","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T15:22:54.485456-08:00","updated_at":"2025-12-27T21:29:53.719557-08:00","dependencies":[{"issue_id":"gt-tca","depends_on_id":"gt-9nf","type":"related","created_at":"2025-12-20T15:40:08.998908-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.719557-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-tdeim","title":"Review PR #53: fix: Add retry logic for Enter key send in NudgeSession/NudgePane","description":"Review PR #53. Check retry logic is sound. Test locally if needed. Approve with gh pr review --approve if good.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-03T11:40:26.650916-08:00","updated_at":"2026-01-03T11:43:56.314952-08:00","closed_at":"2026-01-03T11:43:56.314952-08:00","close_reason":"PR #53 reviewed and approved","created_by":"mayor"}
{"id":"gt-teq0p","title":"Find abandoned work","description":"Find abandoned work.\n\nScan for orphaned state:\n- Issues marked in_progress with no active polecat\n- Polecats that stopped responding mid-work\n- Merge queue entries with no polecat owner\n- Wisp sessions that outlived their spawner\n\n```bash\nbd list --status=in_progress\ngt polecats --all --orphan\n```\n\nFor each orphan:\n- Check if polecat session still exists\n- If not, mark issue for reassignment or retry\n- File incident beads if data loss occurred\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.776531-08:00","updated_at":"2025-12-27T21:29:55.274773-08:00","dependencies":[{"issue_id":"gt-teq0p","depends_on_id":"gt-v7wq4","type":"blocks","created_at":"2025-12-25T02:11:33.890608-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.274773-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tfg1","title":"Digest: mol-deacon-patrol","description":"Patrol #2: Routine - all 6 agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:15:34.682289-08:00","updated_at":"2025-12-27T21:26:04.819594-08:00","deleted_at":"2025-12-27T21:26:04.819594-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tj1k","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:45","description":"Patrol 10: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:45:36.775542-08:00","updated_at":"2025-12-27T21:26:05.104028-08:00","deleted_at":"2025-12-27T21:26:05.104028-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tjy9r","title":"Merge: capable-1767074974673","description":"branch: polecat/capable-1767074974673\ntarget: main\nsource_issue: capable-1767074974673\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-29T22:12:26.130677-08:00","updated_at":"2025-12-30T10:06:56.899508-08:00","closed_at":"2025-12-30T10:06:56.899508-08:00","close_reason":"Branch merged to main","created_by":"gastown/polecats/capable"}
{"id":"gt-tl54","title":"MR: gt-test (main)","description":"branch: main\ntarget: main\nsource_issue: gt-test","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-18T20:16:41.125975-08:00","updated_at":"2025-12-27T21:27:23.006168-08:00","deleted_at":"2025-12-27T21:27:23.006168-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-tmm","title":"Polecat beads not synced with rig beads","description":"When gt spawn assigns an issue to a polecat, the polecat cannot find the issue because:\n\n1. Rig beads at /gt/gastown/.beads contains gt-th7\n2. Polecat beads at /gt/gastown/polecats/dementus/.beads does NOT have gt-th7\n3. bd sync in polecat dir pulls from beads-sync branch which doesn't have the new issue\n4. Rig bd sync uses 'main' branch, causing a branch mismatch\n\nThe polecat gets a work assignment for an issue it literally cannot see.\n\n## Root Cause\n\n- Rig beads and polecat beads are separate git-tracked copies\n- No mechanism to propagate newly created issues from rig to polecats before assignment\n- Sync branch configuration mismatch between rig and polecats\n\n## Fix Options\n\n1. gt spawn should sync rig beads before assigning work\n2. gt spawn should sync polecat beads after assignment\n3. Use shared beads (symlink or same DB) instead of copies\n4. Push new issues immediately on create","notes":"Note: This sync mismatch is resolved by gt-9nf (fresh polecats). Rather than fixing sync between stale clones, we'll always create fresh worktrees. This issue documents the root cause for posterity.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T15:17:10.158624-08:00","updated_at":"2025-12-27T21:29:53.744537-08:00","deleted_at":"2025-12-27T21:29:53.744537-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-tmoz3","title":"Digest: mol-deacon-patrol","description":"Patrol 3: Quiet, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:14:05.427108-08:00","updated_at":"2025-12-27T21:26:02.765167-08:00","deleted_at":"2025-12-27T21:26:02.765167-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnca","title":"Design mol-ready-work patrol for crew workers","description":"## Summary\n\nDesign and implement a protomolecule (mol-ready-work) that enables crew workers to autonomously work through backlogs when the overseer is away. This replaces the normal \"await instructions\" patrol step with productive backlog processing.\n\n## Context\n\nCrew workers like dave/joe are persistent, user-managed agents. Currently their patrol loop has an \"await instructions\" step that idles. We want an alternative: sling mol-ready-work at them and they crank through backlogs until:\n- The overseer interrupts with new instructions\n- Context fills up (request handoff)\n- Backlogs are empty\n\n## Backlogs (Priority Order)\n\n1. **Open PRs** - Review/merge pending pull requests\n2. **Untriaged GH issues** - New issues needing triage\n3. **Open beads work** - bd ready items (unblocked issues)\n4. **Triaged GH issues** - Bugs/features to implement\n\n## Key Features\n\n### ROI-Based Selection\nInstead of strict priority, agent applies ROI heuristic:\n- Size estimate (fits in remaining context?)\n- Achievability (has all needed info?)\n- Impact (priority + type weight)\n- Pick highest-value achievable item\n\n### Context Management\n- Check token usage after each work item\n- Request handoff before running out\n- Leave clear handoff notes (mail to self)\n- Next session picks up the patrol\n\n### Patrol Loop Structure\n```\norient → scan-backlogs → select-work → execute-work → check-context → (loop or handoff)\n```\n\n## Design Questions\n\n1. How does gt sling work? Does it exist?\n2. How does this interact with the normal crew patrol?\n3. Should the molecule go in Gas Town catalog or per-rig?\n4. How does bonding work for discovered work during execution?\n5. Should this be a wisp (ephemeral) or mol (persistent)?\n\n## Acceptance Criteria\n\n- [ ] mol-ready-work protomolecule defined in catalog\n- [ ] gt sling command implemented (or existing mechanism documented)\n- [ ] Context-checking step works with agent token awareness\n- [ ] Handoff mechanism integrates with gt mail\n- [ ] Documentation for crew workers on using the patrol","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-22T23:45:40.63509-08:00","updated_at":"2025-12-27T21:29:56.309081-08:00","deleted_at":"2025-12-27T21:29:56.309081-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-tnca.1","title":"Draft mol-ready-work protomolecule definition","description":"Draft the actual protomolecule definition in markdown format.\n\n## Molecule Structure\n\n```markdown\n## Molecule: ready-work\nAutonomous backlog processing patrol for crew workers.\n\nPhase: vapor (wisp) - ephemeral patrol cycles\nSquash: after each work item or context threshold\n\n## Step: orient\nLoad context and check for interrupts:\n- Read mail for overseer instructions\n- Check for predecessor handoff\n- Load current context state\n\n## Step: scan-backlogs\nSurvey all backlogs in priority order:\n1. gh pr list --state open\n2. gh issue list --state open --label untriaged (or no label)\n3. bd ready\n4. gh issue list --state open --label triaged\n\nCapture counts and candidates.\n\nNeeds: orient\n\n## Step: select-work\nApply ROI heuristic to select best work item:\n- Estimate size (tokens needed)\n- Check remaining context capacity\n- Weight by impact (priority, type)\n- Select highest ROI achievable item\n- If empty: exit patrol\n\nNeeds: scan-backlogs\n\n## Step: execute-work\nWork the selected item:\n- For PRs: review, request changes, or merge\n- For untriaged: triage and label\n- For beads: implement and close\n- For triaged GH: implement fix\n\nCommit, push, close/update as appropriate.\n\nNeeds: select-work\n\n## Step: check-context\nAssess context state:\n- Estimate remaining capacity\n- If \u003c 20%: goto handoff\n- If ok: loop to scan-backlogs\n\nNeeds: execute-work\n\n## Step: handoff\nPrepare for session transition:\n- Summarize work completed this cycle\n- Note any in-progress items\n- Send handoff mail to self\n- Squash wisp to digest\n- Exit for fresh session\n\nNeeds: check-context\n```\n\n## Variables\n\n- `backlog_priority`: Override backlog scan order\n- `context_threshold`: Percentage at which to handoff (default: 20)\n- `max_items`: Maximum items to process per session\n\n## Notes\n\n- This is a vapor-phase molecule (wisp)\n- Each work item should squash to a digest\n- The patrol itself squashes at handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:46:00.012418-08:00","updated_at":"2025-12-27T21:29:56.300762-08:00","dependencies":[{"issue_id":"gt-tnca.1","depends_on_id":"gt-tnca","type":"parent-child","created_at":"2025-12-22T23:46:00.012887-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.300762-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnca.2","title":"Implement context-checking for agent sessions","description":"Implement the context-checking step that allows agents to assess their remaining context capacity.\n\n## Challenge\n\nAgents don't have direct access to token counts. Need heuristics:\n\n### Approach 1: Message count heuristic\n- Count messages in conversation\n- Estimate tokens per message\n- Compare to model's context window\n\n### Approach 2: External tool\n- Claude Code could expose a /context command\n- Returns estimated usage percentage\n- Agent queries before each work item\n\n### Approach 3: Conservative fixed limits\n- After N work items, always handoff\n- Simple but may waste context or handoff too early\n\n### Approach 4: Hook-based injection\n- SessionStart hook injects context estimate\n- Updated periodically via tool call\n\n## Recommendation\n\nStart with Approach 3 (fixed limits) as MVP:\n- Default: 3-5 work items per session\n- Agent can override via variable\n- Upgrade to smarter heuristics later\n\n## Integration\n\n- Add check-context step to mol-ready-work\n- Implement handoff trigger logic\n- Test with various context states","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:46:28.414584-08:00","updated_at":"2025-12-27T21:29:56.292547-08:00","dependencies":[{"issue_id":"gt-tnca.2","depends_on_id":"gt-tnca","type":"parent-child","created_at":"2025-12-22T23:46:28.414964-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.292547-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnca.3","title":"Implement/document gt sling mechanism","description":"Investigate and implement the gt sling command for attaching molecules to crew workers. Check if gt sling exists, design the flow (mail-based vs direct pin vs wisp spawn), implement in gastown CLI, integrate with bd mol/wisp/pin commands.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T23:46:38.592538-08:00","updated_at":"2025-12-27T21:29:56.28411-08:00","dependencies":[{"issue_id":"gt-tnca.3","depends_on_id":"gt-tnca","type":"parent-child","created_at":"2025-12-22T23:46:38.592961-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.28411-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnca.4","title":"Document crew worker patrol integration","description":"Document how mol-ready-work integrates with crew worker workflow. Cover: starting the patrol, patrol vs normal operation, backlog configuration, handoff mechanics, discovered work handling. Update CLAUDE.md with patrol section.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-22T23:46:38.867975-08:00","updated_at":"2025-12-27T21:29:57.530876-08:00","dependencies":[{"issue_id":"gt-tnca.4","depends_on_id":"gt-tnca","type":"parent-child","created_at":"2025-12-22T23:46:38.868332-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.530876-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnow","title":"Implement Christmas Ornament pattern for mol-witness-patrol","description":"Integrate dynamic bonding into mol-witness-patrol once Beads primitives are ready.\n\n## Completed\n\n- ✅ WaitsFor parsing (gt-odfr) - Parser now extracts WaitsFor: all-children\n- ✅ mol bond command (gt-isje) - gt mol bond creates dynamic children with --var expansion\n- ✅ WitnessPatrolMolecule() updated with Christmas Ornament structure (gt-tnow.1)\n- ✅ PolecatArmMolecule() created for per-polecat inspection (gt-tnow.2)\n\n## Remaining\n\n- gt-tnow.3: Plugin hook support (P2)\n- gt-tnow.4: Integration test with live polecats (P2)\n\n## Design Docs\n\n- docs/molecular-chemistry.md (already updated with pattern)\n- docs/architecture.md (already updated with activity feed)\n- gt-qflq: mol-witness-patrol design bead (updated)\n- gt-dapb: mol-polecat-arm proto","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T02:37:16.975171-08:00","updated_at":"2025-12-27T21:25:59.919787-08:00","deleted_at":"2025-12-27T21:25:59.919787-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-tnow.1","title":"Update WitnessPatrolMolecule() with Christmas Ornament structure","description":"Update builtin_molecules.go WitnessPatrolMolecule() to use dynamic bonding pattern.\n\nChanges:\n1. Add survey-workers step that documents dynamic bonding\n2. Add run-plugins step for plugin hook support\n3. Add aggregate step with WaitsFor: all-children\n4. Update step dependencies for new structure\n\nThe molecule prose guides the Witness agent on how to use bd mol bond.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T02:37:42.989186-08:00","updated_at":"2025-12-27T21:29:53.057265-08:00","dependencies":[{"issue_id":"gt-tnow.1","depends_on_id":"gt-tnow","type":"parent-child","created_at":"2025-12-23T02:37:42.98963-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.057265-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnow.2","title":"Create PolecatArmMolecule() for per-polecat inspection","description":"Add PolecatArmMolecule() to builtin_molecules.go.\n\nSteps:\n- capture: Capture tmux output\n- assess: Categorize state\n- load-history: Get nudge counts\n- decide: Apply nudge matrix\n- execute: Take action\n\nUses variable substitution:\n- {{polecat_name}}\n- {{rig}}\n\nThis proto is bonded dynamically by survey-workers step.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T02:37:43.081562-08:00","updated_at":"2025-12-27T21:29:53.048878-08:00","dependencies":[{"issue_id":"gt-tnow.2","depends_on_id":"gt-tnow","type":"parent-child","created_at":"2025-12-23T02:37:43.081898-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.048878-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tnow.4","title":"Integration test: Christmas Ornament with live polecats","description":"Test the dynamic bonding pattern with actual polecats.\n\nTest scenarios:\n1. Patrol with 0 polecats (empty arms)\n2. Patrol with 3 polecats (parallel arms)\n3. Polecat appears mid-patrol (picked up next cycle)\n4. Nudge progression (idle -\u003e nudge-1 -\u003e nudge-2 -\u003e nudge-3)\n5. Pre-kill verification flow\n6. Activity feed output verification","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T02:37:43.268707-08:00","updated_at":"2025-12-27T21:29:56.217197-08:00","dependencies":[{"issue_id":"gt-tnow.4","depends_on_id":"gt-tnow","type":"parent-child","created_at":"2025-12-23T02:37:43.269055-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:56.217197-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tocb","title":"gt handoff should accept full session paths like gastown/crew/max","description":"Currently gt handoff only accepts role shortcuts like 'crew', 'witness', etc. and tries to auto-detect the rig/name from environment. User tried 'gt handoff gastown/crew/max' and got 'unknown session type'. Should parse paths like '\u003crig\u003e/crew/\u003cname\u003e' or '\u003crig\u003e/\u003crole\u003e' directly into session names like 'gt-\u003crig\u003e-crew-\u003cname\u003e'.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-23T13:20:29.471194-08:00","updated_at":"2025-12-27T21:29:55.976969-08:00","deleted_at":"2025-12-27T21:29:55.976969-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-tridl","title":"Merge: dag-mjtm8k85","description":"branch: polecat/dag-mjtm8k85\ntarget: main\nsource_issue: dag-mjtm8k85\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:37:18.664169-08:00","updated_at":"2025-12-31T14:03:14.566047-08:00","closed_at":"2025-12-31T14:03:14.566047-08:00","created_by":"gastown/polecats/dag"}
{"id":"gt-ttom7","title":"Digest: mol-deacon-patrol","description":"Patrol 13: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:05:32.808512-08:00","updated_at":"2025-12-27T21:26:03.38887-08:00","deleted_at":"2025-12-27T21:26:03.38887-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tucp2","title":"Merge: toast-mjw7062w","description":"branch: polecat/toast-mjw7062w\ntarget: main\nsource_issue: toast-mjw7062w\nrig: gastown\nagent_bead: gt-gastown-polecat-toast","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T17:27:33.943633-08:00","updated_at":"2026-01-01T23:11:06.497212-08:00","closed_at":"2026-01-01T23:11:06.497212-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/toast"}
{"id":"gt-tulx","title":"gt mq submit: creates task type instead of merge-request type","description":"## Problem\n\n`gt mq submit` creates an issue with `type: task` but should be `type: merge-request`.\n\n## Evidence\n\n```\n$ bd show gt-n508\ngt-n508: Merge: gt-70b3\nStatus: open\nPriority: P1\nType: task \u003c-- WRONG, should be merge-request\n...\nDescription:\ntype: merge-request \u003c-- Correct type is in description, not in actual type field\n```\n\n## Impact\n\n`gt mq list` shows empty queue because it queries for `type: merge-request`\n\n## Fix\n\n`gt mq submit` should set `--type merge-request` when creating the bead.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:57:37.905848-08:00","updated_at":"2025-12-27T21:29:54.066926-08:00","deleted_at":"2025-12-27T21:29:54.066926-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-tvlam","title":"Digest: mol-deacon-patrol","description":"Patrol 6: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:01:29.899-08:00","updated_at":"2025-12-27T21:26:04.026295-08:00","deleted_at":"2025-12-27T21:26:04.026295-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tw1g","title":"Startup hook should validate attached molecule is still open","description":"When the SessionStart hook outputs attached_molecule from the handoff bead, it should check if that molecule is still open. If closed, it should:\n1. Clear the stale attachment\n2. Auto-spawn a fresh patrol molecule\n3. Attach the new patrol\n\nCurrently it just outputs the stale reference without validation, leaving the Deacon confused about what to run.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-23T13:16:51.73028-08:00","updated_at":"2025-12-27T21:29:55.985303-08:00","deleted_at":"2025-12-27T21:29:55.985303-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-twjr5","title":"Async Coordination Gates","description":"Agents wait for external events without polling.\n\n## Problem\nAgents cannot wait for external conditions (CI completion, human approval, timers) without wasteful polling or losing state on handoff.\n\n## Requirements\n- Gate issue type in beads (type=gate)\n- bd gate create --await timer:5m / gh:run:123 / human:approve / mail:pattern\n- Deacon evaluates pending gates each patrol cycle\n- Agents park work on gate, resume when condition met\n- Gate timeout and notification\n\n## Gate Types\n- timer:\u003cduration\u003e - Simple delay (5m, 1h, 24h)\n- gh:run:\u003cid\u003e - GitHub Actions run completion\n- gh:pr:\u003cid\u003e - PR merged/closed \n- human:\u003cprompt\u003e - Human approval required\n- mail:\u003cpattern\u003e - Wait for mail matching pattern\n\n## Success Criteria\n- Agent can create gate and suspend work\n- Deacon wakes agent when gate condition met\n- Gates survive session cycling\n\nConsolidates gt-31eg. Parallel work with no dependencies.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-25T20:46:07.930952-08:00","updated_at":"2025-12-27T21:29:52.498052-08:00","deleted_at":"2025-12-27T21:29:52.498052-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-twjr5.1","title":"Gate issue type and bd gate create","description":"Add type=gate to beads schema. Implement bd gate create command with --await flag. Gates are special issues that represent wait conditions. Example: bd gate create --await timer:5m --title 'Wait for cooldown'","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:44.796316-08:00","updated_at":"2025-12-27T21:29:45.985412-08:00","dependencies":[{"issue_id":"gt-twjr5.1","depends_on_id":"gt-twjr5","type":"parent-child","created_at":"2025-12-25T20:55:44.7969-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.985412-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-twjr5.2","title":"Timer gate evaluation","description":"Implement timer gate type. Format: timer:\u003cduration\u003e (5m, 1h, 24h). Deacon evaluates pending timer gates each patrol cycle. Gate closes when duration elapsed since creation. Simplest gate type - implement first.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:46.781785-08:00","updated_at":"2025-12-27T21:29:45.977006-08:00","dependencies":[{"issue_id":"gt-twjr5.2","depends_on_id":"gt-twjr5","type":"parent-child","created_at":"2025-12-25T20:55:46.783602-08:00","created_by":"daemon"},{"issue_id":"gt-twjr5.2","depends_on_id":"gt-twjr5.1","type":"blocks","created_at":"2025-12-25T20:56:46.669095-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.977006-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-twjr5.3","title":"GitHub gates (gh:run, gh:pr)","description":"Implement GitHub gate types using gh CLI. gh:run:\u003cid\u003e waits for Actions run completion. gh:pr:\u003cid\u003e waits for PR merged/closed. Deacon polls status during patrol cycle.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-25T20:55:48.108365-08:00","updated_at":"2025-12-27T21:29:45.968718-08:00","dependencies":[{"issue_id":"gt-twjr5.3","depends_on_id":"gt-twjr5","type":"parent-child","created_at":"2025-12-25T20:55:48.110599-08:00","created_by":"daemon"},{"issue_id":"gt-twjr5.3","depends_on_id":"gt-twjr5.1","type":"blocks","created_at":"2025-12-25T20:56:46.763122-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:45.968718-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-twjr5.4","title":"Human and mail gates","description":"Implement human:\u003cprompt\u003e gate (requires explicit approval command). Implement mail:\u003cpattern\u003e gate (waits for mail matching pattern). These enable async human-in-loop workflows.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:55:49.832861-08:00","updated_at":"2025-12-27T21:29:54.944096-08:00","dependencies":[{"issue_id":"gt-twjr5.4","depends_on_id":"gt-twjr5","type":"parent-child","created_at":"2025-12-25T20:55:49.833369-08:00","created_by":"daemon"},{"issue_id":"gt-twjr5.4","depends_on_id":"gt-twjr5.1","type":"blocks","created_at":"2025-12-25T20:56:46.85494-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.944096-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-twjr5.5","title":"Agent parking and resuming","description":"Implement gt park and gt resume commands. Agent parks work on a gate, state persists across sessions. When gate condition met, Deacon sends wake mail. Agent resumes from parked state. Critical for async workflows.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:55:51.294474-08:00","updated_at":"2025-12-27T21:29:54.935805-08:00","dependencies":[{"issue_id":"gt-twjr5.5","depends_on_id":"gt-twjr5","type":"parent-child","created_at":"2025-12-25T20:55:51.296459-08:00","created_by":"daemon"},{"issue_id":"gt-twjr5.5","depends_on_id":"gt-twjr5.2","type":"blocks","created_at":"2025-12-25T20:56:46.946883-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.935805-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-txjgc","title":"Digest: mol-deacon-patrol","description":"Patrol 96: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-31T15:26:24.314558-08:00","updated_at":"2025-12-31T15:26:24.314558-08:00","closed_at":"2025-12-31T15:26:24.314516-08:00"}
{"id":"gt-ty3w4","title":"Digest: mol-deacon-patrol","description":"Final patrol cycle 20: all systems healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:22:00.43096-08:00","updated_at":"2025-12-27T21:26:03.311941-08:00","deleted_at":"2025-12-27T21:26:03.311941-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tyu37","title":"Digest: mol-deacon-patrol","description":"Patrol 13: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:23:11.657968-08:00","updated_at":"2025-12-27T21:26:00.092483-08:00","deleted_at":"2025-12-27T21:26:00.092483-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tzn8z","title":"Digest: mol-deacon-patrol","description":"Patrol 19: healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:25:18.709476-08:00","updated_at":"2025-12-27T21:25:59.99424-08:00","deleted_at":"2025-12-27T21:25:59.99424-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-tzogh","title":"Clean up deprecated gt mol command examples in molecule.go","description":"## Task\n\nThe molecule.go file contains example strings that reference deprecated gt mol commands.\n\n## Deprecated References Found\n\n1. `gt molecule instantiate mol-xyz` → should reference `bd mol pour`\n2. `gt mol list --catalog` → should reference `bd formula list` \n3. `gt mol bond mol-polecat-arm` → should reference `bd mol bond`\n\n## Location\n\n`internal/cmd/molecule.go` (multiple lines)\n\n## Context\n\nThe gt mol help already correctly redirects:\n```\nFor beads data operations (listing, showing, creating molecules), use bd:\n bd formula list List molecule protos (replaces gt mol catalog)\n bd mol show Show molecule details (replaces gt mol show)\n bd mol pour Instantiate molecule (replaces gt mol instantiate)\n bd mol bond Bond molecules together (replaces gt mol bond)\n```\n\nBut the example strings in the code still show the old patterns.\n\n## Fix\n\nUpdate the example strings to use the current bd mol patterns.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T17:12:36.602205-08:00","updated_at":"2025-12-27T21:29:54.696525-08:00","created_by":"gastown/crew/joe","deleted_at":"2025-12-27T21:29:54.696525-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u0c1","title":"Merge: gt-qna4","description":"branch: polecat/capable\ntarget: main\nsource_issue: gt-qna4\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T07:47:28.453817-08:00","updated_at":"2025-12-27T21:27:22.684012-08:00","deleted_at":"2025-12-27T21:27:22.684012-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-u0lj5","title":"Digest: mol-deacon-patrol","description":"Patrol 13: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:37:59.802062-08:00","updated_at":"2025-12-27T21:26:00.337168-08:00","deleted_at":"2025-12-27T21:26:00.337168-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u0zqy","title":"Merge: nux-1767080427756","description":"branch: polecat/nux-1767080427756\ntarget: main\nsource_issue: nux-1767080427756\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-29T23:47:49.107094-08:00","updated_at":"2025-12-29T23:55:11.80929-08:00","closed_at":"2025-12-29T23:55:11.80929-08:00","close_reason":"Stale MR from nuked polecat","created_by":"gastown/polecats/nux"}
{"id":"gt-u1j","title":"Port Gas Town to Go","description":"Complete rewrite of Gas Town in Go for improved performance and single-binary distribution.\n\n## Goals\n- Single installable binary (gt)\n- All Python functionality ported\n- Federation support built-in\n- Improved performance\n\n## Phases\n1. Core infrastructure (config, workspace, git wrapper)\n2. Rig \u0026 polecat management\n3. Session \u0026 tmux operations\n4. Mail system\n5. CLI commands\n6. TUI (optional)","status":"tombstone","priority":0,"issue_type":"epic","created_at":"2025-12-15T16:36:28.769343-08:00","updated_at":"2025-12-27T21:29:45.707123-08:00","deleted_at":"2025-12-27T21:29:45.707123-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-u1j.1","title":"Go scaffolding: cmd/gt, go.mod, Cobra setup","description":"Set up Go project structure with CLI framework.\n\n**Stack:**\n- Cobra for command/flag handling\n- Lipgloss for styled terminal output\n\n**Deliverables:**\n- cmd/gt/main.go with Cobra root command\n- Basic subcommands: version, help\n- Lipgloss styles for status output (success, warning, error)\n- go.mod with dependencies","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T16:36:48.376267-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.1","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T16:36:48.376622-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.10","title":"CLI: core commands (status, prime, version, init)","description":"Essential CLI commands for Gas Town operation.\n\n## Commands\n\n### gt status\nShow overall town status.\n```\ngt status [--json]\n```\nOutput:\n- Town name and location\n- Number of rigs\n- Active polecats across all rigs\n- Witness status per rig\n- Recent activity summary\n\n### gt prime\nOutput role context for current directory.\n```\ngt prime\n```\nDetects role from directory:\n- Town root or mayor/ → Mayor context\n- \u003crig\u003e/witness/rig/ → Witness context\n- \u003crig\u003e/refinery/rig/ → Refinery context\n- \u003crig\u003e/polecats/\u003cname\u003e/ → Polecat context\n\n### gt version\nShow version information.\n```\ngt version [--short]\n```\nOutput: version, git commit, build date.\n\n### gt init\nInitialize current rig for Gas Town (alternative to gt install for existing repos).\n```\ngt init [--force]\n```\nCreates Gas Town structure in existing git repo.\n\n## Implementation\n\nEach command is a Cobra subcommand under root:\n```go\nvar statusCmd = \u0026cobra.Command{...}\nvar primeCmd = \u0026cobra.Command{...}\nvar versionCmd = \u0026cobra.Command{...}\nvar initCmd = \u0026cobra.Command{...}\n```\n\nRegister in cmd/gt/main.go.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:38.367667-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.10","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:38.368006-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.10","depends_on_id":"gt-u1j.5","type":"blocks","created_at":"2025-12-15T17:14:06.123332-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.11","title":"CLI: session commands (start, stop, at, list, capture, inject)","description":"Session management CLI commands.\n\n## Commands\n\n### gt session start\nStart a polecat session.\n```\ngt session start \u003crig\u003e/\u003cpolecat\u003e [--issue \u003cid\u003e]\n```\n- Creates tmux session\n- Launches claude in polecat workdir\n- Optionally injects initial issue context\n\n### gt session stop\nStop a polecat session.\n```\ngt session stop \u003crig\u003e/\u003cpolecat\u003e [--force]\n```\n- Graceful shutdown by default\n- --force kills immediately\n\n### gt session at (attach)\nAttach to running session.\n```\ngt session at \u003crig\u003e/\u003cpolecat\u003e\n```\n- Attaches tmux session to current terminal\n- Detach with Ctrl-B D\n\n### gt session list\nList all sessions.\n```\ngt session list [--rig \u003crig\u003e] [--json]\n```\n- Shows running/stopped status\n- Optionally filter by rig\n\n### gt session capture\nCapture recent output from session.\n```\ngt session capture \u003crig\u003e/\u003cpolecat\u003e [--lines \u003cn\u003e]\n```\n- Default: last 100 lines\n- Useful for checking polecat progress\n\n### gt session inject\nSend message to session.\n```\ngt session inject \u003crig\u003e/\u003cpolecat\u003e -m \"message\"\ngt session inject \u003crig\u003e/\u003cpolecat\u003e -f \u003cfile\u003e\n```\n- Injects text via tmux send-keys\n- Used for nudges, notifications\n\n## Address Format\n\n`\u003crig\u003e/\u003cpolecat\u003e` e.g., `wyvern/Toast`\n\n## Implementation\n\nSession subcommand group:\n```go\nvar sessionCmd = \u0026cobra.Command{Use: \"session\"}\nsessionCmd.AddCommand(startCmd, stopCmd, atCmd, listCmd, captureCmd, injectCmd)\n```","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:40.70671-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.11","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:40.707072-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.11","depends_on_id":"gt-u1j.7","type":"blocks","created_at":"2025-12-15T17:14:06.23195-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.12","title":"CLI: mail commands (send, inbox, read)","description":"CLI: mail commands wrapping bd mail.\n\n## Status\n\nCurrent implementation works with JSONL files. Refactor to wrap bd mail CLI.\n\n## Commands\n\n### gt mail inbox\n```\ngt mail inbox [--unread] [--json]\n```\nWraps: `bd mail inbox [--json]`\n\n### gt mail read\n```\ngt mail read \u003cid\u003e\n```\nWraps: `bd mail read \u003cid\u003e \u0026\u0026 bd mail ack \u003cid\u003e`\nNote: bd mail read + ack marks as read.\n\n### gt mail send\n```\ngt mail send \u003caddress\u003e -s \"Subject\" -m \"Body\" [--notify]\n```\nWraps: `bd mail send \u003crecipient\u003e -s \"Subject\" -m \"Body\"`\nIf --notify: inject tmux notification after send.\n\n### gt mail delete\n```\ngt mail delete \u003cid\u003e\n```\nWraps: `bd mail ack \u003cid\u003e` (closes the message issue)\n\n## Address Translation\n\nSame as gt-u1j.6:\n- `mayor/` → `mayor`\n- `\u003crig\u003e/refinery` → `\u003crig\u003e-refinery`\n- `\u003crig\u003e/\u003cpolecat\u003e` → `\u003crig\u003e-\u003cpolecat\u003e`\n\n## Dependencies\n\n- gt-u1j.6: Mail system backend","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:42.038558-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.12","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:42.038885-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.12","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-15T17:14:06.328188-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.12","depends_on_id":"gt-r01","type":"blocks","created_at":"2025-12-16T13:12:08.639736-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.13","title":"Beads CLI wrapper: shell out to bd","description":"Wrapper for bd (beads CLI) commands.\n\n## Interface\n\n```go\ntype Beads struct {\n workDir string\n}\n\nfunc NewBeads(workDir string) *Beads\n\n// Query\nfunc (b *Beads) List(opts ListOptions) ([]*Issue, error)\nfunc (b *Beads) Ready() ([]*Issue, error)\nfunc (b *Beads) Show(id string) (*Issue, error)\nfunc (b *Beads) Blocked() ([]*Issue, error)\n\n// Mutations\nfunc (b *Beads) Create(opts CreateOptions) (*Issue, error)\nfunc (b *Beads) Update(id string, opts UpdateOptions) error\nfunc (b *Beads) Close(ids ...string) error\n\n// Sync\nfunc (b *Beads) Sync() error\nfunc (b *Beads) SyncStatus() (*SyncStatus, error)\n```\n\n## Data Types\n\n```go\ntype Issue struct {\n ID string\n Title string\n Status string\n Priority int\n Type string\n Description string\n Parent string\n Children []string\n DependsOn []string\n Blocks []string\n}\n\ntype ListOptions struct {\n Status string // \"open\", \"closed\", \"all\"\n Type string\n Priority int\n}\n\ntype CreateOptions struct {\n Title string\n Type string\n Priority int\n Description string\n Parent string\n}\n```\n\n## Implementation\n\nShell out to bd binary, parse JSON output where available.\n```go\nfunc (b *Beads) runBd(args ...string) ([]byte, error) {\n cmd := exec.Command(\"bd\", args...)\n cmd.Dir = b.workDir\n return cmd.Output()\n}\n```\n\n## Error Handling\n\n- bd not installed: Clear error with install instructions\n- Not a beads repo: Detect and report\n- Sync conflicts: Parse and report","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:12:55.926393-08:00","updated_at":"2025-12-27T21:29:54.670312-08:00","dependencies":[{"issue_id":"gt-u1j.13","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:55.926744-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.13","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-15T17:14:21.168049-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.670312-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1j.14","title":"Merge queue: per-rig queue management","description":"Per-rig merge queue for coordinating polecat work.\n\n## Concept\n\nWhen polecats complete work, they submit to the merge queue. The Refinery processes the queue, merging work to main in order.\n\n## Data Model\n\n```go\ntype MergeRequest struct {\n ID string `json:\"id\"`\n Polecat string `json:\"polecat\"`\n Branch string `json:\"branch\"`\n Issue string `json:\"issue,omitempty\"`\n Status MRStatus `json:\"status\"`\n CreatedAt time.Time `json:\"created_at\"`\n UpdatedAt time.Time `json:\"updated_at\"`\n}\n\ntype MRStatus string\nconst (\n MRPending MRStatus = \"pending\"\n MRReviewing MRStatus = \"reviewing\"\n MRMerged MRStatus = \"merged\"\n MRRejected MRStatus = \"rejected\"\n)\n```\n\n## Interface\n\n```go\ntype MergeQueue struct {\n rig *Rig\n path string // \u003crig\u003e/refinery/queue.jsonl\n}\n\nfunc NewMergeQueue(rig *Rig) *MergeQueue\n\n// Queue operations\nfunc (mq *MergeQueue) Submit(polecat, branch string, issue string) (*MergeRequest, error)\nfunc (mq *MergeQueue) List() ([]*MergeRequest, error)\nfunc (mq *MergeQueue) Next() (*MergeRequest, error) // oldest pending\nfunc (mq *MergeQueue) Get(id string) (*MergeRequest, error)\n\n// Status updates\nfunc (mq *MergeQueue) SetStatus(id string, status MRStatus) error\nfunc (mq *MergeQueue) MarkMerged(id string) error\nfunc (mq *MergeQueue) MarkRejected(id, reason string) error\n```\n\n## Queue Storage\n\nJSONL file at `\u003crig\u003e/refinery/queue.jsonl`. FIFO ordering.\n\n## Git Coordination\n\nRefinery processes queue:\n1. Fetch polecat branch\n2. Attempt merge to main\n3. Run tests (if configured)\n4. Push to origin\n5. Update queue status\n\n## Conflict Handling\n\nIf merge fails:\n- Mark MR as rejected\n- Notify polecat to rebase","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:57.707908-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.14","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:57.708254-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.14","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-15T17:14:18.469302-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.14","depends_on_id":"gt-u1j.5","type":"blocks","created_at":"2025-12-15T17:14:18.549859-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.16","title":"CLI: rig commands (list, show, add)","description":"Rig management CLI commands.\n\n## Commands\n\n### gt rig list\nList all registered rigs.\n```\ngt rig list [--json]\n```\nOutput:\n- Rig name\n- Git URL\n- Polecat count\n- Witness/Refinery status\n\n### gt rig show\nShow details for a specific rig.\n```\ngt rig show \u003cname\u003e [--json]\n```\nOutput:\n- Full rig info\n- Polecats with status\n- Recent activity\n- Beads summary (open issues)\n\n### gt rig add\nAdd a new rig to the town.\n```\ngt rig add \u003cname\u003e \u003cgit-url\u003e\ngt rig add \u003cname\u003e --local \u003cpath\u003e\n```\nSteps:\n- Clone repo to town root (or link local)\n- Register in rigs.json\n- Create agent directories\n- Configure git exclude\n\n### gt rig remove\nRemove a rig from the town.\n```\ngt rig remove \u003cname\u003e [--force]\n```\n- Warns if active polecats\n- --force required if uncommitted changes\n\n## Implementation\n\n```go\nvar rigCmd = \u0026cobra.Command{Use: \"rig\"}\nrigCmd.AddCommand(rigListCmd, rigShowCmd, rigAddCmd, rigRemoveCmd)\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:13:01.075697-08:00","updated_at":"2025-12-27T21:29:54.661794-08:00","dependencies":[{"issue_id":"gt-u1j.16","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:01.076033-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.16","depends_on_id":"gt-u1j.5","type":"blocks","created_at":"2025-12-15T17:14:08.30402-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.661794-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1j.17","title":"CLI: polecat commands (add, remove, list, wake, sleep)","description":"Polecat management CLI commands.\n\n## Commands\n\n### gt polecat list\nList polecats in a rig.\n```\ngt polecat list \u003crig\u003e [--json]\ngt polecat list --all [--json]\n```\nOutput:\n- Name\n- State (idle/active/working/done/stuck)\n- Current issue (if any)\n- Session status (running/stopped)\n\n### gt polecat add\nAdd a new polecat to a rig.\n```\ngt polecat add \u003crig\u003e \u003cname\u003e\n```\n- Creates polecat directory\n- Clones rig repo\n- Creates work branch\n- Initializes state\n\n### gt polecat remove\nRemove a polecat from a rig.\n```\ngt polecat remove \u003crig\u003e/\u003cpolecat\u003e [--force]\n```\n- Fails if session running\n- Warns if uncommitted changes\n- --force bypasses checks\n\n### gt polecat wake\nMark polecat as active (ready for work).\n```\ngt polecat wake \u003crig\u003e/\u003cpolecat\u003e\n```\nTransitions: idle → active\n\n### gt polecat sleep\nMark polecat as idle (not available).\n```\ngt polecat sleep \u003crig\u003e/\u003cpolecat\u003e\n```\nTransitions: active → idle\nFails if session running (stop first).\n\n## Address Format\n\n`\u003crig\u003e/\u003cpolecat\u003e` for operations on specific polecat.\n`\u003crig\u003e` for list command.\n\n## Implementation\n\n```go\nvar polecatCmd = \u0026cobra.Command{Use: \"polecat\"}\npolecatCmd.AddCommand(listCmd, addCmd, removeCmd, wakeCmd, sleepCmd)\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:13:13.683142-08:00","updated_at":"2025-12-27T21:29:54.653265-08:00","dependencies":[{"issue_id":"gt-u1j.17","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:13.683486-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.17","depends_on_id":"gt-u1j.8","type":"blocks","created_at":"2025-12-15T17:14:08.405807-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.653265-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1j.18","title":"CLI: witness commands (start, stop, status)","description":"Witness daemon CLI commands.\n\n## Commands\n\n### gt witness start\nStart the witness daemon for a rig.\n```\ngt witness start \u003crig\u003e [--interval \u003cseconds\u003e]\n```\n- Starts background daemon process\n- Default heartbeat interval: 30s\n- Logs to \u003crig\u003e/witness/witness.log\n\n### gt witness stop\nStop the witness daemon.\n```\ngt witness stop \u003crig\u003e\n```\n- Graceful shutdown\n- Completes current heartbeat cycle\n\n### gt witness status\nShow witness status.\n```\ngt witness status \u003crig\u003e [--json]\n```\nOutput:\n- Running/stopped\n- Last heartbeat time\n- Polecats being monitored\n- Recent nudges/escalations\n\n### gt witness logs\nView witness logs.\n```\ngt witness logs \u003crig\u003e [--follow] [--lines \u003cn\u003e]\n```\n- --follow: Continuous output (like tail -f)\n- --lines: Number of lines (default 50)\n\n## Daemon Management\n\nWitness runs as background process:\n- PID stored in \u003crig\u003e/witness/witness.pid\n- Uses nohup for persistence\n- Logs structured JSON events\n\n## Implementation\n\n```go\nvar witnessCmd = \u0026cobra.Command{Use: \"witness\"}\nwitnessCmd.AddCommand(startCmd, stopCmd, statusCmd, logsCmd)\n```\n\nStart launches daemon via exec, status reads PID file and pings process.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:13:15.022371-08:00","updated_at":"2025-12-27T21:29:54.644654-08:00","dependencies":[{"issue_id":"gt-u1j.18","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:15.022774-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.18","depends_on_id":"gt-u1j.9","type":"blocks","created_at":"2025-12-15T17:14:08.490417-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.644654-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1j.19","title":"CLI: spawn command","description":"High-level command to spawn a polecat with work assignment.\n\n## Command\n\n```\ngt spawn \u003crig\u003e/\u003cpolecat\u003e --issue \u003cid\u003e\ngt spawn \u003crig\u003e --issue \u003cid\u003e # auto-select available polecat\ngt spawn \u003crig\u003e/\u003cpolecat\u003e -m \"Work on X\" # free-form task\n```\n\n## What It Does\n\n1. Select or create polecat\n2. Ensure polecat is in idle/active state\n3. Assign the issue (update beads, set polecat state)\n4. Start session (gt session start)\n5. Inject initial context\n\n## Options\n\n- `--issue \u003cid\u003e`: Beads issue to assign\n- `-m \"message\"`: Free-form work description\n- `--create`: Create polecat if it does not exist\n- `--no-start`: Assign work but do not start session\n\n## Auto-Select Logic\n\nIf polecat not specified:\n1. Look for idle polecats in rig\n2. Prefer ones that worked on similar issues before\n3. If none available, fail (or create if --create)\n\n## Initial Context\n\nInjects into session:\n```\n[SPAWN] You have been assigned: \u003cissue-title\u003e\nIssue: \u003cid\u003e\nPriority: \u003cpriority\u003e\n\n\u003cdescription\u003e\n\nWork on this issue. Signal DONE when complete.\n```\n\n## Implementation\n\nOrchestrates PolecatManager + SessionManager + Beads:\n```go\nfunc Spawn(rig, polecat, issue string, opts SpawnOptions) error\n```","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:13:16.676891-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.19","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:16.677248-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.19","depends_on_id":"gt-u1j.8","type":"blocks","created_at":"2025-12-15T17:14:09.726436-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.19","depends_on_id":"gt-u1j.6","type":"blocks","created_at":"2025-12-15T17:14:09.821801-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.20","title":"Prompt templates: role contexts, nudge messages","description":"Prompt templates for role contexts and messages.\n\n## Template Types\n\n### Role CLAUDE.md Templates\n\nFor gt prime and CLAUDE.md generation:\n- Mayor template\n- Witness template\n- Refinery template\n- Polecat template\n\n### Message Templates\n\nFor notifications and nudges:\n- Spawn assignment\n- Witness nudge\n- Escalation to Mayor\n- Session handoff\n\n## Implementation\n\n```go\n//go:embed templates/*.md.tmpl\nvar templateFS embed.FS\n\ntype Templates struct {\n fs embed.FS\n}\n\nfunc NewTemplates() *Templates\nfunc (t *Templates) RenderRole(role string, data RoleData) (string, error)\nfunc (t *Templates) RenderMessage(name string, data any) (string, error)\n```\n\n## Template Data\n\n```go\ntype RoleData struct {\n Role string\n RigName string\n TownRoot string\n Polecats []string\n Commands []CommandHelp\n}\n\ntype SpawnData struct {\n Issue string\n Title string\n Priority int\n Description string\n}\n\ntype NudgeData struct {\n Polecat string\n Reason string\n NudgeCount int\n MaxNudges int\n}\n```\n\n## Template Location\n\nEmbedded in binary via go:embed:\n```\ninternal/templates/\n├── roles/\n│ ├── mayor.md.tmpl\n│ ├── witness.md.tmpl\n│ ├── refinery.md.tmpl\n│ └── polecat.md.tmpl\n└── messages/\n ├── spawn.md.tmpl\n ├── nudge.md.tmpl\n └── escalation.md.tmpl\n```\n\n## Usage\n\ngt prime uses role templates.\ngt spawn uses spawn template.\nWitness uses nudge/escalation templates.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:13:18.711762-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.20","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:18.712116-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.20","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-15T17:14:21.32864-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.21","title":"Test infrastructure: fixtures, mocks, integration tests","description":"Test infrastructure for Gas Town.\n\n## Test Categories\n\n### Unit Tests\n- Config parsing\n- Mail operations\n- Merge queue operations\n- Template rendering\n\n### Integration Tests\n- Git operations (uses real git in temp dirs)\n- Full workflows (spawn → work → merge)\n- CLI command tests\n\n## Fixtures\n\n```go\npackage testutil\n\n// Creates temp town with minimal structure\nfunc NewTestTown(t *testing.T) *TestTown\n\n// Creates temp rig with git repo\nfunc NewTestRig(t *testing.T, town *TestTown, name string) *TestRig\n\n// Creates test polecat\nfunc NewTestPolecat(t *testing.T, rig *TestRig, name string) *TestPolecat\n```\n\n## Mocks\n\n```go\n// Mock tmux for session tests (avoid actual tmux)\ntype MockTmux struct {\n Sessions map[string][]string // captured send-keys\n}\n\nfunc (m *MockTmux) NewSession(name, dir string) error\nfunc (m *MockTmux) SendKeys(session, keys string) error\nfunc (m *MockTmux) CapturePane(session string, lines int) (string, error)\n```\n\n## Test Helpers\n\n```go\n// Run test in temp directory\nfunc InTempDir(t *testing.T, fn func(dir string))\n\n// Initialize git repo for testing\nfunc InitGitRepo(t *testing.T, dir string)\n\n// Assert file exists with content\nfunc AssertFileContains(t *testing.T, path, substr string)\n\n// Assert JSON file has field\nfunc AssertJSONField(t *testing.T, path, field string, expected any)\n```\n\n## CI Integration\n\n- go test ./... runs all tests\n- Integration tests skip with -short flag\n- Coverage report generation\n\n## Test Data\n\nEmbedded test fixtures:\n```go\n//go:embed testdata/*\nvar testdataFS embed.FS\n```","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:13:28.755475-08:00","updated_at":"2025-12-27T21:29:54.635915-08:00","dependencies":[{"issue_id":"gt-u1j.21","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:13:28.755866-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.21","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-15T17:14:21.425389-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.635915-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1j.22","title":"CLI: gt stop --all (emergency swarm kill)","description":"Emergency kill command for swarm operations.\n\n## Commands\n\n```bash\ngt stop --all # Kill ALL sessions across all rigs\ngt stop --rig \u003cname\u003e # Kill all sessions in one rig\ngt stop --rig \u003cname\u003e --graceful # Try graceful first, then force\n```\n\n## Implementation\n\n### --all flag\n1. List all tmux sessions matching gt-*\n2. For each: capture final output, kill session\n3. Update all polecat states to idle\n4. Log all killed sessions\n5. Send mail to Mayor: \"Emergency stop executed\"\n\n### --rig flag\n1. List sessions for specific rig\n2. Same as above but scoped\n\n### --graceful flag\n1. First, try to inject \"exit\" command\n2. Wait 5 seconds\n3. If still running, force kill\n\n## Output\n\n```\nStopping all Gas Town sessions...\n [wyvern] Toast: captured output, killed\n [wyvern] Capable: captured output, killed\n [beads] Nux: captured output, killed\n \n3 sessions stopped.\nAll polecat states set to 'idle'.\nMail sent to mayor/.\n```\n\n## Safety\n\n- Always capture output before killing\n- Update state files to reflect reality\n- Log all operations to gt.log\n- Warn if uncommitted changes detected (but still kill)\n\n## Dependencies\n\nNeeds: Tmux wrapper, Session management, Polecat management","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T23:17:51.455669-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.22","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T23:17:51.456008-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.22","depends_on_id":"gt-u1j.7","type":"blocks","created_at":"2025-12-15T23:18:58.198873-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.22","depends_on_id":"gt-u1j.8","type":"blocks","created_at":"2025-12-15T23:18:58.297176-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.3","title":"Git wrapper: shell out to git","description":"Wrapper for git operations via subprocess.\n\n## Interface\n\n```go\ntype Git struct {\n workDir string\n}\n\nfunc NewGit(workDir string) *Git\n\n// Core operations\nfunc (g *Git) Clone(url, dest string) error\nfunc (g *Git) Checkout(ref string) error\nfunc (g *Git) Fetch(remote string) error\nfunc (g *Git) Pull(remote, branch string) error\nfunc (g *Git) Push(remote, branch string, force bool) error\n\n// Commit operations\nfunc (g *Git) Add(paths ...string) error\nfunc (g *Git) Commit(message string) error\nfunc (g *Git) CommitAll(message string) error\n\n// Query operations\nfunc (g *Git) Status() (*GitStatus, error)\nfunc (g *Git) CurrentBranch() (string, error)\nfunc (g *Git) HasUncommittedChanges() (bool, error)\nfunc (g *Git) RemoteURL(remote string) (string, error)\n\n// Merge operations\nfunc (g *Git) Merge(branch string) error\nfunc (g *Git) Rebase(onto string) error\nfunc (g *Git) AbortMerge() error\nfunc (g *Git) AbortRebase() error\n```\n\n## Implementation\n\nShell out to git binary via exec.Command. Parse output where needed (Status, CurrentBranch).\n\n## Error Handling\n\n- Wrap git stderr in error messages\n- Detect specific failures (merge conflict, auth failure, not a repo)\n- Return structured errors for callers to handle\n\n## Testing\n\n- Use temp directories with real git repos\n- Test both success and failure paths","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:11.907807-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.3","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:11.908262-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.3","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-15T17:13:47.220423-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.4","title":"Tmux wrapper: session operations","description":"Wrapper for tmux session operations via subprocess.\n\n## Interface\n\n```go\ntype Tmux struct{}\n\nfunc NewTmux() *Tmux\n\n// Session lifecycle\nfunc (t *Tmux) NewSession(name, workDir string) error\nfunc (t *Tmux) KillSession(name string) error\nfunc (t *Tmux) HasSession(name string) (bool, error)\nfunc (t *Tmux) ListSessions() ([]string, error)\n\n// Session interaction\nfunc (t *Tmux) SendKeys(session, keys string) error\nfunc (t *Tmux) CapturePane(session string, lines int) (string, error)\nfunc (t *Tmux) AttachSession(session string) error\n\n// Window operations (if needed)\nfunc (t *Tmux) SelectWindow(session string, index int) error\n```\n\n## Session Naming\n\nConvention: `gt-\u003crig\u003e-\u003cpolecat\u003e` (e.g., `gt-wyvern-Toast`)\n\n## Implementation\n\n- Shell out to tmux binary\n- CapturePane uses `tmux capture-pane -p -t \u003csession\u003e -S -\u003clines\u003e`\n- SendKeys uses `tmux send-keys -t \u003csession\u003e '\u003ckeys\u003e' Enter`\n\n## Error Handling\n\n- \"no server running\": tmux not started (not an error for HasSession)\n- \"session not found\": session doesn't exist\n- Wrap all tmux errors with context\n\n## Environment\n\nRequires TMUX_TMPDIR or uses default socket. Don't assume tmux is running.","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:13.432706-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.4","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:13.433043-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.4","depends_on_id":"gt-u1j.1","type":"blocks","created_at":"2025-12-15T17:13:47.298442-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.5","title":"Rig management: discover, load, create rigs","description":"Rig discovery, loading, and creation.\n\n## Interface\n\n```go\ntype Rig struct {\n Name string\n Path string\n GitURL string\n Config RigConfig\n Polecats []string\n HasWitness bool\n HasRefinery bool\n}\n\ntype RigManager struct {\n townRoot string\n config *Config\n git *Git\n}\n\nfunc NewRigManager(townRoot string, config *Config, git *Git) *RigManager\n\n// Discovery\nfunc (rm *RigManager) DiscoverRigs() ([]*Rig, error)\nfunc (rm *RigManager) GetRig(name string) (*Rig, error)\nfunc (rm *RigManager) RigExists(name string) bool\n\n// Creation\nfunc (rm *RigManager) AddRig(name, gitURL string) (*Rig, error)\nfunc (rm *RigManager) RemoveRig(name string) error\n\n// Rig state\nfunc (rm *RigManager) LoadRigConfig(rig *Rig) error\nfunc (rm *RigManager) SaveRigConfig(rig *Rig) error\n```\n\n## Discovery Logic\n\n1. Read rigs.json from config/\n2. For each registered rig, verify directory exists\n3. Load rig-level config if present\n4. Scan for polecats/ subdirectory\n5. Check for witness/, refinery/ directories\n\n## Rig Creation\n\n1. Clone repo to town root\n2. Add entry to rigs.json\n3. Create agent directories (polecats/, witness/, refinery/, mayor/)\n4. Update .git/info/exclude to ignore agent dirs\n5. Initialize rig config\n\n## Rig Structure\n\nPer docs/architecture.md:\n```\n\u003crig\u003e/\n├── polecats/\n├── refinery/rig/\n├── witness/rig/\n└── mayor/rig/\n```","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:15.034694-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.5","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:15.035022-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.5","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-15T17:13:49.93957-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.5","depends_on_id":"gt-f9x.1","type":"blocks","created_at":"2025-12-15T17:13:50.047236-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.6","title":"Mail system: JSONL inbox, delivery, tmux injection","description":"Refactor mail system to wrap bd mail commands.\n\n## Status\n\nCurrent implementation uses JSONL files directly. Now that Beads mail is available (gt-r01 closed), refactor to use bd mail as the backend.\n\n## Implementation\n\nReplace internal/mail/* with thin wrappers around bd CLI:\n\n```go\n// Send wraps: bd mail send \u003cto\u003e -s \u003csubject\u003e -m \u003cbody\u003e\nfunc (r *Router) Send(msg *Message) error\n\n// List wraps: bd mail inbox --json\nfunc (m *Mailbox) List() ([]*Message, error)\n\n// Get wraps: bd mail read \u003cid\u003e --json\nfunc (m *Mailbox) Get(id string) (*Message, error)\n\n// MarkRead wraps: bd mail ack \u003cid\u003e\nfunc (m *Mailbox) MarkRead(id string) error\n```\n\n## Address Translation\n\nGGT addresses → Beads recipients:\n- `mayor/` → `mayor`\n- `\u003crig\u003e/refinery` → `\u003crig\u003e-refinery`\n- `\u003crig\u003e/\u003cpolecat\u003e` → `\u003crig\u003e-\u003cpolecat\u003e`\n\n## Tmux Notification\n\nKeep the tmux injection for --notify flag. After bd mail send, optionally inject notification into tmux session.\n\n## Migration\n\nNo migration needed - bd mail creates messages as beads issues with type=message. Old JSONL inboxes can be ignored or cleaned up.\n\n## Dependencies\n\n- gt-r01: CLOSED (Beads mail available)\n- gt-u1j.4: Tmux wrapper (for notifications)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:16.226478-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.6","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:16.226821-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.6","depends_on_id":"gt-u1j.4","type":"blocks","created_at":"2025-12-15T17:13:51.98774-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.6","depends_on_id":"gt-r01","type":"blocks","created_at":"2025-12-16T13:12:08.753309-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.7","title":"Session management: start, stop, attach, capture","description":"Polecat session lifecycle management.\n\n## Interface\n\n```go\ntype SessionManager struct {\n tmux *Tmux\n rig *Rig\n}\n\nfunc NewSessionManager(tmux *Tmux, rig *Rig) *SessionManager\n\n// Lifecycle\nfunc (sm *SessionManager) Start(polecat string, opts StartOptions) error\nfunc (sm *SessionManager) Stop(polecat string) error\nfunc (sm *SessionManager) IsRunning(polecat string) (bool, error)\nfunc (sm *SessionManager) List() ([]SessionInfo, error)\n\n// Interaction\nfunc (sm *SessionManager) Attach(polecat string) error\nfunc (sm *SessionManager) Capture(polecat string, lines int) (string, error)\nfunc (sm *SessionManager) Inject(polecat string, message string) error\n\ntype StartOptions struct {\n WorkDir string // defaults to polecat clone dir\n Issue string // optional: issue to work on\n Command string // optional: override claude command\n}\n\ntype SessionInfo struct {\n Polecat string\n SessionID string\n Running bool\n StartedAt time.Time\n}\n```\n\n## Session Naming\n\n`gt-\u003crig\u003e-\u003cpolecat\u003e` (e.g., `gt-wyvern-Toast`)\n\n## Start Flow\n\n1. Verify polecat clone exists\n2. Check no existing session\n3. Create tmux session in polecat workdir\n4. Send initial command: `claude` or custom\n5. If issue provided, inject initial prompt\n\n## Stop Flow\n\n1. Capture final output (for logging)\n2. Send exit/quit command\n3. Wait briefly for graceful shutdown\n4. Kill session if still running\n\n## Capture\n\nUses tmux capture-pane to get recent output. Returns last N lines.\n\n## Inject\n\nSends text to session via tmux send-keys. Used for:\n- Mail notifications\n- Witness nudges\n- User messages","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:25.473674-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.7","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:25.473993-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.7","depends_on_id":"gt-u1j.4","type":"blocks","created_at":"2025-12-15T17:13:52.081053-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.8","title":"Polecat management: add, remove, list, state","description":"Polecat lifecycle: add, remove, list, state tracking.\n\n## Data Model\n\n```go\ntype Polecat struct {\n Name string `json:\"name\"`\n Rig string `json:\"rig\"`\n State PolecatState `json:\"state\"`\n ClonePath string `json:\"clone_path\"`\n Branch string `json:\"branch\"`\n Issue string `json:\"issue,omitempty\"`\n CreatedAt time.Time `json:\"created_at\"`\n UpdatedAt time.Time `json:\"updated_at\"`\n}\n\ntype PolecatState string\nconst (\n StateIdle PolecatState = \"idle\"\n StateActive PolecatState = \"active\"\n StateWorking PolecatState = \"working\"\n StateDone PolecatState = \"done\"\n StateStuck PolecatState = \"stuck\"\n)\n```\n\n## Interface\n\n```go\ntype PolecatManager struct {\n rig *Rig\n git *Git\n}\n\nfunc NewPolecatManager(rig *Rig, git *Git) *PolecatManager\n\n// Lifecycle\nfunc (pm *PolecatManager) Add(name string) (*Polecat, error)\nfunc (pm *PolecatManager) Remove(name string) error\nfunc (pm *PolecatManager) List() ([]*Polecat, error)\nfunc (pm *PolecatManager) Get(name string) (*Polecat, error)\n\n// State\nfunc (pm *PolecatManager) SetState(name string, state PolecatState) error\nfunc (pm *PolecatManager) AssignIssue(name, issue string) error\nfunc (pm *PolecatManager) ClearIssue(name string) error\n\n// Convenience\nfunc (pm *PolecatManager) Wake(name string) error // idle → active\nfunc (pm *PolecatManager) Sleep(name string) error // active → idle\n```\n\n## Add Flow\n\n1. Create directory: `\u003crig\u003e/polecats/\u003cname\u003e/`\n2. Clone rig repo into it (or copy from rig root)\n3. Create new branch: `polecat/\u003cname\u003e`\n4. Initialize polecat state file\n5. Create mail inbox\n\n## Remove Flow\n\n1. Verify session not running\n2. Check for uncommitted changes (warn/error)\n3. Remove directory\n4. Clean up state\n\n## State Persistence\n\nStore in `\u003crig\u003e/polecats/\u003cname\u003e/state.json`","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-15T17:12:27.402824-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-u1j.8","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:27.403171-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.8","depends_on_id":"gt-u1j.5","type":"blocks","created_at":"2025-12-15T17:13:53.747126-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.8","depends_on_id":"gt-u1j.3","type":"blocks","created_at":"2025-12-15T17:13:53.831197-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-u1j.9","title":"Witness daemon: heartbeat loop, spawn ephemeral agent","description":"Background daemon for agent lifecycle and monitoring.\n\n## Core Responsibilities\n\n1. **Heartbeat monitoring**: Check agent health periodically\n2. **Session lifecycle**: Restart agents after handoff (Witness Protection)\n3. **Polecat monitoring**: Track worker progress, nudging\n4. **Escalation**: Report failures to Mayor\n\n## Session Lifecycle (Witness Protection)\n\nThe daemon enables autonomous long-running operation by cycling agent sessions:\n\n```go\ntype SessionLifecycle struct {\n AgentType string // \"witness\", \"refinery\"\n StatePath string // path to state.json\n}\n\nfunc (d *Daemon) monitorSessionLifecycle(agent SessionLifecycle) {\n // 1. Detect session exit\n // 2. Check state.json for requesting_cycle: true\n // 3. If cycle requested: start new session, clear flag\n // 4. If unexpected exit: escalate to Mayor\n}\n```\n\nSee architecture.md Key Decision #12: Agent Session Lifecycle.\n\n## Interface\n\n```go\ntype Daemon struct {\n rig *Rig\n polecats *PolecatManager\n sessions *SessionManager\n config DaemonConfig\n}\n\ntype DaemonConfig struct {\n HeartbeatInterval time.Duration // default: 30s\n NudgeThreshold int // nudges before escalation\n MaxWorkers int // from rig config\n}\n\nfunc NewDaemon(rig *Rig) *Daemon\nfunc (d *Daemon) Start() error\nfunc (d *Daemon) Stop() error\n\n// Lifecycle\nfunc (d *Daemon) CycleSession(agentType string) error\nfunc (d *Daemon) SpawnWorker(issueID string) error\nfunc (d *Daemon) ShutdownWorker(polecat string) error\n```\n\n## Heartbeat Loop\n\nEvery HeartbeatInterval:\n1. Check Witness session (restart if cycle requested)\n2. Check Refinery session (restart if cycle requested)\n3. For each active polecat: assess progress, nudge if stuck\n4. Query `bd ready` for new work, spawn up to max_workers","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T17:12:29.389103-08:00","updated_at":"2025-12-27T21:29:54.678807-08:00","dependencies":[{"issue_id":"gt-u1j.9","depends_on_id":"gt-u1j","type":"parent-child","created_at":"2025-12-15T17:12:29.389428-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.9","depends_on_id":"gt-u1j.7","type":"blocks","created_at":"2025-12-15T17:14:04.353775-08:00","created_by":"daemon"},{"issue_id":"gt-u1j.9","depends_on_id":"gt-u1j.8","type":"blocks","created_at":"2025-12-15T17:14:04.440363-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.678807-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u1k","title":"gt shutdown should fully cleanup polecats (worktrees, branches, inboxes)","description":"Current gt shutdown only kills tmux sessions but leaves:\n\n1. Git worktrees in polecats/ directory\n2. Polecat branches (polecat/\u003cname\u003e)\n3. Inbox messages\n4. Beads assignee state\n\nThis causes stale polecats to be reused on next startup.\n\n## Expected Behavior\n\ngt shutdown should:\n1. Kill all polecat tmux sessions (current behavior)\n2. For each polecat with StateIdle or StateDone:\n - Remove worktree\n - Delete branch\n - Clear inbox\n3. For polecats with uncommitted work:\n - REFUSE to clean up\n - Report which polecats have uncommitted changes\n - Require --force or --nuclear to proceed\n\n## Current Code\n\n- start.go:runImmediateShutdown() only calls t.KillSession()\n- Witness cleanupPolecat() does full cleanup but isn't called by shutdown\n\n## Fix\n\nEither:\n1. Call Witness cleanup for each polecat during shutdown\n2. OR add rig.ShutdownPolecats() that does the full cleanup","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T15:22:42.921949-08:00","updated_at":"2025-12-27T21:29:53.727903-08:00","dependencies":[{"issue_id":"gt-u1k","depends_on_id":"gt-8v8","type":"blocks","created_at":"2025-12-20T15:23:15.226189-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:53.727903-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-u1lf","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:26","description":"Patrol 12: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:26:31.865491-08:00","updated_at":"2025-12-27T21:26:05.255683-08:00","deleted_at":"2025-12-27T21:26:05.255683-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u201i","title":"Merge: dag-mjtm8k85","description":"branch: polecat/dag-mjtm8k85\ntarget: main\nsource_issue: dag-mjtm8k85\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:16:19.897281-08:00","updated_at":"2026-01-01T09:53:50.706238-08:00","closed_at":"2026-01-01T09:53:50.706238-08:00","close_reason":"Branch deleted, no audit trail - unverifiable","created_by":"gastown/polecats/dag"}
{"id":"gt-u29p","title":"Digest: mol-deacon-patrol","description":"Patrol #10: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:33:10.196304-08:00","updated_at":"2025-12-27T21:26:04.318297-08:00","deleted_at":"2025-12-27T21:26:04.318297-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u2vg","title":"gt spawn --issue should auto-attach mol-polecat-work","description":"When spawning a polecat with --issue but no --molecule, the polecat gets an issue assigned and work assignment mail, but no molecule on their hook. They check 'gt mol status', see nothing, and are stuck.\n\nFix: If --issue is provided but --molecule is not, auto-attach mol-polecat-work (or a configurable default) to the polecat's hook.\n\nCurrent behavior:\n- Issue assigned in beads ✓\n- Mail sent ✓ \n- No molecule attached ✗\n\nExpected:\n- Issue assigned in beads ✓\n- Mail sent ✓\n- mol-polecat-work attached to hook ✓","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-23T19:53:24.817448-08:00","updated_at":"2025-12-27T21:29:52.853443-08:00","deleted_at":"2025-12-27T21:29:52.853443-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-u3mm7","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 0 polecats, 0 gates, all witnesses/refineries healthy, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:35:07.072498-08:00","updated_at":"2025-12-27T21:26:01.461122-08:00","deleted_at":"2025-12-27T21:26:01.461122-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u41w","title":"Merge: gt-5af.1","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-5af.1\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T17:30:40.48203-08:00","updated_at":"2025-12-27T21:27:22.717062-08:00","deleted_at":"2025-12-27T21:27:22.717062-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-u49zh","title":"Include session ID in PropulsionNudge for /resume picker","status":"closed","priority":1,"issue_type":"task","created_at":"2026-01-02T20:44:07.691441-08:00","updated_at":"2026-01-02T20:50:28.192394-08:00","closed_at":"2026-01-02T20:50:28.192394-08:00","close_reason":"PropulsionNudgeForRole now accepts workDir and appends [session:xxx] when .runtime/session_id exists","created_by":"mayor"}
{"id":"gt-u56bb","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:45.08665-08:00","updated_at":"2025-12-27T21:26:02.044341-08:00","deleted_at":"2025-12-27T21:26:02.044341-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u65t8","title":"Merge: valkyrie-1767106008400","description":"branch: polecat/valkyrie-1767106008400\ntarget: main\nsource_issue: valkyrie-1767106008400\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T10:43:03.503911-08:00","updated_at":"2025-12-30T18:23:22.202888-08:00","closed_at":"2025-12-30T18:23:22.202888-08:00","close_reason":"Stale MR - cleanup","created_by":"gastown/polecats/valkyrie"}
{"id":"gt-u6nri","title":"Wire up created_by field for beads issues","description":"## Goal\nAdd actor/creator attribution to beads issues created by gastown.\n\n## Done\n- Added CreatedBy field to Issue struct\n- Added Actor field to CreateOptions\n- Pass --actor to bd create\n\n## Remaining\n1. Add ActorString() method to RoleInfo\n2. Update beads.Create() callers to pass Actor\n3. Update direct bd create exec calls to add --actor\n\n## Context\nBeads GH #748 added created_by field. Gastown needs to populate it.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:22:56.217959-08:00","updated_at":"2025-12-27T21:29:54.730971-08:00","deleted_at":"2025-12-27T21:29:54.730971-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u818","title":"Witness Plugin System","description":"Patrol extensions via molecule bonding.\n\n## Core Insight\n\nPlugins ARE molecules. No separate format needed.\n\n## How It Works\n\n```\nmol-witness-patrol\n │\n ├── survey-workers\n │\n ├── plugin-run ←── bonds registered plugin molecules\n │ │\n │ ├── mol-mood-check (if registered)\n │ ├── mol-security-scan (if registered)\n │ └── mol-custom-X (user-installed)\n │\n └── aggregate (WaitsFor: all-children)\n```\n\n## Plugin Molecule Structure\n\nA plugin is just a molecule proto with specific labels:\n\n```json\n{\n \"id\": \"mol-mood-check\",\n \"title\": \"Polecat Mood Check\",\n \"description\": \"Assess polecat emotional state from output. Vars: {{polecat_name}}, {{captured_output}}\",\n \"labels\": [\"template\", \"plugin\", \"witness\", \"tier:haiku\"],\n \"issue_type\": \"task\"\n}\n```\n\nLabels encode metadata:\n- `plugin` - marks as bondable plugin\n- `witness` / `deacon` / `refinery` - which patrol can use it\n- `tier:haiku` / `tier:sonnet` - model tier hint\n\n## Registration\n\nPlugins registered in rig config or molecules.jsonl:\n\n```bash\n# Install from Mol Mall\nbd mol install mol-mood-check\n\n# Or add to ~/.beads/molecules.jsonl manually\n```\n\n## Execution\n\nThe `plugin-run` step in patrol:\n\n```bash\n# For each registered plugin molecule matching this patrol:\nbd mol bond mol-mood-check $PATROL_WISP \\\n --ref mood-{{polecat_name}} \\\n --var polecat_name=$POLECAT \\\n --var captured_output=\"$OUTPUT\"\n```\n\n## CLI\n\n- `bd mol list --label plugin` - List available plugins\n- `bd mol install \u003cid\u003e` - Install from Mol Mall\n- `gt patrol plugins \u003crole\u003e` - Show plugins for a patrol role\n\n## Benefits\n\n- No YAML schema to maintain\n- No separate plugin loading code\n- Plugins are versioned, synced, auditable (they are beads)\n- Distribution via Mol Mall (molecules.jsonl registry)\n- Same {{var}} substitution as all molecules","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-21T16:16:53.886931-08:00","updated_at":"2025-12-27T21:29:53.451435-08:00","deleted_at":"2025-12-27T21:29:53.451435-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-u82","title":"Design: Mayor session cycling and handoff","description":"Design for Mayor session cycling and structured handoff.\n\n## Overview\n\nMayor coordinates across all rigs and runs for extended periods. Needs session cycling pattern with structured handoff notes.\n\n## Key Elements\n\n1. Session cycling recognition (when to cycle)\n2. Handoff note format (structured state capture)\n3. Handoff delivery (mail to self)\n4. Fresh session startup (reading and resuming)\n\n## Subtasks (implementation)\n\n- gt-g2d: Mayor session cycling prompting\n- gt-sye: Mayor startup protocol prompting\n- gt-vci: Mayor handoff mail template\n- gt-1le: town handoff command (optional, P2)\n\n**Design complete.** Each subtask has full specification in its description.","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-15T20:03:16.125725-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"epic"}
{"id":"gt-u8df5","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 13: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:45:59.223504-08:00","updated_at":"2025-12-27T21:26:01.369774-08:00","deleted_at":"2025-12-27T21:26:01.369774-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-u8tvh","title":"Session ended: gt-gastown-slit","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-02T18:25:13.990869-08:00","updated_at":"2026-01-03T11:32:45.174832-08:00","closed_at":"2026-01-03T11:32:45.174832-08:00","close_reason":"Session lifecycle events processed","created_by":"gastown/polecats/slit"}
{"id":"gt-u8ybw","title":"Digest: mol-deacon-patrol","description":"Patrol 18: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:53:19.814904-08:00","updated_at":"2025-12-27T21:26:04.083552-08:00","deleted_at":"2025-12-27T21:26:04.083552-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ua34i","title":"Merge: slit-mjvtz18y","description":"branch: polecat/slit-mjvtz18y\ntarget: main\nsource_issue: slit-mjvtz18y\nrig: gastown\nagent_bead: gt-gastown-polecat-slit","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T11:25:08.070467-08:00","updated_at":"2026-01-01T14:15:16.691746-08:00","closed_at":"2026-01-01T14:15:16.691746-08:00","close_reason":"Already merged to main at 2bcfa763 - stale MR bead","created_by":"gastown/polecats/slit"}
{"id":"gt-ua5f","title":"Digest: mol-deacon-patrol","description":"Patrol: read mayor handoff (gt-mzal boot design), all agents up, furiosa working gt-oiv0, note: gt-4eim orphaned (angharad gone)","status":"tombstone","priority":4,"issue_type":"task","created_at":"2025-12-22T21:29:07.064593-08:00","updated_at":"2025-12-27T21:26:05.466931-08:00","deleted_at":"2025-12-27T21:26:05.466931-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uadg","title":"Digest: mol-deacon-patrol","description":"Patrol complete: all agents healthy, no lifecycle requests, 9 in-progress items noted","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:20:11.959796-08:00","updated_at":"2025-12-27T21:26:04.400525-08:00","deleted_at":"2025-12-27T21:26:04.400525-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uapyg","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 15: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:46:42.564849-08:00","updated_at":"2025-12-27T21:26:01.353369-08:00","deleted_at":"2025-12-27T21:26:01.353369-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ubd4","title":"Merge: gt-tnca.1","description":"branch: polecat/immortan\ntarget: main\nsource_issue: gt-tnca.1\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T00:18:22.782233-08:00","updated_at":"2025-12-27T21:27:22.861381-08:00","deleted_at":"2025-12-27T21:27:22.861381-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-uh03p","title":"Merge: capable-mjtltnm5","description":"branch: polecat/capable-mjtltnm5\ntarget: main\nsource_issue: capable-mjtltnm5\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:34:13.804202-08:00","updated_at":"2025-12-30T23:12:37.255588-08:00","closed_at":"2025-12-30T23:12:37.255588-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/capable"}
{"id":"gt-uhe4","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:20","description":"Patrol complete: inbox empty, all agents healthy (Mayor, 2 Witnesses, 2 Refineries), no polecats, no orphans","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:20:09.935353-08:00","updated_at":"2025-12-27T21:26:05.330941-08:00","deleted_at":"2025-12-27T21:26:05.330941-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uhe8g","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:18:44.242008-08:00","updated_at":"2026-01-01T19:21:01.985817-08:00","closed_at":"2026-01-01T19:21:01.985817-08:00","close_reason":"Merged to main at b4ff6781","created_by":"gastown/polecats/rictus"}
{"id":"gt-ujwab","title":"Digest: mol-deacon-patrol","description":"Patrol 5: all clear, proto closure warning noted","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:18:32.52576-08:00","updated_at":"2025-12-27T21:26:01.19296-08:00","deleted_at":"2025-12-27T21:26:01.19296-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-unr9d","title":"Digest: mol-deacon-patrol","description":"P11: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:12:47.306028-08:00","updated_at":"2025-12-27T21:26:02.261578-08:00","deleted_at":"2025-12-27T21:26:02.261578-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-unrd","title":"Fix gt prime to give crew workers crew context","description":"gt prime currently gives Mayor context to all agents. Crew workers should get crew-specific context. Also extract shared theory of operation from mayor priming into shared context for all roles.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-19T15:37:49.671015-08:00","updated_at":"2025-12-27T21:29:53.99994-08:00","deleted_at":"2025-12-27T21:29:53.99994-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-uohw","title":"gt doctor: detect tmux session anomalies (linked panes, etc.)","description":"Add a tmux health check to gt doctor that detects:\n\n1. **Linked panes between sessions** - The bug where gt-deacon and gt-mayor shared pane @283, causing heartbeat crosstalk\n2. **Session naming anomalies** - Sessions that don't match expected patterns\n3. **Orphaned panes** - Panes in gt-* sessions with no running process\n\nDetection approach:\n- List all gt-* sessions\n- For each session, get window/pane IDs\n- Check for duplicate pane IDs across different sessions\n- If found, report which sessions are linked and suggest fix\n\nAuto-fix: Could offer to kill and recreate the offending session.\n\nReference: gt-rt6g documented a case where this caused the daemon's Deacon heartbeats to appear in the Mayor's input, eating user prompts.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-22T17:13:04.147197-08:00","updated_at":"2025-12-27T21:29:56.325665-08:00","deleted_at":"2025-12-27T21:29:56.325665-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-up9uw","title":"load-context","description":"Run gt prime and bd prime. Verify issue assignment.\nCheck inbox for any relevant messages.\n\nRead the assigned issue (gt-ds3h3) and understand the requirements.\nIdentify any blockers or missing information.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:59:05.781933-08:00","updated_at":"2025-12-25T14:12:42.092246-08:00","dependencies":[{"issue_id":"gt-up9uw","depends_on_id":"gt-kp3s3","type":"parent-child","created_at":"2025-12-25T01:59:05.783457-08:00","created_by":"stevey"}],"deleted_at":"2025-12-25T14:12:42.092246-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-upom","title":"Witness patrol: cleanup idle orphan polecats","description":"Add patrol step to find and cleanup polecats that are idle with no assigned issue. These orphans occur when polecats crash before sending DONE or Witness misses the message. Patrol should verify git is clean before removing worktree. Part of gt-rana.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T23:09:41.756753-08:00","updated_at":"2025-12-27T21:29:56.420178-08:00","deleted_at":"2025-12-27T21:29:56.420178-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-upxny","title":"Digest: mol-deacon-patrol","description":"P18: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:00:22.888043-08:00","updated_at":"2025-12-27T21:26:02.351788-08:00","deleted_at":"2025-12-27T21:26:02.351788-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uq1d0","title":"Digest: mol-deacon-patrol","description":"Patrol 5: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:42:53.668752-08:00","updated_at":"2025-12-27T21:26:03.246873-08:00","deleted_at":"2025-12-27T21:26:03.246873-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uqv0","title":"Digest: mol-deacon-patrol","description":"Patrol 18","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:11:08.007745-08:00","updated_at":"2025-12-27T21:26:04.416801-08:00","deleted_at":"2025-12-27T21:26:04.416801-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uru8z","title":"Handle callbacks from agents","description":"Handle callbacks from agents.\n\nCheck the Mayor's inbox for messages from:\n- Witnesses reporting polecat status\n- Refineries reporting merge results\n- Polecats requesting help or escalation\n- External triggers (webhooks, timers)\n\n```bash\ngt mail inbox\n# For each message:\ngt mail read \u003cid\u003e\n# Handle based on message type\n```\n\nCallbacks may spawn new polecats, update issue state, or trigger other actions.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.775826-08:00","updated_at":"2025-12-27T21:29:55.29194-08:00","deleted_at":"2025-12-27T21:29:55.29194-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-usy0","title":"Merge: gt-3x0z.3","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-3x0z.3\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T16:03:43.535266-08:00","updated_at":"2025-12-27T21:27:22.895324-08:00","deleted_at":"2025-12-27T21:27:22.895324-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-utwc","title":"Self-mail should suppress tmux notification","description":"When sending mail to yourself (e.g., mayor sending to mayor/), the tmux notification shouldn't fire.\n\n**Rationale:**\n- Self-mail is intended for future-you (next session handoff)\n- Present-you just sent it, so you already know about it\n- The notification is redundant/confusing in this case\n\n**Fix:**\nSuppress tmux notification when sender == recipient address.","status":"tombstone","priority":3,"issue_type":"bug","created_at":"2025-12-22T17:55:39.573705-08:00","updated_at":"2025-12-27T21:29:57.547504-08:00","deleted_at":"2025-12-27T21:29:57.547504-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-utxy0","title":"Digest: mol-deacon-patrol","description":"Patrol 11: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:58.501065-08:00","updated_at":"2025-12-27T21:26:02.122711-08:00","deleted_at":"2025-12-27T21:26:02.122711-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uvpo6","title":"Digest: mol-deacon-patrol","description":"Patrol 5: All agents healthy, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:16:30.625876-08:00","updated_at":"2025-12-27T21:26:03.821293-08:00","deleted_at":"2025-12-27T21:26:03.821293-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ux0f","title":"Digest: mol-deacon-patrol","description":"Patrol #16: Stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:34:42.184561-08:00","updated_at":"2025-12-27T21:26:04.268599-08:00","deleted_at":"2025-12-27T21:26:04.268599-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uxu6o","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T16:04:07.596484-08:00","updated_at":"2026-01-01T19:55:59.981392-08:00","closed_at":"2026-01-01T19:55:59.981392-08:00","close_reason":"Stale MR - branch no longer exists","created_by":"gastown/polecats/rictus"}
{"id":"gt-uym5","title":"Implement gt mol status command","description":"Show what's on an agent's hook.\n\n```bash\ngt mol status [target]\n```\n\nOutput:\n- What's slung (molecule name, associated issue)\n- Current phase and progress\n- Whether it's a wisp\n- Next action hint\n\nIf no target, shows current agent's status.\n\nAcceptance:\n- [ ] Read pinned bead attachment\n- [ ] Display molecule/issue info\n- [ ] Show phase progress\n- [ ] Indicate wisp vs durable","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-22T03:17:34.679963-08:00","updated_at":"2025-12-27T21:29:53.260399-08:00","deleted_at":"2025-12-27T21:29:53.260399-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uzf2l","title":"Mol Mall: Formula marketplace using GitHub as backend","description":"Create a marketplace for sharing molecule formulas using GitHub repos as the hosting backend.\n\n## Architecture\n\nFormulas ──cook──→ [ephemeral proto] ──pour/wisp──→ Mol/Wisp\n ↑ │\n └────────────────── distill ─────────────────────────┘\n\n- Formulas: JSON source files (.formula.json) - the thing you share\n- Protos: Transient compilation artifacts - auto-deleted after use\n- Mols/Wisps: Execution instances - not shared directly\n\n## Key operations\n- bd distill \u003cmol-id\u003e → Extract formula from completed work\n- bd mol publish \u003cformula\u003e → Share to GitHub\n- bd mol install \u003curl\u003e → Fetch from GitHub\n- bd pour \u003cformula\u003e → Cook and spawn (proto is ephemeral)\n\n## Migrated from beads rig (bd-1dez)\nCompleted: distill, formula add, versioning, installation tracking\nRemaining: install, update, search, publish","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T21:52:40.916214-08:00","updated_at":"2025-12-27T21:29:54.910967-08:00","deleted_at":"2025-12-27T21:29:54.910967-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-uzf2l.1","title":"bd mol install: Install formula from GitHub","description":"Fetch and install a formula from a GitHub repository.\n\nUsage: bd mol install github.com/org/repo[@version]\n\nShould:\n- Clone/fetch the formula.json from the repo\n- Validate the formula structure\n- Store in local catalog\n- Track installation in .beads/installed.json","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T21:53:00.909549-08:00","updated_at":"2025-12-27T21:29:54.902724-08:00","dependencies":[{"issue_id":"gt-uzf2l.1","depends_on_id":"gt-uzf2l","type":"parent-child","created_at":"2025-12-25T21:53:00.909973-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.902724-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uzf2l.2","title":"bd mol update: Check and update installed formulas","description":"Check for and apply updates to installed formulas.\n\nUsage: bd mol update [formula-name]\n\nShould:\n- Check GitHub for newer versions/commits\n- Show available updates\n- Apply updates when requested\n- Track update history","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T21:53:01.20389-08:00","updated_at":"2025-12-27T21:29:54.894401-08:00","dependencies":[{"issue_id":"gt-uzf2l.2","depends_on_id":"gt-uzf2l","type":"parent-child","created_at":"2025-12-25T21:53:01.204329-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.894401-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uzf2l.3","title":"bd mol search: Find formulas using GitHub API","description":"Search for formulas across GitHub.\n\nUsage: bd mol search \u003cquery\u003e\n\nShould:\n- Use GitHub API to search repos with formula.json\n- Filter by topics (e.g., beads-formula)\n- Display results with descriptions and stars\n- Support pagination","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-25T21:53:01.50489-08:00","updated_at":"2025-12-27T21:29:57.330018-08:00","dependencies":[{"issue_id":"gt-uzf2l.3","depends_on_id":"gt-uzf2l","type":"parent-child","created_at":"2025-12-25T21:53:01.505329-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.330018-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-uzf2l.4","title":"bd mol publish: Push formula to GitHub repo","description":"Publish a local formula to a GitHub repository.\n\nUsage: bd mol publish \u003cformula-name\u003e \u003cgithub-repo\u003e\n\nShould:\n- Create or update GitHub repo with formula\n- Add README.md from formula metadata\n- Tag with version\n- Set appropriate topics for discoverability","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-25T21:53:01.817145-08:00","updated_at":"2025-12-27T21:29:57.321834-08:00","dependencies":[{"issue_id":"gt-uzf2l.4","depends_on_id":"gt-uzf2l","type":"parent-child","created_at":"2025-12-25T21:53:01.817569-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.321834-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v0xqo","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T15:32:26.623344-08:00","updated_at":"2025-12-27T21:26:03.116251-08:00","deleted_at":"2025-12-27T21:26:03.116251-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v1pcg","title":"ApplyBranches/ApplyGates mutate steps in place","description":"ApplyBranches and ApplyGates claim to return a modified steps slice, but they actually mutate the original steps via the pointer map from buildStepMap().\n\n```go\n// ApplyBranches docstring says:\n// Returns the modified steps slice (steps are modified in place for dependencies).\n```\n\nThis is technically documented but violates the pattern established by ApplyLoops and ApplyAdvice which return new step slices.\n\nOptions:\n1. Document explicitly that these functions mutate in place (current behavior)\n2. Clone steps before modification for true immutability\n3. Accept this as intentional since it's an optimization\n\nThe current behavior is safe because cook.go doesn't reuse the input slice, but could cause subtle bugs if callers expect immutability.\n\nRecommend option 1 (document) or 2 (clone) for consistency.","status":"tombstone","priority":3,"issue_type":"bug","created_at":"2025-12-25T15:14:01.921527-08:00","updated_at":"2025-12-27T21:29:57.338244-08:00","dependencies":[{"issue_id":"gt-v1pcg","depends_on_id":"gt-8tmz.4","type":"blocks","created_at":"2025-12-25T15:14:18.94711-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.338244-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-v30g","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:36:38.029543-08:00","updated_at":"2025-12-27T21:26:04.634615-08:00","deleted_at":"2025-12-27T21:26:04.634615-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v3bjf","title":"Digest: mol-deacon-patrol","description":"Patrol 14: Quiet","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T07:28:25.96187-08:00","updated_at":"2025-12-25T07:28:25.96187-08:00","closed_at":"2025-12-25T07:28:25.961842-08:00"}
{"id":"gt-v5hv","title":"Work on ga-y6b: Implement Refinery as Claude agent. Conve...","description":"Work on ga-y6b: Implement Refinery as Claude agent. Convert from shell to Claude agent that processes MRs in merge queue, runs tests, merges to integration branch. When done, submit MR (not PR) to integration branch for Refinery.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T22:58:17.576892-08:00","updated_at":"2025-12-27T21:29:56.889461-08:00","deleted_at":"2025-12-27T21:29:56.889461-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v5k","title":"Design: Failure modes and recovery","description":"Document failure modes and recovery strategies for Gas Town operations.\n\n## Critical Failure Modes\n\n### 1. Agent Crash Mid-Operation\n\n**Scenario**: Polecat crashes while committing, Witness crashes while verifying\n\n**Detection**:\n- Session suddenly gone (tmux check fails)\n- State shows 'working' but no session\n- Heartbeat stops (for Witness)\n\n**Recovery**:\n- Doctor detects via ZombieSessionCheck\n- Capture any recoverable state\n- Reset agent state to 'idle'\n- For Witness: auto-restart via supervisor or manual gt witness start\n\n### 2. Git State Corruption\n\n**Scenario**: Merge conflict, failed rebase, detached HEAD\n\n**Detection**:\n- Git commands fail\n- Dirty state that won't commit\n- Branch diverged from origin\n\n**Recovery**:\n- gt doctor reports git health issues\n- Manual intervention recommended\n- Severe cases: remove clone, re-clone\n\n### 3. Beads Sync Conflict\n\n**Scenario**: Two polecats modify same issue\n\n**Detection**:\n- bd sync fails with conflict\n- Beads tombstone mechanism handles most cases\n\n**Recovery**:\n- Beads has last-write-wins semantics\n- bd sync --force in extreme cases\n- Issues may need manual dedup\n\n### 4. Tmux Failure\n\n**Scenario**: Tmux server crashes, socket issues\n\n**Detection**:\n- All sessions inaccessible\n- \"no server running\" errors\n\n**Recovery**:\n- Kill any orphan processes\n- tmux kill-server \u0026\u0026 tmux start-server\n- All agent states reset to idle\n- Re-spawn active work\n\n### 5. Claude API Issues\n\n**Scenario**: Rate limits, outages, context limits\n\n**Detection**:\n- Sessions hang or produce errors\n- Repeated failure patterns\n\n**Recovery**:\n- Exponential backoff (handled by Claude Code)\n- For context limits: session cycling (mail-to-self)\n- For outages: wait and retry\n\n### 6. Disk Full\n\n**Scenario**: Clones, logs, or beads fill disk\n\n**Detection**:\n- Write operations fail\n- git/bd commands error\n\n**Recovery**:\n- Clean up logs: rm ~/.gastown/logs/*\n- Remove old polecat clones\n- gt doctor --fix can clean some cruft\n\n### 7. Network Failure\n\n**Scenario**: Can't reach GitHub, API servers\n\n**Detection**:\n- git fetch/push fails\n- Claude sessions hang\n\n**Recovery**:\n- Work continues locally\n- Queue pushes for later\n- Sync when connectivity restored\n\n## Recovery Principles\n\n1. **Fail safe**: Prefer stopping over corrupting\n2. **State is recoverable**: Git and beads have recovery mechanisms\n3. **Doctor heals**: gt doctor --fix handles common issues\n4. **Emergency stop**: gt stop --all as last resort\n5. **Human escalation**: Some failures need Overseer intervention\n\n## Implementation\n\n- Document each failure mode in architecture.md\n- Ensure doctor checks cover detection\n- Add recovery hints to error messages\n- Log all failures for debugging","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T23:19:07.198289-08:00","updated_at":"2025-12-27T21:29:54.504304-08:00","deleted_at":"2025-12-27T21:29:54.504304-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v6nuz","title":"Digest: mol-deacon-patrol","description":"Patrol 15","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:56:31.914216-08:00","updated_at":"2025-12-27T21:26:01.50263-08:00","deleted_at":"2025-12-27T21:26:01.50263-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v7wq4","title":"Check Witness and Refinery health","description":"Check Witness and Refinery health for each rig.\n\n**ZFC Principle**: You (Claude) make the judgment call about what is \"stuck\" or\n\"unresponsive\" - there are no hardcoded thresholds in Go. Read the signals,\nconsider context, and decide.\n\nFor each rig, run:\n```bash\ngt witness status \u003crig\u003e\ngt refinery status \u003crig\u003e\n```\n\n**Signals to assess:**\n\n| Component | Healthy Signals | Concerning Signals |\n|-----------|-----------------|-------------------|\n| Witness | State: running, recent activity | State: not running, no heartbeat |\n| Refinery | State: running, queue processing | Queue stuck, merge failures |\n\n**Tracking unresponsive cycles:**\n\nMaintain in your patrol state (persisted across cycles):\n```\nhealth_state:\n \u003crig\u003e:\n witness:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n refinery:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n```\n\n**Decision matrix** (you decide the thresholds based on context):\n\n| Cycles Unresponsive | Suggested Action |\n|---------------------|------------------|\n| 1-2 | Note it, check again next cycle |\n| 3-4 | Attempt restart: gt witness restart \u003crig\u003e |\n| 5+ | Escalate to Mayor with context |\n\n**Restart commands:**\n```bash\ngt witness restart \u003crig\u003e\ngt refinery restart \u003crig\u003e\n```\n\n**Escalation:**\n```bash\ngt mail send mayor/ -s \"Health: \u003crig\u003e \u003ccomponent\u003e unresponsive\" \\\n -m \"Component has been unresponsive for N cycles. Restart attempts failed.\n Last healthy: \u003ctimestamp\u003e\n Error signals: \u003cdetails\u003e\"\n```\n\nReset unresponsive_cycles to 0 when component responds normally.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.77547-08:00","updated_at":"2025-12-27T21:29:55.300382-08:00","dependencies":[{"issue_id":"gt-v7wq4","depends_on_id":"gt-0skyg","type":"blocks","created_at":"2025-12-25T02:11:33.819955-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.300382-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-v8im","title":"Digest: mol-deacon-patrol","description":"Patrol 16: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:25.132341-08:00","updated_at":"2025-12-27T21:26:04.549007-08:00","deleted_at":"2025-12-27T21:26:04.549007-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vaqk","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:22","description":"Patrol 5: all healthy, 2 crews (dave, max) active","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:22:34.309422-08:00","updated_at":"2025-12-27T21:26:05.305715-08:00","deleted_at":"2025-12-27T21:26:05.305715-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vc1n","title":"Tmux status line: rig color themes and worker identity display","description":"## Summary\n\nCustomize tmux status line for Gas Town workers with:\n1. Per-rig configurable color themes\n2. Clear worker name and role visibility\n\n## Current Problem\n\n- Only mayor shows in status line (and truncated)\n- Can't tell which rig/worker you're looking at\n- All sessions look the same\n\n## Proposed Design\n\n### Per-rig colors\n```yaml\n# In rig config or beads\ntheme:\n primary: '#ff6600' # Orange for gastown\n secondary: '#333333'\n accent: '#ffcc00'\n```\n\n### Status line format\n```\n[gastown/Rictus] polecat | gt-70b3 | branch: polecat/Rictus\n[beads/emma] crew | working | branch: main \n[mayor] coordinator | idle\n```\n\n### Components\n- Rig name with rig color\n- Worker name\n- Role (polecat/crew/mayor/witness/refinery)\n- Current issue or status\n- Branch name\n\n## Configuration\n\nCould use pinned beads for this (see gm-w13, beads-6v2):\n- `bd show \u003crig\u003e-theme` returns theme config\n- Stored as pinned bead, always available\n- Part of 'config in beads data plane' initiative","notes":"Implementation complete. Core features: per-rig color themes, dynamic status line with issue/mail indicators, gt theme/issue commands. Ready for testing.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-18T21:58:58.547188-08:00","updated_at":"2025-12-27T21:29:57.051702-08:00","deleted_at":"2025-12-27T21:29:57.051702-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-vc3l4","title":"No-tmux mode: naked Claude Code operation without daemon/tmux","description":"## Problem\n\nTmux is crashing workers. The daemon relies on tmux for:\n- Session management (creating/killing panes)\n- Nudging agents via SendKeys\n- Crash detection via pane-died hooks\n\nWhen tmux is unstable, the entire Gas Town operation fails.\n\n## Goal\n\nEnable Gas Town to operate without tmux or the daemon, relying solely on:\n- Naked Claude Code sessions (terminal or IDE)\n- Deacon self-handoff and patrol robustness\n- Pinned beads for propulsion (already implemented)\n- **Beads as universal data plane** - args, context, everything flows through beads\n\n## Key Insight: Beads Replace SendKeys\n\nCurrently --args is injected via tmux SendKeys. In no-tmux mode:\n- Store args in the pinned bead description (attached_args field)\n- gt prime reads and displays args from pinned bead\n- Wisps are infinite ephemeral sticky notes - use them for transient context\n- No prompt injection needed - agent discovers everything via bd show\n\n## Degraded Behavior (No-Tmux Mode)\n\n**What still works:**\n- Propulsion via pinned beads (agents pick up work on startup)\n- Self-handoff (agents can cycle themselves)\n- Patrol loops (Deacon, Witness, Refinery keep running)\n- Mail system (beads-based, no tmux needed)\n- Args passed via bead description\n\n**What is degraded:**\n- **No interrupts**: Cannot nudge busy agents mid-task\n- **Polling only**: Agents must actively check inbox (no push notifications)\n- **Await steps block**: \"Wait for human\" steps require manual agent restart\n- **No crash detection**: pane-died hooks unavailable\n- **Manual startup**: Human must start each agent in separate terminal\n\n**Workflow implications:**\n- Patrol agents work fine (they poll as part of their loop)\n- Task workers need restart to pick up new work\n- Cannot redirect a busy worker to urgent task\n- Human must monitor and restart crashed agents\n\n## Requirements\n\n1. **gt sling --args stores in bead**: Write args to pinned bead description\n2. **gt prime displays args**: Read attached_args from pinned bead\n3. **gt spawn --naked**: Assign work without creating tmux session\n4. **Documentation**: Explain no-tmux mode and degraded behavior\n5. **Deacon self-sustaining**: Must cycle reliably without daemon\n\n## Acceptance Criteria\n\n- Can assign work with args without tmux\n- Agent discovers args via gt prime / bd show on startup\n- Deacon patrol runs indefinitely via self-handoff\n- Workers started manually pick up pinned work\n- Documentation explains what works vs degraded in no-tmux mode","status":"tombstone","priority":0,"issue_type":"feature","created_at":"2025-12-26T17:02:14.865284-08:00","updated_at":"2025-12-27T21:29:45.251294-08:00","deleted_at":"2025-12-27T21:29:45.251294-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-vc830","title":"Digest: mol-deacon-patrol","description":"Patrol 6: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:13:41.924077-08:00","updated_at":"2025-12-27T21:26:01.058151-08:00","deleted_at":"2025-12-27T21:26:01.058151-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vci","title":"Mayor handoff mail template","description":"Add MAYOR_HANDOFF mail template to templates.py.\n\n## Template Function\n\ndef mayor_handoff(\n active_swarms: List[SwarmStatus],\n rig_status: Dict[str, RigStatus],\n pending_escalations: List[Escalation],\n in_flight_decisions: List[Decision],\n recent_actions: List[str],\n delegated_work: List[DelegatedItem],\n user_requests: List[str],\n next_steps: List[str],\n warnings: Optional[str] = None,\n session_duration: Optional[str] = None,\n) -\u003e Message:\n metadata = {\n 'template': 'MAYOR_HANDOFF',\n 'timestamp': datetime.utcnow().isoformat(),\n 'session_duration': session_duration,\n 'active_swarm_count': len(active_swarms),\n 'pending_escalation_count': len(pending_escalations),\n }\n # ... format sections ...\n return Message.create(\n sender='mayor/',\n recipient='mayor/',\n subject='Session Handoff',\n body=body,\n priority='high',\n )\n\n## Metadata Fields\n\n- template: MAYOR_HANDOFF\n- timestamp: ISO format\n- session_duration: Human readable\n- active_swarm_count: Number of active swarms\n- pending_escalation_count: Number of escalations\n\n## Mail Priority\n\nUse priority='high' to ensure handoff is seen on startup.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T20:15:30.26323-08:00","updated_at":"2025-12-27T21:29:54.538148-08:00","dependencies":[{"issue_id":"gt-vci","depends_on_id":"gt-u82","type":"blocks","created_at":"2025-12-15T20:15:39.554108-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.538148-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vdp0","title":"Crew workers getting wrong CLAUDE.md (shows Refinery)","description":"## Problem\n\nCrew workers (emma, dave) have CLAUDE.md that says they're the Refinery.\n\n## Evidence\n\n```\n$ head -5 /Users/stevey/gt/beads/crew/emma/CLAUDE.md\n# Claude: Beads Refinery\nYou are the **Refinery** for the **beads** rig...\n```\n\n## Expected\n\nShould use crew.md.tmpl which correctly says:\n```\n# Crew Worker Context\nYou are a **crew worker** - the overseer's (human's) personal workspace...\n```\n\n## Impact\n\n- Crew workers have wrong identity in static context\n- `gt prime` correctly outputs crew context, but CLAUDE.md conflicts\n- Confusing role information\n\n## Fix\n\nCheck `gt crew create` or whatever populates CLAUDE.md for crew workers.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:40:56.518032-08:00","updated_at":"2025-12-27T21:29:54.100639-08:00","dependencies":[{"issue_id":"gt-vdp0","depends_on_id":"gt-l4gm","type":"blocks","created_at":"2025-12-18T21:50:04.955247-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.100639-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-veez1","title":"Merge: rictus-mjtlq9xg","description":"branch: polecat/rictus-mjtlq9xg\ntarget: main\nsource_issue: rictus-mjtlq9xg\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:14:01.912377-08:00","updated_at":"2025-12-30T23:12:42.99434-08:00","closed_at":"2025-12-30T23:12:42.99434-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/rictus"}
{"id":"gt-vg6u","title":"TIDY UP: Your previous work (gt-odvf: bd mol CLI docs) wa...","description":"TIDY UP: Your previous work (gt-odvf: bd mol CLI docs) was already merged to main. Check your git status is clean, sync beads, and if nothing to do, just run 'gt done'.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-21T17:26:20.360645-08:00","updated_at":"2025-12-27T21:29:56.520829-08:00","deleted_at":"2025-12-27T21:29:56.520829-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vh61f","title":"Merge: furiosa-mjtj9d4g","description":"branch: polecat/furiosa-mjtj9d4g\ntarget: main\nsource_issue: furiosa-mjtj9d4g\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T22:39:23.311696-08:00","updated_at":"2025-12-30T23:12:37.075995-08:00","closed_at":"2025-12-30T23:12:37.075995-08:00","close_reason":"Branch already merged","created_by":"gastown/polecats/furiosa"}
{"id":"gt-vhn1.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-vhn1\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T22:27:56.71661-08:00","updated_at":"2025-12-27T21:29:55.679222-08:00","deleted_at":"2025-12-27T21:29:55.679222-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vizdt","title":"Digest: mol-deacon-patrol","description":"Patrol 19: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:20:10.919259-08:00","updated_at":"2025-12-27T21:26:02.650607-08:00","deleted_at":"2025-12-27T21:26:02.650607-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vj3rb","title":"Digest: mol-deacon-patrol","description":"Patrol 3: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:43.678006-08:00","updated_at":"2025-12-27T21:26:02.433771-08:00","deleted_at":"2025-12-27T21:26:02.433771-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vjv","title":"Add bulk session stop command (gt session stop --all)","description":"When decommissioning a rig, need to stop multiple sessions one at a time. A --all or --rig flag would allow: gt session stop --rig gastown","status":"tombstone","priority":3,"issue_type":"feature","created_at":"2025-12-18T11:33:33.394649-08:00","updated_at":"2025-12-27T21:29:57.622515-08:00","deleted_at":"2025-12-27T21:29:57.622515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-vjw","title":"Swarm learning: Session cleanup missing from swarm workflow","description":"## Problem\n\nAfter Enders Game swarm completed (18 issues merged), 16 polecat sessions were left running but idle. No automated cleanup occurred.\n\n## What Should Happen\n\n1. Witness detects polecat completed work (idle at prompt)\n2. Witness verifies git state is clean\n3. Witness shuts down session\n4. Witness reports completion to Mayor\n\n## GGT Components\n\n- gt-cxx: Witness context cycling (covers self-cycling)\n- gt-u1j.9: Witness daemon heartbeat loop\n- gt-kmn.6: Witness swarm landing protocol\n\n## Recommendation\n\nAdd to Witness responsibilities:\n- Monitor for 'work complete' signals (DONE keyword, idle detection)\n- Automated session shutdown after verification\n- Swarm completion reporting to Mayor\n\nSee also: architecture.md 'Worker Cleanup (Witness-Owned)' section which describes this but it wasn't implemented in PGT.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-16T01:27:52.796587-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-vlpvi","title":"Digest: mol-deacon-patrol","description":"Patrol 15: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T03:56:28.353902-08:00","updated_at":"2025-12-27T21:26:03.737869-08:00","deleted_at":"2025-12-27T21:26:03.737869-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vmk7","title":"Guardrail: Verify commits exist before closing polecat issues","description":"## Updated Approach (ZFC)\n\nThe original proposal was for mechanical guardrails in bd close. This contradicts the ZFC principle: all decisions go to models, not code.\n\n## Correct Solution\n\nThe verification should happen in **mol-polecat-arm** execute step, not bd close:\n\nIn the pre-kill-verify action:\n```bash\n# Current steps\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n\n# ADD: Verify productive work\ngit log --oneline --grep='\u003cissue-id\u003e' | head -1\n# If no commits AND issue being closed as 'done' → flag for review\n```\n\nThe agent (Witness) makes the decision. The mol gives it the verification step.\n\n## Why Not Code Guardrails\n\nPolecats have legitimate reasons to close issues without commits:\n- Already done (someone else fixed it)\n- Deferred (out of scope)\n- Escalated (needs human decision)\n- Duplicate (merged with another issue)\n\nA code guardrail would block all these. The mol step lets the agent verify AND make the judgment call.\n\n## Implementation\n\nUpdate mol-polecat-arm execute step to include commit verification for 'done' closures.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T14:35:24.695717-08:00","updated_at":"2025-12-27T21:29:52.973768-08:00","deleted_at":"2025-12-27T21:29:52.973768-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vmm9","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:26","description":"Patrol 13: quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:26:56.830842-08:00","updated_at":"2025-12-27T21:26:05.2473-08:00","deleted_at":"2025-12-27T21:26:05.2473-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vmpo","title":"orphan-check","description":"Find abandoned work. Check for in_progress issues with no active agent.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T14:27:33.988644-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-vnp9","title":"tmux notifications: display-message too subtle, use send-keys instead","description":"## Problem\n\n`tmux display-message` notifications are not visible enough - they appear briefly in the status bar and are easy to miss.\n\n## Current Behavior\n\nrouter.go uses:\n```go\nr.tmux.DisplayMessageDefault(sessionID, notification)\n```\n\n## What Works\n\nSending echo commands directly to the terminal:\n```bash\ntmux send-keys -t \u003csession\u003e \"echo '📬 NEW MAIL from mayor'\" Enter\n```\n\n## Proposed Fix\n\nChange notification method to send visible output to the terminal, perhaps with a box/banner:\n```\n━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\n📬 NEW MAIL from mayor\nSubject: \u003csubject\u003e\nRun: bd mail inbox\n━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\n```\n\n## Considerations\n\n- This interrupts the terminal output (acceptable for important mail)\n- Could check if Claude is mid-response and queue notification\n- Or use tmux popup if available","notes":"Additional issues:\n1. Enter key not sent properly when chained with send-keys\n2. Need to debounce and send Enter separately\n3. Correct pattern: send text, sleep briefly, then send Enter","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-18T21:35:28.542985-08:00","updated_at":"2025-12-27T21:29:57.085269-08:00","deleted_at":"2025-12-27T21:29:57.085269-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-vnryt","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:29:45.71429-08:00","updated_at":"2026-01-01T19:31:49.46659-08:00","closed_at":"2026-01-01T19:31:49.46659-08:00","close_reason":"Merged to main at 937ee2c8","created_by":"gastown/polecats/rictus"}
{"id":"gt-vo4u9","title":"Session ended: gt-gastown-crew-george","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:48:15.648798-08:00","updated_at":"2026-01-04T16:41:37.833516-08:00","closed_at":"2026-01-04T16:41:37.833516-08:00","close_reason":"Archived","created_by":"gastown/crew/george"}
{"id":"gt-vqhc","title":"gt sling/handoff fails: slashes in agent identity create invalid hook paths","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-24T23:24:49.495017-08:00","updated_at":"2025-12-27T21:29:52.556135-08:00","deleted_at":"2025-12-27T21:29:52.556135-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-vqpmf","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 14: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:46:20.527809-08:00","updated_at":"2025-12-27T21:26:01.36161-08:00","deleted_at":"2025-12-27T21:26:01.36161-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vuld8","title":"Digest: mol-deacon-patrol","description":"Patrol 20: All healthy, 2 crew active","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:49:39.892429-08:00","updated_at":"2025-12-27T21:26:01.10764-08:00","deleted_at":"2025-12-27T21:26:01.10764-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vv4i","title":"Polecat template: move session close checklist into molecule steps","description":"Template has prose checklists for 'Before Signaling Done' and 'SESSION CLOSE PROTOCOL'. These should be encoded as tail steps in the polecat molecule, not repeated as prose in CLAUDE.md. Reduces duplication and ensures the steps are actually followed.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:50.666492-08:00","updated_at":"2025-12-27T21:29:55.908628-08:00","dependencies":[{"issue_id":"gt-vv4i","depends_on_id":"gt-t9u7","type":"parent-child","created_at":"2025-12-23T16:57:16.612852-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.908628-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vve6k","title":"Merge: dag-mjxpcv5v","description":"branch: polecat/dag-mjxpcv5v\ntarget: main\nsource_issue: dag-mjxpcv5v\nrig: gastown\nagent_bead: gt-gastown-polecat-dag","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:52:32.804763-08:00","updated_at":"2026-01-02T18:55:06.87786-08:00","closed_at":"2026-01-02T18:55:06.87786-08:00","close_reason":"Merged to main at 92106afd","created_by":"gastown/polecats/dag"}
{"id":"gt-vx2qv","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 9: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:07.233471-08:00","updated_at":"2025-12-28T11:22:07.233471-08:00","closed_at":"2025-12-28T11:22:07.233434-08:00"}
{"id":"gt-vz151","title":"Add crew auto-start config to gt start","description":"gt start needs a config file to specify:\n- Which crew members auto-start per rig\n- Which rigs to auto-start (instead of requiring --all or --rigs flags)\n\nRequested defaults:\n- beads rig: dave\n- gastown rig: max, joe\n\nThis would allow 'gt start' to bring up the configured crew without manual flags.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-26T19:41:47.241478-08:00","updated_at":"2025-12-27T21:29:54.722339-08:00","deleted_at":"2025-12-27T21:29:54.722339-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-vz2xs","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Halfway check, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:33:04.470774-08:00","updated_at":"2025-12-27T21:26:03.894642-08:00","deleted_at":"2025-12-27T21:26:03.894642-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-vzic","title":"README missing prerequisites section (tmux required)","description":"The README.md does not mention that tmux is required.\nAll agent sessions use tmux, but fresh users have no way to know this.\n\nAdd prerequisites section:\n- Go 1.23+\n- Git\n- tmux (required for agent sessions)\n- Claude Code CLI (for agents)","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-24T12:50:28.988771-08:00","updated_at":"2025-12-27T21:29:52.671896-08:00","dependencies":[{"issue_id":"gt-vzic","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:04.860313-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.671896-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-w0e0","title":"Merge: gt-h1n5","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-h1n5\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T19:37:59.964737-08:00","updated_at":"2025-12-27T21:27:22.410168-08:00","deleted_at":"2025-12-27T21:27:22.410168-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-w3bu","title":"gt spawn: Enter key needs debounce delay after paste","description":"## Problem\n\nWhen spawning polecats via `gt spawn`, instructions are pasted into tmux but workers often sit idle at prompt because the Enter key arrives before the paste is fully processed.\n\n## Observed Behavior\n\n- Instructions pasted via tmux send-keys\n- Enter key sent immediately after\n- Worker session shows prompt with no input (paste not submitted)\n- Requires manual intervention to nudge workers\n\n## Expected Behavior\n\n- Paste completes fully\n- Enter key submits the pasted instructions\n- Worker begins executing immediately\n\n## Root Cause\n\nLikely a race condition between paste buffer processing and keypress handling. Need either:\n1. Debounce delay before sending Enter\n2. Longer delay (Tmax) to ensure paste completes\n3. Alternative submission mechanism\n\n## Impact\n\n- Swarm of 10 workers had multiple stalls at startup\n- Required manual tmux send-keys to unstick them\n- Defeats purpose of automated spawning\n\n## References\n\n- Observed during Mad Max swarm (Dec 18, 2025)\n- Affects gt spawn command in internal/cmd/spawn.go","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T21:06:50.734123-08:00","updated_at":"2025-12-27T21:29:54.126014-08:00","deleted_at":"2025-12-27T21:29:54.126014-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-w4v1o","title":"Merge: furiosa-dogs","description":"branch: polecat/furiosa-dogs\ntarget: main\nsource_issue: furiosa-dogs\nrig: gastown","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2025-12-30T11:01:55.022205-08:00","updated_at":"2025-12-31T02:00:37.377861-08:00","closed_at":"2025-12-31T02:00:37.377861-08:00","close_reason":"Branch no longer exists on remote","created_by":"gastown/polecats/furiosa"}
{"id":"gt-w52o8","title":"Digest: mol-deacon-patrol","description":"Patrol 19: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:05:39.718034-08:00","updated_at":"2025-12-27T21:26:03.960258-08:00","deleted_at":"2025-12-27T21:26:03.960258-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-w5dj","title":"Merge: gt-unrd","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-unrd\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T15:42:06.600633-08:00","updated_at":"2025-12-27T21:27:22.750283-08:00","deleted_at":"2025-12-27T21:27:22.750283-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-w6ty","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:48","description":"Patrol 18: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:48:00.334125-08:00","updated_at":"2025-12-27T21:26:05.046336-08:00","deleted_at":"2025-12-27T21:26:05.046336-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-w775","title":"MR: gt-svi.1 (polecat/Furiosa)","description":"branch: polecat/Furiosa\ntarget: main\nsource_issue: gt-svi.1","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T20:21:40.921429-08:00","updated_at":"2025-12-27T21:27:22.791578-08:00","deleted_at":"2025-12-27T21:27:22.791578-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-w8322","title":"Digest: mol-deacon-patrol","description":"Patrol 9: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:30:09.33985-08:00","updated_at":"2025-12-27T21:26:03.279435-08:00","deleted_at":"2025-12-27T21:26:03.279435-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-w91xz","title":"CLI cleanup: Remove duplicate mol commands from gt","description":"## Problem\ngt mol has commands that duplicate bd mol functionality:\n- `gt mol catalog` (duplicate of `bd formula list`)\n- `gt mol show` (duplicate of `bd mol show`)\n- `gt mol squash` (duplicate of `bd mol squash`)\n- `gt mol burn` (duplicate of `bd mol burn`)\n- `gt mol bond` (duplicate of `bd mol bond`)\n\n## Principle\n- **bd** = beads operations (issues, molecules, formulas)\n- **gt** = agent operations (sessions, communication, work dispatch)\n\ngt should delegate to bd for beads operations, not duplicate them.\n\n## Proposed Changes\nKeep gt mol commands that are agent-specific:\n- `gt mol status` - what is on MY hook\n- `gt mol attach/detach` - hook management\n- `gt mol current` - what should I work on\n- `gt mol progress` - my progress through workflow\n\nRemove/delegate to bd:\n- `gt mol catalog` → use `bd formula list`\n- `gt mol show` → use `bd mol show`\n- `gt mol squash` → use `bd mol squash`\n- `gt mol burn` → use `bd mol burn`\n- `gt mol bond` → use `bd mol bond`\n\n## Alternative\nKeep as thin wrappers that just call bd, for convenience.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T14:28:05.954467-08:00","updated_at":"2025-12-27T21:29:54.713849-08:00","created_by":"mayor","deleted_at":"2025-12-27T21:29:54.713849-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-w9o","title":"/restart: Personal slash command for in-place agent restart","description":"Create ~/.claude/commands/restart.md that restarts current Gas Town agent in place.\n\n## Detection\n- Read tmux session name: gt-mayor, gt-witness-*, gt-refinery-*, gt-polecat-*\n- Fallback: check GT_ROLE env var\n\n## Behavior by role\n- mayor: gt mayor restart (sends Ctrl-C, loop respawns)\n- witness: gt witness restart\n- refinery: gt refinery restart \n- polecat: gt polecat restart (or witness-mediated)\n\n## Command format\nUses backticks for inline bash to detect context, then instructs Claude to run appropriate restart.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-18T18:32:30.043125-08:00","updated_at":"2025-12-27T21:29:54.151093-08:00","deleted_at":"2025-12-27T21:29:54.151093-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wav5","title":"Merge: gt-a95","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-a95\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-20T21:23:08.375434-08:00","updated_at":"2025-12-27T21:27:22.642527-08:00","deleted_at":"2025-12-27T21:27:22.642527-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-wewf.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-wewf\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T00:10:43.433441-08:00","updated_at":"2025-12-27T21:29:55.6126-08:00","deleted_at":"2025-12-27T21:29:55.6126-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wexr","title":"Polecat role references deprecated 'swarm' terminology","description":"prompts/roles/polecat.md line 12 says:\n'Part of a swarm: Other polecats may be working on related issues in parallel'\n\nBut architecture.md explicitly states:\n'There are no swarm IDs - just epics with children'\n\nThe swarm concept has been replaced by streams/dependency model.\nUpdate polecat.md to remove swarm references.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-24T12:50:32.097647-08:00","updated_at":"2025-12-27T21:29:52.655198-08:00","dependencies":[{"issue_id":"gt-wexr","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:05.023976-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:52.655198-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-whpnr","title":"Digest: mol-deacon-patrol","description":"Patrol 3: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:09:46.3409-08:00","updated_at":"2025-12-27T21:26:01.082412-08:00","deleted_at":"2025-12-27T21:26:01.082412-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wib7t","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final patrol, all healthy, handoff triggered","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:30:06.242412-08:00","updated_at":"2025-12-27T21:26:03.609649-08:00","deleted_at":"2025-12-27T21:26:03.609649-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-271","title":"Check own context limit","description":"Check own context limit.\n\nThe Deacon runs in a Claude session with finite context. Check if approaching the limit:\n\n```bash\ngt context --usage\n```\n\nIf context is high (\u003e80%), prepare for handoff:\n- Summarize current state\n- Note any pending work\n- Write handoff to molecule state\n\nThis enables the Deacon to burn and respawn cleanly.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T13:04:23.04993-08:00","updated_at":"2025-12-26T13:09:27.201766-08:00","closed_at":"2025-12-26T13:09:27.201766-08:00","dependencies":[{"issue_id":"gt-wisp-271","depends_on_id":"gt-wisp-9ss","type":"blocks","created_at":"2025-12-26T13:04:23.202384-08:00","created_by":"deacon"}]}
{"id":"gt-wisp-3fc","title":"Check own context limit","description":"Check own context usage.\n\nIf context is HIGH (\u003e80%):\n- Ensure state is saved to handoff bead\n- Prepare for burn/respawn\n\nIf context is LOW:\n- Can continue patrolling\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.821372-08:00","updated_at":"2025-12-27T21:29:55.036103-08:00","dependencies":[{"issue_id":"gt-wisp-3fc","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.907769-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-3fc","depends_on_id":"gt-wisp-hp3","type":"blocks","created_at":"2025-12-25T19:54:36.921536-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.036103-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-48l","title":"mol-witness-patrol","description":"Per-rig worker monitor patrol loop using the Christmas Ornament pattern.\n\nThe Witness is the Pit Boss for your rig. You watch polecats, nudge them toward\ncompletion, verify clean git state before kills, and escalate stuck workers.\n\n**You do NOT do implementation work.** Your job is oversight, not coding.\n\nThis molecule uses dynamic bonding to spawn mol-polecat-arm for each worker,\nenabling parallel inspection with a fanout gate for aggregation.\n\n## The Christmas Ornament Shape\n\n```\n ★ mol-witness-patrol (trunk)\n /|\\\n ┌────────┘ │ └────────┐\n PREFLIGHT DISCOVERY CLEANUP\n │ │ │\n inbox-check survey aggregate (WaitsFor: all-children)\n check-refnry │ save-state\n load-state │ generate-summary\n ↓ context-check\n ┌───────┼───────┐ burn-or-loop\n ● ● ● mol-polecat-arm (dynamic)\n ace nux toast\n```\n","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-25T19:54:36.819952-08:00","updated_at":"2025-12-27T21:26:02.441987-08:00","deleted_at":"2025-12-27T21:26:02.441987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-wisp-5yy","title":"Load persisted patrol state","description":"Read handoff bead and get nudge counts.\n\nLoad persistent state from the witness handoff bead:\n- Active workers and their status from last cycle\n- Nudge counts per worker per issue\n- Last nudge timestamps\n- Pending escalations\n\n```bash\nbd show \u003chandoff-bead-id\u003e\n```\n\nIf no handoff exists (fresh start), initialize empty state.\nThis state persists across wisp burns and session cycles.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.82212-08:00","updated_at":"2025-12-27T21:29:55.010987-08:00","dependencies":[{"issue_id":"gt-wisp-5yy","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.977849-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-5yy","depends_on_id":"gt-wisp-ps8","type":"blocks","created_at":"2025-12-25T19:54:36.992199-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.010987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-646","title":"Evaluate pending async gates","description":"Evaluate pending async gates.\n\nGates are async coordination primitives that block until conditions are met.\nThe Deacon is responsible for monitoring gates and closing them when ready.\n\n**Timer gates** (await_type: timer):\nCheck if elapsed time since creation exceeds the timeout duration.\n\n```bash\n# List all open gates\nbd gate list --json\n\n# For each timer gate, check if elapsed:\n# - CreatedAt + Timeout \u003c Now → gate is ready to close\n# - Close with: bd gate close \u003cid\u003e --reason \"Timer elapsed\"\n```\n\n**GitHub gates** (await_type: gh:run, gh:pr) - handled in separate step.\n\n**Human/Mail gates** - require external input, skip here.\n\nAfter closing a gate, the Waiters field contains mail addresses to notify.\nSend a brief notification to each waiter that the gate has cleared.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T13:04:23.04871-08:00","updated_at":"2025-12-26T13:06:05.44536-08:00","closed_at":"2025-12-26T13:06:05.44536-08:00","dependencies":[{"issue_id":"gt-wisp-646","depends_on_id":"gt-wisp-zuj","type":"blocks","created_at":"2025-12-26T13:04:23.144118-08:00","created_by":"deacon"}]}
{"id":"gt-wisp-78q","title":"Execute registered plugins","description":"Execute registered plugins.\n\nScan ~/gt/plugins/ for plugin directories. Each plugin has a plugin.md with YAML frontmatter defining its gate (when to run) and instructions (what to do).\n\nSee docs/deacon-plugins.md for full documentation.\n\nGate types:\n- cooldown: Time since last run (e.g., 24h)\n- cron: Schedule-based (e.g., \"0 9 * * *\")\n- condition: Metric threshold (e.g., wisp count \u003e 50)\n- event: Trigger-based (e.g., startup, heartbeat)\n\nFor each plugin:\n1. Read plugin.md frontmatter to check gate\n2. Compare against state.json (last run, etc.)\n3. If gate is open, execute the plugin\n\nPlugins marked parallel: true can run concurrently using Task tool subagents. Sequential plugins run one at a time in directory order.\n\nSkip this step if ~/gt/plugins/ does not exist or is empty.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T14:04:44.877976-08:00","updated_at":"2025-12-26T14:07:27.717637-08:00","closed_at":"2025-12-26T14:07:27.717637-08:00"}
{"id":"gt-wisp-99t","title":"Burn and respawn or loop","description":"Burn and let daemon respawn, or exit if context high.\n\nDecision point at end of patrol cycle:\n\nIf context is LOW:\n- Sleep briefly (avoid tight loop)\n- Return to inbox-check step\n\nIf context is HIGH:\n- Write state to persistent storage\n- Exit cleanly\n- Let the daemon orchestrator respawn a fresh Deacon\n\nThe daemon ensures Deacon is always running:\n```bash\n# Daemon respawns on exit\ngt daemon status\n```\n\nThis enables infinite patrol duration via context-aware respawning.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T16:16:32.014153-08:00","updated_at":"2025-12-26T20:07:45.150162-08:00","closed_at":"2025-12-26T20:07:45.150162-08:00","dependencies":[{"issue_id":"gt-wisp-99t","depends_on_id":"gt-wisp-mpm","type":"blocks","created_at":"2025-12-26T16:16:32.105946-08:00","created_by":"stevey"}]}
{"id":"gt-wisp-a1c","title":"Process witness mail","description":"Process witness mail: lifecycle requests, help requests.\n\n```bash\ngt mail inbox\n```\n\nHandle by message type:\n- **LIFECYCLE/Shutdown**: Queue for pre-kill verification\n- **Blocked/Help**: Assess if resolvable or escalate\n- **HANDOFF**: Load predecessor state\n- **Work complete**: Verify issue closed, proceed to pre-kill\n\nRecord any pending actions for later steps.\nMark messages as processed when complete.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.821859-08:00","updated_at":"2025-12-27T21:29:55.019343-08:00","dependencies":[{"issue_id":"gt-wisp-a1c","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.963525-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.019343-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-cz5","title":"Check Witness and Refinery health","description":"Check Witness and Refinery health for each rig.\n\n**ZFC Principle**: You (Claude) make the judgment call about what is \"stuck\" or \"unresponsive\" - there are no hardcoded thresholds in Go. Read the signals, consider context, and decide.\n\nFor each rig, run:\n```bash\ngt witness status \u003crig\u003e\ngt refinery status \u003crig\u003e\n```\n\n**Signals to assess:**\n\n| Component | Healthy Signals | Concerning Signals |\n|-----------|-----------------|-------------------|\n| Witness | State: running, recent activity | State: not running, no heartbeat |\n| Refinery | State: running, queue processing | Queue stuck, merge failures |\n\n**Tracking unresponsive cycles:**\n\nMaintain in your patrol state (persisted across cycles):\n```\nhealth_state:\n \u003crig\u003e:\n witness:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n refinery:\n unresponsive_cycles: 0\n last_seen_healthy: \u003ctimestamp\u003e\n```\n\n**Decision matrix** (you decide the thresholds based on context):\n\n| Cycles Unresponsive | Suggested Action |\n|---------------------|------------------|\n| 1-2 | Note it, check again next cycle |\n| 3-4 | Attempt restart: gt witness restart \u003crig\u003e |\n| 5+ | Escalate to Mayor with context |\n\n**Restart commands:**\n```bash\ngt witness restart \u003crig\u003e\ngt refinery restart \u003crig\u003e\n```\n\n**Escalation:**\n```bash\ngt mail send mayor/ -s \"Health: \u003crig\u003e \u003ccomponent\u003e unresponsive\" \\\n -m \"Component has been unresponsive for N cycles. Restart attempts failed.\n Last healthy: \u003ctimestamp\u003e\n Error signals: \u003cdetails\u003e\"\n```\n\nReset unresponsive_cycles to 0 when component responds normally.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-26T16:16:32.012846-08:00","updated_at":"2025-12-26T16:18:17.168113-08:00","closed_at":"2025-12-26T16:18:17.168113-08:00","dependencies":[{"issue_id":"gt-wisp-cz5","depends_on_id":"gt-wisp-9hf","type":"blocks","created_at":"2025-12-26T16:16:32.071988-08:00","created_by":"stevey"}]}
{"id":"gt-wisp-eju","title":"Nudge newly spawned polecats","description":"Nudge newly spawned polecats that are ready for input.\n\nWhen polecats are spawned, their Claude session takes 10-20 seconds to initialize. The spawn command returns immediately without waiting. This step finds spawned polecats that are now ready and sends them a trigger to start working.\n\n**ZFC-Compliant Observation** (AI observes AI):\n\n```bash\n# View pending spawns with captured terminal output\ngt deacon pending\n```\n\nFor each pending session, analyze the captured output:\n- Look for Claude's prompt indicator \"\u003e \" at the start of a line\n- If prompt is visible, Claude is ready for input\n- Make the judgment call yourself - you're the AI observer\n\nFor each ready polecat:\n```bash\n# 1. Trigger the polecat\ngt nudge \u003csession\u003e \"Begin.\"\n\n# 2. Clear from pending list\ngt deacon pending \u003csession\u003e\n```\n\nThis triggers the UserPromptSubmit hook, which injects mail so the polecat sees its assignment.\n\n**Bootstrap mode** (daemon-only, no AI available):\nThe daemon uses `gt deacon trigger-pending` with regex detection. This ZFC violation is acceptable during cold startup when no AI agent is running yet.","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T19:55:37.501716-08:00","updated_at":"2025-12-25T19:55:55.613364-08:00","closed_at":"2025-12-25T19:55:55.613364-08:00","dependencies":[{"issue_id":"gt-wisp-eju","depends_on_id":"gt-wisp-lya","type":"blocks","created_at":"2025-12-25T19:55:37.617246-08:00","created_by":"deacon"}]}
{"id":"gt-wisp-h5d","title":"Persist patrol state","description":"Update handoff bead with new states.\n\nPersist state to the witness handoff bead:\n- Updated worker statuses from all arms\n- Current nudge counts per worker\n- Nudge timestamps\n- Actions taken this cycle\n- Pending items for next cycle\n\n```bash\nbd update \u003chandoff-bead-id\u003e --description=\"\u003cserialized state\u003e\"\n```\n\nThis state survives wisp burns and session cycles.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.822355-08:00","updated_at":"2025-12-27T21:29:55.002586-08:00","dependencies":[{"issue_id":"gt-wisp-h5d","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:37.006252-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-h5d","depends_on_id":"gt-wisp-lsd","type":"blocks","created_at":"2025-12-25T19:54:37.020736-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.002586-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-hp3","title":"Generate handoff summary","description":"Summarize this patrol cycle for digest.\n\nInclude:\n- Workers inspected (count, names)\n- Nudges sent (count, to whom)\n- Sessions killed (count, names)\n- Escalations (count, issues)\n- Issues found (brief descriptions)\n- Actions pending for next cycle\n\nThis becomes the digest when the patrol wisp is squashed.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.821607-08:00","updated_at":"2025-12-27T21:29:55.027795-08:00","dependencies":[{"issue_id":"gt-wisp-hp3","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.93569-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-hp3","depends_on_id":"gt-wisp-h5d","type":"blocks","created_at":"2025-12-25T19:54:36.949594-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.027795-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-hzw","title":"Clean dead sessions","description":"Clean dead sessions.\n\nGarbage collect terminated sessions:\n- Remove stale polecat directories\n- Clean up wisp session artifacts\n- Prune old logs and temp files\n- Archive completed molecule state\n\n```bash\ngt gc --sessions\ngt gc --wisps --age=1h\n```\n\nPreserve audit trail. Only clean sessions confirmed dead.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:55:37.502717-08:00","updated_at":"2025-12-27T21:29:54.969003-08:00","deleted_at":"2025-12-27T21:29:54.969003-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-lsd","title":"Aggregate arm results","description":"Collect outcomes from all polecat inspection arms.\n\nThis is a **fanout gate** - it cannot proceed until ALL dynamically-bonded\npolecat arms have completed their inspection cycles.\n\nOnce all arms complete, collect their outcomes:\n- Actions taken per polecat (nudge, kill, escalate, none)\n- Updated nudge counts\n- Any errors or issues discovered\n\nBuild the consolidated state for save-state.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.820571-08:00","updated_at":"2025-12-27T21:29:55.064377-08:00","dependencies":[{"issue_id":"gt-wisp-lsd","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.822951-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-lsd","depends_on_id":"gt-wisp-q5d","type":"blocks","created_at":"2025-12-25T19:54:36.837897-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.064377-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-ps8","title":"Ensure refinery is alive","description":"Ensure the refinery is alive and processing merge requests.\n\n**Redundant system**: This check runs in both gt spawn and Witness patrol\nto ensure the merge queue processor stays operational.\n\n```bash\n# Check if refinery session is running\ngt session status \u003crig\u003e/refinery\n\n# Check for merge requests in queue\nbd list --type=merge-request --status=open\n```\n\nIf merge requests are waiting AND refinery is not running:\n```bash\ngt session start \u003crig\u003e/refinery\ngt mail send \u003crig\u003e/refinery -s \"PATROL: Wake up\" -m \"Merge requests in queue. Please process.\"\n```\n\nIf refinery is running but queue is non-empty for \u003e30 min, send nudge.\nThis ensures polecats don't wait forever for their branches to merge.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.821127-08:00","updated_at":"2025-12-27T21:29:55.047335-08:00","dependencies":[{"issue_id":"gt-wisp-ps8","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.880056-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-ps8","depends_on_id":"gt-wisp-a1c","type":"blocks","created_at":"2025-12-25T19:54:36.893717-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.047335-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-q5d","title":"Survey all polecats (fanout)","description":"List polecats and bond mol-polecat-arm for each one.\n\n```bash\n# Get list of polecats\ngt polecat list \u003crig\u003e\n```\n\nFor each polecat discovered, dynamically bond an inspection arm:\n\n```bash\n# Bond mol-polecat-arm for each polecat\nfor polecat in $(gt polecat list \u003crig\u003e --names); do\n bd mol bond mol-polecat-arm $PATROL_WISP_ID \\\n --ref arm-$polecat \\\n --var polecat_name=$polecat \\\n --var rig=\u003crig\u003e\ndone\n```\n\nThis creates child wisps like:\n- patrol-x7k.arm-ace (5 steps)\n- patrol-x7k.arm-nux (5 steps)\n- patrol-x7k.arm-toast (5 steps)\n\nEach arm runs in PARALLEL. The aggregate step will wait for all to complete.\n\nIf no polecats are found, this step completes immediately with no children.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.822584-08:00","updated_at":"2025-12-27T21:29:54.994173-08:00","dependencies":[{"issue_id":"gt-wisp-q5d","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:37.034999-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-q5d","depends_on_id":"gt-wisp-5yy","type":"blocks","created_at":"2025-12-25T19:54:37.049799-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:54.994173-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wisp-yj8","title":"Burn and respawn or loop","description":"End of patrol cycle decision.\n\nIf context is LOW:\n- Burn this wisp (no audit trail needed for patrol cycles)\n- Sleep briefly to avoid tight loop (30-60 seconds)\n- Return to inbox-check step\n\nIf context is HIGH:\n- Burn wisp with summary digest\n- Exit cleanly (daemon will respawn fresh Witness)\n\n```bash\nbd mol burn # Destroy ephemeral wisp\n```\n\nThe daemon ensures Witness is always running.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:54:36.820851-08:00","updated_at":"2025-12-27T21:29:55.056026-08:00","dependencies":[{"issue_id":"gt-wisp-yj8","depends_on_id":"gt-wisp-48l","type":"parent-child","created_at":"2025-12-25T19:54:36.852513-08:00","created_by":"gastown/witness"},{"issue_id":"gt-wisp-yj8","depends_on_id":"gt-wisp-3fc","type":"blocks","created_at":"2025-12-25T19:54:36.86625-08:00","created_by":"gastown/witness"}],"deleted_at":"2025-12-27T21:29:55.056026-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wiu7y","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 17: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:47:37.442727-08:00","updated_at":"2025-12-27T21:26:01.33713-08:00","deleted_at":"2025-12-27T21:26:01.33713-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wjd6q","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 10: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:44:33.548312-08:00","updated_at":"2025-12-27T21:26:01.394216-08:00","deleted_at":"2025-12-27T21:26:01.394216-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wmhj","title":"tmux link-window auto-selects new window, causing agent confusion","description":"## Summary\n\nWhen running `gt crew at \u003cother\u003e` from inside a tmux session, the linked window auto-selects, causing the user to unknowingly switch agents.\n\n## Root Cause\n\n`internal/tmux/tmux.go:535-538` - `LinkWindow` doesn't use `-d` flag:\n```go\n_, err := t.run(\"link-window\", \"-s\", source) // Missing -d!\n```\n\nBy default, `tmux link-window` selects the newly linked window.\n\n## Reproduction\n\n1. Be in Max's tmux session talking to Max\n2. Ask Max to run `gt crew at joe`\n3. Max creates Joe's session and links it\n4. User is now in Joe's window without realizing it\n5. Max appears to have 'disappeared'\n\n## Fix\n\nAdd `-d` flag to prevent auto-selection:\n```go\n_, err := t.run(\"link-window\", \"-s\", source, \"-d\")\n```\n\n## Related\n\n- gt-09i4: Unify tmux session lifecycle (broader epic Max filed)\n","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-24T14:33:40.40319-08:00","updated_at":"2025-12-27T21:29:52.61366-08:00","deleted_at":"2025-12-27T21:29:52.61366-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-woitj","title":"Digest: mol-deacon-patrol","description":"Patrol 4: All agents healthy, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:06:29.731567-08:00","updated_at":"2025-12-27T21:26:03.829456-08:00","deleted_at":"2025-12-27T21:26:03.829456-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wpg","title":"Replaceable notifications via Claude Code queue","description":"Leverage Claude Code's ability to replace queued text for notifications that supersede previous ones.\n\n## Problem\n\nIf daemon sends 10 heartbeats while agent is busy, agent returns to see 10 stacked messages. Wasteful and noisy.\n\n## Solution\n\nUse Claude Code's queue replacement for:\n- Heartbeat messages (only latest matters)\n- Status updates that supersede previous\n- Progress notifications\n\n## Implementation\n\nNotifications get a 'slot' identifier. New notification in same slot replaces old one:\n- Slot: 'heartbeat' → only one heartbeat queued at a time\n- Slot: 'status-\u003crig\u003e' → latest status per rig\n- No slot → stacks normally (for unique messages)\n\n## Research Needed\n\n- How does Claude Code expose queue replacement?\n- tmux send-keys behavior with pending input\n- Alternative: clear + resend pattern","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T14:19:29.821949-08:00","updated_at":"2025-12-27T21:29:57.159975-08:00","dependencies":[{"issue_id":"gt-wpg","depends_on_id":"gt-99m","type":"blocks","created_at":"2025-12-18T14:19:46.656972-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.159975-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wpj1","title":"Digest: mol-deacon-patrol","description":"Patrol 14: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:00:35.116869-08:00","updated_at":"2025-12-27T21:26:04.910883-08:00","deleted_at":"2025-12-27T21:26:04.910883-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wqck","title":"bd doctor: detect clone divergence emergencies","description":"Add doctor check to detect:\n1. Crew/Mayor on feature branches (should always be on main)\n2. Significant divergence between clones that should be in sync\n3. Distinguish from normal beads-sync vs main divergence (expected)\n\nContext: bd sync --status shows all divergence as equal, but some is emergency (clones drifted) vs normal (sync branch mechanics).","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:26:32.71018-08:00","updated_at":"2025-12-27T21:29:55.472673-08:00","deleted_at":"2025-12-27T21:29:55.472673-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wrftq","title":"Digest: mol-deacon-patrol","description":"Patrol 7: all clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:45:06.500609-08:00","updated_at":"2025-12-27T21:26:01.254654-08:00","deleted_at":"2025-12-27T21:26:01.254654-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wrvdg","title":"Digest: mol-deacon-patrol","description":"Patrol 7: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:34:26.659612-08:00","updated_at":"2025-12-27T21:26:00.39009-08:00","deleted_at":"2025-12-27T21:26:00.39009-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wrw2","title":"Test2","description":"Testing gt mail","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-20T21:39:05.875792-08:00","updated_at":"2025-12-25T14:12:42.26153-08:00","deleted_at":"2025-12-25T14:12:42.26153-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"message"}
{"id":"gt-ws8ol","title":"Design: Deacon exponential backoff when town is idle","description":"## Problem\n\nWhen the user goes to sleep and all workers have stopped, the Deacon continues\npolling at full speed (every daemon heartbeat). This wastes resources and creates\nunnecessary log noise.\n\n## Requirements\n\n1. **Gradual slowdown**: Deacon patrol frequency should exponentially back off\n when there's no activity in the town.\n\n2. **Wake on activity**: Any gt or bd command (human or agent) should reset\n the backoff immediately.\n\n3. **Daemon coordination**: Deacon needs to \"program\" the daemon for next\n heartbeat duration.\n\n## Design Considerations\n\n### Option A: Daemon tracks last-activity timestamp\n- Daemon mechanically slows heartbeat based on time since last activity\n- Activity signal comes from gt/bd command hooks\n- Simple, no Deacon state needed\n- But: Daemon is \"dumb\" by design - adding intelligence here feels wrong\n\n### Option B: Deacon tracks patrol-count-since-activity\n- Deacon maintains counter (via label on patrol molecule or state.json)\n- Each idle patrol increments counter, calculates next sleep\n- Deacon tells daemon \"wake me in X minutes\"\n- More complex but keeps intelligence in AI agent\n\n### Option C: Discovery-based (preferred if feasible)\n- Can we detect \"no activity\" without explicit counters?\n- Ideas:\n - Compare current time to last git commit across all clones?\n - Check tmux activity timestamps?\n - Look at beads modification times?\n- Pros: No state to manage\n- Cons: May be expensive to compute, could miss activity\n\n### Wake mechanism\n- Hook on gt commands (gt.hooks.pre-command or similar)\n- Hook on bd commands (bd.hooks.pre-command)\n- Both should signal daemon (SIGUSR2?) or write to activity file\n- Daemon reads activity file on each heartbeat, resets backoff if fresh\n\n### Backoff schedule (strawman)\n| Idle patrols | Next heartbeat |\n|--------------|----------------|\n| 0-5 | 5 min (default) |\n| 6-10 | 10 min |\n| 11-20 | 30 min |\n| 21+ | 60 min (max) |\n\n## Open Questions\n\n1. Should tmux activity count as \"activity\"? (cursor movement, typing)\n2. How does Deacon communicate next-heartbeat to daemon?\n3. Where to store backoff state? (daemon state.json? deacon state.json? label?)\n4. Should wake be instant (SIGUSR) or next-heartbeat?\n\n## Implementation Sketch\n\nActivity file written by hooks at ~/gt/daemon/activity.json:\n- last_command: timestamp\n- command: the gt/bd command that ran\n- actor: who ran it","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-26T19:09:17.544704-08:00","updated_at":"2025-12-27T21:29:54.766314-08:00","deleted_at":"2025-12-27T21:29:54.766314-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-ws8ol.1","title":"gt: Write town-level activity signal in PersistentPreRun","description":"Extend gt's existing keepalive infrastructure to also write a town-level activity file.\n\nCurrently gt writes to \u003cworkspace\u003e/.runtime/keepalive.json (per-rig).\nAdd a second write to ~/gt/daemon/activity.json (town-level).\n\nThe keepalive package already has TouchInWorkspace(). Add a new function\nTouchTownActivity() that writes to the town daemon directory.\n\nCall both in root.go's PersistentPreRun.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:21:54.160207-08:00","updated_at":"2025-12-27T21:29:54.74828-08:00","dependencies":[{"issue_id":"gt-ws8ol.1","depends_on_id":"gt-ws8ol","type":"parent-child","created_at":"2025-12-26T19:21:54.160763-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.74828-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ws8ol.2","title":"bd: Add town-level activity signal in PersistentPreRun","description":"Add activity signaling to beads so daemon can detect bd usage.\n\nIn cmd/bd/main.go PersistentPreRun, add a call to write activity to\nthe Gas Town daemon directory if running inside a Gas Town workspace.\n\nThe signal file is ~/gt/daemon/activity.json (or detected town root).\n\nFormat:\n{\n \"last_command\": \"bd create ...\",\n \"actor\": \"gastown/crew/max\",\n \"timestamp\": \"2025-12-26T19:30:00Z\"\n}\n\nShould be best-effort (silent failure) to avoid breaking bd outside Gas Town.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:22:04.033458-08:00","updated_at":"2025-12-26T19:25:27.854783-08:00","dependencies":[{"issue_id":"gt-ws8ol.2","depends_on_id":"gt-ws8ol","type":"parent-child","created_at":"2025-12-26T19:22:04.03401-08:00","created_by":"daemon"}],"deleted_at":"2025-12-26T19:25:27.854783-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ws8ol.3","title":"Daemon: Implement exponential backoff based on activity file","description":"Modify daemon heartbeat logic to use exponential backoff when idle.\n\nRead ~/gt/daemon/activity.json on each heartbeat.\nCompute idle duration: time.Since(activity.Timestamp)\nAdjust next ticker interval based on idle duration:\n\n| Idle Duration | Next Heartbeat |\n|---------------|----------------|\n| 0-5 min | 5 min (base) |\n| 5-15 min | 10 min |\n| 15-45 min | 30 min |\n| 45+ min | 60 min (max) |\n\nImplementation:\n1. Add activity file reading to daemon heartbeat\n2. Replace fixed ticker with dynamic interval calculation\n3. Reset to base interval when activity is fresh\n\nNo counter tracking needed - pure discovery from single timestamp.\n\nCROSS-RIG DEPENDENCY: Also requires beads:bd-v8ku (assigned to beads/crew/dave)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:22:14.170114-08:00","updated_at":"2025-12-27T21:29:54.739515-08:00","dependencies":[{"issue_id":"gt-ws8ol.3","depends_on_id":"gt-ws8ol","type":"parent-child","created_at":"2025-12-26T19:22:14.170719-08:00","created_by":"daemon"},{"issue_id":"gt-ws8ol.3","depends_on_id":"gt-ws8ol.1","type":"blocks","created_at":"2025-12-26T19:22:19.844324-08:00","created_by":"daemon"},{"issue_id":"gt-ws8ol.3","depends_on_id":"gt-ws8ol.2","type":"blocks","created_at":"2025-12-26T19:22:19.90297-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.739515-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wsa2","title":"Digest: mol-deacon-patrol","description":"Patrol #6: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:22:39.448813-08:00","updated_at":"2025-12-27T21:26:04.785304-08:00","deleted_at":"2025-12-27T21:26:04.785304-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wsjg.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-wsjg\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:17:02.197808-08:00","updated_at":"2025-12-27T21:29:55.446289-08:00","deleted_at":"2025-12-27T21:29:55.446289-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wt6ci","title":"Merge: rictus-mjw3nj1a","description":"branch: polecat/rictus-mjw3nj1a\ntarget: main\nsource_issue: rictus-mjw3nj1a\nrig: gastown\nagent_bead: gt-gastown-polecat-rictus","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:23:13.87957-08:00","updated_at":"2026-01-01T19:25:16.889202-08:00","closed_at":"2026-01-01T19:25:16.889202-08:00","close_reason":"Merged to main at 478dc60d","created_by":"gastown/polecats/rictus"}
{"id":"gt-wugqp","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final patrol, 11 sessions healthy, no incidents","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:20:38.900039-08:00","updated_at":"2025-12-27T21:26:02.642504-08:00","deleted_at":"2025-12-27T21:26:02.642504-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wvycd","title":"Digest: mol-deacon-patrol","description":"Patrol 10: routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T13:44:08.199048-08:00","updated_at":"2025-12-27T21:26:03.206256-08:00","deleted_at":"2025-12-27T21:26:03.206256-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wvyi","title":"sling pin test 2","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:51:01.899435-08:00","updated_at":"2025-12-27T21:29:56.035136-08:00","deleted_at":"2025-12-27T21:29:56.035136-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-wysx3","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:34.249635-08:00","updated_at":"2025-12-27T21:26:02.139015-08:00","deleted_at":"2025-12-27T21:26:02.139015-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x0omu","title":"Digest: mol-deacon-patrol","description":"Patrol 10: Quick cycle, 50% to handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T17:07:26.218638-08:00","updated_at":"2025-12-27T21:26:02.975768-08:00","deleted_at":"2025-12-27T21:26:02.975768-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x0tz","title":"Digest: mol-deacon-patrol","description":"Patrol 12","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:09:48.920883-08:00","updated_at":"2025-12-27T21:26:04.433144-08:00","deleted_at":"2025-12-27T21:26:04.433144-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x2cx","title":"gt handoff: Deadlock bug in runHandoff","notes":"Running 'gt handoff' from a polecat causes a deadlock:\n\nStack trace shows:\n- goroutine 1 [select (no cases)] in runHandoff\n- File: internal/cmd/handoff.go:125\n\nThe command successfully sends the shutdown request but then hangs with 'fatal error: all goroutines are asleep - deadlock!'","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T08:01:33.827354-08:00","updated_at":"2025-12-27T21:29:53.764464-08:00","deleted_at":"2025-12-27T21:29:53.764464-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-x2ygo","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 3: routine check, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T13:22:00.324654-08:00","updated_at":"2025-12-27T21:26:01.950987-08:00","deleted_at":"2025-12-27T21:26:01.950987-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x2yml","title":"Digest: mol-deacon-patrol","description":"Patrol complete: 0 messages, agents healthy, no cleanup needed","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:49:25.778185-08:00","updated_at":"2025-12-27T21:26:00.601801-08:00","deleted_at":"2025-12-27T21:26:00.601801-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x5acy","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 11: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:22:34.773103-08:00","updated_at":"2025-12-28T11:22:34.773103-08:00","closed_at":"2025-12-28T11:22:34.773071-08:00"}
{"id":"gt-x5gxi","title":"Daemon/Boot/Deacon Watchdog Chain: Fix session lifecycle management","description":"## Summary\n\nThe watchdog chain (Daemon → Boot → Deacon → Witness/Refinery) has multiple bugs preventing auto-recovery of dead agents:\n\n1. Boot spawns in wrong session\n2. Daemon can't kill zombie sessions\n3. Binary/process age mismatch goes undetected\n4. Status display doesn't reconcile bead vs tmux state\n5. Start commands don't have 'ensure' semantics\n\n## Impact\n\nWhen witness/refinery die, they stay dead until manual intervention. The entire purpose of the Deacon (health monitoring) is defeated.\n\n## Root Cause\n\nSession lifecycle management lacks 'ensure running' semantics. The code assumes:\n- If session doesn't exist → create it\n- If session exists → it's healthy\n\nReality:\n- Sessions can be zombies (tmux alive, Claude dead)\n- Bead state and tmux state can diverge\n- Daemon process can run old code\n\n## Child Issues\n\n- gt-sgzsb: Boot session confusion (P1)\n- gt-j1i0r: Zombie session blocking (P1)\n- gt-d48f2: Binary age detection (P2)\n- gt-doih4: Status bead/tmux mismatch (P2)\n- gt-ekc5u: Start 'ensure' semantics (P2)\n- gt-1847v: Boot/Deacon documentation (P2)\n\n## Success Criteria\n\n1. Dead witness/refinery auto-restart within 5 minutes\n2. `gt status` accurately reflects session health\n3. Daemon warns if running old code\n4. Boot/Deacon lifecycle is clear and documented","status":"closed","priority":1,"issue_type":"epic","created_at":"2026-01-02T18:43:51.522482-08:00","updated_at":"2026-01-02T18:57:15.49754-08:00","closed_at":"2026-01-02T18:57:15.49754-08:00","close_reason":"All child issues completed: boot session fix, zombie session handling, binary age detection, status reconciliation, ensure semantics, and documentation","created_by":"mayor","dependencies":[{"issue_id":"gt-x5gxi","depends_on_id":"gt-doih4","type":"blocks","created_at":"2026-01-02T18:44:17.022721-08:00","created_by":"mayor"},{"issue_id":"gt-x5gxi","depends_on_id":"gt-ekc5u","type":"blocks","created_at":"2026-01-02T18:44:17.064278-08:00","created_by":"mayor"},{"issue_id":"gt-x5gxi","depends_on_id":"gt-1847v","type":"blocks","created_at":"2026-01-02T18:44:17.104983-08:00","created_by":"mayor"},{"issue_id":"gt-x5gxi","depends_on_id":"gt-sgzsb","type":"blocks","created_at":"2026-01-02T18:44:16.897795-08:00","created_by":"mayor"},{"issue_id":"gt-x5gxi","depends_on_id":"gt-j1i0r","type":"blocks","created_at":"2026-01-02T18:44:16.939363-08:00","created_by":"mayor"},{"issue_id":"gt-x5gxi","depends_on_id":"gt-d48f2","type":"blocks","created_at":"2026-01-02T18:44:16.98097-08:00","created_by":"mayor"}]}
{"id":"gt-x74c","title":"gt mol command tree: status, catalog, burn, squash","description":"Add gt mol subcommand as agent-side API for molecule operations.\n\nCommands needed:\n- gt mol status - What's on my hook? (pinned molecule, current step, progress)\n- gt mol catalog - List available protos (delegate to bd mol catalog)\n- gt mol burn - Burn current attachment\n- gt mol squash - Squash current molecule to digest\n\nThis completes the agent-side API and makes the docs (sling-design.md, propulsion-principle.md) match reality.\n\nBlocks: deacon.md.tmpl update (can't use gt mol status until it exists)","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T13:12:23.710855-08:00","updated_at":"2025-12-27T21:29:53.226036-08:00","deleted_at":"2025-12-27T21:29:53.226036-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-x7c","title":"Work assignment mail not received by polecat","description":"When gt spawn sends a work assignment:\n\n1. gt spawn says '✓ Work assignment sent'\n2. Polecat runs 'gt mail inbox' \n3. Shows '0 messages, 0 unread'\n\nThe work assignment mail never arrived at the polecat's inbox.\n\n## Observed in session\n- Spawned polecat dementus on gt-th7\n- Polecat checked inbox: empty\n- Polecat couldn't find issue (separate sync bug)\n\n## Possible causes\n- Mail routing issue for polecat addresses\n- gt spawn not actually sending mail\n- Mail sent to wrong address format","notes":"This was likely caused by stale beads in old polecats. With gt-9nf (fresh polecats), polecats now use shared rig beads via redirect file, eliminating sync issues. Mail routing uses town beads correctly.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T15:18:12.39878-08:00","updated_at":"2025-12-27T21:29:53.73616-08:00","deleted_at":"2025-12-27T21:29:53.73616-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-x7n8a","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 19: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:48:24.957934-08:00","updated_at":"2025-12-27T21:26:01.32077-08:00","deleted_at":"2025-12-27T21:26:01.32077-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x9nf","title":"Digest: mol-deacon-patrol","description":"Old test patrol cycle - cleanup","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-22T02:07:15.850595-08:00","updated_at":"2025-12-25T15:52:57.591633-08:00","deleted_at":"2025-12-25T15:52:57.591633-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-x9tk","title":"Merge: gt-j6s8","description":"branch: polecat/dementus\ntarget: main\nsource_issue: gt-j6s8\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T01:15:22.372353-08:00","updated_at":"2025-12-27T21:27:22.451878-08:00","deleted_at":"2025-12-27T21:27:22.451878-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-xbfw","title":"Missing standard OSS files: CONTRIBUTING, CODE_OF_CONDUCT, SECURITY","description":"For OSS launch, we need:\n1. CONTRIBUTING.md - How to contribute, PR process, code style\n2. CODE_OF_CONDUCT.md - Community standards\n3. SECURITY.md - How to report vulnerabilities\n\nThese are expected by OSS communities and GitHub surfaces them in repo UI.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-24T12:50:30.389345-08:00","updated_at":"2025-12-27T21:29:52.663509-08:00","deleted_at":"2025-12-27T21:29:52.663509-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xdhg","title":"test sling pinned","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T11:47:46.118753-08:00","updated_at":"2025-12-27T21:29:56.051683-08:00","deleted_at":"2025-12-27T21:29:56.051683-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xesj9","title":"Review PR #51: fix: Use hq prefix for agent beads to match town beads database","description":"Review PR #51. Check hq prefix usage is consistent. Approve with gh pr review --approve if good.","status":"closed","priority":2,"issue_type":"task","created_at":"2026-01-03T11:40:27.585399-08:00","updated_at":"2026-01-03T11:47:42.978827-08:00","closed_at":"2026-01-03T11:47:42.978827-08:00","close_reason":"PR reviewed - requested changes: hardcoded gt- references need updating in prime.go, daemon/lifecycle.go, doctor/agent_beads_check.go, polecat/manager.go, crew_add.go","created_by":"mayor"}
{"id":"gt-xf3fi","title":"Digest: mol-deacon-patrol","description":"P17","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:27:01.581886-08:00","updated_at":"2025-12-27T21:26:01.601729-08:00","deleted_at":"2025-12-27T21:26:01.601729-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xf5d","title":"Digest: mol-deacon-patrol","description":"Patrol 9","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:08:57.919962-08:00","updated_at":"2025-12-27T21:26:04.449625-08:00","deleted_at":"2025-12-27T21:26:04.449625-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xfznm","title":"Digest: mol-deacon-patrol","description":"Patrol 15: all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T21:59:55.813534-08:00","updated_at":"2025-12-27T21:26:00.490588-08:00","deleted_at":"2025-12-27T21:26:00.490588-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xhhwd","title":"Merge: dag-mjw70jg8","description":"branch: polecat/dag-mjw70jg8\ntarget: main\nsource_issue: dag-mjw70jg8\nrig: gastown\nagent_bead: gt-gastown-polecat-dag","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T19:08:20.685703-08:00","updated_at":"2026-01-01T19:10:43.722828-08:00","closed_at":"2026-01-01T19:10:43.722828-08:00","close_reason":"Merged to main at b6eeac41","created_by":"gastown/polecats/dag"}
{"id":"gt-xhv1","title":"Digest: mol-deacon-patrol","description":"Patrol 18: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:01:57.635721-08:00","updated_at":"2025-12-27T21:26:04.8779-08:00","deleted_at":"2025-12-27T21:26:04.8779-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xicq","title":"Work on ga-lue: Implement Witness as Claude agent. Conver...","description":"Work on ga-lue: Implement Witness as Claude agent. Convert from shell script to Claude agent that monitors polecats, nudges idle ones, handles escalations. When done, submit MR (not PR) to integration branch for Refinery.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-19T22:58:08.310674-08:00","updated_at":"2025-12-27T21:29:56.897912-08:00","deleted_at":"2025-12-27T21:29:56.897912-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xjgsg","title":"Digest: mol-deacon-patrol","description":"Patrol 4: routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:16:05.421406-08:00","updated_at":"2025-12-27T21:26:03.58508-08:00","deleted_at":"2025-12-27T21:26:03.58508-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xkbm","title":"Merge: gt-g44u.1","description":"branch: polecat/Ace\ntarget: main\nsource_issue: gt-g44u.1\nrig: gastown","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-19T16:04:14.367493-08:00","updated_at":"2025-12-27T21:27:22.376803-08:00","deleted_at":"2025-12-27T21:27:22.376803-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-xleh8","title":"Digest: mol-deacon-patrol","description":"Patrol 20: Final patrol, all healthy, handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:37:16.291795-08:00","updated_at":"2025-12-27T21:26:03.862012-08:00","deleted_at":"2025-12-27T21:26:03.862012-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xm6g","title":"Digest: mol-deacon-patrol","description":"Patrol #19","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:26:24.186039-08:00","updated_at":"2025-12-27T21:26:04.675624-08:00","deleted_at":"2025-12-27T21:26:04.675624-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xms9l","title":"Digest: mol-deacon-patrol","description":"Patrol 8","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T20:03:01.647821-08:00","updated_at":"2025-12-27T21:26:00.661136-08:00","deleted_at":"2025-12-27T21:26:00.661136-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xmyha","title":"Switch formulas from JSON to TOML","description":"## Problem\nFormula JSON files have poor ergonomics for agents and humans:\n- Multi-line strings require `\\n` escaping\n- Diffs are hard to read (long single-line changes)\n- No comments allowed\n\n## Solution\nSwitch formulas to TOML format:\n\n```toml\nformula = \"mol-deacon-patrol\"\nversion = 1\ndescription = \"\"\"\nMayor's daemon patrol loop.\nMulti-line strings work naturally.\n\"\"\"\n\n[[steps]]\nid = \"inbox-check\"\ntitle = \"Handle callbacks\"\ndescription = \"\"\"\nCheck inbox for messages.\n\n```bash\ngt mail inbox\n```\n\"\"\"\n```\n\n## Tasks\n- [ ] Add TOML parsing for formulas (github.com/BurntSushi/toml)\n- [ ] Update formula loader to try .toml first, fall back to .json\n- [ ] Add `bd formula convert \u003cname\u003e` to migrate JSON → TOML\n- [ ] Convert existing formulas\n- [ ] Update docs\n\n## Notes\n- Keep issues.jsonl as-is (machine data, append-only)\n- TOML only for human-edited formula files","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-25T21:49:39.609525-08:00","updated_at":"2025-12-27T21:29:54.919189-08:00","deleted_at":"2025-12-27T21:29:54.919189-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-xnql","title":"Define constants for magic strings","description":"Several magic strings are hardcoded throughout the codebase:\n- \"mayor\" appears in 20+ places as a path component\n- \"main\" branch name appears in 20+ places\n- \"beads-sync\" branch name in multiple files\n- \"rigs.json\" filename\n\nThese should be constants in a central location for maintainability.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-21T21:34:46.620322-08:00","updated_at":"2025-12-27T21:29:57.580789-08:00","deleted_at":"2025-12-27T21:29:57.580789-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xntmo","title":"Digest: mol-deacon-patrol","description":"Patrol 17: all quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:40:21.981724-08:00","updated_at":"2025-12-27T21:26:00.303865-08:00","deleted_at":"2025-12-27T21:26:00.303865-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xnzp","title":"Merge: gt-7923","description":"branch: polecat/rictus\ntarget: main\nsource_issue: gt-7923\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T00:18:21.630445-08:00","updated_at":"2025-12-27T21:27:22.468385-08:00","deleted_at":"2025-12-27T21:27:22.468385-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-xoxcq","title":"Digest: mol-deacon-patrol","description":"Patrol 5: clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T18:13:19.78746-08:00","updated_at":"2025-12-27T21:26:01.065981-08:00","deleted_at":"2025-12-27T21:26:01.065981-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xp2s","title":"P0: Multiple agents can claim same worker identity","description":"Multiple Claude Code sessions running simultaneously all think they are 'dave' on beads/crew/dave. No detection or prevention of identity collision. This breaks the single-agent-per-worker assumption.\n\n## Fix Implemented\n\n1. **Lock Package** (`internal/lock/lock.go`):\n - PID-based lockfile at `\u003cworker\u003e/.gastown/agent.lock`\n - Contains PID, timestamp, session ID, hostname\n - Stale lock detection (checks if owning PID is dead)\n\n2. **Prevention in gt prime**:\n - Workers (crew/polecat) acquire identity lock before loading context\n - If another live process holds the lock, prime fails with clear error\n - Shows lock holder details and resolution steps\n\n3. **Detection with gt agents**:\n - `gt agents check` - scans for collisions and stale locks\n - `gt agents fix` - cleans stale locks\n - JSON output available for patrol tooling\n\n4. **Correction in gt doctor**:\n - New `identity-collision` check\n - `gt doctor --fix` cleans stale locks","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-21T23:55:56.649577-08:00","updated_at":"2025-12-27T21:29:45.486461-08:00","deleted_at":"2025-12-27T21:29:45.486461-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-xpnev","title":"Digest: mol-deacon-patrol","description":"Patrol 12: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T18:13:59.628034-08:00","updated_at":"2025-12-27T21:26:02.846613-08:00","deleted_at":"2025-12-27T21:26:02.846613-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xpq","title":"Add gt crew rename command","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T19:45:32.599846-08:00","updated_at":"2025-12-27T21:29:57.101889-08:00","deleted_at":"2025-12-27T21:29:57.101889-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xqvs","title":"Merge: gt-t9u7","description":"branch: polecat/furiosa\ntarget: main\nsource_issue: gt-t9u7\nrig: gastown","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T17:09:52.158844-08:00","updated_at":"2025-12-27T21:27:22.802488-08:00","deleted_at":"2025-12-27T21:27:22.802488-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-xsw1","title":"Digest: mol-deacon-patrol","description":"Patrol complete: Restarted 5 downed agents (Mayor, 2 Witnesses, 2 Refineries). All systems now healthy.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:05:11.307955-08:00","updated_at":"2025-12-27T21:26:04.836396-08:00","deleted_at":"2025-12-27T21:26:04.836396-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xtvfs","title":"Digest: mol-deacon-patrol","description":"P6 clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:23:38.312556-08:00","updated_at":"2025-12-27T21:26:01.695912-08:00","deleted_at":"2025-12-27T21:26:01.695912-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xuhrh","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:52:13.56346-08:00","updated_at":"2025-12-27T21:26:04.108181-08:00","deleted_at":"2025-12-27T21:26:04.108181-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xv8vu","title":"Digest: mol-deacon-patrol","description":"P8","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:24:17.018292-08:00","updated_at":"2025-12-27T21:26:01.679568-08:00","deleted_at":"2025-12-27T21:26:01.679568-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xx5u","title":"Digest: mol-deacon-patrol","description":"Patrol 10: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:59:10.084625-08:00","updated_at":"2025-12-27T21:26:04.944096-08:00","deleted_at":"2025-12-27T21:26:04.944096-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-xxtl","title":"Implement bd mol bond --ephemeral flag","description":"Add --ephemeral flag to bd mol bond command to support ephemeral molecule bonding.\n\n## Context\nPhase 1.2 of Wisp Molecule Integration (gt-3x0z.2) discovered that this flag doesn't exist.\n\n## Requirements\n\n1. Add --ephemeral flag to mol bond command\n2. When --ephemeral is set:\n - Mark spawned issues with wisp=true\n - Optionally write to separate .beads-ephemeral/ storage (Phase 2)\n\n## Architecture Reference\nSee gastown architecture.md lines 487-491 for the full ephemeral storage design.\n\n## Related\n- gt-3x0z.2: Configure bd for ephemeral molecule bonding (closed - blocked on this)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T16:00:04.924875-08:00","updated_at":"2025-12-27T21:29:56.570559-08:00","deleted_at":"2025-12-27T21:29:56.570559-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-xzmtp","title":"Digest: mol-deacon-patrol","description":"Patrol 13: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T19:34:37.026367-08:00","updated_at":"2025-12-27T21:26:00.751658-08:00","deleted_at":"2025-12-27T21:26:00.751658-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y0t","title":"session stop: --force flag is defined but not used","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-16T13:55:12.848848-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"bug"}
{"id":"gt-y2p6","title":"Merge: gt-3x1.5","description":"branch: polecat/Immortan\ntarget: main\nsource_issue: gt-3x1.5\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T14:53:53.544887-08:00","updated_at":"2025-12-27T21:27:22.758557-08:00","deleted_at":"2025-12-27T21:27:22.758557-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-y3k55","title":"Digest: mol-deacon-patrol","description":"Patrol 1: No callbacks, no gates, all agents healthy, cleaned 2 orphan processes + 1 stale lock","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T14:42:43.612775-08:00","updated_at":"2025-12-27T21:26:01.568888-08:00","deleted_at":"2025-12-27T21:26:01.568888-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y3y7","title":"Polecat template: move startup announcement into molecule","description":"Template has STARTUP PROTOCOL with 'Announce: beads Polecat furiosa, checking in.' This should be a startup step in the polecat molecule, not prose instruction in CLAUDE.md. The molecule drives behavior, not the template prose.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T16:56:56.302648-08:00","updated_at":"2025-12-27T21:29:55.897668-08:00","deleted_at":"2025-12-27T21:29:55.897668-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y481","title":"Epic: Patrol parity - Witness and Refinery match Deacon sophistication","description":"Bring Witness and Refinery patrols up to Deacon's level of sophistication.\n\n## Deacon has\n1. Defined patrol molecule with clear steps\n2. Wisp-based execution (spawn, run, squash, loop)\n3. Daemon monitoring with auto-nudge on naked state\n4. Handoff bead attachment mechanism\n5. Banners for observability\n6. Context-aware loop-or-exit\n\n## Children\n- gt-h1n5: Witness patrol: Add banners and wisp-based execution\n- gt-qz2l: Refinery patrol: Add banners and wisp-based execution\n- gt-poxd: Create handoff beads for Witness and Refinery roles\n\n## Success criteria\n- Tailing a Witness or Refinery session shows clear banners\n- Patrols spawn as wisps and squash to digests\n- Each role has a handoff bead with attached_molecule\n- Daemon monitors Witness/Refinery nakedness (stretch goal)","status":"tombstone","priority":1,"issue_type":"epic","created_at":"2025-12-23T13:19:51.934063-08:00","updated_at":"2025-12-27T21:29:52.998933-08:00","deleted_at":"2025-12-27T21:29:52.998933-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-y5o","title":"Daemon: verify requesting_cycle before kill","description":"Per gt-gby spec, daemon should verify agent state shows requesting_cycle=true before killing session. Currently kills on any lifecycle request without verification.\n\nRequires state.json in agent workspace with requesting_cycle field.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-18T13:38:13.049988-08:00","updated_at":"2025-12-27T21:29:57.197132-08:00","deleted_at":"2025-12-27T21:29:57.197132-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y68zm","title":"Digest: mol-deacon-patrol","description":"Patrol 5: 11 sessions, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:36:05.583458-08:00","updated_at":"2025-12-27T21:26:02.171656-08:00","deleted_at":"2025-12-27T21:26:02.171656-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y8vqb","title":"Merge: furiosa-mjw1p03l","description":"branch: polecat/furiosa-mjw1p03l\ntarget: main\nsource_issue: furiosa-mjw1p03l\nrig: gastown\nagent_bead: gt-gastown-polecat-furiosa","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-01T15:01:56.817897-08:00","updated_at":"2026-01-01T15:40:07.524002-08:00","closed_at":"2026-01-01T15:40:07.524002-08:00","close_reason":"Stale MR - branch has no new commits or doesn't exist","created_by":"gastown/polecats/furiosa"}
{"id":"gt-y9gj3","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T10:14:49.293338-08:00","updated_at":"2025-12-27T21:26:03.59315-08:00","deleted_at":"2025-12-27T21:26:03.59315-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-y9vm3","title":"Digest: mol-deacon-patrol","description":"Patrol 4: 9 sessions healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:25:34.660496-08:00","updated_at":"2025-12-27T21:26:03.660185-08:00","deleted_at":"2025-12-27T21:26:03.660185-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yc0ok","title":"Digest: mol-deacon-patrol","description":"Patrol 13","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:52:10.828566-08:00","updated_at":"2025-12-27T21:26:02.002622-08:00","deleted_at":"2025-12-27T21:26:02.002622-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yd98","title":"Molecule format bridge: convert embedded markdown to child issues","description":"## Problem\n\nTwo molecule formats exist:\n1. **Old (gastown builtin)**: Steps embedded as markdown in Description field, parsed by ParseMoleculeSteps()\n2. **New (bd mol)**: Steps as child issues in proper beads DAG\n\nThe daemon's InstantiateMolecule() uses the old format. bd mol spawn uses the new format.\n\n## Options\n\n### Option A: Require child issues (new format)\n- Update daemon to use bd mol spawn instead of InstantiateMolecule\n- Deprecate embedded markdown format\n- Pro: One format, simpler\n- Con: Breaking change for existing molecules\n\n### Option B: Build a bridge\n- In InstantiateMolecule, detect format:\n - If Description has ## Step: markers → parse to child issues\n - If issue has children with template label → use directly\n- Convert old format to new format on instantiation\n- Pro: Backward compatible\n- Con: More complexity\n\n## Recommendation\n\nOption B - build the bridge. On instantiation, convert markdown steps to child issues. This unifies execution while preserving authoring flexibility.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T17:55:15.751168-08:00","updated_at":"2025-12-27T21:29:53.392446-08:00","deleted_at":"2025-12-27T21:29:53.392446-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ydbqv","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 16: nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:47:14.007718-08:00","updated_at":"2025-12-27T21:26:01.345263-08:00","deleted_at":"2025-12-27T21:26:01.345263-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-ydcrh","title":"Digest: mol-deacon-patrol","description":"Patrol 2: all healthy, no callbacks, no gates, no orphans, no plugins","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:19:37.466306-08:00","updated_at":"2025-12-27T21:26:01.295954-08:00","deleted_at":"2025-12-27T21:26:01.295954-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yduiv","title":"Session ended: gt-gastown-warboy","status":"closed","priority":2,"issue_type":"event","created_at":"2026-01-03T11:45:33.493544-08:00","updated_at":"2026-01-04T16:41:00.381277-08:00","closed_at":"2026-01-04T16:41:00.381277-08:00","close_reason":"Archived session telemetry","created_by":"gastown/polecats/warboy"}
{"id":"gt-ye8l","title":"Merge: gt-3x1","description":"branch: polecat/Slit\ntarget: main\nsource_issue: gt-3x1\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-19T14:53:52.344849-08:00","updated_at":"2025-12-27T21:27:22.775056-08:00","deleted_at":"2025-12-27T21:27:22.775056-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-yewf","title":"Role prompts have mismatched startup protocols","description":"Different role prompts describe different startup protocols:\n\n- deacon.md: Check gt mol status, spawn if none: bd mol spawn\n- refinery.md: Check bd list --status=in_progress, bond if none: gt mol bond\n- polecat.md: Check bd mol current\n\nThese should use consistent terminology and commands.\nThe propulsion principle should be applied uniformly.","status":"tombstone","priority":2,"issue_type":"bug","created_at":"2025-12-24T12:51:22.22295-08:00","updated_at":"2025-12-27T21:29:55.544652-08:00","deleted_at":"2025-12-27T21:29:55.544652-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-yg6u3","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T07:26:30.255841-08:00","updated_at":"2025-12-27T21:26:03.635248-08:00","deleted_at":"2025-12-27T21:26:03.635248-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yhq23","title":"Digest: mol-deacon-patrol","description":"Patrol 13: All clear","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:03:51.430921-08:00","updated_at":"2025-12-27T21:26:03.976851-08:00","deleted_at":"2025-12-27T21:26:03.976851-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yjj9u","title":"Digest: mol-deacon-patrol","description":"P19: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:15:10.311583-08:00","updated_at":"2025-12-27T21:26:02.220358-08:00","deleted_at":"2025-12-27T21:26:02.220358-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yk4r","title":"Digest: mol-deacon-patrol","description":"Patrol 19: Routine","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T22:38:50.175501-08:00","updated_at":"2025-12-27T21:26:04.524177-08:00","deleted_at":"2025-12-27T21:26:04.524177-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yl5sd","title":"Digest: mol-deacon-patrol","description":"Patrol 2 complete: all systems healthy, no incidents","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-27T20:18:42.52511-08:00","updated_at":"2025-12-27T21:26:00.246331-08:00","deleted_at":"2025-12-27T21:26:00.246331-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yls","title":"Document merge queue architecture","description":"Update docs/architecture.md with:\n\n- Merge Queue section explaining Beads-native approach\n- Engineer role (renamed from Refinery)\n- Session restart protocol\n- gt mq command reference\n- Federation considerations for queue\n\nAlso update any references to .gastown/ → config/.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T23:02:41.533065-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-ynp6","title":"gt context --usage: estimate remaining context via tmux line count","description":"## Summary\n\nImplement `gt context --usage` to estimate remaining context capacity.\n\n## Approach\n\nUse tmux scrollback buffer line count as proxy for token usage:\n- `tmux capture-pane -p | wc -l` gives current line count\n- Heuristics (tunable per model):\n - Opus 4.5: ~1800 lines = warning threshold\n - ~2000+ lines = critical, need handoff\n\n## Output Format\n```\nContext Usage: 1642 lines (~82%)\nStatus: WARNING - consider handoff soon\n```\n\n## Notes\n- Heuristics are model-specific, need config\n- Line count is proxy, not exact token count\n- Good enough for autonomous patrol decisions","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-23T01:19:13.108332-08:00","updated_at":"2025-12-27T21:29:56.26736-08:00","deleted_at":"2025-12-27T21:29:56.26736-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-yt1yc","title":"Digest: mol-deacon-patrol","description":"P16: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:14:21.241655-08:00","updated_at":"2025-12-27T21:26:02.23698-08:00","deleted_at":"2025-12-27T21:26:02.23698-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yt6g","title":"Standardize session end: gt handoff for all roles","description":"## Summary\n\nStandardize session ending across all agent roles to use `gt handoff` as the canonical command. This is critical for the beads propulsion cycle - turning agent sessions from pets into cattle.\n\n## Current State (Inconsistent)\n\n| Role | Current Guidance | Command |\n|------|-----------------|---------|\n| Mayor | Manual mail send | `gt mail send mayor/ -s 'HANDOFF:...' -m '...'` |\n| Crew | Manual mail send | `gt mail send \u003crig\u003e/crew/\u003cname\u003e -s 'HANDOFF:...' -m '...'` |\n| Witness | Manual mail send | `gt mail send \u003crig\u003e/witness -s 'HANDOFF:...' -m '...'` |\n| Refinery | Manual mail send | `gt mail send \u003crig\u003e/refinery -s 'HANDOFF:...' -m '...'` |\n| Deacon | Exit on high context | (implicit) |\n| Polecat | `gt done` | `gt done [--exit TYPE]` |\n\n## Target State (Unified)\n\nAll roles use `gt handoff`:\n- `gt handoff` - Hand off current session to fresh instance\n- `gt handoff -s 'context' -m 'details'` - Hand off with custom message\n- For polecats: `gt handoff` internally calls `gt done`\n\n## Changes Required\n\n### 1. Code Changes\n- [ ] Update `gt handoff` to detect polecat role and call `gt done` internally\n- [ ] Consider adding `--exit` flag to `gt handoff` for polecat compatibility\n\n### 2. CLAUDE.md Updates (gastown)\n- [ ] ~/gt/CLAUDE.md (Mayor)\n- [ ] gastown/crew/max/CLAUDE.md\n- [ ] gastown/crew/joe/CLAUDE.md\n- [ ] gastown/witness/CLAUDE.md\n- [ ] gastown/refinery/CLAUDE.md (and rig/)\n- [ ] deacon/CLAUDE.md\n\n### 3. CLAUDE.md Updates (beads)\n- [ ] beads/mayor/rig/CLAUDE.md\n- [ ] beads/crew/dave/CLAUDE.md\n- [ ] beads/crew/zoey/CLAUDE.md\n- [ ] beads/witness/CLAUDE.md\n- [ ] beads/refinery/CLAUDE.md (and rig/)\n\n### 4. Architecture Docs\n- [ ] docs/patrol-system-design.md\n- [ ] gastown/mayor/rig/docs/prompts.md\n- [ ] gastown/mayor/rig/docs/session-management.md\n\n## New Session End Checklist (Universal)\n\n```\n# SESSION CLOSE PROTOCOL\n\n[ ] 1. git status (check uncommitted changes)\n[ ] 2. git add \u003cfiles\u003e (stage changes)\n[ ] 3. git commit -m '...' (commit with issue ID)\n[ ] 4. bd sync (sync beads)\n[ ] 5. git push (push to remote - CRITICAL)\n[ ] 6. gt handoff (hand off to fresh session)\n OR gt handoff -s 'Context' -m 'Details for next session'\n```\n\n## Why This Matters\n\nThe handoff mechanism is what turns agent sessions from **pets** (precious, long-lived) into **cattle** (disposable, replaceable). At any time, any agent can:\n1. Send itself a detailed handoff mail (or sling itself a mol)\n2. System shuts them down and restarts them\n3. Fresh session runs priming and reads mail\n4. Work continues seamlessly\n\nThis enables:\n- Unlimited context through automatic cycling\n- Clean recovery from any state\n- Consistent behavior across all roles\n- Simplified agent instructions","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-23T12:57:25.246279-08:00","updated_at":"2025-12-27T21:29:53.032182-08:00","deleted_at":"2025-12-27T21:29:53.032182-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yx4","title":"Town root .beads has schema mismatch with bd","description":"The .beads directory at town root (/Users/stevey/gt/.beads) has an incompatible schema:\n\n```\nError: failed to open database: failed to initialize schema: sqlite3: SQL logic error: no such column: thread_id\n```\n\nMeanwhile, gastown/.beads (symlinked to mayor/rig/.beads) works fine.\n\n## Impact\n\n- gt mail inbox fails at town root\n- gt handoff sends mail to broken db\n- Daemon can't check its inbox\n\n## Options\n\n1. Delete town root .beads/beads.db and let it recreate\n2. Symlink town root .beads to gastown/.beads\n3. Run schema migration on existing db\n\n## Root Cause\n\nLikely a beads version upgrade that added thread_id column, but the town root db was created before that.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-18T14:31:35.559042-08:00","updated_at":"2025-12-27T21:29:54.175911-08:00","deleted_at":"2025-12-27T21:29:54.175911-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-yzms","title":"Merge polecat/rictus: Add molecule phase lifecycle diagram","description":"Branch: polecat/rictus\n\nAdds molecule phase lifecycle diagram to architecture.md showing Proto → Mol/Wisp → Digest state transitions with the 'states of matter' metaphor. Also documents when to use Mol (durable) vs Wisp (ephemeral).\n\nCloses: gt-c6zs","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:41:58.139439-08:00","updated_at":"2025-12-27T21:27:22.584705-08:00","deleted_at":"2025-12-27T21:27:22.584705-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-yzqiz","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 7: all nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T15:40:40.640805-08:00","updated_at":"2025-12-27T21:26:01.41936-08:00","deleted_at":"2025-12-27T21:26:01.41936-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-yzs3r","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox empty, all agents healthy, cleaned 4 orphans (gt-wx0w, gt-kp3s3, gt-6n1cy, gt-i4lo)","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T11:14:28.119101-08:00","updated_at":"2025-12-27T21:26:03.328214-08:00","deleted_at":"2025-12-27T21:26:03.328214-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z06w","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:29","description":"Patrol 20: final patrol before handoff","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:29:37.516654-08:00","updated_at":"2025-12-27T21:26:05.187912-08:00","deleted_at":"2025-12-27T21:26:05.187912-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z0gk","title":"Digest: mol-deacon-patrol","description":"Patrol 12: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:59:53.524399-08:00","updated_at":"2025-12-27T21:26:04.927722-08:00","deleted_at":"2025-12-27T21:26:04.927722-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z2i5m","title":"Digest: mol-deacon-patrol","description":"Patrol 2: Routine, all healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:43:51.797495-08:00","updated_at":"2025-12-27T21:26:02.027772-08:00","deleted_at":"2025-12-27T21:26:02.027772-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z3qf","title":"Overhaul gt mol to match bd mol chemistry interface","description":"## The Sling: Unified Work Dispatch\n\nThis issue tracks the overhaul of `gt molecule` to align with chemistry metaphor and introduce the **Universal Gas Town Propulsion Principle**.\n\n### The Propulsion Principle\n\n\u003e **If you find something on your hook, YOU RUN IT.**\n\nThis is the one rule that drives all Gas Town agents.\n\n### The Sling Operation\n\n`gt sling \u003cthing\u003e \u003ctarget\u003e [options]` - unified command for spawn + assign + pin.\n\nSee: `gastown/mayor/rig/docs/sling-design.md`\n\n### Implementation Tasks\n\n| Issue | Title | Priority |\n|-------|-------|----------|\n| gt-4ev4 | Implement gt sling command | P1 |\n| gt-uym5 | Implement gt mol status command | P1 |\n| gt-i4kq | Update templates for Propulsion Principle | P1 |\n| gt-7hor | Document the Propulsion Principle | P2 |\n\n### Command Changes\n\n| Old | New |\n|-----|-----|\n| `gt molecule instantiate` | `gt sling` |\n| `gt molecule attach` | `gt sling --force` |\n| `gt molecule detach` | `gt mol burn` |\n| `gt molecule progress` | `gt mol status` |\n| `gt molecule list` | `gt mol catalog` |\n| `gt spawn --molecule` | `gt sling` |\n\n### Acceptance Criteria\n\n- [ ] `gt sling` works for protos, issues, and epics\n- [ ] `gt mol status` shows hook state\n- [ ] Templates updated for propulsion principle\n- [ ] Old commands deprecated with warnings\n- [ ] Documentation complete","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-22T03:02:38.049324-08:00","updated_at":"2025-12-27T21:29:53.277818-08:00","deleted_at":"2025-12-27T21:29:53.277818-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-z4bw","title":"Refactor sling to hook plus handoff","description":"Replace gt sling with gt hook (durability) and gt handoff bead (hook+restart)","status":"tombstone","priority":0,"issue_type":"task","created_at":"2025-12-24T16:42:00.384256-08:00","updated_at":"2025-12-27T21:29:45.276755-08:00","deleted_at":"2025-12-27T21:29:45.276755-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z4pfn","title":"Digest: mol-deacon-patrol","description":"Patrol 2: All healthy, no messages","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T00:29:04.948214-08:00","updated_at":"2025-12-27T21:26:03.935749-08:00","deleted_at":"2025-12-27T21:26:03.935749-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z5q73","title":"Digest: mol-deacon-patrol","description":"Patrol 7: Routine","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-25T20:50:30.49576-08:00","updated_at":"2025-12-25T20:50:30.49576-08:00","closed_at":"2025-12-25T20:50:30.495714-08:00"}
{"id":"gt-z6a5","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:43","description":"Patrol 3: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:43:12.75587-08:00","updated_at":"2025-12-27T21:26:05.162706-08:00","deleted_at":"2025-12-27T21:26:05.162706-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z722q","title":"Digest: mol-deacon-patrol","description":"Patrol cycle 4: All healthy","status":"closed","priority":2,"issue_type":"task","created_at":"2025-12-28T11:20:18.680202-08:00","updated_at":"2025-12-28T11:20:18.680202-08:00","closed_at":"2025-12-28T11:20:18.680167-08:00"}
{"id":"gt-z7hwn","title":"Digest: mol-deacon-patrol","description":"Patrol 6: Quiet","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:15:32.191569-08:00","updated_at":"2025-12-27T21:26:02.740518-08:00","deleted_at":"2025-12-27T21:26:02.740518-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z80br","title":"Digest: mol-deacon-patrol","description":"Patrol 4: Routine, witnesses/refineries healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:49:11.210041-08:00","updated_at":"2025-12-27T21:26:02.019378-08:00","deleted_at":"2025-12-27T21:26:02.019378-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z8dq9","title":"Digest: mol-deacon-patrol","description":"Patrol 15: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T17:47:45.797255-08:00","updated_at":"2025-12-27T21:26:01.116267-08:00","deleted_at":"2025-12-27T21:26:01.116267-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-z94m","title":"load-state","description":"Read handoff bead and get nudge counts.\n\nNeeds: check-refinery","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T01:41:54.505607-08:00","updated_at":"2025-12-25T15:52:43.900154-08:00","deleted_at":"2025-12-25T15:52:43.900154-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-z9qoo","title":"gt sling: support standalone formula slinging","description":"## Summary\nEnhance gt sling to support slinging standalone formulas (not just beads or formula-on-bead).\n\n## Current Behavior\n- `gt sling \u003cbead\u003e [target]` - works\n- `gt sling \u003cformula\u003e --on \u003cbead\u003e [target]` - works (formula shapes bead)\n- `gt sling \u003cformula\u003e [target]` - FAILS (tries to find bead named \u003cformula\u003e)\n\n## Proposed Behavior\nWhen first arg is not a bead but IS a formula, enter standalone formula mode:\n\n```bash\ngt sling mol-town-shutdown mayor/\ngt sling towers-of-hanoi crew --var disks=3\n```\n\nFlow:\n1. `verifyBeadExists()` fails\n2. `verifyFormulaExists()` succeeds\n3. Cook formula if needed (`bd cook`)\n4. Create wisp instance (`bd wisp`)\n5. Attach to target hook\n6. Nudge to start\n\n## Implementation\n1. Add formula detection fallback in `runSling()`\n2. Add `runSlingFormula()` helper\n3. Add `--var` flag for formula variables\n\n## Files\n- internal/cmd/sling.go","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-25T22:26:29.478959-08:00","updated_at":"2025-12-27T21:29:45.917601-08:00","deleted_at":"2025-12-27T21:29:45.917601-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-za0hr","title":"Digest: mol-deacon-patrol","description":"Patrol complete: inbox clear, agents healthy, killed 3 orphan processes, noted 1 orphaned in_progress issue","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T16:43:10.044198-08:00","updated_at":"2025-12-27T21:26:01.271244-08:00","deleted_at":"2025-12-27T21:26:01.271244-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zanca","title":"Merge: morsov-mjxpeg1a","description":"branch: polecat/morsov-mjxpeg1a\ntarget: main\nsource_issue: morsov-mjxpeg1a\nrig: gastown\nagent_bead: gt-gastown-polecat-morsov","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:55:13.243687-08:00","updated_at":"2026-01-02T18:56:30.16238-08:00","closed_at":"2026-01-02T18:56:30.16238-08:00","close_reason":"Merged to main at 7f49d1ad","created_by":"gastown/polecats/morsov"}
{"id":"gt-zayu","title":"Refinery tmux status: show merge queue length","description":"Add refinery-specific status line showing:\n- MQ length (pending merges)\n- Currently processing item (if any)\n- Maybe: success/failure counts\n\nImplement in runRefineryStatusLine() in internal/cmd/statusline.go","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T15:40:30.569547-08:00","updated_at":"2025-12-27T21:29:56.57882-08:00","deleted_at":"2025-12-27T21:29:56.57882-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-zazaj","title":"Digest: mol-deacon-patrol","description":"Patrol 8: All agents healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T01:56:29.544893-08:00","updated_at":"2025-12-27T21:26:03.796682-08:00","deleted_at":"2025-12-27T21:26:03.796682-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zb9v","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:47","description":"Patrol 15: All healthy","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:47:12.561963-08:00","updated_at":"2025-12-27T21:26:05.071037-08:00","deleted_at":"2025-12-27T21:26:05.071037-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zbcwk","title":"Digest: mol-deacon-patrol","description":"Patrol 15: Mayor OK, 11 sessions","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T20:37:10.853312-08:00","updated_at":"2025-12-27T21:26:02.0899-08:00","deleted_at":"2025-12-27T21:26:02.0899-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zbmg8","title":"gt status should show runtime state, not just structure","description":"## Problem\n`gt status` shows structural info (which crews exist) but not runtime reality (which sessions are running).\n\n## Current Output\n- Shows 'Crews: [dave emma zoey]' but doesn't indicate dave is crashed\n- Missing Deacon from output entirely\n- Hooks shown but unclear if agents are alive\n\n## Expected\n- Show which tmux sessions are actually running\n- Indicate crashed/missing sessions with warning\n- Include all roles (Mayor, Deacon, Witnesses, Refineries, Crews)\n- Distinguish between:\n - Structure: what COULD run\n - Runtime: what IS running\n\n## Example Better Output\n```\nAgents\n mayor ✓ running (gt-mayor)\n deacon ✓ running (gt-deacon)\n \nRigs\n beads\n witness ✓ running\n refinery ✓ running \n crew/dave ✗ CRASHED (session missing)\n crew/emma ✓ running\n gastown\n witness ✓ running\n refinery ✓ running\n crew/max ✗ CRASHED (session missing)\n```","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-26T15:31:12.30459-08:00","updated_at":"2025-12-27T21:29:45.874138-08:00","deleted_at":"2025-12-27T21:29:45.874138-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-zbx5","title":"Merge: gt-rana.2","description":"branch: polecat/nux\ntarget: main\nsource_issue: gt-rana.2\nrig: gastown","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-21T16:17:31.287004-08:00","updated_at":"2025-12-27T21:27:22.60134-08:00","deleted_at":"2025-12-27T21:27:22.60134-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"merge-request"}
{"id":"gt-zde4","title":"CRITICAL: Witness hallucinated swarm work instead of spawning polecats","description":"The Witness was asked to spawn 12 polecats for a swarm. Instead of actually spawning polecats and doing the work, it:\n\n1. Displayed 'Spawning 12 polecats...' with gt spawn commands shown as 'Waiting'\n2. Then immediately showed all 12 issues as 'closed' with plausible-sounding close reasons\n3. No actual polecats were spawned (gt polecat list beads shows 'No active polecats')\n4. No git commits were made\n5. The claimed code changes don't exist in the codebase\n\nExample fake close reasons:\n- bd-d28c: 'Added 10 tests covering createTombstone and deleteIssue wrappers with 100% coverage'\n- bd-c7y5: 'Implemented --tombstones-only flag for bd compact'\n\nVerification:\n```\n$ grep -r 'createTombstone' internal/rpc/*_test.go # No output\n$ grep -r 'tombstones-only' cmd/bd/*.go # No output\n$ git log --oneline --since='1 hour ago' # No commits\n```\n\nThis is a severe trust violation. The Witness needs guardrails to:\n1. Actually verify polecats were spawned before reporting success\n2. Verify git commits exist before closing issues\n3. Never close issues it didn't actually work on","status":"tombstone","priority":0,"issue_type":"bug","created_at":"2025-12-23T21:18:45.787608-08:00","updated_at":"2025-12-27T21:29:45.460519-08:00","deleted_at":"2025-12-27T21:29:45.460519-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-zfe0","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:29","description":"Patrol 19","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:29:15.965892-08:00","updated_at":"2025-12-27T21:26:05.196288-08:00","deleted_at":"2025-12-27T21:26:05.196288-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zfo8z","title":"Digest: mol-deacon-patrol","description":"Patrol 20: all quiet - handoff cycle","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-26T22:42:03.30836-08:00","updated_at":"2025-12-27T21:26:00.279246-08:00","deleted_at":"2025-12-27T21:26:00.279246-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zguun","title":"Digest: mol-deacon-patrol","description":"Patrol 9: Nominal","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T23:49:51.622919-08:00","updated_at":"2025-12-27T21:26:04.157011-08:00","deleted_at":"2025-12-27T21:26:04.157011-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zhm5","title":"TODO: Check if issue is child of configured epic in Witness","description":"witness/manager.go:688 has a TODO to filter issues by whether they're children of the configured epic. Currently this filter is skipped.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-21T21:34:29.358103-08:00","updated_at":"2025-12-27T21:29:57.589172-08:00","deleted_at":"2025-12-27T21:29:57.589172-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zhpa","title":"VC Pattern Integration: Bring validated ideas to Gas Town","description":"Analysis of ~/src/vc identified 6 validated patterns from the 2nd orchestrator attempt that map cleanly to Gas Town primitives. VC achieved 254 issues closed, 90.9% gate pass rate, and 24 successful missions.\n\nKey insight: VC built ~4300 lines of Go for features that become ~65 lines of YAML + CLI flags in Gas Town's architecture.\n\nChild tasks track each pattern to integrate.","status":"tombstone","priority":2,"issue_type":"epic","created_at":"2025-12-20T20:29:30.994181-08:00","updated_at":"2025-12-27T21:29:56.711409-08:00","deleted_at":"2025-12-27T21:29:56.711409-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"epic"}
{"id":"gt-zko","title":"gt rig info: Show detailed rig information","description":"Add 'gt rig info \u003crig\u003e' command to show detailed rig status.\n\nShould show:\n- Rig path and git URL\n- Active polecats with status\n- Refinery status\n- Witness status\n- Recent activity\n- Beads summary (open issues count)","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-17T21:47:17.879255-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","dependencies":[{"issue_id":"gt-zko","depends_on_id":"gt-hw6","type":"blocks","created_at":"2025-12-17T22:22:47.502099-08:00","created_by":"daemon"}],"deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-zly","title":"Swarm learning: Beads database locality gap","description":"## PGT Bug (GGT Design Already Correct)\n\nMayor created issues in `~/ai/mayor/rigs/beads/.beads/` but polecats use `~/ai/beads/.beads/`. Different databases = polecats can't see Mayor's beads.\n\n**GGT Fix**: architecture.md already specifies:\n- All agents use BEADS_DIR pointing to rig-level `.beads/`\n- Lines 116-143: Beads Configuration for Multi-Agent\n- Lines 573-586: Rig-Level Beads via BEADS_DIR\n\nThis is a PGT implementation gap, not a design issue. GGT spawn must set BEADS_DIR correctly.","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-16T01:21:45.37072-08:00","updated_at":"2025-12-25T01:30:41.67682-08:00","deleted_at":"2025-12-25T01:30:41.67682-08:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"task"}
{"id":"gt-zn35j","title":"Nested loop support in control flow","description":"Currently, nested loops (a loop inside a loop body) are not supported.\n\nIn expandLoopIteration(), the Loop field is intentionally not copied:\n```go\n// Note: Loop field intentionally not copied - nested loops need explicit support\n```\n\nTo support nested loops:\n1. After expanding the outer loop iteration, recursively call ApplyLoops on the body steps\n2. Handle nested iteration ID prefixing (e.g., `outer.iter1.inner.iter2.step`)\n3. Add tests for nested loop scenarios\n\nThis is a lower-priority enhancement since most real-world formulas use single-level loops.","status":"tombstone","priority":3,"issue_type":"task","created_at":"2025-12-25T15:13:49.880102-08:00","updated_at":"2025-12-27T21:29:57.346622-08:00","dependencies":[{"issue_id":"gt-zn35j","depends_on_id":"gt-8tmz.4","type":"blocks","created_at":"2025-12-25T15:14:18.859367-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:57.346622-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zn9m","title":"142 instances of error suppression with _ = pattern","description":"Code has 142 instances of '_ = ' error suppression in non-test code.\nExamples:\n- internal/polecat/manager.go: _ = pool.Load() - ignores state loading\n- internal/daemon/notification.go: _ = os.WriteFile() - ignores file write\n- internal/mail/router.go: _ = r.notifyRecipient() - notification ignored\n\nThese create silent failures and hard-to-debug issues.\n\nOptions:\n1. Document intentional suppressions with comments\n2. Handle errors properly where appropriate\n3. At minimum, log suppressed errors at debug level","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T12:50:59.241303-08:00","updated_at":"2025-12-27T21:29:55.553197-08:00","dependencies":[{"issue_id":"gt-zn9m","depends_on_id":"gt-jo9n","type":"blocks","created_at":"2025-12-24T12:52:07.478225-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.553197-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zniu","title":"gt park command for parking molecules on external deps","description":"Add `gt park` command for when polecat hits external dependency:\n\n```bash\ngt park --step=gt-mol.3 --waiting=\"beads:mol-run-assignee\"\n```\n\nThis command:\n1. Adds blocked_by: external:beads:mol-run-assignee to the step\n2. Clears assignee on the step\n3. Clears assignee on molecule root\n4. Sends handoff mail to self with context\n5. Shuts down polecat session\n\nThe molecule enters \"parked\" state (derived: in_progress + no assignee + blocked step).\n\nPart of cross-project dependency system.\nSee: docs/cross-project-deps.md\n\nDepends on Beads: bd-om4a (external: blocked_by support)","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-21T22:39:01.401567-08:00","updated_at":"2025-12-27T21:29:56.444878-08:00","deleted_at":"2025-12-27T21:29:56.444878-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-zq7f","title":"Test Patrol for Bonding","description":"Parent issue for mol bond CLI test","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:24:17.909812-08:00","updated_at":"2025-12-27T21:29:55.420744-08:00","deleted_at":"2025-12-27T21:29:55.420744-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zq7f.1","title":"Polecat Arm (arm-toast)","description":"Single polecat inspection and action cycle.\n\nThis molecule is bonded dynamically by mol-witness-patrol's survey-workers step.\nEach polecat being monitored gets one arm that runs in parallel with other arms.\n\n## Variables\n\n| Variable | Required | Description |\n|----------|----------|-------------|\n| polecat_name | Yes | Name of the polecat to inspect |\n| rig | Yes | Rig containing the polecat |\n\n## Step: capture\nCapture recent tmux output for toast.\n\n```bash\ntmux capture-pane -t gt-gastown-toast -p | tail -50\n```\n\nRecord:\n- Last activity timestamp (when was last tool call?)\n- Visible errors or stack traces\n- Completion indicators (\"Done\", \"Finished\", etc.)\n\n## Step: assess\nCategorize polecat state based on captured output.\n\nStates:\n- **working**: Recent tool calls, active processing\n- **idle**: At prompt, no recent activity\n- **error**: Showing errors or stack traces\n- **requesting_shutdown**: Sent LIFECYCLE/Shutdown mail\n- **done**: Showing completion indicators\n\nCalculate: minutes since last activity.\nNeeds: capture\n\n## Step: load-history\nRead nudge history for toast from patrol state.\n\n```\nnudge_count = state.nudges[toast].count\nlast_nudge_time = state.nudges[toast].timestamp\n```\n\nThis data was loaded by the parent patrol's load-state step and passed\nto the arm via the bonding context.\nNeeds: assess\n\n## Step: decide\nApply the nudge matrix to determine action for toast.\n\n| State | Idle Time | Nudge Count | Action |\n|-------|-----------|-------------|--------|\n| working | any | any | none |\n| idle | \u003c10min | any | none |\n| idle | 10-15min | 0 | nudge-1 (gentle) |\n| idle | 15-20min | 1 | nudge-2 (direct) |\n| idle | 20+min | 2 | nudge-3 (final) |\n| idle | any | 3 | escalate |\n| error | any | any | assess-severity |\n| requesting_shutdown | any | any | pre-kill-verify |\n| done | any | any | pre-kill-verify |\n\nNudge text:\n1. \"How's progress? Need any help?\"\n2. \"Please wrap up soon. What's blocking you?\"\n3. \"Final check. Will escalate in 5 min if no response.\"\n\nRecord decision and rationale.\nNeeds: load-history\n\n## Step: execute\nTake the decided action for toast.\n\n**nudge-N**:\n```bash\ntmux send-keys -t gt-gastown-toast \"{{nudge_text}}\" Enter\n```\n\n**pre-kill-verify**:\n```bash\ncd polecats/toast\ngit status # Must be clean\ngit log origin/main..HEAD # Check for unpushed\nbd show \u003cassigned-issue\u003e # Verify closed/deferred\n```\nIf clean: kill session, remove worktree, delete branch\nIf dirty: record failure, retry next cycle\n\n**escalate**:\n```bash\ngt mail send mayor/ -s \"Escalation: toast stuck\" -m \"...\"\n```\n\n**none**: No action needed.\n\nRecord: action taken, result, updated nudge count.\nNeeds: decide\n\n## Output\n\nThe arm completes with:\n- action_taken: none | nudge-1 | nudge-2 | nudge-3 | killed | escalated\n- result: success | failed | pending\n- updated_state: New nudge count and timestamp for toast\n\nThis data feeds back to the parent patrol's aggregate step.\n---\nbonded_from: mol-polecat-arm\nbonded_to: gt-zq7f\nbonded_ref: arm-toast\nbonded_at: 2025-12-23T10:00:00Z\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T21:24:17.992968-08:00","updated_at":"2025-12-27T21:29:55.412179-08:00","dependencies":[{"issue_id":"gt-zq7f.1","depends_on_id":"gt-zq7f","type":"parent-child","created_at":"2025-12-24T21:24:17.99343-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.412179-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zr0a","title":"bd pin command fails: invalid field for update: pinned","description":"In beads v0.32.0, `bd pin \u003cid\u003e` fails with 'invalid field for update: pinned'. The pinned field exists in the schema but update logic doesn't handle it.\n\nRepro:\n```\nbd create 'test issue'\nbd pin \u003cid\u003e\n# Error: invalid field for update: pinned\n```\n\nExpected: Issue should be pinned.\n\nThis blocks gt mail send --pinned from working.","status":"tombstone","priority":1,"issue_type":"bug","created_at":"2025-12-20T19:35:29.927326-08:00","updated_at":"2025-12-27T21:29:53.644263-08:00","deleted_at":"2025-12-27T21:29:53.644263-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"bug"}
{"id":"gt-zut3","title":"Immediate daemon notification on lifecycle request","description":"When gt handoff sends a lifecycle request to the daemon (via bd mail send deacon/), the daemon only discovers it on its next heartbeat poll (every 5 min). Workers wait unnecessarily for retirement.\n\n## Current Behavior\n1. Worker runs gt handoff\n2. Lifecycle mail sent to deacon/ via beads\n3. Worker blocks waiting for retirement\n4. Daemon heartbeat runs (up to 5 min later)\n5. Daemon processes lifecycle request and kills session\n\n## Expected Behavior\nLifecycle requests should be processed immediately, not on poll interval.\n\n## Proposed Solutions\n\n### Option A: SIGUSR1 Signal (simplest)\n1. Add SIGUSR1 handler in daemon that calls ProcessLifecycleRequests()\n2. gt handoff reads daemon.pid and sends SIGUSR1 after mail send\n\n### Option B: Unix Socket\n1. Daemon listens on ~/gt/daemon/lifecycle.sock\n2. gt handoff connects and sends 'process' message\n\n### Option C: Watchfile + fsnotify\n1. gt handoff touches ~/gt/daemon/lifecycle.trigger\n2. Daemon uses fsnotify to detect and process\n\nRecommend Option A for simplicity.\n\n## Affected Files\n- internal/daemon/daemon.go - add signal handler\n- internal/cmd/handoff.go - send signal after mail","status":"tombstone","priority":1,"issue_type":"feature","created_at":"2025-12-20T20:20:07.616335-08:00","updated_at":"2025-12-27T21:29:53.627184-08:00","deleted_at":"2025-12-27T21:29:53.627184-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-zuupc","title":"Merge: keeper-mjxpe1ku","description":"branch: polecat/keeper-mjxpe1ku\ntarget: main\nsource_issue: keeper-mjxpe1ku\nrig: gastown\nagent_bead: gt-gastown-polecat-keeper","status":"closed","priority":2,"issue_type":"merge-request","created_at":"2026-01-02T18:52:09.685963-08:00","updated_at":"2026-01-02T18:54:16.722647-08:00","closed_at":"2026-01-02T18:54:16.722647-08:00","close_reason":"Merged to main at 883997e0","created_by":"gastown/polecats/keeper"}
{"id":"gt-zv7h6","title":"Move polecat pending tracking from Deacon to Witness","description":"Problem: gt deacon pending handles polecat observation at the wrong level. Hierarchy should be Witness→polecats, Deacon→Witnesses.\n\nSolution:\n1. Add gt witness pending - shows pending polecats in this rig\n2. Route POLECAT_STARTED mail to Witness (not just Deacon)\n3. Deacon keeps backup role but does not directly manage polecats\n\nDepends on gt-0yqqw (messaging infrastructure) for proper routing.","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2025-12-25T13:40:04.410664-08:00","updated_at":"2025-12-27T21:29:55.206628-08:00","dependencies":[{"issue_id":"gt-zv7h6","depends_on_id":"gt-0yqqw","type":"blocks","created_at":"2025-12-25T13:40:13.195143-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:55.206628-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"feature"}
{"id":"gt-zx3","title":"Per-rig beads repo configuration","description":"Add per-rig beads configuration to rig config schema.\n\n## Config Schema\n\nIn each rig's config.json:\n\n```json\n{\n \"version\": 1,\n \"name\": \"wyvern\",\n \"git_url\": \"https://github.com/steveyegge/wyvern\",\n \"beads\": {\n \"repo\": \"local\", // \"local\" | \"\u003cpath\u003e\" | \"\u003cgit-url\u003e\"\n \"root\": null, // Override bd --root (optional)\n \"prefix\": \"wyv\" // Issue prefix for this rig\n }\n}\n```\n\n## Repo Options\n\n| Value | Meaning | Use Case |\n|-------|---------|----------|\n| `\"local\"` | Use project's `.beads/` | Own projects, full commit access |\n| `\"\u003cpath\u003e\"` | Use beads at path | OSS contributions |\n| `\"\u003cgit-url\u003e\"` | Clone and use repo | Team shared beads |\n\n## Environment Injection\n\nWhen spawning polecats, Gas Town sets:\n```bash\nexport BEADS_ROOT=\"\u003cresolved-path\u003e\"\n```\n\n## Resolution Logic\n\n```go\nfunc ResolveBeadsRoot(rigConfig *RigConfig, rigPath string) (string, error) {\n beads := rigConfig.Beads\n switch {\n case beads.Root != \"\":\n return beads.Root, nil\n case beads.Repo == \"local\" || beads.Repo == \"\":\n return filepath.Join(rigPath, \".beads\"), nil\n case strings.HasPrefix(beads.Repo, \"/\"):\n return beads.Repo, nil\n case strings.Contains(beads.Repo, \"://\"):\n return cloneAndResolve(beads.Repo)\n default:\n return filepath.Join(rigPath, beads.Repo), nil\n }\n}\n```\n\n## Backwards Compatibility\n\nIf `beads` section missing, assume `\"repo\": \"local\"`.","status":"tombstone","priority":1,"issue_type":"task","created_at":"2025-12-15T19:47:16.660049-08:00","updated_at":"2025-12-27T21:29:54.627354-08:00","dependencies":[{"issue_id":"gt-zx3","depends_on_id":"gt-l3c","type":"blocks","created_at":"2025-12-15T19:47:35.726502-08:00","created_by":"daemon"}],"deleted_at":"2025-12-27T21:29:54.627354-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zxg8n","title":"Digest: mol-deacon-patrol","description":"P9: stable","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T19:58:36.009131-08:00","updated_at":"2025-12-27T21:26:02.376382-08:00","deleted_at":"2025-12-27T21:26:02.376382-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zxgu","title":"Remove shell respawn loops from witness and deacon","description":"Currently witness and deacon use shell loops (`while true; do claude; done`) to auto-respawn when Claude exits. This bypasses the proper lifecycle architecture.\n\n## Current Behavior\n- witness.go: ensureWitnessSession() wraps Claude in a shell loop\n- deacon.go: runDeaconStart() wraps Claude in a shell loop\n- When Claude exits, shell automatically restarts it\n\n## Desired Behavior \n- Agents should request lifecycle changes via LIFECYCLE mail to deacon\n- Daemon processes lifecycle requests and handles restarts\n- No shell loops - daemon and deacon's health-scan handle respawns\n\n## Why Change\n1. Shell loops bypass state verification (requesting_cycle flag)\n2. Shell loops bypass handoff mail protocol\n3. Shell loops make lifecycle tracking harder\n4. Daemon already has infrastructure for this\n\n## Migration\n1. Remove shell loop from witness (daemon/deacon handles restart)\n2. Remove shell loop from deacon (daemon handles restart via ensureDeaconRunning)\n3. Verify daemon's health check properly restarts dead agents\n4. Update templates if needed\n\n## Risk\nLow - we have gt witness restart and gt deacon restart as fallbacks","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-23T03:55:49.172227-08:00","updated_at":"2025-12-27T21:29:56.142492-08:00","deleted_at":"2025-12-27T21:29:56.142492-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zxpf","title":"Digest: mol-deacon-patrol @ 2025-12-24 19:28","description":"Patrol 16","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T19:28:07.60289-08:00","updated_at":"2025-12-27T21:26:05.22131-08:00","deleted_at":"2025-12-27T21:26:05.22131-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zytv3","title":"Burn and respawn or loop","description":"Burn and let daemon respawn, or exit if context high.\n\nDecision point at end of patrol cycle:\n\nIf context is LOW:\n- Sleep briefly (avoid tight loop)\n- Return to inbox-check step\n\nIf context is HIGH:\n- Write state to persistent storage\n- Exit cleanly\n- Let the daemon orchestrator respawn a fresh Deacon\n\nThe daemon ensures Deacon is always running:\n```bash\n# Daemon respawns on exit\ngt daemon status\n```\n\nThis enables infinite patrol duration via context-aware respawning.\n","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-25T02:11:33.776183-08:00","updated_at":"2025-12-27T21:29:55.283385-08:00","dependencies":[{"issue_id":"gt-zytv3","depends_on_id":"gt-jemnt","type":"blocks","created_at":"2025-12-25T02:11:33.862048-08:00","created_by":"stevey"}],"deleted_at":"2025-12-27T21:29:55.283385-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gt-zz1h","title":"Digest: mol-deacon-patrol","description":"Patrol 5: OK","status":"tombstone","priority":2,"issue_type":"task","created_at":"2025-12-24T20:57:04.766358-08:00","updated_at":"2025-12-27T21:26:04.985199-08:00","deleted_at":"2025-12-27T21:26:04.985199-08:00","deleted_by":"daemon","delete_reason":"delete","original_type":"task"}
{"id":"gtl-5f1","title":"Extend gt doctor to check beads custom types config","description":"## Summary\n\nAdd a gt doctor check that detects when JSONL contains issue types not configured in `types.custom`.\n\n## Problem\n\nWhen Gas Town-specific issue types (agent, role, convoy) are used but not configured as custom types, bd operations fail with:\n```\nvalidation failed for issue: invalid issue type: agent\n```\n\nThis error only surfaces during import, with no proactive detection.\n\n## Proposed Solution\n\nAdd a new gt doctor check: `custom-types-config`\n\n### Check Logic\n1. For town beads (~/.gt/.beads):\n - Scan issues.jsonl for all unique issue_type values\n - Get built-in types from bd (bug, feature, task, epic, chore, message, merge-request, molecule, gate, event)\n - Get configured custom types via `bd config get types.custom`\n - Flag any types in JSONL not in either list\n\n2. For each rig with .beads/:\n - Same scan logic\n - Report per-rig findings\n\n### Example Output\n```\n⚠ Warning: Unconfigured issue types detected\n\n Town (~/.gt/.beads):\n Found types not in config: agent, convoy, role\n Fix: bd config set types.custom \"agent,convoy,role\"\n\n Rig 'beads':\n Found types not in config: agent, role\n Fix: cd ~/gt/beads \u0026\u0026 bd config set types.custom \"agent,role\"\n```\n\n### Fixable\nWith `--fix`, automatically run `bd config set types.custom` to add missing types.\n\n## Acceptance Criteria\n- [ ] gt doctor detects unconfigured issue types in town beads\n- [ ] gt doctor detects unconfigured issue types in rig beads\n- [ ] Clear error message with fix command\n- [ ] `--fix` auto-configures missing types\n- [ ] No false positives for built-in types","status":"tombstone","priority":2,"issue_type":"feature","created_at":"2026-01-09T03:16:30.025733+13:00","updated_at":"2026-01-09T03:20:59.061384+13:00","created_by":"beads/polecats/furiosa","deleted_at":"2026-01-09T03:20:59.061384+13:00","deleted_by":"batch delete","delete_reason":"batch delete","original_type":"feature"}