Fix bd-my64: Pre-push hook blocks instead of exports

The original pre-push hook tried to export DB → JSONL during the push,
then run 'git add', but this doesn't work because:

1. The commit is already created when pre-push runs
2. git add in pre-push stages files for a FUTURE commit
3. The current push sends the old commit with stale JSONL
4. Result: dirty git status after push

Fix:
- Pre-push now CHECKS for uncommitted JSONL changes
- If found, it FAILS the push with clear instructions
- User must commit JSONL before pushing
- This prevents stale JSONL from reaching remote

The pre-commit hook already properly flushes changes, so this
catch-all prevents changes made BETWEEN commit and push.

Amp-Thread-ID: https://ampcode.com/threads/T-39a89553-c301-4d4f-b39f-6df9c403d22b
Co-authored-by: Amp <amp@ampcode.com>
This commit is contained in:
Steve Yegge
2025-11-06 18:57:34 -08:00
parent 2cb97e4d22
commit 3ba245e6c0
5 changed files with 76 additions and 40 deletions

View File

@@ -243,7 +243,7 @@
{"id":"bd-lwnt","content_hash":"ddfa247870eb3734ffa7a4d0da6fcd4a359d2b48e02d70aad8560ec4bc13afdc","title":"Test P1 priority","description":"","status":"closed","priority":1,"issue_type":"task","created_at":"2025-11-05T12:58:38.074112-08:00","updated_at":"2025-11-05T12:58:44.711763-08:00","closed_at":"2025-11-05T12:58:44.711763-08:00","source_repo":"."}
{"id":"bd-mlcz","content_hash":"35434fed7e20e8b67f7b598c46ce5dc65c111de237ea38b36a88a671b023b58b","title":"Implement bd migrate command","description":"Add bd migrate command to move issues between repos with filtering. Should support: filtering by status/priority/labels, dry-run mode, preserving dependencies, handling source_repo field updates.","status":"closed","priority":1,"issue_type":"task","created_at":"2025-11-05T18:04:29.902151-08:00","updated_at":"2025-11-05T18:42:52.536951-08:00","closed_at":"2025-11-05T18:42:52.536951-08:00","source_repo":".","dependencies":[{"issue_id":"bd-mlcz","depends_on_id":"bd-8rd","type":"parent-child","created_at":"2025-11-05T18:04:39.072312-08:00","created_by":"daemon"}]}
{"id":"bd-mn9p","content_hash":"89752e08d5a278ed301c655753e07bc2b564e710ce6d9e7e535bbd9ec48af182","title":"bd-hv01: Brittle string comparison breaks with JSON field reordering","description":"## Problem\ndeletion_tracking.go:125 uses string comparison to detect unchanged issues:\n\n```go\nif leftLine, existsInLeft := leftIndex[id]; existsInLeft \u0026\u0026 leftLine == baseLine {\n deletions = append(deletions, id)\n}\n```\n\nThis breaks if:\n- JSON field order changes (legal in JSON)\n- Timestamps updated by import/export\n- Whitespace/formatting changes\n- Floating point precision varies\n\n## Example Failure\n```json\n// baseLine\n{\"id\":\"bd-1\",\"priority\":1,\"status\":\"open\"}\n// leftLine (same data, different order)\n{\"id\":\"bd-1\",\"status\":\"open\",\"priority\":1}\n```\nThese are semantically identical but string comparison fails.\n\n## Fix\nParse and compare JSON semantically:\n```go\nfunc jsonEquals(a, b string) bool {\n var objA, objB map[string]interface{}\n json.Unmarshal([]byte(a), \u0026objA)\n json.Unmarshal([]byte(b), \u0026objB)\n return reflect.DeepEqual(objA, objB)\n}\n```\n\n## Files Affected\n- cmd/bd/deletion_tracking.go:125\n- cmd/bd/deletion_tracking.go:134-170 (buildIDToLineMap)","status":"closed","priority":1,"issue_type":"bug","created_at":"2025-11-06T18:15:35.090716-08:00","updated_at":"2025-11-06T18:46:55.889888-08:00","closed_at":"2025-11-06T18:46:55.889888-08:00","source_repo":".","dependencies":[{"issue_id":"bd-mn9p","depends_on_id":"bd-rbxi","type":"parent-child","created_at":"2025-11-06T18:19:14.790898-08:00","created_by":"daemon"}]}
{"id":"bd-my64","content_hash":"953fe689901a31818f40afb6e158bb32816186de76a3364e647ca4ecf7372a98","title":"Pre-push hook and daemon export produce different JSONL","description":"After committing and pushing, git status shows .beads/beads.jsonl as dirty. Investigation shows:\n\n1. Pre-push hook ran successfully and exported DB → JSONL\n2. Push completed\n3. Shortly after, daemon exported DB → JSONL again with different content\n4. Diff shows comments added to old issues (bd-23a8, bd-6049, bd-87a0)\n\nTimeline:\n- Commit c731c45 \"Update beads JSONL\"\n- Pre-push hook exported JSONL\n- Push succeeded\n- Daemon PID 33314 exported again with different content\n\nQuestions:\n1. Did someone run a command between commit and daemon export?\n2. Is there a timing issue where pre-push hook doesn't capture all DB changes?\n3. Should pre-commit hook flush daemon changes before committing?\n\nThe comments appear to be from Nov 5 (created_at: 2025-11-05T08:38:46Z) but are only appearing in JSONL now. This suggests the DB had these comments but they weren't exported during pre-push.\n\nPossible causes:\n- Pre-push hook uses BEADS_NO_DAEMON=1 which might skip pending writes\n- Daemon has unflushed changes in memory\n- Race condition between pre-push export and daemon's periodic export","status":"open","priority":1,"issue_type":"bug","created_at":"2025-11-06T18:49:54.570993-08:00","updated_at":"2025-11-06T18:49:54.570993-08:00","source_repo":"."}
{"id":"bd-my64","content_hash":"5ce1e4594cd5eede3f8eb0dbc1c903d90fd46ae7805179688d9425472b711311","title":"Pre-push hook and daemon export produce different JSONL","description":"After committing and pushing, git status shows .beads/beads.jsonl as dirty. Investigation shows:\n\n1. Pre-push hook ran successfully and exported DB → JSONL\n2. Push completed\n3. Shortly after, daemon exported DB → JSONL again with different content\n4. Diff shows comments added to old issues (bd-23a8, bd-6049, bd-87a0)\n\nTimeline:\n- Commit c731c45 \"Update beads JSONL\"\n- Pre-push hook exported JSONL\n- Push succeeded\n- Daemon PID 33314 exported again with different content\n\nQuestions:\n1. Did someone run a command between commit and daemon export?\n2. Is there a timing issue where pre-push hook doesn't capture all DB changes?\n3. Should pre-commit hook flush daemon changes before committing?\n\nThe comments appear to be from Nov 5 (created_at: 2025-11-05T08:38:46Z) but are only appearing in JSONL now. This suggests the DB had these comments but they weren't exported during pre-push.\n\nPossible causes:\n- Pre-push hook uses BEADS_NO_DAEMON=1 which might skip pending writes\n- Daemon has unflushed changes in memory\n- Race condition between pre-push export and daemon's periodic export","status":"in_progress","priority":1,"issue_type":"bug","created_at":"2025-11-06T18:49:54.570993-08:00","updated_at":"2025-11-06T18:54:12.543848-08:00","source_repo":"."}
{"id":"bd-ng56","content_hash":"82c7d4d6f8a2fa73193de02e52de172f7325f38c02e4b9d397aec4833234ff5f","title":"bd-hv01: Three full JSONL reads on every sync (performance)","description":"Problem: computeAcceptedDeletions reads three JSONL files completely into memory (base, left, merged). For 1000 issues at 1KB each, this is 3MB read and 3000 JSON parse operations.\n\nImpact: Acceptable now (~20-35ms overhead) but will be slow for large repos (10k+ issues).\n\nPossible optimizations: single-pass streaming, memory-mapped files, binary format, incremental snapshots.\n\nFiles: cmd/bd/deletion_tracking.go:101-208","status":"open","priority":3,"issue_type":"task","created_at":"2025-11-06T18:16:25.653076-08:00","updated_at":"2025-11-06T18:16:25.653076-08:00","source_repo":".","dependencies":[{"issue_id":"bd-ng56","depends_on_id":"bd-rbxi","type":"parent-child","created_at":"2025-11-06T18:19:15.148149-08:00","created_by":"daemon"}]}
{"id":"bd-nqes","content_hash":"d7ca10b54b92c464a0c4c1c932f2918a6e0e952a5fd7337acb9160cd524dc59a","title":"bd-hv01: Non-atomic snapshot operations can cause data loss","description":"## Problem\nIn sync.go:146-155 and daemon_sync.go:502-505, snapshot capture failures are logged as warnings but sync continues:\n\n```go\nif err := exportToJSONL(ctx, jsonlPath); err != nil { ... }\nif err := captureLeftSnapshot(jsonlPath); err != nil {\n fmt.Fprintf(os.Stderr, \"Warning: failed to capture snapshot...\")\n}\n```\n\nIf export succeeds but snapshot capture fails, the merge uses stale snapshot data, potentially deleting wrong issues.\n\n## Impact\n- Critical data integrity issue\n- Could delete issues incorrectly during multi-workspace sync\n\n## Fix\nMake snapshot capture mandatory:\n```go\nif err := captureLeftSnapshot(jsonlPath); err != nil {\n return fmt.Errorf(\"failed to capture snapshot (required for deletion tracking): %w\", err)\n}\n```\n\n## Files Affected\n- cmd/bd/sync.go:146-155\n- cmd/bd/daemon_sync.go:502-505","status":"closed","priority":1,"issue_type":"bug","created_at":"2025-11-06T18:15:33.574158-08:00","updated_at":"2025-11-06T18:46:55.874814-08:00","closed_at":"2025-11-06T18:46:55.874814-08:00","source_repo":".","dependencies":[{"issue_id":"bd-nqes","depends_on_id":"bd-rbxi","type":"parent-child","created_at":"2025-11-06T18:19:14.749153-08:00","created_by":"daemon"}]}
{"id":"bd-o43","content_hash":"4caa0f14a58127378a533362ec0292833b6d59195e503fab7505180c9c5c0438","title":"Add richer query capabilities to bd list","description":"Current bd list filters are limited to basic field matching (status, priority, type, assignee, label). This forces users to resort to piping through jq for common queries.\n\nMissing query capabilities:\n- Pattern matching: --title-contains, --desc-contains\n- Date ranges: --created-after, --updated-before, --closed-after\n- Empty/null checks: --empty-description, --no-assignee, --no-labels\n- Numeric ranges: --priority-min, --priority-max\n- Complex boolean logic: --and, --or operators\n- Full-text search: --search across all text fields\n- Negation: --not-status, --exclude-label\n\nExample use cases:\n- Find issues with empty descriptions\n- Find stale issues not updated in 30 days\n- Find high-priority bugs with no assignee\n- Search for keyword across title/description/notes\n\nImplementation approach:\n- Add query builder pattern to storage layer\n- Support --query DSL for complex queries\n- Keep simple flags for common cases\n- Add --json output for programmatic use","notes":"## Progress Update\n\n**Completed:**\n- ✅ Extended IssueFilter struct with new fields (pattern matching, date ranges, empty/null checks, priority ranges)\n- ✅ Updated SQLite SearchIssues implementation \n- ✅ Added CLI flags to list.go\n- ✅ Added parseTimeFlag helper\n- ✅ Comprehensive tests added - all passing\n\n**Remaining:**\n- ⚠️ RPC layer needs updating (internal/rpc/protocol.go ListArgs)\n- ⚠️ Daemon handler needs to forward new filters\n- ⚠️ End-to-end testing with daemon mode\n- 📝 Documentation updates\n\n**Files Modified:**\n- internal/types/types.go\n- internal/storage/sqlite/sqlite.go \n- cmd/bd/list.go\n- cmd/bd/list_test.go\n\n**Next Steps:**\n1. Update RPC protocol\n2. Update daemon handler \n3. Test with daemon mode\n4. Update docs","status":"closed","priority":2,"issue_type":"feature","created_at":"2025-11-05T00:17:48.677493-08:00","updated_at":"2025-11-05T00:33:38.998433-08:00","closed_at":"2025-11-05T00:33:38.998433-08:00","source_repo":"."}

View File

@@ -1,21 +1,29 @@
# bd Git Hooks
This directory contains git hooks that integrate bd (beads) with your git workflow, solving the race condition between daemon auto-flush and git commits.
This directory contains git hooks that integrate bd (beads) with your git workflow, preventing stale JSONL from being pushed to remote.
## The Problem
When using bd in daemon mode, operations trigger a 5-second debounced auto-flush to JSONL. This creates a race condition:
Two race conditions can occur:
1. User closes issue via MCP → daemon schedules flush (5 sec delay)
2. User commits code changes → JSONL appears clean
3. Daemon flush fires → JSONL modified after commit
4. Result: dirty working tree showing JSONL changes
1. **Between operations and commits**: Daemon auto-flush (5s debounce) may fire after commit
- User closes issue via MCP → daemon schedules flush (5 sec delay)
- User commits code changes → JSONL appears clean
- Daemon flush fires → JSONL modified after commit
- Result: dirty working tree showing JSONL changes
2. **Between commits and pushes**: Changes made after commit but before push (bd-my64)
- User commits → pre-commit hook flushes JSONL
- User adds comments or updates issues
- User pushes → outdated JSONL is pushed
- Result: remote has stale JSONL
## The Solution
These git hooks ensure bd changes are always synchronized with your commits:
These git hooks ensure bd changes are always synchronized with your commits and pushes:
- **pre-commit** - Flushes pending bd changes to JSONL before commit
- **pre-commit** - Flushes pending bd changes to JSONL before commit and stages it
- **pre-push** - Blocks push if JSONL has uncommitted changes (bd-my64)
- **post-merge** - Imports updated JSONL after git pull/merge
## Installation
@@ -37,8 +45,9 @@ This will:
```bash
cp examples/git-hooks/pre-commit .git/hooks/pre-commit
cp examples/git-hooks/pre-push .git/hooks/pre-push
cp examples/git-hooks/post-merge .git/hooks/post-merge
chmod +x .git/hooks/pre-commit .git/hooks/post-merge
chmod +x .git/hooks/pre-commit .git/hooks/pre-push .git/hooks/post-merge
```
## How It Works
@@ -58,16 +67,31 @@ This:
The hook is silent on success, fast (no git operations), and safe (fails commit if flush fails).
### pre-push
Before each push, the hook checks:
```bash
git diff --quiet .beads/beads.jsonl
```
This prevents pushing stale JSONL by:
1. Checking if JSONL has uncommitted changes (working tree or staging)
2. Failing the push with clear error message if changes exist
3. Instructing user to commit JSONL before pushing again
This solves bd-my64: changes made between commit and push are caught before reaching remote.
### post-merge
After a git pull or merge, the hook runs:
```bash
bd import -i .beads/issues.jsonl
bd import -i .beads/beads.jsonl
```
This ensures your local database reflects the merged state. The hook:
- Only runs if `.beads/issues.jsonl` exists
- Only runs if `.beads/beads.jsonl` exists (also checks `issues.jsonl` for backward compat)
- Imports any new issues or updates from the merge
- Warns on failure but doesn't block the merge
@@ -92,7 +116,7 @@ This ensures your local database reflects the merged state. The hook:
Remove the hooks:
```bash
rm .git/hooks/pre-commit .git/hooks/post-merge
rm .git/hooks/pre-commit .git/hooks/pre-push .git/hooks/post-merge
```
Your backed-up hooks (if any) are in `.git/hooks/*.backup-*`.

View File

@@ -59,7 +59,7 @@ echo "✓ Git hooks installed successfully"
echo ""
echo "Hooks installed:"
echo " pre-commit - Flushes pending bd changes to JSONL before commit"
echo " pre-push - Blocks push if JSONL has uncommitted changes (bd-my64)"
echo " post-merge - Imports updated JSONL after git pull/merge"
echo " pre-push - Exports database to JSONL before push (prevents stale JSONL)"
echo ""
echo "To uninstall, remove the hooks from .git/hooks/"

View File

@@ -4,7 +4,7 @@
# bd (beads) pre-commit hook
#
# This hook ensures that any pending bd issue changes are flushed to
# .beads/issues.jsonl before the commit is created, preventing the
# .beads/beads.jsonl before the commit is created, preventing the
# race condition where daemon auto-flush fires after the commit.
#
# Installation:
@@ -35,9 +35,17 @@ if ! bd sync --flush-only >/dev/null 2>&1; then
exit 1
fi
# Find the JSONL file (could be issues.jsonl or beads.jsonl)
JSONL_FILE=""
if [ -f .beads/beads.jsonl ]; then
JSONL_FILE=".beads/beads.jsonl"
elif [ -f .beads/issues.jsonl ]; then
JSONL_FILE=".beads/issues.jsonl"
fi
# If the JSONL file was modified, stage it
if [ -f .beads/issues.jsonl ]; then
git add .beads/issues.jsonl 2>/dev/null || true
if [ -n "$JSONL_FILE" ] && [ -f "$JSONL_FILE" ]; then
git add "$JSONL_FILE" 2>/dev/null || true
fi
exit 0

View File

@@ -3,9 +3,11 @@
#
# bd (beads) pre-push hook
#
# This hook ensures that the database is exported to JSONL before pushing,
# preventing the problem where database changes are committed without
# corresponding JSONL updates.
# This hook checks if the database has uncommitted changes and prevents
# pushing stale JSONL by failing with a clear error message.
#
# The pre-commit hook should have already exported changes, but if new
# changes were made between commit and push, this hook catches that.
#
# Installation:
# cp examples/git-hooks/pre-push .git/hooks/pre-push
@@ -16,7 +18,7 @@
# Check if bd is available
if ! command -v bd >/dev/null 2>&1; then
echo "Warning: bd command not found, skipping pre-push export check" >&2
# If bd is not available, we can't check - allow push
exit 0
fi
@@ -26,26 +28,28 @@ if [ ! -d .beads ]; then
exit 0
fi
# Check if database is newer than JSONL
DB_FILE=".beads/beads.db"
JSONL_FILE=".beads/beads.jsonl"
# Find the JSONL file (could be issues.jsonl or beads.jsonl)
JSONL_FILE=""
if [ -f .beads/beads.jsonl ]; then
JSONL_FILE=".beads/beads.jsonl"
elif [ -f .beads/issues.jsonl ]; then
JSONL_FILE=".beads/issues.jsonl"
fi
if [ -f "$DB_FILE" ] && [ -f "$JSONL_FILE" ]; then
# Get modification times
if [ "$DB_FILE" -nt "$JSONL_FILE" ]; then
echo "⚠️ Database is newer than JSONL - exporting before push..." >&2
# Force export to ensure JSONL is up to date
if ! BEADS_NO_DAEMON=1 bd export --output "$JSONL_FILE" >/dev/null 2>&1; then
echo "Error: Failed to export database to JSONL" >&2
echo "Run 'bd export' manually to diagnose" >&2
exit 1
fi
# Stage the updated JSONL
git add "$JSONL_FILE" 2>/dev/null || true
echo "✓ Exported database to JSONL" >&2
# Check if JSONL file has uncommitted changes
if [ -n "$JSONL_FILE" ] && [ -f "$JSONL_FILE" ]; then
# Check git status for the JSONL file
if ! git diff --quiet "$JSONL_FILE" 2>/dev/null || ! git diff --cached --quiet "$JSONL_FILE" 2>/dev/null; then
echo "❌ Error: $JSONL_FILE has uncommitted changes" >&2
echo "" >&2
echo "You made changes to bd issues between your last commit and this push." >&2
echo "Please commit the updated JSONL file before pushing:" >&2
echo "" >&2
echo " git add $JSONL_FILE" >&2
echo " git commit -m \"Update beads JSONL\"" >&2
echo " git push" >&2
echo "" >&2
exit 1
fi
fi