revmoed some old cruft
This commit is contained in:
239
NEXT_SESSION.md
239
NEXT_SESSION.md
@@ -1,239 +0,0 @@
|
||||
# Next Session: Complete bd-d19a (Fix Import Failure on Missing Parents)
|
||||
|
||||
## Current Status
|
||||
|
||||
**Branch**: `fix/import-missing-parents`
|
||||
**Epic**: bd-d19a (P0 - Critical)
|
||||
**Progress**: Phase 1 & 2 Complete ✅
|
||||
|
||||
### Completed Work
|
||||
|
||||
#### Phase 1: Topological Sorting ✅
|
||||
- **Commit**: `f2cb91d`
|
||||
- **What**: Added depth-based sorting to `importer.go` to ensure parents are created before children
|
||||
- **Files**: `internal/importer/importer.go`
|
||||
- **Result**: Fixes latent ordering bug where parent-child pairs in same batch could fail
|
||||
|
||||
#### Phase 2: Parent Resurrection ✅
|
||||
- **Commit**: `b41d65d`
|
||||
- **Implemented Issues**:
|
||||
- bd-cc4f: `TryResurrectParent` function
|
||||
- bd-d76d: Modified `EnsureIDs` to call resurrection
|
||||
- bd-02a4: Modified `CreateIssue` to call resurrection
|
||||
- **Files Created**: `internal/storage/sqlite/resurrection.go`
|
||||
- **Files Modified**:
|
||||
- `internal/storage/sqlite/ids.go`
|
||||
- `internal/storage/sqlite/sqlite.go`
|
||||
- `internal/storage/sqlite/batch_ops.go`
|
||||
- `internal/storage/sqlite/batch_ops_test.go`
|
||||
|
||||
**How Resurrection Works**:
|
||||
1. When child issue has missing parent, search `.beads/issues.jsonl` for parent in git history
|
||||
2. If found, create tombstone issue (status=closed, priority=4)
|
||||
3. Tombstone preserves original title, type, created_at
|
||||
4. Description marked with `[RESURRECTED]` prefix + original description
|
||||
5. Dependencies copied if targets exist
|
||||
6. Recursively handles entire parent chains (e.g., `bd-abc.1.2` → resurrects both `bd-abc` and `bd-abc.1`)
|
||||
|
||||
---
|
||||
|
||||
## Next Steps: Phase 3 - Testing & Documentation
|
||||
|
||||
### 1. Add Comprehensive Tests
|
||||
|
||||
**Create**: `internal/storage/sqlite/resurrection_test.go`
|
||||
|
||||
**Test Cases Needed**:
|
||||
- ✅ Parent exists → no resurrection needed
|
||||
- ✅ Parent found in JSONL → successful resurrection
|
||||
- ✅ Parent not in JSONL → proper error message
|
||||
- ✅ Multi-level chain (`bd-abc.1.2`) → resurrects entire chain
|
||||
- ✅ JSONL file missing → graceful failure
|
||||
- ✅ Malformed JSONL lines → skip with warning
|
||||
- ✅ Dependencies preserved → only if targets exist
|
||||
- ✅ Tombstone properties → correct status, priority, description format
|
||||
- ✅ Concurrent resurrection → idempotent behavior
|
||||
|
||||
**Integration Test**:
|
||||
Add to `beads_integration_test.go`:
|
||||
```go
|
||||
TestImportWithDeletedParent
|
||||
- Create parent and child
|
||||
- Delete parent
|
||||
- Export to JSONL (preserves parent in git)
|
||||
- Clear DB
|
||||
- Import from JSONL
|
||||
- Verify: parent resurrected as tombstone, child imported successfully
|
||||
```
|
||||
|
||||
### 2. Update Documentation
|
||||
|
||||
**Files to Update**:
|
||||
1. `README.md` - Add resurrection behavior to import section
|
||||
2. `QUICKSTART.md` - Mention parent resurrection for multi-repo workflows
|
||||
3. `docs/import-bug-analysis-bd-3xq.md` - Add "Implementation Complete" section
|
||||
4. `AGENTS.md` - Document resurrection for AI agents
|
||||
|
||||
**Example Addition to README.md**:
|
||||
```markdown
|
||||
## Parent Resurrection
|
||||
|
||||
When importing issues with hierarchical IDs (e.g., `bd-abc.1`), bd automatically
|
||||
resurrects deleted parent issues from git history to maintain referential integrity.
|
||||
|
||||
Resurrected parents are created as tombstones:
|
||||
- Status: `closed`
|
||||
- Priority: 4 (lowest)
|
||||
- Description: `[RESURRECTED]` prefix + original description
|
||||
|
||||
This enables multi-repo workflows where different clones may delete different issues.
|
||||
```
|
||||
|
||||
### 3. Manual Testing Workflow
|
||||
|
||||
```bash
|
||||
# Terminal 1: Create test scenario
|
||||
cd /tmp/bd-test
|
||||
git init
|
||||
bd init --prefix test --quiet
|
||||
bd create "Parent epic" -t epic -p 1 --json # Returns test-abc123
|
||||
bd create "Child task" -p 1 --json # Auto-creates test-abc123.1
|
||||
|
||||
# Verify hierarchy
|
||||
bd dep tree test-abc123
|
||||
|
||||
# Delete parent (simulating normal database hygiene)
|
||||
bd delete test-abc123 --force
|
||||
|
||||
# Export state (child exists, parent deleted)
|
||||
bd export -o backup.jsonl
|
||||
|
||||
# Simulate fresh clone
|
||||
rm -rf .beads/beads.db
|
||||
bd init --prefix test --quiet
|
||||
|
||||
# Import - should resurrect parent as tombstone
|
||||
bd import -i backup.jsonl
|
||||
|
||||
# Verify resurrection
|
||||
bd show test-abc123 --json | grep -i resurrected
|
||||
bd show test-abc123.1 --json # Should exist
|
||||
bd dep tree test-abc123 # Should show full tree
|
||||
```
|
||||
|
||||
### 4. Edge Cases to Handle
|
||||
|
||||
**Potential Issues**:
|
||||
1. **JSONL path detection**: Currently assumes `.beads/issues.jsonl` - verify works with symlinks, worktrees
|
||||
2. **Performance**: Large JSONL files (10k+ issues) - may need optimization (indexing?)
|
||||
3. **Memory**: Scanner buffer is 1MB - test with very large issue descriptions
|
||||
4. **Concurrent access**: Multiple processes resurrecting same parent simultaneously
|
||||
|
||||
**Optimizations to Consider** (Future work):
|
||||
- Build in-memory index of JSONL on first resurrection call (cache for session)
|
||||
- Use `grep` or `ripgrep` for fast ID lookup before JSON parsing
|
||||
- Add resurrection stats to import summary (`Resurrected: 3 parents`)
|
||||
|
||||
### 5. Create Pull Request
|
||||
|
||||
Once testing complete:
|
||||
|
||||
```bash
|
||||
# Update CHANGELOG.md
|
||||
# Add entry under "Unreleased"
|
||||
|
||||
# Create PR
|
||||
gh pr create \
|
||||
--title "Fix import failure on missing parent issues (bd-d19a)" \
|
||||
--body "Implements topological sorting + parent resurrection.
|
||||
|
||||
Fixes #XXX (if there's a GitHub issue)
|
||||
|
||||
## Changes
|
||||
- Phase 1: Topological sorting for import ordering
|
||||
- Phase 2: Parent resurrection from JSONL history
|
||||
- Creates tombstones for deleted parents to preserve hierarchical structure
|
||||
|
||||
## Testing
|
||||
- [x] Unit tests for resurrection logic
|
||||
- [x] Integration test for deleted parent scenario
|
||||
- [x] Manual testing with multi-level hierarchies
|
||||
|
||||
See docs/import-bug-analysis-bd-3xq.md for full design rationale."
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Commands for Next Session
|
||||
|
||||
```bash
|
||||
# Resume work
|
||||
cd /Users/stevey/src/dave/beads
|
||||
git checkout fix/import-missing-parents
|
||||
|
||||
# Run existing tests
|
||||
go test ./internal/storage/sqlite -v -run Resurrection
|
||||
|
||||
# Create new test file
|
||||
# (See test template above)
|
||||
|
||||
# Run integration tests
|
||||
go test -v -run TestImport
|
||||
|
||||
# Manual testing
|
||||
# (See workflow above)
|
||||
|
||||
# When ready to merge
|
||||
git checkout main
|
||||
git merge fix/import-missing-parents
|
||||
git push origin main
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Issues Tracking
|
||||
|
||||
**Epic**: bd-d19a (Fix import failure on missing parent issues) - **OPEN**
|
||||
**Subtasks**:
|
||||
- bd-cc4f: Implement TryResurrectParent - **DONE** ✅
|
||||
- bd-d76d: Modify EnsureIDs - **DONE** ✅
|
||||
- bd-02a4: Modify CreateIssue - **DONE** ✅
|
||||
- **TODO**: Create test issue for Phase 3
|
||||
- **TODO**: Create docs issue for Phase 3
|
||||
|
||||
**Files Modified**:
|
||||
- ✅ `internal/importer/importer.go` (topological sorting)
|
||||
- ✅ `internal/storage/sqlite/resurrection.go` (new file)
|
||||
- ✅ `internal/storage/sqlite/ids.go`
|
||||
- ✅ `internal/storage/sqlite/sqlite.go`
|
||||
- ✅ `internal/storage/sqlite/batch_ops.go`
|
||||
- ✅ `internal/storage/sqlite/batch_ops_test.go`
|
||||
- ⏳ `internal/storage/sqlite/resurrection_test.go` (TODO)
|
||||
- ⏳ `beads_integration_test.go` (TODO - add import test)
|
||||
- ⏳ `README.md` (TODO - document resurrection)
|
||||
- ⏳ `AGENTS.md` (TODO - document for AI agents)
|
||||
|
||||
---
|
||||
|
||||
## Key Design Decisions
|
||||
|
||||
1. **Tombstone Status**: Using `closed` (not a new "deleted" status) to avoid schema changes
|
||||
2. **Search Strategy**: Linear scan of JSONL (acceptable for <10k issues, can optimize later)
|
||||
3. **Idempotency**: `TryResurrectParent` checks existence first, safe to call multiple times
|
||||
4. **Recursion**: `TryResurrectParentChain` handles multi-level hierarchies automatically
|
||||
5. **Dependencies**: Best-effort resurrection (logs warnings, doesn't fail if targets missing)
|
||||
|
||||
---
|
||||
|
||||
## Reference Documents
|
||||
|
||||
- **Design Doc**: `docs/import-bug-analysis-bd-3xq.md` (comprehensive analysis)
|
||||
- **Current Branch**: `fix/import-missing-parents`
|
||||
- **GitHub PR URL**: (To be created)
|
||||
- **Related Issues**: bd-4ms (multi-repo support), bd-a101 (separate branch workflow)
|
||||
|
||||
---
|
||||
|
||||
**Status**: Ready for Phase 3 (Testing & Documentation)
|
||||
**Estimate**: 2-3 hours for comprehensive tests + 1 hour for docs
|
||||
**Risk**: Low - core logic implemented and builds successfully
|
||||
@@ -1,79 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Test daemon auto-import after git pull (bd-09b5f2f5)
|
||||
# This verifies the critical data corruption fix
|
||||
|
||||
set -e
|
||||
|
||||
TMPDIR=$(mktemp -d)
|
||||
trap "rm -rf $TMPDIR" EXIT
|
||||
|
||||
echo "=== Setting up test environment ==="
|
||||
cd $TMPDIR
|
||||
|
||||
# Create origin repo
|
||||
mkdir origin && cd origin
|
||||
git init --bare
|
||||
|
||||
cd $TMPDIR
|
||||
|
||||
# Clone repo A
|
||||
git clone origin repoA
|
||||
cd repoA
|
||||
git config user.name "Test User A"
|
||||
git config user.email "test-a@example.com"
|
||||
|
||||
# Initialize bd in repo A
|
||||
echo "=== Initializing bd in repo A ==="
|
||||
bd init --prefix test --quiet
|
||||
bd create "Initial issue" -p 1 -d "Created in repo A" --json
|
||||
|
||||
# Commit and push (use master as default branch)
|
||||
git add .
|
||||
git commit -m "Initial commit with bd"
|
||||
git push origin master
|
||||
|
||||
cd $TMPDIR
|
||||
|
||||
# Clone repo B
|
||||
echo "=== Cloning repo B ==="
|
||||
git clone origin repoB
|
||||
cd repoB
|
||||
git config user.name "Test User B"
|
||||
git config user.email "test-b@example.com"
|
||||
|
||||
# Initialize bd in repo B (import from JSONL)
|
||||
echo "=== Initializing bd in repo B ==="
|
||||
bd init --prefix test --quiet
|
||||
|
||||
# Verify repo B can read the issue
|
||||
echo "=== Verifying repo B sees initial issue ==="
|
||||
ISSUE_ID=$(bd list --json | jq -r '.[0].id')
|
||||
echo "Found issue: $ISSUE_ID"
|
||||
|
||||
# In repo A: Update the issue
|
||||
cd $TMPDIR/repoA
|
||||
echo "=== Repo A: Updating issue status ==="
|
||||
bd update $ISSUE_ID --status in_progress --json
|
||||
bd sync # Force immediate export/commit/push
|
||||
|
||||
# Wait for export to flush
|
||||
sleep 2
|
||||
|
||||
# In repo B: Pull and verify daemon auto-imports
|
||||
cd $TMPDIR/repoB
|
||||
echo "=== Repo B: Pulling changes ==="
|
||||
git pull
|
||||
|
||||
# Check if daemon auto-imports (should see updated status)
|
||||
echo "=== Repo B: Checking if daemon auto-imported ==="
|
||||
STATUS=$(bd show $ISSUE_ID --json | jq -r '.[0].status')
|
||||
|
||||
if [ "$STATUS" == "in_progress" ]; then
|
||||
echo "✅ SUCCESS: Daemon auto-imported! Status is 'in_progress' as expected"
|
||||
exit 0
|
||||
else
|
||||
echo "❌ FAIL: Daemon did NOT auto-import. Status is '$STATUS', expected 'in_progress'"
|
||||
echo ""
|
||||
echo "This indicates bd-09b5f2f5 regression - daemon serving stale data"
|
||||
exit 1
|
||||
fi
|
||||
BIN
wasm/bd.wasm
BIN
wasm/bd.wasm
Binary file not shown.
@@ -1,10 +0,0 @@
|
||||
#!/bin/bash
|
||||
# Build bd for WebAssembly
|
||||
|
||||
set -e
|
||||
|
||||
echo "Building bd for WASM..."
|
||||
GOOS=js GOARCH=wasm go build -o wasm/bd.wasm ./cmd/bd
|
||||
|
||||
echo "WASM build complete: wasm/bd.wasm"
|
||||
ls -lh wasm/bd.wasm
|
||||
28
wasm/run.js
28
wasm/run.js
@@ -1,28 +0,0 @@
|
||||
#!/usr/bin/env node
|
||||
// Node.js wrapper for bd.wasm
|
||||
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
// Load wasm_exec.js from Go distribution
|
||||
require('./wasm_exec.js');
|
||||
|
||||
// Load the WASM binary
|
||||
const wasmPath = path.join(__dirname, 'bd.wasm');
|
||||
const wasmBuffer = fs.readFileSync(wasmPath);
|
||||
|
||||
// Create Go runtime instance
|
||||
const go = new Go();
|
||||
|
||||
// Pass command-line arguments to Go
|
||||
// process.argv[0] is 'node', process.argv[1] is this script
|
||||
// So we want process.argv.slice(1) to simulate: bd <args>
|
||||
go.argv = ['bd'].concat(process.argv.slice(2));
|
||||
|
||||
// Instantiate and run the WASM module
|
||||
WebAssembly.instantiate(wasmBuffer, go.importObject).then((result) => {
|
||||
go.run(result.instance);
|
||||
}).catch((err) => {
|
||||
console.error('Failed to run WASM:', err);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,575 +0,0 @@
|
||||
// Copyright 2018 The Go Authors. All rights reserved.
|
||||
// Use of this source code is governed by a BSD-style
|
||||
// license that can be found in the LICENSE file.
|
||||
|
||||
"use strict";
|
||||
|
||||
(() => {
|
||||
const enosys = () => {
|
||||
const err = new Error("not implemented");
|
||||
err.code = "ENOSYS";
|
||||
return err;
|
||||
};
|
||||
|
||||
if (!globalThis.fs) {
|
||||
let outputBuf = "";
|
||||
globalThis.fs = {
|
||||
constants: { O_WRONLY: -1, O_RDWR: -1, O_CREAT: -1, O_TRUNC: -1, O_APPEND: -1, O_EXCL: -1, O_DIRECTORY: -1 }, // unused
|
||||
writeSync(fd, buf) {
|
||||
outputBuf += decoder.decode(buf);
|
||||
const nl = outputBuf.lastIndexOf("\n");
|
||||
if (nl != -1) {
|
||||
console.log(outputBuf.substring(0, nl));
|
||||
outputBuf = outputBuf.substring(nl + 1);
|
||||
}
|
||||
return buf.length;
|
||||
},
|
||||
write(fd, buf, offset, length, position, callback) {
|
||||
if (offset !== 0 || length !== buf.length || position !== null) {
|
||||
callback(enosys());
|
||||
return;
|
||||
}
|
||||
const n = this.writeSync(fd, buf);
|
||||
callback(null, n);
|
||||
},
|
||||
chmod(path, mode, callback) { callback(enosys()); },
|
||||
chown(path, uid, gid, callback) { callback(enosys()); },
|
||||
close(fd, callback) { callback(enosys()); },
|
||||
fchmod(fd, mode, callback) { callback(enosys()); },
|
||||
fchown(fd, uid, gid, callback) { callback(enosys()); },
|
||||
fstat(fd, callback) { callback(enosys()); },
|
||||
fsync(fd, callback) { callback(null); },
|
||||
ftruncate(fd, length, callback) { callback(enosys()); },
|
||||
lchown(path, uid, gid, callback) { callback(enosys()); },
|
||||
link(path, link, callback) { callback(enosys()); },
|
||||
lstat(path, callback) { callback(enosys()); },
|
||||
mkdir(path, perm, callback) { callback(enosys()); },
|
||||
open(path, flags, mode, callback) { callback(enosys()); },
|
||||
read(fd, buffer, offset, length, position, callback) { callback(enosys()); },
|
||||
readdir(path, callback) { callback(enosys()); },
|
||||
readlink(path, callback) { callback(enosys()); },
|
||||
rename(from, to, callback) { callback(enosys()); },
|
||||
rmdir(path, callback) { callback(enosys()); },
|
||||
stat(path, callback) { callback(enosys()); },
|
||||
symlink(path, link, callback) { callback(enosys()); },
|
||||
truncate(path, length, callback) { callback(enosys()); },
|
||||
unlink(path, callback) { callback(enosys()); },
|
||||
utimes(path, atime, mtime, callback) { callback(enosys()); },
|
||||
};
|
||||
}
|
||||
|
||||
if (!globalThis.process) {
|
||||
globalThis.process = {
|
||||
getuid() { return -1; },
|
||||
getgid() { return -1; },
|
||||
geteuid() { return -1; },
|
||||
getegid() { return -1; },
|
||||
getgroups() { throw enosys(); },
|
||||
pid: -1,
|
||||
ppid: -1,
|
||||
umask() { throw enosys(); },
|
||||
cwd() { throw enosys(); },
|
||||
chdir() { throw enosys(); },
|
||||
}
|
||||
}
|
||||
|
||||
if (!globalThis.path) {
|
||||
globalThis.path = {
|
||||
resolve(...pathSegments) {
|
||||
return pathSegments.join("/");
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (!globalThis.crypto) {
|
||||
throw new Error("globalThis.crypto is not available, polyfill required (crypto.getRandomValues only)");
|
||||
}
|
||||
|
||||
if (!globalThis.performance) {
|
||||
throw new Error("globalThis.performance is not available, polyfill required (performance.now only)");
|
||||
}
|
||||
|
||||
if (!globalThis.TextEncoder) {
|
||||
throw new Error("globalThis.TextEncoder is not available, polyfill required");
|
||||
}
|
||||
|
||||
if (!globalThis.TextDecoder) {
|
||||
throw new Error("globalThis.TextDecoder is not available, polyfill required");
|
||||
}
|
||||
|
||||
const encoder = new TextEncoder("utf-8");
|
||||
const decoder = new TextDecoder("utf-8");
|
||||
|
||||
globalThis.Go = class {
|
||||
constructor() {
|
||||
this.argv = ["js"];
|
||||
this.env = {};
|
||||
this.exit = (code) => {
|
||||
if (code !== 0) {
|
||||
console.warn("exit code:", code);
|
||||
}
|
||||
};
|
||||
this._exitPromise = new Promise((resolve) => {
|
||||
this._resolveExitPromise = resolve;
|
||||
});
|
||||
this._pendingEvent = null;
|
||||
this._scheduledTimeouts = new Map();
|
||||
this._nextCallbackTimeoutID = 1;
|
||||
|
||||
const setInt64 = (addr, v) => {
|
||||
this.mem.setUint32(addr + 0, v, true);
|
||||
this.mem.setUint32(addr + 4, Math.floor(v / 4294967296), true);
|
||||
}
|
||||
|
||||
const setInt32 = (addr, v) => {
|
||||
this.mem.setUint32(addr + 0, v, true);
|
||||
}
|
||||
|
||||
const getInt64 = (addr) => {
|
||||
const low = this.mem.getUint32(addr + 0, true);
|
||||
const high = this.mem.getInt32(addr + 4, true);
|
||||
return low + high * 4294967296;
|
||||
}
|
||||
|
||||
const loadValue = (addr) => {
|
||||
const f = this.mem.getFloat64(addr, true);
|
||||
if (f === 0) {
|
||||
return undefined;
|
||||
}
|
||||
if (!isNaN(f)) {
|
||||
return f;
|
||||
}
|
||||
|
||||
const id = this.mem.getUint32(addr, true);
|
||||
return this._values[id];
|
||||
}
|
||||
|
||||
const storeValue = (addr, v) => {
|
||||
const nanHead = 0x7FF80000;
|
||||
|
||||
if (typeof v === "number" && v !== 0) {
|
||||
if (isNaN(v)) {
|
||||
this.mem.setUint32(addr + 4, nanHead, true);
|
||||
this.mem.setUint32(addr, 0, true);
|
||||
return;
|
||||
}
|
||||
this.mem.setFloat64(addr, v, true);
|
||||
return;
|
||||
}
|
||||
|
||||
if (v === undefined) {
|
||||
this.mem.setFloat64(addr, 0, true);
|
||||
return;
|
||||
}
|
||||
|
||||
let id = this._ids.get(v);
|
||||
if (id === undefined) {
|
||||
id = this._idPool.pop();
|
||||
if (id === undefined) {
|
||||
id = this._values.length;
|
||||
}
|
||||
this._values[id] = v;
|
||||
this._goRefCounts[id] = 0;
|
||||
this._ids.set(v, id);
|
||||
}
|
||||
this._goRefCounts[id]++;
|
||||
let typeFlag = 0;
|
||||
switch (typeof v) {
|
||||
case "object":
|
||||
if (v !== null) {
|
||||
typeFlag = 1;
|
||||
}
|
||||
break;
|
||||
case "string":
|
||||
typeFlag = 2;
|
||||
break;
|
||||
case "symbol":
|
||||
typeFlag = 3;
|
||||
break;
|
||||
case "function":
|
||||
typeFlag = 4;
|
||||
break;
|
||||
}
|
||||
this.mem.setUint32(addr + 4, nanHead | typeFlag, true);
|
||||
this.mem.setUint32(addr, id, true);
|
||||
}
|
||||
|
||||
const loadSlice = (addr) => {
|
||||
const array = getInt64(addr + 0);
|
||||
const len = getInt64(addr + 8);
|
||||
return new Uint8Array(this._inst.exports.mem.buffer, array, len);
|
||||
}
|
||||
|
||||
const loadSliceOfValues = (addr) => {
|
||||
const array = getInt64(addr + 0);
|
||||
const len = getInt64(addr + 8);
|
||||
const a = new Array(len);
|
||||
for (let i = 0; i < len; i++) {
|
||||
a[i] = loadValue(array + i * 8);
|
||||
}
|
||||
return a;
|
||||
}
|
||||
|
||||
const loadString = (addr) => {
|
||||
const saddr = getInt64(addr + 0);
|
||||
const len = getInt64(addr + 8);
|
||||
return decoder.decode(new DataView(this._inst.exports.mem.buffer, saddr, len));
|
||||
}
|
||||
|
||||
const testCallExport = (a, b) => {
|
||||
this._inst.exports.testExport0();
|
||||
return this._inst.exports.testExport(a, b);
|
||||
}
|
||||
|
||||
const timeOrigin = Date.now() - performance.now();
|
||||
this.importObject = {
|
||||
_gotest: {
|
||||
add: (a, b) => a + b,
|
||||
callExport: testCallExport,
|
||||
},
|
||||
gojs: {
|
||||
// Go's SP does not change as long as no Go code is running. Some operations (e.g. calls, getters and setters)
|
||||
// may synchronously trigger a Go event handler. This makes Go code get executed in the middle of the imported
|
||||
// function. A goroutine can switch to a new stack if the current stack is too small (see morestack function).
|
||||
// This changes the SP, thus we have to update the SP used by the imported function.
|
||||
|
||||
// func wasmExit(code int32)
|
||||
"runtime.wasmExit": (sp) => {
|
||||
sp >>>= 0;
|
||||
const code = this.mem.getInt32(sp + 8, true);
|
||||
this.exited = true;
|
||||
delete this._inst;
|
||||
delete this._values;
|
||||
delete this._goRefCounts;
|
||||
delete this._ids;
|
||||
delete this._idPool;
|
||||
this.exit(code);
|
||||
},
|
||||
|
||||
// func wasmWrite(fd uintptr, p unsafe.Pointer, n int32)
|
||||
"runtime.wasmWrite": (sp) => {
|
||||
sp >>>= 0;
|
||||
const fd = getInt64(sp + 8);
|
||||
const p = getInt64(sp + 16);
|
||||
const n = this.mem.getInt32(sp + 24, true);
|
||||
fs.writeSync(fd, new Uint8Array(this._inst.exports.mem.buffer, p, n));
|
||||
},
|
||||
|
||||
// func resetMemoryDataView()
|
||||
"runtime.resetMemoryDataView": (sp) => {
|
||||
sp >>>= 0;
|
||||
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||
},
|
||||
|
||||
// func nanotime1() int64
|
||||
"runtime.nanotime1": (sp) => {
|
||||
sp >>>= 0;
|
||||
setInt64(sp + 8, (timeOrigin + performance.now()) * 1000000);
|
||||
},
|
||||
|
||||
// func walltime() (sec int64, nsec int32)
|
||||
"runtime.walltime": (sp) => {
|
||||
sp >>>= 0;
|
||||
const msec = (new Date).getTime();
|
||||
setInt64(sp + 8, msec / 1000);
|
||||
this.mem.setInt32(sp + 16, (msec % 1000) * 1000000, true);
|
||||
},
|
||||
|
||||
// func scheduleTimeoutEvent(delay int64) int32
|
||||
"runtime.scheduleTimeoutEvent": (sp) => {
|
||||
sp >>>= 0;
|
||||
const id = this._nextCallbackTimeoutID;
|
||||
this._nextCallbackTimeoutID++;
|
||||
this._scheduledTimeouts.set(id, setTimeout(
|
||||
() => {
|
||||
this._resume();
|
||||
while (this._scheduledTimeouts.has(id)) {
|
||||
// for some reason Go failed to register the timeout event, log and try again
|
||||
// (temporary workaround for https://github.com/golang/go/issues/28975)
|
||||
console.warn("scheduleTimeoutEvent: missed timeout event");
|
||||
this._resume();
|
||||
}
|
||||
},
|
||||
getInt64(sp + 8),
|
||||
));
|
||||
this.mem.setInt32(sp + 16, id, true);
|
||||
},
|
||||
|
||||
// func clearTimeoutEvent(id int32)
|
||||
"runtime.clearTimeoutEvent": (sp) => {
|
||||
sp >>>= 0;
|
||||
const id = this.mem.getInt32(sp + 8, true);
|
||||
clearTimeout(this._scheduledTimeouts.get(id));
|
||||
this._scheduledTimeouts.delete(id);
|
||||
},
|
||||
|
||||
// func getRandomData(r []byte)
|
||||
"runtime.getRandomData": (sp) => {
|
||||
sp >>>= 0;
|
||||
crypto.getRandomValues(loadSlice(sp + 8));
|
||||
},
|
||||
|
||||
// func finalizeRef(v ref)
|
||||
"syscall/js.finalizeRef": (sp) => {
|
||||
sp >>>= 0;
|
||||
const id = this.mem.getUint32(sp + 8, true);
|
||||
this._goRefCounts[id]--;
|
||||
if (this._goRefCounts[id] === 0) {
|
||||
const v = this._values[id];
|
||||
this._values[id] = null;
|
||||
this._ids.delete(v);
|
||||
this._idPool.push(id);
|
||||
}
|
||||
},
|
||||
|
||||
// func stringVal(value string) ref
|
||||
"syscall/js.stringVal": (sp) => {
|
||||
sp >>>= 0;
|
||||
storeValue(sp + 24, loadString(sp + 8));
|
||||
},
|
||||
|
||||
// func valueGet(v ref, p string) ref
|
||||
"syscall/js.valueGet": (sp) => {
|
||||
sp >>>= 0;
|
||||
const result = Reflect.get(loadValue(sp + 8), loadString(sp + 16));
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 32, result);
|
||||
},
|
||||
|
||||
// func valueSet(v ref, p string, x ref)
|
||||
"syscall/js.valueSet": (sp) => {
|
||||
sp >>>= 0;
|
||||
Reflect.set(loadValue(sp + 8), loadString(sp + 16), loadValue(sp + 32));
|
||||
},
|
||||
|
||||
// func valueDelete(v ref, p string)
|
||||
"syscall/js.valueDelete": (sp) => {
|
||||
sp >>>= 0;
|
||||
Reflect.deleteProperty(loadValue(sp + 8), loadString(sp + 16));
|
||||
},
|
||||
|
||||
// func valueIndex(v ref, i int) ref
|
||||
"syscall/js.valueIndex": (sp) => {
|
||||
sp >>>= 0;
|
||||
storeValue(sp + 24, Reflect.get(loadValue(sp + 8), getInt64(sp + 16)));
|
||||
},
|
||||
|
||||
// valueSetIndex(v ref, i int, x ref)
|
||||
"syscall/js.valueSetIndex": (sp) => {
|
||||
sp >>>= 0;
|
||||
Reflect.set(loadValue(sp + 8), getInt64(sp + 16), loadValue(sp + 24));
|
||||
},
|
||||
|
||||
// func valueCall(v ref, m string, args []ref) (ref, bool)
|
||||
"syscall/js.valueCall": (sp) => {
|
||||
sp >>>= 0;
|
||||
try {
|
||||
const v = loadValue(sp + 8);
|
||||
const m = Reflect.get(v, loadString(sp + 16));
|
||||
const args = loadSliceOfValues(sp + 32);
|
||||
const result = Reflect.apply(m, v, args);
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 56, result);
|
||||
this.mem.setUint8(sp + 64, 1);
|
||||
} catch (err) {
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 56, err);
|
||||
this.mem.setUint8(sp + 64, 0);
|
||||
}
|
||||
},
|
||||
|
||||
// func valueInvoke(v ref, args []ref) (ref, bool)
|
||||
"syscall/js.valueInvoke": (sp) => {
|
||||
sp >>>= 0;
|
||||
try {
|
||||
const v = loadValue(sp + 8);
|
||||
const args = loadSliceOfValues(sp + 16);
|
||||
const result = Reflect.apply(v, undefined, args);
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 40, result);
|
||||
this.mem.setUint8(sp + 48, 1);
|
||||
} catch (err) {
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 40, err);
|
||||
this.mem.setUint8(sp + 48, 0);
|
||||
}
|
||||
},
|
||||
|
||||
// func valueNew(v ref, args []ref) (ref, bool)
|
||||
"syscall/js.valueNew": (sp) => {
|
||||
sp >>>= 0;
|
||||
try {
|
||||
const v = loadValue(sp + 8);
|
||||
const args = loadSliceOfValues(sp + 16);
|
||||
const result = Reflect.construct(v, args);
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 40, result);
|
||||
this.mem.setUint8(sp + 48, 1);
|
||||
} catch (err) {
|
||||
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||
storeValue(sp + 40, err);
|
||||
this.mem.setUint8(sp + 48, 0);
|
||||
}
|
||||
},
|
||||
|
||||
// func valueLength(v ref) int
|
||||
"syscall/js.valueLength": (sp) => {
|
||||
sp >>>= 0;
|
||||
setInt64(sp + 16, parseInt(loadValue(sp + 8).length));
|
||||
},
|
||||
|
||||
// valuePrepareString(v ref) (ref, int)
|
||||
"syscall/js.valuePrepareString": (sp) => {
|
||||
sp >>>= 0;
|
||||
const str = encoder.encode(String(loadValue(sp + 8)));
|
||||
storeValue(sp + 16, str);
|
||||
setInt64(sp + 24, str.length);
|
||||
},
|
||||
|
||||
// valueLoadString(v ref, b []byte)
|
||||
"syscall/js.valueLoadString": (sp) => {
|
||||
sp >>>= 0;
|
||||
const str = loadValue(sp + 8);
|
||||
loadSlice(sp + 16).set(str);
|
||||
},
|
||||
|
||||
// func valueInstanceOf(v ref, t ref) bool
|
||||
"syscall/js.valueInstanceOf": (sp) => {
|
||||
sp >>>= 0;
|
||||
this.mem.setUint8(sp + 24, (loadValue(sp + 8) instanceof loadValue(sp + 16)) ? 1 : 0);
|
||||
},
|
||||
|
||||
// func copyBytesToGo(dst []byte, src ref) (int, bool)
|
||||
"syscall/js.copyBytesToGo": (sp) => {
|
||||
sp >>>= 0;
|
||||
const dst = loadSlice(sp + 8);
|
||||
const src = loadValue(sp + 32);
|
||||
if (!(src instanceof Uint8Array || src instanceof Uint8ClampedArray)) {
|
||||
this.mem.setUint8(sp + 48, 0);
|
||||
return;
|
||||
}
|
||||
const toCopy = src.subarray(0, dst.length);
|
||||
dst.set(toCopy);
|
||||
setInt64(sp + 40, toCopy.length);
|
||||
this.mem.setUint8(sp + 48, 1);
|
||||
},
|
||||
|
||||
// func copyBytesToJS(dst ref, src []byte) (int, bool)
|
||||
"syscall/js.copyBytesToJS": (sp) => {
|
||||
sp >>>= 0;
|
||||
const dst = loadValue(sp + 8);
|
||||
const src = loadSlice(sp + 16);
|
||||
if (!(dst instanceof Uint8Array || dst instanceof Uint8ClampedArray)) {
|
||||
this.mem.setUint8(sp + 48, 0);
|
||||
return;
|
||||
}
|
||||
const toCopy = src.subarray(0, dst.length);
|
||||
dst.set(toCopy);
|
||||
setInt64(sp + 40, toCopy.length);
|
||||
this.mem.setUint8(sp + 48, 1);
|
||||
},
|
||||
|
||||
"debug": (value) => {
|
||||
console.log(value);
|
||||
},
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
async run(instance) {
|
||||
if (!(instance instanceof WebAssembly.Instance)) {
|
||||
throw new Error("Go.run: WebAssembly.Instance expected");
|
||||
}
|
||||
this._inst = instance;
|
||||
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||
this._values = [ // JS values that Go currently has references to, indexed by reference id
|
||||
NaN,
|
||||
0,
|
||||
null,
|
||||
true,
|
||||
false,
|
||||
globalThis,
|
||||
this,
|
||||
];
|
||||
this._goRefCounts = new Array(this._values.length).fill(Infinity); // number of references that Go has to a JS value, indexed by reference id
|
||||
this._ids = new Map([ // mapping from JS values to reference ids
|
||||
[0, 1],
|
||||
[null, 2],
|
||||
[true, 3],
|
||||
[false, 4],
|
||||
[globalThis, 5],
|
||||
[this, 6],
|
||||
]);
|
||||
this._idPool = []; // unused ids that have been garbage collected
|
||||
this.exited = false; // whether the Go program has exited
|
||||
|
||||
// Pass command line arguments and environment variables to WebAssembly by writing them to the linear memory.
|
||||
let offset = 4096;
|
||||
|
||||
const strPtr = (str) => {
|
||||
const ptr = offset;
|
||||
const bytes = encoder.encode(str + "\0");
|
||||
new Uint8Array(this.mem.buffer, offset, bytes.length).set(bytes);
|
||||
offset += bytes.length;
|
||||
if (offset % 8 !== 0) {
|
||||
offset += 8 - (offset % 8);
|
||||
}
|
||||
return ptr;
|
||||
};
|
||||
|
||||
const argc = this.argv.length;
|
||||
|
||||
const argvPtrs = [];
|
||||
this.argv.forEach((arg) => {
|
||||
argvPtrs.push(strPtr(arg));
|
||||
});
|
||||
argvPtrs.push(0);
|
||||
|
||||
const keys = Object.keys(this.env).sort();
|
||||
keys.forEach((key) => {
|
||||
argvPtrs.push(strPtr(`${key}=${this.env[key]}`));
|
||||
});
|
||||
argvPtrs.push(0);
|
||||
|
||||
const argv = offset;
|
||||
argvPtrs.forEach((ptr) => {
|
||||
this.mem.setUint32(offset, ptr, true);
|
||||
this.mem.setUint32(offset + 4, 0, true);
|
||||
offset += 8;
|
||||
});
|
||||
|
||||
// The linker guarantees global data starts from at least wasmMinDataAddr.
|
||||
// Keep in sync with cmd/link/internal/ld/data.go:wasmMinDataAddr.
|
||||
const wasmMinDataAddr = 4096 + 8192;
|
||||
if (offset >= wasmMinDataAddr) {
|
||||
throw new Error("total length of command line and environment variables exceeds limit");
|
||||
}
|
||||
|
||||
this._inst.exports.run(argc, argv);
|
||||
if (this.exited) {
|
||||
this._resolveExitPromise();
|
||||
}
|
||||
await this._exitPromise;
|
||||
}
|
||||
|
||||
_resume() {
|
||||
if (this.exited) {
|
||||
throw new Error("Go program has already exited");
|
||||
}
|
||||
this._inst.exports.resume();
|
||||
if (this.exited) {
|
||||
this._resolveExitPromise();
|
||||
}
|
||||
}
|
||||
|
||||
_makeFuncWrapper(id) {
|
||||
const go = this;
|
||||
return function () {
|
||||
const event = { id: id, this: this, args: arguments };
|
||||
go._pendingEvent = event;
|
||||
go._resume();
|
||||
return event.result;
|
||||
};
|
||||
}
|
||||
}
|
||||
})();
|
||||
Reference in New Issue
Block a user