Compare commits

..

14 Commits

Author SHA1 Message Date
9267d1a42b feat: Add Renovate configuration for dependency tracking
All checks were successful
CI / check (pull_request) Successful in 3m13s
2026-01-14 15:05:28 -08:00
2d03714934 feat(skills): Add contribution guidelines check to beads skills
All checks were successful
CI / check (push) Successful in 2m57s
2026-01-14 14:26:17 -08:00
3f0e381de2 fix(ci): Add access token for private flake inputs
Some checks failed
CI / check (push) Has been cancelled
2026-01-14 14:25:47 -08:00
1d9fd0aee9 feat(skills): Add batch research+plan skill for multiple beads
Some checks failed
CI / check (push) Has been cancelled
2026-01-14 14:24:53 -08:00
16f6dfcec7 feat(skills): Enforce worktree/branch workflow in parallel_beads
Some checks failed
CI / check (push) Has been cancelled
2026-01-14 14:24:16 -08:00
90ef70eb2e Add mcrcon-rbw wrapper for Minecraft RCON
Some checks failed
CI / check (push) Has been cancelled
Wrapper that auto-authenticates via rbw (Bitwarden) for RCON access.
- Uses minecraft-rcon entry from Bitwarden
- Defaults to 10.0.0.165:25575 (LoadBalancer IP)
- Supports MCRCON_HOST/PORT overrides
- Interactive terminal mode when no args provided

Part of k3s-cluster-config bead k3s-cluster-config-byg
2026-01-14 14:09:45 -08:00
667f5b28dc feat(skills): Close Gitea issues when beads are reconciled
Some checks failed
CI / check (push) Has been cancelled
2026-01-14 13:59:30 -08:00
4bb71d0b7e Remove wixos (WSL) configuration
All checks were successful
CI / check (push) Successful in 3m0s
WSL is no longer used. This removes:
- machines/wixos/ directory and configuration.nix
- nixos-wsl input from flake.nix
- nixosConfigurations.wixos output
- References to wixos in AGENTS.md and .goosehints

Implements bead: nixos-configs-2mk
2026-01-13 18:02:36 -08:00
0bc134f557 fix(mu4e): Configure msmtp to preserve email body content
All checks were successful
CI / check (push) Successful in 6m0s
The mu4e msmtp configuration was causing email bodies to be stripped,
especially for multipart messages from org-msg. This was due to missing
critical msmtp settings.

Changes:
- Add message-sendmail-f-is-evil to prevent -f flag issues
- Add --read-envelope-from to msmtp arguments
- Set both send-mail-function and message-send-mail-function

Fixes: nixos-configs-9l8
2026-01-13 17:48:36 -08:00
1b9df3926e Fix conflicting audio role config: remove pulseaudio, keep pipewire
Some checks failed
CI / check (push) Has been cancelled
Remove services.pulseaudio configuration that conflicted with
services.pipewire. PipeWire replaces PulseAudio and provides
compatibility through pulse.enable.

Also added alsa.enable and alsa.support32Bit for better ALSA support.
2026-01-13 17:48:00 -08:00
bd98793528 feat(roles): Parameterize hardcoded values in printing, nfs-mounts, and virtualisation roles
Some checks failed
CI / check (push) Has been cancelled
- printing role: Add configurable printerName, printerUri, and printerModel options
  to replace hardcoded Brother printer values
- nfs-mounts role: Add configurable server, remotePath, and mountPoint options
  to replace hardcoded NFS server IP (10.0.0.43)
- virtualisation role: Add configurable dockerUsers option as list type
  to replace hardcoded 'johno' docker group membership

All options have sensible defaults matching the original hardcoded values,
ensuring backward compatibility while allowing per-host customization.

Implements bead: nixos-configs-fkt
2026-01-13 17:20:59 -08:00
d78637cf13 feat(home-manager): Add platform compatibility guards to cross-platform roles
Some checks failed
CI / check (push) Has been cancelled
Add lib.optionals pkgs.stdenv.isLinux guards to roles that contain
Linux-only packages or services to prevent build failures on Darwin:

- communication: Guard Electron apps (element-desktop, fluffychat,
  nextcloud-talk-desktop) that don't build on Darwin due to electron
  build-from-source limitations
- kdeconnect: Guard entire config block since services.kdeconnect
  requires D-Bus and systemd (Linux-only)
- sync: Guard syncthingtray package (requires Linux system tray)
- email: Guard systemd.user.services/timers (Darwin uses launchd)
- desktop: Guard Linux-only packages, services, and KDE-specific
  configurations including gnome-keyring, systemd services, and
  XDG mime associations

Implements bead: nixos-configs-tcu
2026-01-13 17:20:01 -08:00
08d16bd2c9 feat(scripts): Add --help flags to all flake apps
Some checks failed
CI / check (push) Has been cancelled
Add consistent --help/-h argument handling to update-doomemacs.sh,
rotate-wallpaper.sh, and upgrade.sh scripts. Each script now displays
usage information and a description of what it does.

update-claude-code already had --help support.
2026-01-13 17:18:46 -08:00
a14ff9be4d fix(flake): Remove duplicate home-manager imports from wixos and zix790prors
Some checks failed
CI / check (pull_request) Successful in 5m36s
CI / check (push) Has been cancelled
The nixosModules list already includes inputs.home-manager.nixosModules.home-manager,
so these individual configuration imports were redundant.
2026-01-13 16:37:41 -08:00
27 changed files with 866 additions and 278 deletions

View File

@@ -16,3 +16,5 @@ jobs:
- name: Check flake - name: Check flake
run: nix flake check run: nix flake check
env:
NIX_CONFIG: "access-tokens = git.johnogle.info=${{ secrets.GITEA_ACCESS_TOKEN }}"

View File

@@ -9,7 +9,7 @@ Directory Structure:
---------------------- ----------------------
• packages/ - Custom Nix packages leveraged across various configurations. • packages/ - Custom Nix packages leveraged across various configurations.
• roles/ - Role-based configurations (e.g., kodi, bluetooth) each with its own module (default.nix) for inclusion in machine setups. • roles/ - Role-based configurations (e.g., kodi, bluetooth) each with its own module (default.nix) for inclusion in machine setups.
• machines/ - Machine-specific configurations (e.g., nix-book, z790prors, boxy, wixos) including configuration.nix and hardware-configuration.nix tailored for each hardware. • machines/ - Machine-specific configurations (e.g., nix-book, zix790prors, boxy) including configuration.nix and hardware-configuration.nix tailored for each hardware.
• home/ - Home-manager configurations for personal environments and application settings (e.g., home-nix-book.nix, home-z790prors.nix). • home/ - Home-manager configurations for personal environments and application settings (e.g., home-nix-book.nix, home-z790prors.nix).
Design Principles: Design Principles:

View File

@@ -14,7 +14,7 @@ This repository uses `beads` for issue tracking and management. Run `bd quicksta
### Flake Structure ### Flake Structure
- **flake.nix**: Main entry point defining inputs (nixpkgs, home-manager, plasma-manager, etc.) and outputs for multiple NixOS configurations - **flake.nix**: Main entry point defining inputs (nixpkgs, home-manager, plasma-manager, etc.) and outputs for multiple NixOS configurations
- **Machines**: `nix-book`, `boxy`, `wixos` (WSL configuration), `zix790prors`, `live-usb`, `johno-macbookpro` (Darwin/macOS) - **Machines**: `nix-book`, `boxy`, `zix790prors`, `live-usb`, `johno-macbookpro` (Darwin/macOS)
- **Home configurations**: Standalone home-manager configuration for user `johno` - **Home configurations**: Standalone home-manager configuration for user `johno`
### Directory Structure ### Directory Structure
@@ -78,7 +78,6 @@ The repository also uses a modular home-manager role system for user-space confi
- **nix-book**: Compact laptop → excludes office/media roles due to SSD space constraints - **nix-book**: Compact laptop → excludes office/media roles due to SSD space constraints
- **boxy**: Living room media center → optimized for media consumption, excludes sync/office (shared machine) - **boxy**: Living room media center → optimized for media consumption, excludes sync/office (shared machine)
- **zix790prors**: All-purpose workstation → full desktop experience with all roles enabled - **zix790prors**: All-purpose workstation → full desktop experience with all roles enabled
- **wixos**: WSL2 development → full desktop experience, inherits from zix790prors Windows host
- **live-usb**: Temporary environment → only base + desktop roles, no persistent services - **live-usb**: Temporary environment → only base + desktop roles, no persistent services
- **johno-macbookpro**: macOS work laptop → Darwin-specific configuration with development tools - **johno-macbookpro**: macOS work laptop → Darwin-specific configuration with development tools
@@ -111,7 +110,6 @@ darwin-rebuild build --flake .#johno-macbookpro
- `nix-book`: Compact laptop with storage constraints, uses `home/home-laptop-compact.nix` - `nix-book`: Compact laptop with storage constraints, uses `home/home-laptop-compact.nix`
- `boxy`: Shared living room media center/gaming desktop with AMD GPU, uses `home/home-media-center.nix` - `boxy`: Shared living room media center/gaming desktop with AMD GPU, uses `home/home-media-center.nix`
- `zix790prors`: Powerful all-purpose workstation (gaming, 3D modeling, development), dual-boots Windows 11 with shared btrfs /games partition, uses `home/home-desktop.nix` - `zix790prors`: Powerful all-purpose workstation (gaming, 3D modeling, development), dual-boots Windows 11 with shared btrfs /games partition, uses `home/home-desktop.nix`
- `wixos`: WSL2 development environment running in Windows partition of zix790prors, uses `home/home-desktop.nix`
- `live-usb`: Bootable ISO configuration, uses `home/home-live-usb.nix` - `live-usb`: Bootable ISO configuration, uses `home/home-live-usb.nix`
- `johno-macbookpro`: macOS work laptop, uses `home/home-darwin-work.nix` - `johno-macbookpro`: macOS work laptop, uses `home/home-darwin-work.nix`

67
flake.lock generated
View File

@@ -60,22 +60,6 @@
"type": "github" "type": "github"
} }
}, },
"flake-compat": {
"flake": false,
"locked": {
"lastModified": 1765121682,
"narHash": "sha256-4VBOP18BFeiPkyhy9o4ssBNQEvfvv1kXkasAYd0+rrA=",
"owner": "edolstra",
"repo": "flake-compat",
"rev": "65f23138d8d09a92e30f1e5c87611b23ef451bf3",
"type": "github"
},
"original": {
"owner": "edolstra",
"repo": "flake-compat",
"type": "github"
}
},
"flake-utils": { "flake-utils": {
"inputs": { "inputs": {
"systems": "systems" "systems": "systems"
@@ -241,38 +225,18 @@
"type": "github" "type": "github"
} }
}, },
"nixos-wsl": {
"inputs": {
"flake-compat": "flake-compat",
"nixpkgs": "nixpkgs"
},
"locked": {
"lastModified": 1765841014,
"narHash": "sha256-55V0AJ36V5Egh4kMhWtDh117eE3GOjwq5LhwxDn9eHg=",
"owner": "nix-community",
"repo": "NixOS-WSL",
"rev": "be4af8042e7a61fa12fda58fe9a3b3babdefe17b",
"type": "github"
},
"original": {
"owner": "nix-community",
"ref": "main",
"repo": "NixOS-WSL",
"type": "github"
}
},
"nixpkgs": { "nixpkgs": {
"locked": { "locked": {
"lastModified": 1765472234, "lastModified": 1767480499,
"narHash": "sha256-9VvC20PJPsleGMewwcWYKGzDIyjckEz8uWmT0vCDYK0=", "narHash": "sha256-8IQQUorUGiSmFaPnLSo2+T+rjHtiNWc+OAzeHck7N48=",
"owner": "NixOS", "owner": "nixos",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "2fbfb1d73d239d2402a8fe03963e37aab15abe8b", "rev": "30a3c519afcf3f99e2c6df3b359aec5692054d92",
"type": "github" "type": "github"
}, },
"original": { "original": {
"owner": "NixOS", "owner": "nixos",
"ref": "nixos-unstable", "ref": "nixos-25.11",
"repo": "nixpkgs", "repo": "nixpkgs",
"type": "github" "type": "github"
} }
@@ -293,22 +257,6 @@
"type": "github" "type": "github"
} }
}, },
"nixpkgs_2": {
"locked": {
"lastModified": 1767480499,
"narHash": "sha256-8IQQUorUGiSmFaPnLSo2+T+rjHtiNWc+OAzeHck7N48=",
"owner": "nixos",
"repo": "nixpkgs",
"rev": "30a3c519afcf3f99e2c6df3b359aec5692054d92",
"type": "github"
},
"original": {
"owner": "nixos",
"ref": "nixos-25.11",
"repo": "nixpkgs",
"type": "github"
}
},
"plasma-manager": { "plasma-manager": {
"inputs": { "inputs": {
"home-manager": [ "home-manager": [
@@ -364,8 +312,7 @@
"jovian": "jovian", "jovian": "jovian",
"nix-darwin": "nix-darwin", "nix-darwin": "nix-darwin",
"nix-doom-emacs-unstraightened": "nix-doom-emacs-unstraightened", "nix-doom-emacs-unstraightened": "nix-doom-emacs-unstraightened",
"nixos-wsl": "nixos-wsl", "nixpkgs": "nixpkgs",
"nixpkgs": "nixpkgs_2",
"nixpkgs-unstable": "nixpkgs-unstable", "nixpkgs-unstable": "nixpkgs-unstable",
"plasma-manager": "plasma-manager", "plasma-manager": "plasma-manager",
"plasma-manager-unstable": "plasma-manager-unstable" "plasma-manager-unstable": "plasma-manager-unstable"

View File

@@ -4,7 +4,6 @@
inputs = { inputs = {
nixpkgs.url = "github:nixos/nixpkgs/nixos-25.11"; nixpkgs.url = "github:nixos/nixpkgs/nixos-25.11";
nixpkgs-unstable.url = "github:nixos/nixpkgs/nixos-unstable"; nixpkgs-unstable.url = "github:nixos/nixpkgs/nixos-unstable";
nixos-wsl.url = "github:nix-community/NixOS-WSL/main";
nix-darwin = { nix-darwin = {
url = "github:nix-darwin/nix-darwin/nix-darwin-25.11"; url = "github:nix-darwin/nix-darwin/nix-darwin-25.11";
@@ -55,7 +54,7 @@
}; };
}; };
outputs = { self, nixpkgs, nixpkgs-unstable, nixos-wsl, ... } @ inputs: let outputs = { self, nixpkgs, nixpkgs-unstable, ... } @ inputs: let
# Shared overlay function to reduce duplication across module sets # Shared overlay function to reduce duplication across module sets
# Parameters: # Parameters:
# unstableOverlays: Additional overlays to apply when importing nixpkgs-unstable # unstableOverlays: Additional overlays to apply when importing nixpkgs-unstable
@@ -84,6 +83,7 @@
}; };
}; };
nixosModules = [ nixosModules = [
./roles ./roles
inputs.home-manager.nixosModules.home-manager inputs.home-manager.nixosModules.home-manager
@@ -157,24 +157,10 @@
]; ];
}; };
nixosConfigurations.wixos = nixpkgs.lib.nixosSystem rec {
system = "x86_64-linux";
modules = nixosModules ++ [
nixos-wsl.nixosModules.default
./machines/wixos/configuration.nix
inputs.home-manager.nixosModules.home-manager
{
home-manager.users.johno = import ./home/home-desktop.nix;
home-manager.extraSpecialArgs = { inherit system; };
}
];
};
nixosConfigurations.zix790prors = nixpkgs.lib.nixosSystem rec { nixosConfigurations.zix790prors = nixpkgs.lib.nixosSystem rec {
system = "x86_64-linux"; system = "x86_64-linux";
modules = nixosModules ++ [ modules = nixosModules ++ [
./machines/zix790prors/configuration.nix ./machines/zix790prors/configuration.nix
inputs.home-manager.nixosModules.home-manager
{ {
home-manager.users.johno = import ./home/home-desktop.nix; home-manager.users.johno = import ./home/home-desktop.nix;
home-manager.extraSpecialArgs = { inherit system; }; home-manager.extraSpecialArgs = { inherit system; };

View File

@@ -4,6 +4,7 @@ with lib;
let let
cfg = config.home.roles.communication; cfg = config.home.roles.communication;
isLinux = pkgs.stdenv.isLinux;
in in
{ {
options.home.roles.communication = { options.home.roles.communication = {
@@ -12,14 +13,14 @@ in
config = mkIf cfg.enable { config = mkIf cfg.enable {
home.packages = [ home.packages = [
# Communication apps # For logging back into google chat (cross-platform)
globalInputs.google-cookie-retrieval.packages.${system}.default
] ++ optionals isLinux [
# Linux-only communication apps (Electron apps don't build on Darwin)
pkgs.element-desktop pkgs.element-desktop
# Re-enabled in 25.11 after security issues were resolved # Re-enabled in 25.11 after security issues were resolved
pkgs.fluffychat pkgs.fluffychat
pkgs.nextcloud-talk-desktop pkgs.nextcloud-talk-desktop
# For logging back into google chat
globalInputs.google-cookie-retrieval.packages.${system}.default
]; ];
}; };
} }

View File

@@ -4,6 +4,7 @@ with lib;
let let
cfg = config.home.roles.desktop; cfg = config.home.roles.desktop;
isLinux = pkgs.stdenv.isLinux;
in in
{ {
options.home.roles.desktop = { options.home.roles.desktop = {
@@ -12,27 +13,29 @@ in
config = mkIf cfg.enable { config = mkIf cfg.enable {
home.packages = with pkgs; [ home.packages = with pkgs; [
# Desktop applications # Cross-platform desktop applications
bitwarden-desktop bitwarden-desktop
dunst
keepassxc keepassxc
xdg-utils # XDG utilities for opening files/URLs with default applications
] ++ optionals isLinux [
# Linux-only desktop applications
dunst
unstable.ghostty unstable.ghostty
# Desktop utilities # Linux-only desktop utilities
feh # Image viewer and wallpaper setter for X11 feh # Image viewer and wallpaper setter for X11
rofi # Application launcher for X11 rofi # Application launcher for X11
solaar # Logitech management software solaar # Logitech management software
waybar waybar
wofi # Application launcher for Wayland wofi # Application launcher for Wayland
xdg-utils # XDG utilities for opening files/URLs with default applications
# System utilities with GUI components # Linux-only system utilities with GUI components
(snapcast.override { pulseaudioSupport = true; }) (snapcast.override { pulseaudioSupport = true; })
# KDE tiling window management # KDE tiling window management (Linux-only)
kdePackages.krohnkite # Dynamic tiling extension for KWin 6 kdePackages.krohnkite # Dynamic tiling extension for KWin 6
# KDE PIM applications for email, calendar, and contacts # KDE PIM applications for email, calendar, and contacts (Linux-only)
kdePackages.kmail kdePackages.kmail
kdePackages.kmail-account-wizard kdePackages.kmail-account-wizard
kdePackages.kmailtransport kdePackages.kmailtransport
@@ -40,33 +43,33 @@ in
kdePackages.kaddressbook kdePackages.kaddressbook
kdePackages.kontact kdePackages.kontact
# KDE System components needed for proper integration # KDE System components needed for proper integration (Linux-only)
kdePackages.kded kdePackages.kded
kdePackages.systemsettings kdePackages.systemsettings
kdePackages.kmenuedit kdePackages.kmenuedit
# Desktop menu support # Desktop menu support (Linux-only)
kdePackages.plasma-desktop # Contains applications.menu kdePackages.plasma-desktop # Contains applications.menu
# KDE Online Accounts support # KDE Online Accounts support (Linux-only)
kdePackages.kaccounts-integration kdePackages.kaccounts-integration
kdePackages.kaccounts-providers kdePackages.kaccounts-providers
kdePackages.signond kdePackages.signond
# KDE Mapping # KDE Mapping (Linux-only)
kdePackages.marble # Virtual globe and world atlas kdePackages.marble # Virtual globe and world atlas
# KDE Productivity # KDE Productivity (Linux-only)
kdePackages.kate # Advanced text editor with syntax highlighting kdePackages.kate # Advanced text editor with syntax highlighting
kdePackages.okular # Universal document viewer (PDF, ePub, etc.) kdePackages.okular # Universal document viewer (PDF, ePub, etc.)
kdePackages.spectacle # Screenshot capture utility kdePackages.spectacle # Screenshot capture utility
kdePackages.filelight # Visual disk usage analyzer kdePackages.filelight # Visual disk usage analyzer
# KDE Multimedia # KDE Multimedia (Linux-only)
kdePackages.gwenview # Image viewer and basic editor kdePackages.gwenview # Image viewer and basic editor
kdePackages.elisa # Music player kdePackages.elisa # Music player
# KDE System Utilities # KDE System Utilities (Linux-only)
kdePackages.ark # Archive manager (zip, tar, 7z, etc.) kdePackages.ark # Archive manager (zip, tar, 7z, etc.)
kdePackages.yakuake # Drop-down terminal emulator kdePackages.yakuake # Drop-down terminal emulator
]; ];
@@ -77,51 +80,56 @@ in
programs.spotify-player.enable = true; programs.spotify-player.enable = true;
services.gnome-keyring = { # Linux-only: GNOME keyring service
services.gnome-keyring = mkIf isLinux {
enable = true; enable = true;
}; };
# rbw vault unlock on login and resume from suspend # Linux-only: systemd user services for rbw vault unlock
systemd.user.services.rbw-unlock-on-login = { systemd.user.services = mkIf isLinux {
Unit = { # rbw vault unlock on login
Description = "Unlock rbw vault at login"; rbw-unlock-on-login = {
After = [ "graphical-session.target" ]; Unit = {
Description = "Unlock rbw vault at login";
After = [ "graphical-session.target" ];
};
Service = {
Type = "oneshot";
ExecStart = "${pkgs.rbw}/bin/rbw unlock";
Environment = "RBW_AGENT=${pkgs.rbw}/bin/rbw-agent";
# KillMode = "process" prevents systemd from killing the rbw-agent daemon
# when this oneshot service completes. The agent is spawned by rbw unlock
# and needs to persist after the service exits.
KillMode = "process";
};
Install = {
WantedBy = [ "graphical-session.target" ];
};
}; };
Service = {
Type = "oneshot"; # rbw vault unlock on resume from suspend
ExecStart = "${pkgs.rbw}/bin/rbw unlock"; rbw-unlock-on-resume = {
Environment = "RBW_AGENT=${pkgs.rbw}/bin/rbw-agent"; Unit = {
# KillMode = "process" prevents systemd from killing the rbw-agent daemon Description = "Unlock rbw vault after resume from suspend";
# when this oneshot service completes. The agent is spawned by rbw unlock After = [ "suspend.target" ];
# and needs to persist after the service exits. };
KillMode = "process"; Service = {
}; Type = "oneshot";
Install = { ExecStart = "${pkgs.rbw}/bin/rbw unlock";
WantedBy = [ "graphical-session.target" ]; Environment = "RBW_AGENT=${pkgs.rbw}/bin/rbw-agent";
# KillMode = "process" prevents systemd from killing the rbw-agent daemon
# when this oneshot service completes. The agent is spawned by rbw unlock
# and needs to persist after the service exits.
KillMode = "process";
};
Install = {
WantedBy = [ "suspend.target" ];
};
}; };
}; };
systemd.user.services.rbw-unlock-on-resume = { # Linux-only: KDE environment variables for proper integration
Unit = { home.sessionVariables = mkIf isLinux {
Description = "Unlock rbw vault after resume from suspend";
After = [ "suspend.target" ];
};
Service = {
Type = "oneshot";
ExecStart = "${pkgs.rbw}/bin/rbw unlock";
Environment = "RBW_AGENT=${pkgs.rbw}/bin/rbw-agent";
# KillMode = "process" prevents systemd from killing the rbw-agent daemon
# when this oneshot service completes. The agent is spawned by rbw unlock
# and needs to persist after the service exits.
KillMode = "process";
};
Install = {
WantedBy = [ "suspend.target" ];
};
};
# KDE environment variables for proper integration
home.sessionVariables = {
QT_QPA_PLATFORMTHEME = "kde"; QT_QPA_PLATFORMTHEME = "kde";
KDE_SESSION_VERSION = "6"; KDE_SESSION_VERSION = "6";
}; };
@@ -141,13 +149,14 @@ in
"x-scheme-handler/https" = "firefox.desktop"; "x-scheme-handler/https" = "firefox.desktop";
}; };
defaultApplications = { defaultApplications = {
# Web browsers # Web browsers (cross-platform)
"text/html" = "firefox.desktop"; "text/html" = "firefox.desktop";
"x-scheme-handler/http" = "firefox.desktop"; "x-scheme-handler/http" = "firefox.desktop";
"x-scheme-handler/https" = "firefox.desktop"; "x-scheme-handler/https" = "firefox.desktop";
"x-scheme-handler/about" = "firefox.desktop"; "x-scheme-handler/about" = "firefox.desktop";
"x-scheme-handler/unknown" = "firefox.desktop"; "x-scheme-handler/unknown" = "firefox.desktop";
} // optionalAttrs isLinux {
# Linux-only: KDE application associations
# Documents # Documents
"application/pdf" = "okular.desktop"; "application/pdf" = "okular.desktop";
"text/plain" = "kate.desktop"; "text/plain" = "kate.desktop";
@@ -190,9 +199,11 @@ in
}; };
}; };
# Fix for KDE applications.menu file issue on Plasma 6 # Linux-only: Fix for KDE applications.menu file issue on Plasma 6
# KDE still looks for applications.menu but Plasma 6 renamed it to plasma-applications.menu # KDE still looks for applications.menu but Plasma 6 renamed it to plasma-applications.menu
xdg.configFile."menus/applications.menu".source = "${pkgs.kdePackages.plasma-workspace}/etc/xdg/menus/plasma-applications.menu"; xdg.configFile."menus/applications.menu" = mkIf isLinux {
source = "${pkgs.kdePackages.plasma-workspace}/etc/xdg/menus/plasma-applications.menu";
};
# Note: modules must be imported at top-level home config # Note: modules must be imported at top-level home config
}; };

View File

@@ -0,0 +1,317 @@
---
description: Batch research and planning for multiple beads with interactive question review
model: opus
---
# Beads Batch Research+Plan
This skill automates the common workflow of:
1. Running /beads_research in parallel for multiple beads
2. Presenting open questions interactively for user input (bead-by-bead)
3. Running /beads_plan for all researched beads (plus any spawned from splits)
## When to Use
- You have multiple beads ready for work
- You want to research and plan them efficiently before implementation
- You prefer to batch your question-answering rather than context-switching between skills
## Phase 1: Selection
1. **Get ready beads**: Run `bd ready --limit=20` to list beads with no blockers
2. **Filter already-researched beads**:
For each ready bead, check if it already has research:
```bash
ls thoughts/beads-{bead-id}/research.md 2>/dev/null
```
Categorize beads:
- **Needs research**: No `research.md` exists
- **Has research, needs plan**: `research.md` exists but no `plan.md`
- **Already planned**: Both `research.md` and `plan.md` exist
3. **Present selection**:
```
Ready beads available for batch research+plan:
NEEDS RESEARCH:
- {bead-id}: {title} (type: {type})
- ...
HAS RESEARCH (plan only):
- {bead-id}: {title} (type: {type})
- ...
ALREADY PLANNED (skip):
- {bead-id}: {title}
Which beads would you like to process?
```
4. **Use AskUserQuestion** with `multiSelect: true`:
- Include bead ID and title for each option
- Separate options by category
- Allow selection across categories
## Phase 2: Parallel Research
For each selected bead that NEEDS RESEARCH, launch a research subagent.
### Subagent Instructions Template
```
Research bead [BEAD_ID]: [BEAD_TITLE]
1. **Load bead context**:
```bash
bd show [BEAD_ID]
```
2. **Create artifact directory**:
```bash
mkdir -p thoughts/beads-[BEAD_ID]
```
3. **Conduct research** following beads_research.md patterns:
- Analyze and decompose the research question
- Spawn parallel sub-agent tasks (codebase-locator, codebase-analyzer, etc.)
- Synthesize findings
4. **Write research document** to `thoughts/beads-[BEAD_ID]/research.md`:
- Include frontmatter with metadata
- Document findings with file:line references
- **CRITICAL**: Include "## Open Questions" section listing any unresolved items
5. **Return summary**:
- Research status (complete/partial)
- Number of open questions
- Key findings summary (2-3 bullet points)
- List of open questions verbatim
```
### Launching Subagents
Use `subagent_type: "opus"` for research subagents (matches beads_research model setting).
Launch ALL research subagents in a single message for parallel execution:
```
<Task calls for each selected bead needing research - all in one message>
```
### Collecting Results
Wait for ALL research subagents to complete. Collect:
- Bead ID
- Research status
- Open questions list
- Any errors encountered
## Phase 3: Interactive Question Review
Present each bead's open questions sequentially for user input.
### For Each Bead (in order):
1. **Present research summary**:
```
## Bead {N}/{total}: {bead-id} - {title}
Research complete. Key findings:
- {finding 1}
- {finding 2}
Open questions requiring your input:
1. {question 1}
2. {question 2}
Additionally:
- Should this bead be split into multiple beads? (y/n)
- If split, describe the split:
```
2. **Collect user responses**:
- Answers to open questions
- Split decision (yes/no)
- If split: new bead titles and how to divide the work
3. **Handle splits**:
If user indicates a split:
```bash
# Create new beads for split work
bd create --title="{split title 1}" --type={type} --priority={priority} \
--description="{description based on user input}"
# Update original bead if scope narrowed
bd update {original-bead-id} --description="{updated description}"
```
Track new bead IDs for inclusion in planning phase.
4. **Update research document**:
Append user answers to `thoughts/beads-{id}/research.md`:
```markdown
## User Clarifications [{timestamp}]
Q: {question 1}
A: {user answer 1}
Q: {question 2}
A: {user answer 2}
## Bead Splits
{If split: description of split and new bead IDs}
```
### Progress Tracking
After each bead's questions are answered, confirm before moving to next:
```
Questions answered for {bead-id}. {N-1} beads remaining.
Continue to next bead? (y/n)
```
### Beads with No Questions
If a bead's research had no open questions:
```
## Bead {N}/{total}: {bead-id} - {title}
Research complete with no open questions.
Key findings:
- {finding 1}
- {finding 2}
Should this bead be split? (y/n)
```
## Phase 4: Parallel Planning
After all questions answered, launch planning subagents for all beads.
### Beads to Plan
Include:
- Original beads that were researched
- Beads that had existing research (from selection phase)
- New beads spawned from splits
### Subagent Instructions Template
```
Create implementation plan for bead [BEAD_ID]: [BEAD_TITLE]
1. **Load context**:
```bash
bd show [BEAD_ID]
```
2. **Read research** (it exists and has user clarifications):
Read `thoughts/beads-[BEAD_ID]/research.md` FULLY
3. **Create plan** following beads_plan.md patterns:
- Context gathering via sub-agents
- Design approach based on research findings and user clarifications
- **Skip interactive questions** - they were already answered in research review
4. **Write plan** to `thoughts/beads-[BEAD_ID]/plan.md`:
- Full plan structure with phases
- Success criteria (automated and manual)
- References to research document
5. **Update bead**:
```bash
bd update [BEAD_ID] --notes="Plan created: thoughts/beads-[BEAD_ID]/plan.md"
```
6. **Return summary**:
- Plan status (complete/failed)
- Number of phases
- Estimated complexity (small/medium/large)
- Any issues encountered
```
### Launching Subagents
Use `subagent_type: "opus"` for planning subagents (matches beads_plan model setting).
Launch ALL planning subagents in a single message:
```
<Task calls for each bead to plan - all in one message>
```
### Handling Beads Without Research
For beads that had existing research but user didn't review questions:
- Planning subagent reads existing research
- If research has unresolved open questions, subagent should flag this in its return
## Phase 5: Summary
After all planning completes, present final summary.
### Summary Format
```
## Batch Research+Plan Complete
### Successfully Processed:
| Bead | Title | Research | Plan | Phases | Complexity |
|------|-------|----------|------|--------|------------|
| {id} | {title} | Complete | Complete | 3 | medium |
| {id} | {title} | Complete | Complete | 2 | small |
### New Beads (from splits):
| Bead | Title | Parent | Status |
|------|-------|--------|--------|
| {new-id} | {title} | {parent-id} | Planned |
### Failed:
| Bead | Title | Phase Failed | Error |
|------|-------|--------------|-------|
| {id} | {title} | Research | Timeout |
### Next Steps:
1. Review plans at `thoughts/beads-{id}/plan.md`
2. Run `/parallel_beads` to implement all planned beads
3. Or run `/beads_implement {id}` for individual implementation
### Artifacts Created:
- Research: thoughts/beads-{id}/research.md (x{N} files)
- Plans: thoughts/beads-{id}/plan.md (x{N} files)
```
## Error Handling
### Research Subagent Failure
- Log the failure with bead ID and error
- Continue with other beads
- Exclude failed beads from question review and planning
- Report in final summary
### Planning Subagent Failure
- Log the failure with bead ID and error
- Research still valid - can retry planning manually
- Report in final summary
### User Cancellation During Question Review
- Save progress to bead notes
- Report which beads were completed
- User can resume with remaining beads in new session
### Split Bead Creation Failure
- Report error but continue with original bead
- User can manually create split beads later
## Resource Limits
- Maximum concurrent research subagents: 5
- Maximum concurrent planning subagents: 5
- If more beads selected, process in batches
## Notes
- This skill is designed for the "research+plan before implementation" workflow
- Pairs well with `/parallel_beads` for subsequent implementation
- Run `/reconcile_beads` after implementation PRs merge

View File

@@ -54,6 +54,8 @@ When this command is invoked:
- Read `thoughts/beads-{bead-id}/plan.md` FULLY - Read `thoughts/beads-{bead-id}/plan.md` FULLY
- Check for any existing checkmarks (- [x]) indicating partial progress - Check for any existing checkmarks (- [x]) indicating partial progress
- Read any research at `thoughts/beads-{bead-id}/research.md` - Read any research at `thoughts/beads-{bead-id}/research.md`
- If plan's Success Criteria references contribution guidelines (e.g., "Per CONTRIBUTING.md:"),
verify the original CONTRIBUTING.md still exists and requirements are current
5. **Mark bead in progress** (if not already): 5. **Mark bead in progress** (if not already):
```bash ```bash
@@ -127,6 +129,10 @@ All phases completed and automated verification passed:
- {List manual verification items from plan} - {List manual verification items from plan}
Let me know when manual testing is complete so I can close the bead. Let me know when manual testing is complete so I can close the bead.
**Contribution guidelines compliance:**
- {List any contribution guideline requirements that were part of Success Criteria}
- {Note if any requirements could not be automated and need manual review}
``` ```
**STOP HERE and wait for user confirmation.** **STOP HERE and wait for user confirmation.**

View File

@@ -51,13 +51,32 @@ When this command is invoked:
- Any linked tickets or docs - Any linked tickets or docs
- Use Read tool WITHOUT limit/offset - Use Read tool WITHOUT limit/offset
2. **Spawn initial research tasks**: 2. **Check for contribution guidelines**:
```bash
# Check standard locations for contribution guidelines
for f in CONTRIBUTING.md .github/CONTRIBUTING.md docs/CONTRIBUTING.md; do
if [ -f "$f" ]; then
echo "Found: $f"
break
fi
done
```
If found:
- Read the file fully
- Extract actionable requirements (testing, code style, documentation, PR conventions)
- These requirements MUST be incorporated into the plan's Success Criteria
If not found, note "No contribution guidelines found" and proceed.
3. **Spawn initial research tasks**:
- **codebase-locator**: Find all files related to the task - **codebase-locator**: Find all files related to the task
- **codebase-analyzer**: Understand current implementation - **codebase-analyzer**: Understand current implementation
- **codebase-pattern-finder**: Find similar features to model after - **codebase-pattern-finder**: Find similar features to model after
- **thoughts-locator**: Find any existing plans or decisions - **thoughts-locator**: Find any existing plans or decisions
3. **Read all files identified by research**: 4. **Read all files identified by research**:
- Read them FULLY into main context - Read them FULLY into main context
- Cross-reference with requirements - Cross-reference with requirements
@@ -273,6 +292,12 @@ Always separate into two categories:
- Performance under real conditions - Performance under real conditions
- Edge cases hard to automate - Edge cases hard to automate
**From Contribution Guidelines** (if CONTRIBUTING.md exists):
- Include any testing requirements specified in guidelines
- Include any code style/linting requirements
- Include any documentation requirements
- Reference the guideline: "Per CONTRIBUTING.md: {requirement}"
## Example Invocation ## Example Invocation
``` ```

View File

@@ -51,6 +51,18 @@ When this command is invoked:
- Use the Read tool WITHOUT limit/offset parameters - Use the Read tool WITHOUT limit/offset parameters
- Read these files yourself in the main context before spawning sub-tasks - Read these files yourself in the main context before spawning sub-tasks
### Step 1.5: Check for contribution guidelines
Before spawning sub-agents, check if the repository has contribution guidelines:
```bash
for f in CONTRIBUTING.md .github/CONTRIBUTING.md docs/CONTRIBUTING.md; do
if [ -f "$f" ]; then echo "Found: $f"; break; fi
done
```
If found, read the file and note key requirements. These should be included in the research document under a "## Contribution Guidelines" section if relevant to the research question.
### Step 2: Analyze and decompose the research question ### Step 2: Analyze and decompose the research question
- Break down the query into composable research areas - Break down the query into composable research areas
- Identify specific components, patterns, or concepts to investigate - Identify specific components, patterns, or concepts to investigate
@@ -143,6 +155,10 @@ status: complete
## Architecture Documentation ## Architecture Documentation
{Current patterns, conventions found in codebase} {Current patterns, conventions found in codebase}
## Contribution Guidelines
{If CONTRIBUTING.md exists, summarize key requirements relevant to the research topic}
{If no guidelines found, omit this section}
## Historical Context (from thoughts/) ## Historical Context (from thoughts/)
{Relevant insights from thoughts/ with references} {Relevant insights from thoughts/ with references}

View File

@@ -42,7 +42,46 @@ AskUserQuestion with:
- options from filtered bd ready output - options from filtered bd ready output
``` ```
## Phase 2: Parallel Implementation ## Phase 2: Worktree Setup
Before launching implementation subagents, create worktrees for all selected beads:
1. **Get repository name**:
```bash
REPO_NAME=$(git remote get-url origin | sed 's|.*/||' | sed 's/\.git$//')
```
2. **For each selected bead**, create its worktree:
```bash
BEAD_ID="[bead-id]"
# Check if worktree already exists
if [ -d "$HOME/wt/${REPO_NAME}/${BEAD_ID}" ]; then
echo "Worktree already exists: ~/wt/${REPO_NAME}/${BEAD_ID}"
# Ask user: remove and recreate, or skip this bead?
else
git worktree add -b "bead/${BEAD_ID}" "$HOME/wt/${REPO_NAME}/${BEAD_ID}"
fi
```
3. **Track created worktrees**:
Maintain a list of (bead_id, worktree_path) pairs for use in subagent instructions.
4. **Report status**:
```
Created worktrees:
- nixos-configs-abc → ~/wt/nixos-configs/nixos-configs-abc (branch: bead/nixos-configs-abc)
- nixos-configs-xyz → ~/wt/nixos-configs/nixos-configs-xyz (branch: bead/nixos-configs-xyz)
Skipped (existing worktree):
- nixos-configs-123 → Ask user for resolution
```
**Note**: If a worktree or branch already exists, ask the user before proceeding:
- Remove existing worktree and branch, then recreate
- Skip this bead
- Use existing worktree as-is (risky - branch may have diverged)
## Phase 3: Parallel Implementation
For each selected bead, launch a subagent using the Task tool. All subagents should be launched in parallel (single message with multiple Task tool calls). For each selected bead, launch a subagent using the Task tool. All subagents should be launched in parallel (single message with multiple Task tool calls).
@@ -53,33 +92,62 @@ Each implementation subagent should receive these instructions:
``` ```
Work on bead [BEAD_ID]: [BEAD_TITLE] Work on bead [BEAD_ID]: [BEAD_TITLE]
1. **Create worktree**: Worktree path: [WORKTREE_PATH]
- Branch name: `bead/[BEAD_ID]`
- Worktree path: `~/wt/[REPO_NAME]/[BEAD_ID]`
- Command: `git worktree add -b bead/[BEAD_ID] ~/wt/[REPO_NAME]/[BEAD_ID]`
2. **Review the bead requirements**: ## CRITICAL: Branch Verification (MUST DO FIRST)
1. **Navigate to worktree**:
```bash
cd [WORKTREE_PATH]
```
2. **Verify branch** (MANDATORY before ANY modifications):
```bash
CURRENT_BRANCH=$(git branch --show-current)
echo "Current branch: $CURRENT_BRANCH"
pwd
```
**ABORT CONDITIONS** - If ANY of these are true, STOP IMMEDIATELY:
- Branch is `main` or `master`
- Branch does not match `bead/[BEAD_ID]`
If you detect any abort condition:
```
ABORTING: Branch verification failed.
Expected branch: bead/[BEAD_ID]
Actual branch: [CURRENT_BRANCH]
Working directory: [pwd output]
DO NOT PROCEED. Report this error to the orchestrator.
```
## After Verification Passes
3. **Review the bead requirements**:
- Run `bd show [BEAD_ID]` to understand the acceptance criteria - Run `bd show [BEAD_ID]` to understand the acceptance criteria
- Note any external issue references (GitHub issues, Linear tickets, etc.) - Note any external issue references (GitHub issues, Linear tickets, etc.)
3. **Extract validation criteria**: 4. **Extract validation criteria**:
- Check for a plan: `thoughts/beads-[BEAD_ID]/plan.md` - Check for a plan: `thoughts/beads-[BEAD_ID]/plan.md`
- If plan exists: - If plan exists:
- Read the plan and find the "Automated Verification" section - Read the plan and find the "Automated Verification" section
- Extract each verification command (lines starting with `- [ ]` followed by a command) - Extract each verification command (lines starting with `- [ ]` followed by a command)
- Example: `- [ ] Tests pass: \`make test\`` → extract `make test` - Example: `- [ ] Tests pass: \`make test\`` → extract `make test`
- Note any "Per CONTRIBUTING.md:" requirements for additional validation
- If no plan exists, use best-effort validation: - If no plan exists, use best-effort validation:
- Check if `Makefile` exists → try `make test` and `make lint` - Check if `Makefile` exists → try `make test` and `make lint`
- Check if `flake.nix` exists → try `nix flake check` - Check if `flake.nix` exists → try `nix flake check`
- Check if `package.json` exists → try `npm test` - Check if `package.json` exists → try `npm test`
- **Check for CONTRIBUTING.md** → read and extract testing/linting requirements
- If none found, note "No validation criteria found" - If none found, note "No validation criteria found"
4. **Implement the changes**: 5. **Implement the changes**:
- Work in the worktree directory - Work in the worktree directory
- Complete all acceptance criteria listed in the bead - Complete all acceptance criteria listed in the bead
After implementation, run validation: After implementation, run validation:
- Execute each validation command from step 3 - Execute each validation command from step 4
- Track results in this format: - Track results in this format:
``` ```
VALIDATION_RESULTS: VALIDATION_RESULTS:
@@ -91,12 +159,12 @@ Work on bead [BEAD_ID]: [BEAD_TITLE]
- Continue with PR creation (don't block) - Continue with PR creation (don't block)
- Document failures in bead notes: `bd update [BEAD_ID] --notes="Validation failures: [list]"` - Document failures in bead notes: `bd update [BEAD_ID] --notes="Validation failures: [list]"`
5. **Commit and push**: 6. **Commit and push**:
- Stage all changes: `git add -A` - Stage all changes: `git add -A`
- Create a descriptive commit message - Create a descriptive commit message
- Push the branch: `git push -u origin bead/[BEAD_ID]` - Push the branch: `git push -u origin bead/[BEAD_ID]`
6. **Create a PR**: 7. **Create a PR**:
- Detect hosting provider from origin URL: `git remote get-url origin` - Detect hosting provider from origin URL: `git remote get-url origin`
- If URL contains `github.com`, use `gh`; otherwise use `tea` (Gitea/Forgejo) - If URL contains `github.com`, use `gh`; otherwise use `tea` (Gitea/Forgejo)
- PR title: "[BEAD_ID] [BEAD_TITLE]" - PR title: "[BEAD_ID] [BEAD_TITLE]"
@@ -156,13 +224,13 @@ Work on bead [BEAD_ID]: [BEAD_TITLE]
| nix flake check | SKIP | command not found |" | nix flake check | SKIP | command not found |"
``` ```
7. **Update bead status**: 8. **Update bead status**:
- Mark the bead as "in_review": `bd update [BEAD_ID] --status=in_review` - Mark the bead as "in_review": `bd update [BEAD_ID] --status=in_review`
- Add the PR URL to the bead notes: `bd update [BEAD_ID] --notes="$(bd show [BEAD_ID] --json | jq -r '.notes') - Add the PR URL to the bead notes: `bd update [BEAD_ID] --notes="$(bd show [BEAD_ID] --json | jq -r '.notes')
PR: [PR_URL]"` PR: [PR_URL]"`
8. **Report results**: 9. **Report results**:
- Return: - Return:
- PR URL - PR URL
- Bead ID - Bead ID
@@ -175,15 +243,24 @@ PR: [PR_URL]"`
### Launching Subagents ### Launching Subagents
For each bead, substitute into the template:
- `[BEAD_ID]` - the bead ID
- `[BEAD_TITLE]` - the bead title
- `[WORKTREE_PATH]` - the worktree path created in Phase 2
Use `subagent_type: "general-purpose"` for implementation subagents. Launch all selected beads' subagents in a single message for parallel execution: Use `subagent_type: "general-purpose"` for implementation subagents. Launch all selected beads' subagents in a single message for parallel execution:
``` ```
<Task calls for each selected bead - all in one message> <Task calls for each selected bead - all in one message>
``` ```
**Important**: The worktree paths were created in Phase 2. Use the exact paths that were created, e.g.:
- `~/wt/nixos-configs/nixos-configs-abc`
- `~/wt/nixos-configs/nixos-configs-xyz`
Collect results from all subagents before proceeding. Collect results from all subagents before proceeding.
## Phase 3: Parallel Review ## Phase 4: Parallel Review
After all implementation subagents complete, launch review subagents for each PR. After all implementation subagents complete, launch review subagents for each PR.
@@ -218,7 +295,7 @@ Review PR for bead [BEAD_ID]
Launch all review subagents in parallel. Launch all review subagents in parallel.
## Phase 4: Cleanup and Summary ## Phase 5: Cleanup and Summary
After reviews complete: After reviews complete:
@@ -264,9 +341,21 @@ Example output:
## Error Handling ## Error Handling
- **Worktree creation failures** (Phase 2):
- If `git worktree add` fails (branch exists, path exists), prompt user:
- Remove existing and retry
- Skip this bead
- Use existing (with warning about potential divergence)
- Do NOT proceed to subagent launch until worktree is confirmed
- **Branch verification failures** (subagent reports):
- If subagent reports it's on `main` or `master`, do NOT retry
- Mark bead as failed with reason "Branch verification failed"
- Continue with other beads but flag this as a critical issue
- Investigation required: the worktree may have been corrupted or not created properly
- **Subagent failures**: If a subagent fails or times out, note it in the summary but continue with other beads - **Subagent failures**: If a subagent fails or times out, note it in the summary but continue with other beads
- **PR creation failures**: Report the error but continue with reviews of successful PRs - **PR creation failures**: Report the error but continue with reviews of successful PRs
- **Worktree conflicts**: If a worktree already exists, ask the user if they want to remove it or skip that bead
## Resource Limits ## Resource Limits

View File

@@ -4,12 +4,13 @@ description: Reconcile beads with merged PRs and close completed beads
# Reconcile Beads Workflow # Reconcile Beads Workflow
This skill reconciles beads that are in `in_review` status with their corresponding PRs. If a PR has been merged, the bead is closed. This skill reconciles beads that are in `in_review` status with their corresponding PRs. If a PR has been merged, the bead is closed and any linked Gitea issue is also closed.
## Prerequisites ## Prerequisites
- Custom status `in_review` must be configured: `bd config set status.custom "in_review"` - Custom status `in_review` must be configured: `bd config set status.custom "in_review"`
- Beads in `in_review` status should have a PR URL in their notes - Beads in `in_review` status should have a PR URL in their notes
- `tea` CLI must be configured for closing Gitea issues
## Workflow ## Workflow
@@ -52,6 +53,34 @@ If the PR is merged:
bd close [BEAD_ID] --reason="PR merged: [PR_URL]" bd close [BEAD_ID] --reason="PR merged: [PR_URL]"
``` ```
### Step 3.1: Close corresponding Gitea issue (if any)
After closing a bead, check if it has a linked Gitea issue:
1. **Check for Gitea issue URL in bead notes**:
Look for the pattern `Gitea issue: <URL>` in the notes. Extract the URL.
2. **Extract issue number from URL**:
```bash
# Example: https://git.johnogle.info/johno/nixos-configs/issues/16 -> 16
echo "$GITEA_URL" | grep -oP '/issues/\K\d+'
```
3. **Close the Gitea issue**:
```bash
tea issues close [ISSUE_NUMBER]
```
4. **Handle errors gracefully**:
- If issue is already closed: Log warning, continue
- If issue not found: Log warning, continue
- If `tea` fails: Log error, continue with other beads
Example warning output:
```
Warning: Could not close Gitea issue #16: issue already closed
```
### Step 4: Report summary ### Step 4: Report summary
Present results: Present results:
@@ -60,10 +89,17 @@ Present results:
## Beads Reconciliation Summary ## Beads Reconciliation Summary
### Closed (PR Merged) ### Closed (PR Merged)
| Bead | PR | Title | | Bead | PR | Gitea Issue | Title |
|------|-----|-------| |------|-----|-------------|-------|
| beads-abc | #123 | Feature X | | beads-abc | #123 | #16 closed | Feature X |
| beads-xyz | #456 | Bug fix Y | | beads-xyz | #456 | (none) | Bug fix Y |
### Gitea Issues Closed
| Issue | Bead | Status |
|-------|------|--------|
| #16 | beads-abc | Closed successfully |
| #17 | beads-def | Already closed (skipped) |
| #99 | beads-ghi | Error: issue not found |
### Still in Review ### Still in Review
| Bead | PR | Status | Title | | Bead | PR | Status | Title |
@@ -80,9 +116,14 @@ Present results:
- **Missing PR URL**: Skip the bead and report it - **Missing PR URL**: Skip the bead and report it
- **PR not found**: Report the error but continue with other beads - **PR not found**: Report the error but continue with other beads
- **API errors**: Report and continue - **API errors**: Report and continue
- **Gitea issue already closed**: Log warning, continue (not an error)
- **Gitea issue not found**: Log warning, continue (issue may have been deleted)
- **No Gitea issue linked**: Normal case, no action needed
- **tea command fails**: Log error with output, continue with other beads
## Notes ## Notes
- This skill complements `/parallel_beads` which sets beads to `in_review` status - This skill complements `/parallel_beads` which sets beads to `in_review` status
- Run this skill periodically or after merging PRs to keep beads in sync - Run this skill periodically or after merging PRs to keep beads in sync
- Beads with closed (but not merged) PRs are not automatically closed - they may need rework - Beads with closed (but not merged) PRs are not automatically closed - they may need rework
- Gitea issues are only closed for beads that have a `Gitea issue: <URL>` in their notes

View File

@@ -225,11 +225,16 @@
mu4e-headers-time-format "%H:%M") mu4e-headers-time-format "%H:%M")
;; Sending mail via msmtp ;; Sending mail via msmtp
(setq message-send-mail-function 'message-send-mail-with-sendmail ;; NOTE: message-sendmail-f-is-evil and --read-envelope-from are required
sendmail-program (executable-find "msmtp") ;; to prevent msmtp from stripping the email body when processing headers.
message-sendmail-envelope-from 'header ;; Without these, multipart messages (especially from org-msg) may arrive
mail-envelope-from 'header ;; with empty bodies.
mail-specify-envelope-from t)) (setq sendmail-program (executable-find "msmtp")
send-mail-function #'message-send-mail-with-sendmail
message-send-mail-function #'message-send-mail-with-sendmail
message-sendmail-f-is-evil t
message-sendmail-extra-arguments '("--read-envelope-from")
message-sendmail-envelope-from 'header))
;; Whenever you reconfigure a package, make sure to wrap your config in an ;; Whenever you reconfigure a package, make sure to wrap your config in an
;; `after!' block, otherwise Doom's defaults may override your settings. E.g. ;; `after!' block, otherwise Doom's defaults may override your settings. E.g.

View File

@@ -4,6 +4,7 @@ with lib;
let let
cfg = config.home.roles.email; cfg = config.home.roles.email;
isLinux = pkgs.stdenv.isLinux;
in in
{ {
options.home.roles.email = { options.home.roles.email = {
@@ -89,34 +90,38 @@ in
account default : proton account default : proton
''; '';
# Systemd service for mail sync # Linux-only: Systemd service for mail sync (Darwin uses launchd instead)
systemd.user.services.mbsync = { systemd.user.services = mkIf isLinux {
Unit = { mbsync = {
Description = "Mailbox synchronization service"; Unit = {
After = [ "network-online.target" ]; Description = "Mailbox synchronization service";
Wants = [ "network-online.target" ]; After = [ "network-online.target" ];
}; Wants = [ "network-online.target" ];
Service = { };
Type = "oneshot"; Service = {
ExecStart = "${pkgs.bash}/bin/bash -c 'mkdir -p ~/Mail && ${pkgs.isync}/bin/mbsync -a && (${pkgs.mu}/bin/mu info >/dev/null 2>&1 || ${pkgs.mu}/bin/mu init --maildir ~/Mail --personal-address=john@ogle.fyi) && ${pkgs.mu}/bin/mu index'"; Type = "oneshot";
Environment = "PATH=${pkgs.rbw}/bin:${pkgs.coreutils}/bin"; ExecStart = "${pkgs.bash}/bin/bash -c 'mkdir -p ~/Mail && ${pkgs.isync}/bin/mbsync -a && (${pkgs.mu}/bin/mu info >/dev/null 2>&1 || ${pkgs.mu}/bin/mu init --maildir ~/Mail --personal-address=john@ogle.fyi) && ${pkgs.mu}/bin/mu index'";
StandardOutput = "journal"; Environment = "PATH=${pkgs.rbw}/bin:${pkgs.coreutils}/bin";
StandardError = "journal"; StandardOutput = "journal";
StandardError = "journal";
};
}; };
}; };
# Systemd timer for automatic sync # Linux-only: Systemd timer for automatic sync
systemd.user.timers.mbsync = { systemd.user.timers = mkIf isLinux {
Unit = { mbsync = {
Description = "Mailbox synchronization timer"; Unit = {
}; Description = "Mailbox synchronization timer";
Timer = { };
OnBootSec = "2min"; Timer = {
OnUnitActiveSec = "5min"; OnBootSec = "2min";
Unit = "mbsync.service"; OnUnitActiveSec = "5min";
}; Unit = "mbsync.service";
Install = { };
WantedBy = [ "timers.target" ]; Install = {
WantedBy = [ "timers.target" ];
};
}; };
}; };
}; };

View File

@@ -4,13 +4,15 @@ with lib;
let let
cfg = config.home.roles.kdeconnect; cfg = config.home.roles.kdeconnect;
isLinux = pkgs.stdenv.isLinux;
in in
{ {
options.home.roles.kdeconnect = { options.home.roles.kdeconnect = {
enable = mkEnableOption "Enable KDE Connect for device integration"; enable = mkEnableOption "Enable KDE Connect for device integration";
}; };
config = mkIf cfg.enable { # KDE Connect services are Linux-only (requires D-Bus and systemd)
config = mkIf (cfg.enable && isLinux) {
services.kdeconnect = { services.kdeconnect = {
enable = true; enable = true;
indicator = true; indicator = true;

View File

@@ -4,6 +4,7 @@ with lib;
let let
cfg = config.home.roles.sync; cfg = config.home.roles.sync;
isLinux = pkgs.stdenv.isLinux;
in in
{ {
options.home.roles.sync = { options.home.roles.sync = {
@@ -11,9 +12,10 @@ in
}; };
config = mkIf cfg.enable { config = mkIf cfg.enable {
home.packages = with pkgs; [ # Linux-only: syncthingtray requires system tray support
home.packages = optionals isLinux (with pkgs; [
syncthingtray syncthingtray
]; ]);
services.syncthing = { services.syncthing = {
enable = true; enable = true;

View File

@@ -1,56 +0,0 @@
# Edit this configuration file to define what should be installed on
# your system. Help is available in the configuration.nix(5) man page, on
# https://search.nixos.org/options and in the NixOS manual (`nixos-help`).
# NixOS-WSL specific options are documented on the NixOS-WSL repository:
# https://github.com/nix-community/NixOS-WSL
{ config, lib, pkgs, ... }:
{
imports = [
];
roles = {
audio.enable = true;
desktop = {
enable = true;
wayland = true;
};
nvidia = {
enable = true;
package = "latest";
graphics.extraPackages = with pkgs; [
mesa
libvdpau-va-gl
libva-vdpau-driver
];
};
users.enable = true;
};
networking.hostName = "wixos";
wsl.enable = true;
wsl.defaultUser = "johno";
wsl.startMenuLaunchers = true;
wsl.useWindowsDriver = true;
wsl.wslConf.network.hostname = "wixos";
wsl.wslConf.user.default = "johno";
# WSL-specific environment variables for graphics
environment.sessionVariables = {
LD_LIBRARY_PATH = [
"/usr/lib/wsl/lib"
"/run/opengl-driver/lib"
];
};
# This value determines the NixOS release from which the default
# settings for stateful data, like file locations and database versions
# on your system were taken. It's perfectly fine and recommended to leave
# this value at the release version of the first install of this system.
# Before changing this value read the documentation for this option
# (e.g. man configuration.nix or on https://nixos.org/nixos/options.html).
system.stateVersion = "24.05"; # Did you read the comment?
}

View File

@@ -3,4 +3,5 @@
tea-rbw = pkgs.callPackage ./tea-rbw {}; tea-rbw = pkgs.callPackage ./tea-rbw {};
app-launcher-server = pkgs.callPackage ./app-launcher-server {}; app-launcher-server = pkgs.callPackage ./app-launcher-server {};
claude-code = pkgs.callPackage ./claude-code {}; claude-code = pkgs.callPackage ./claude-code {};
mcrcon-rbw = pkgs.callPackage ./mcrcon-rbw {};
} }

View File

@@ -0,0 +1,40 @@
{ pkgs, ... }:
pkgs.writeShellScriptBin "mcrcon" ''
set -euo pipefail
# Configuration - can be overridden with environment variables
MINECRAFT_RCON_HOST="''${MCRCON_HOST:-10.0.0.165}"
MINECRAFT_RCON_PORT="''${MCRCON_PORT:-25575}"
RBW_ENTRY="minecraft-rcon"
# Check if rbw is available
if ! command -v rbw &> /dev/null; then
echo "Error: rbw is not available. Please ensure rbw is installed and configured."
exit 1
fi
# Retrieve password from Bitwarden
if ! MCRCON_PASS=$(rbw get "$RBW_ENTRY" 2>/dev/null); then
echo "Error: Failed to retrieve RCON password from rbw entry '$RBW_ENTRY'"
echo "Please ensure the entry exists in Bitwarden and rbw is synced."
echo ""
echo "To create the entry:"
echo " 1. Add 'minecraft-rcon' to Bitwarden with the RCON password"
echo " 2. Run 'rbw sync' to refresh the local cache"
exit 1
fi
# Export for mcrcon
export MCRCON_HOST="$MINECRAFT_RCON_HOST"
export MCRCON_PORT="$MINECRAFT_RCON_PORT"
export MCRCON_PASS
# If no arguments provided, start interactive terminal mode
if [[ $# -eq 0 ]]; then
exec ${pkgs.mcrcon}/bin/mcrcon -t
fi
# Execute mcrcon with all provided arguments
exec ${pkgs.mcrcon}/bin/mcrcon "$@"
''

38
renovate.json Normal file
View File

@@ -0,0 +1,38 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"timezone": "America/Los_Angeles",
"nix": {
"enabled": true
},
"lockFileMaintenance": {
"enabled": true,
"schedule": ["before 5am on monday"]
},
"dependencyDashboard": true,
"dependencyDashboardTitle": "NixOS Configs Dependency Dashboard",
"packageRules": [
{
"description": "Group all GitHub Actions updates",
"matchManagers": ["github-actions"],
"groupName": "github-actions"
},
{
"description": "Group stable NixOS ecosystem inputs",
"matchManagers": ["nix"],
"matchPackagePatterns": ["^nixpkgs$", "^home-manager$", "^nix-darwin$"],
"groupName": "nix-stable-ecosystem"
},
{
"description": "Group unstable NixOS ecosystem inputs",
"matchManagers": ["nix"],
"matchPackagePatterns": ["nixpkgs-unstable", "home-manager-unstable"],
"groupName": "nix-unstable-ecosystem"
},
{
"description": "Ignore private Gitea inputs (handle separately)",
"matchManagers": ["nix"],
"matchPackagePatterns": ["google-cookie-retrieval"],
"enabled": false
}
]
}

View File

@@ -8,6 +8,21 @@ in
{ {
options.roles.nfs-mounts = { options.roles.nfs-mounts = {
enable = mkEnableOption "Enable default NFS mounts"; enable = mkEnableOption "Enable default NFS mounts";
server = mkOption {
type = types.str;
default = "10.0.0.43";
description = "IP address or hostname of the NFS server";
};
remotePath = mkOption {
type = types.str;
default = "/media";
description = "Remote path to mount from the NFS server";
};
mountPoint = mkOption {
type = types.str;
default = "/media";
description = "Local mount point for the NFS share";
};
# TODO: implement requireMount # TODO: implement requireMount
requireMount = mkOption { requireMount = mkOption {
type = types.bool; type = types.bool;
@@ -18,8 +33,8 @@ in
config = mkIf cfg.enable config = mkIf cfg.enable
{ {
fileSystems."/media" = { fileSystems.${cfg.mountPoint} = {
device = "10.0.0.43:/media"; device = "${cfg.server}:${cfg.remotePath}";
fsType = "nfs"; fsType = "nfs";
options = [ options = [
"defaults" "defaults"

View File

@@ -8,6 +8,21 @@ in
{ {
options.roles.printing = { options.roles.printing = {
enable = mkEnableOption "Enable default printing setup"; enable = mkEnableOption "Enable default printing setup";
printerName = mkOption {
type = types.str;
default = "MFC-L8900CDW_series";
description = "Name for the default printer";
};
printerUri = mkOption {
type = types.str;
default = "ipp://brother.oglehome/ipp/print";
description = "Device URI for the default printer (e.g., ipp://hostname/ipp/print)";
};
printerModel = mkOption {
type = types.str;
default = "everywhere";
description = "PPD model for the printer (use 'everywhere' for driverless IPP)";
};
}; };
config = mkIf cfg.enable config = mkIf cfg.enable
@@ -21,11 +36,11 @@ in
}; };
hardware.printers.ensurePrinters = [{ hardware.printers.ensurePrinters = [{
name = "MFC-L8900CDW_series"; name = cfg.printerName;
deviceUri = "ipp://brother.oglehome/ipp/print"; deviceUri = cfg.printerUri;
model = "everywhere"; model = cfg.printerModel;
}]; }];
hardware.printers.ensureDefaultPrinter = "MFC-L8900CDW_series"; hardware.printers.ensureDefaultPrinter = cfg.printerName;
# Fix ensure-printers service to wait for network availability # Fix ensure-printers service to wait for network availability
systemd.services.ensure-printers = { systemd.services.ensure-printers = {

View File

@@ -8,6 +8,11 @@ in
{ {
options.roles.virtualisation = { options.roles.virtualisation = {
enable = mkEnableOption "Enable virtualisation"; enable = mkEnableOption "Enable virtualisation";
dockerUsers = mkOption {
type = types.listOf types.str;
default = [ "johno" ];
description = "List of users to add to the docker group";
};
}; };
config = mkIf cfg.enable config = mkIf cfg.enable
@@ -15,6 +20,6 @@ in
virtualisation.libvirtd.enable = true; virtualisation.libvirtd.enable = true;
programs.virt-manager.enable = true; programs.virt-manager.enable = true;
virtualisation.docker.enable = true; virtualisation.docker.enable = true;
users.extraGroups.docker.members = [ "johno" ]; users.extraGroups.docker.members = cfg.dockerUsers;
}; };
} }

View File

@@ -1,6 +1,30 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
--help|-h)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Rotate to the next wallpaper in the configured list."
echo ""
echo "This script increments the currentIndex in home/wallpapers/default.nix,"
echo "cycling through available wallpapers. Rebuild your system to apply"
echo "the new wallpaper."
echo ""
echo "Options:"
echo " --help, -h Show this help message"
exit 0
;;
*)
echo "Unknown option: $1"
echo "Use --help for usage information"
exit 1
;;
esac
done
# Colors for output # Colors for output
RED='\033[0;31m' RED='\033[0;31m'
GREEN='\033[0;32m' GREEN='\033[0;32m'

View File

@@ -1,6 +1,30 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
--help|-h)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Update Doom Emacs to the latest commit from the doomemacs repository."
echo ""
echo "This script fetches the latest commit SHA from the default branch,"
echo "updates the rev and sha256 in home/roles/emacs/default.nix, and"
echo "prepares the configuration for a system rebuild."
echo ""
echo "Options:"
echo " --help, -h Show this help message"
exit 0
;;
*)
echo "Unknown option: $1"
echo "Use --help for usage information"
exit 1
;;
esac
done
# Colors for output # Colors for output
RED='\033[0;31m' RED='\033[0;31m'
GREEN='\033[0;32m' GREEN='\033[0;32m'

View File

@@ -1,6 +1,35 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
# Parse arguments
while [[ $# -gt 0 ]]; do
case $1 in
--help|-h)
echo "Usage: $0 [OPTIONS]"
echo ""
echo "Perform a major upgrade of the NixOS configuration."
echo ""
echo "This script runs the following steps:"
echo " 1. Update all flake inputs (nix flake update)"
echo " 2. Update Doom Emacs to the latest commit"
echo " 3. Update Claude Code to the latest version"
echo " 4. Rotate to the next wallpaper"
echo ""
echo "After completion, review changes with 'git diff' and rebuild"
echo "your system with 'sudo nixos-rebuild switch --flake .'"
echo ""
echo "Options:"
echo " --help, -h Show this help message"
exit 0
;;
*)
echo "Unknown option: $1"
echo "Use --help for usage information"
exit 1
;;
esac
done
# Colors for output # Colors for output
RED='\033[0;31m' RED='\033[0;31m'
GREEN='\033[0;32m' GREEN='\033[0;32m'