Files
nixos-configs/home/roles/development/skills/micro/planning.md
John Ogle 7903b2dfd0
All checks were successful
CI / check (push) Successful in 3m16s
refactor(dev): restructure Claude commands/skills directories
Correct terminology mismatch:
- Rename skills/ to commands/ (these are user-invokable commands)
- Create new skills/ for reference materials
- Move bd_workflow.md to skills/ (it's reference material)
- Add micro-skills and formulas directories
- Update default.nix to install both commands and skills

Commands → ~/.claude/commands/ (invokable as /command-name)
Skills → ~/.claude/commands/skills/ (reference materials)
Formulas → ~/.beads/formulas/ (workflow templates)

Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
2026-01-15 17:16:55 -08:00

3.2 KiB

description
description
How to create effective implementation plans with phased delivery and clear success criteria

Planning

Create implementation plans that enable incremental, verifiable progress.

Core Principles

  1. Incremental delivery: Each phase should produce working, testable changes
  2. Clear checkpoints: Success criteria that can be verified without ambiguity
  3. Buy-in before detail: Confirm understanding and approach before writing specifics
  4. Explicit scope: State what we're NOT doing to prevent scope creep

Plan Document Structure

# {Feature} Implementation Plan

## Overview
{1-2 sentences: what we're building and why}

## Current State Analysis
{What exists now, key constraints, file:line references}

## Desired End State
{Specification of outcome and how to verify it}

## What We're NOT Doing
{Explicit out-of-scope items}

## Phase 1: {Descriptive Name}
### Overview
{What this phase accomplishes - should be independently valuable}

### Changes Required
{Specific files and modifications with code snippets}

### Success Criteria
#### Automated Verification
- [ ] Tests pass: `{test command}`
- [ ] Lint passes: `{lint command}`

#### Manual Verification
- [ ] {Human-observable outcome}

## Testing Strategy
{Unit tests, integration tests, manual testing steps}

## References
{Links to research, related files, similar implementations}

Phase Design

Good phases are:

  • Self-contained: Completable in one session
  • Testable: Has clear pass/fail criteria
  • Reversible: Can be rolled back if needed
  • Incremental: Builds on previous phases without requiring all phases

Bad phases are:

  • "Refactor everything" (too broad)
  • "Add helper function" (too granular)
  • Phases that only work if ALL phases complete

Success Criteria Guidelines

Automated Verification (agent-runnable):

  • Test commands: make test, npm test, nix flake check
  • Lint/format: make lint, cargo fmt --check
  • Type checking: make typecheck, tsc --noEmit
  • Build verification: make build, nix build

Manual Verification (requires human):

  • UI/UX functionality and appearance
  • Performance under realistic conditions
  • Edge cases hard to automate
  • Integration with external systems

From Contribution Guidelines (if CONTRIBUTING.md exists):

  • Include any testing requirements specified
  • Reference the guideline: "Per CONTRIBUTING.md: {requirement}"

Presenting Understanding

Before writing the plan, confirm alignment:

Based on the requirements and my research, I understand we need to [summary].

I've found that:
- [Current implementation detail with file:line]
- [Relevant pattern or constraint]
- [Potential complexity identified]

Questions my research couldn't answer:
- [Specific technical question requiring judgment]

Only ask questions you genuinely cannot answer through code investigation.

Design Options Pattern

When multiple approaches exist:

**Design Options:**
1. [Option A] - [1-sentence description]
   - Pro: [benefit]
   - Con: [drawback]

2. [Option B] - [1-sentence description]
   - Pro: [benefit]
   - Con: [drawback]

Which approach aligns best with [relevant consideration]?

Get buy-in on approach before detailing phases.