diff --git a/.context/AGENT_PLAYBOOK.md b/.context/AGENT_PLAYBOOK.md index c2c27e5f..6f517f6b 100644 --- a/.context/AGENT_PLAYBOOK.md +++ b/.context/AGENT_PLAYBOOK.md @@ -115,7 +115,7 @@ Users rarely invoke skills explicitly. Recognize natural language: | "How's our context looking?" | `/ctx-status` | | "What should we work on?" | `/ctx-next` | | "Commit this" / "Ship it" | `/ctx-commit` | -| "The rate limiter is done" / "We finished that" | `ctx complete` (match to TASKS.md) | +| "The rate limiter is done" / "We finished that" | `ctx tasks complete` (match to TASKS.md) | | "What did we learn?" | `/ctx-reflect` | | "Save that as a decision" | `/ctx-add-decision` | | "That's worth remembering" / "Any gotchas?" | `/ctx-add-learning` | diff --git a/.context/CONVENTIONS.md b/.context/CONVENTIONS.md index eac8bd62..df029ce5 100644 --- a/.context/CONVENTIONS.md +++ b/.context/CONVENTIONS.md @@ -155,3 +155,11 @@ - Zero //nolint:errcheck policy — handle errors, don't suppress them. In test code: use t.Fatal(err) for setup errors, _ = os.Chdir(orig) for cleanup. In production code: use defer func() { _ = f.Close() }() for best-effort close. For gosec false positives: prefer config-level exclusions in .golangci.yml. - Error constructors belong in internal/err, never in per-package err.go files — eliminates the broken-window pattern where agents add local errors when they see a local err.go exists. + +- CLI package taxonomy: every package under internal/cli/ follows the same structure — parent.go (Cmd wiring), doc.go, cmd/root/ or cmd// (implementation), core/ (shared helpers). cmd/ directories contain only cmd.go + run.go; all other helpers belong in core/ + +- All structs in a core/ package are consolidated into a single types.go file + +- All user-facing text is routed through internal/assets with YAML-backed TextDescKeys — no inline strings in core/ or cmd/ packages + +- Every package under internal/config/ must have a doc.go with the project header and a one-line package comment diff --git a/.context/DECISIONS.md b/.context/DECISIONS.md index f87e2b5f..967d9454 100644 --- a/.context/DECISIONS.md +++ b/.context/DECISIONS.md @@ -3,6 +3,16 @@ | Date | Decision | |------|--------| +| 2026-03-13 | build target depends on sync-why to prevent embedded doc drift | +| 2026-03-13 | Templates and user-facing text live in assets, structural constants stay in config | +| 2026-03-12 | Recommend companion RAGs as peer MCP servers not bridge through ctx | +| 2026-03-12 | Split commands.yaml into 4 domain files | +| 2026-03-12 | Rename ctx-map skill to ctx-architecture | +| 2026-03-07 | Use composite directory path constants for multi-segment paths | +| 2026-03-06 | Drop fatih/color dependency — Unicode symbols are sufficient for terminal output, color was redundant | +| 2026-03-06 | Externalize all command descriptions to embedded YAML for i18n readiness — commands.yaml holds Short/Long for 105 commands plus flag descriptions, loaded via assets.CommandDesc() and assets.FlagDesc() | +| 2026-03-06 | cmd/root + core taxonomy for all CLI packages — single-command packages use cmd/root/{cmd.go,run.go}, multi-subcommand packages use cmd//{cmd.go,run.go}, shared helpers in core/ | +| 2026-03-06 | Shared entry types and API live in internal/entry, not in CLI packages — domain types that multiple packages consume (mcp, watch, memory) belong in a domain package, not a CLI subpackage | | 2026-03-06 | PR #27 (MCP server) meets v0.1 spec requirements — merge-ready pending 3 compliance fixes | | 2026-03-06 | Skills stay CLI-based; MCP Prompts are the protocol equivalent | | 2026-03-06 | Peer MCP model for external tool integration | @@ -31,6 +41,146 @@ | 2026-02-27 | Webhook and notification design (consolidated) | +## [2026-03-13-151955] build target depends on sync-why to prevent embedded doc drift + +**Status**: Accepted + +**Context**: assets/why/ files had silently drifted from their docs/ sources + +**Decision**: build target depends on sync-why to prevent embedded doc drift + +**Rationale**: Derived assets that are not in the build dependency chain will drift — the only reliable enforcement is making the build fail without sync + +**Consequences**: Every make build now copies docs into assets before compiling + +--- + +## [2026-03-13-151954] Templates and user-facing text live in assets, structural constants stay in config + +**Status**: Accepted + +**Context**: Ongoing refactoring session moving Tpl* constants out of config/ + +**Decision**: Templates and user-facing text live in assets, structural constants stay in config + +**Rationale**: config/ is for structural constants (paths, limits, regexes); assets/ is for templates, labels, and text that would need i18n. Clean separation of concerns + +**Consequences**: All tpl_entry.go, tpl_journal.go, tpl_loop.go, tpl_recall.go moved to assets/ + +--- + +## [2026-03-12-133007] Recommend companion RAGs as peer MCP servers not bridge through ctx + +**Status**: Accepted + +**Context**: Explored whether ctx should proxy RAG queries or integrate a RAG directly + +**Decision**: Recommend companion RAGs as peer MCP servers not bridge through ctx + +**Rationale**: MCP is the composition layer — agents already compose multiple servers. ctx is context, RAGs are intelligence. No bridging, no plugin system, no schema abstraction + +**Consequences**: Spec created at ideas/spec-companion-intelligence.md; future work is documentation and UX only + +--- + +## [2026-03-12-133007] Split commands.yaml into 4 domain files + +**Status**: Accepted + +**Context**: Single 2373-line YAML mixed commands, flags, text, and examples with inconsistent quoting + +**Decision**: Split commands.yaml into 4 domain files + +**Rationale**: Context is for humans — localization files should be human-readable block scalars. Separate files eliminate the underscore prefix namespace hack + +**Consequences**: 4 files (commands.yaml, flags.yaml, text.yaml, examples.yaml) with dedicated loaders in embed.go + +--- + +## [2026-03-12-133007] Rename ctx-map skill to ctx-architecture + +**Status**: Accepted + +**Context**: The name 'map' didn't convey the iterative, architectural nature of the ritual + +**Decision**: Rename ctx-map skill to ctx-architecture + +**Rationale**: 'architecture' better describes surveying and evolving project structure across sessions + +**Consequences**: All cross-references updated across skills, docs, .context files, and settings + +--- + +## [2026-03-07-221155] Use composite directory path constants for multi-segment paths + +**Status**: Accepted + +**Context**: Needed a constant for hooks/messages path used in message.go and message_cmd.go + +**Decision**: Use composite directory path constants for multi-segment paths + +**Rationale**: Matches existing pattern of DirClaudeHooks = '.claude/hooks' — keeps filepath.Join calls cleaner and avoids scattering path segments + +**Consequences**: New multi-segment directory paths should be single constants (e.g. DirHooksMessages, DirMemoryArchive) rather than joined from individual segment constants + +--- + +## [2026-03-06-200306] Drop fatih/color dependency — Unicode symbols are sufficient for terminal output, color was redundant + +**Status**: Accepted + +**Context**: fatih/color was used in 32 files for green checkmarks, yellow warnings, cyan headings, dim text + +**Decision**: Drop fatih/color dependency — Unicode symbols are sufficient for terminal output, color was redundant + +**Rationale**: Every colored output already had a semantic symbol (✓, ⚠, ○) that conveyed the same meaning; color added visual noise in non-terminal contexts (logs, pipes) + +**Consequences**: Removed --no-color flag (only existed for color.NoColor); one fewer external dependency; FlagNoColor retained in config for CLI compatibility + +--- + +## [2026-03-06-200257] Externalize all command descriptions to embedded YAML for i18n readiness — commands.yaml holds Short/Long for 105 commands plus flag descriptions, loaded via assets.CommandDesc() and assets.FlagDesc() + +**Status**: Accepted + +**Context**: Command descriptions were inline strings scattered across 105 cobra.Command definitions + +**Decision**: Externalize all command descriptions to embedded YAML for i18n readiness — commands.yaml holds Short/Long for 105 commands plus flag descriptions, loaded via assets.CommandDesc() and assets.FlagDesc() + +**Rationale**: Centralizing user-facing text in a single translatable file prepares for i18n without runtime cost (embedded at compile time) + +**Consequences**: System's 30 hidden hook subcommands excluded (not user-facing); flag descriptions use _flags.scope.name convention + +--- + +## [2026-03-06-200247] cmd/root + core taxonomy for all CLI packages — single-command packages use cmd/root/{cmd.go,run.go}, multi-subcommand packages use cmd//{cmd.go,run.go}, shared helpers in core/ + +**Status**: Accepted + +**Context**: 35 CLI packages had inconsistent flat structures mixing Cmd(), run logic, helpers, and types in the same directory + +**Decision**: cmd/root + core taxonomy for all CLI packages — single-command packages use cmd/root/{cmd.go,run.go}, multi-subcommand packages use cmd//{cmd.go,run.go}, shared helpers in core/ + +**Rationale**: Taxonomical symmetry: every package has the same predictable shape, making navigation instant and agent-friendly + +**Consequences**: cmd/ contains only cmd.go + run.go; helpers go to core/; 474 files changed in initial restructuring + +--- + +## [2026-03-06-200227] Shared entry types and API live in internal/entry, not in CLI packages — domain types that multiple packages consume (mcp, watch, memory) belong in a domain package, not a CLI subpackage + +**Status**: Accepted + +**Context**: External consumers were importing cli/add for EntryParams/ValidateEntry/WriteEntry, creating a leaky abstraction + +**Decision**: Shared entry types and API live in internal/entry, not in CLI packages — domain types that multiple packages consume (mcp, watch, memory) belong in a domain package, not a CLI subpackage + +**Rationale**: Domain types in CLI packages force consumers to depend on CLI internals; internal/entry provides a clean boundary + +**Consequences**: entry aliases Params from add/core to avoid import cycle (entry imports add/core for insert logic); future work may move insert logic to entry to eliminate the cycle + +--- + ## [2026-03-06-141507] PR #27 (MCP server) meets v0.1 spec requirements — merge-ready pending 3 compliance fixes **Status**: Accepted @@ -153,7 +303,7 @@ **Rationale**: The output pipeline (map[string][]string to Mermaid/table/JSON) was already language-agnostic. Each ecosystem builder is ~40 lines — this is finishing what was started, not bloat. Static manifest parsing (no external tools for Node/Python) keeps dependencies minimal. -**Consequences**: ctx deps now auto-detects Go, Node.js, Python, Rust. --type flag overrides detection. ctx-map skill works across ecosystems without changes. +**Consequences**: ctx deps now auto-detects Go, Node.js, Python, Rust. --type flag overrides detection. ctx-architecture skill works across ecosystems without changes. --- diff --git a/.context/DETAILED_DESIGN.md b/.context/DETAILED_DESIGN.md index d75442d5..5a069e40 100644 --- a/.context/DETAILED_DESIGN.md +++ b/.context/DETAILED_DESIGN.md @@ -856,7 +856,7 @@ Consult specific sections when working on a module. | `check-version` | UserPromptSubmit | (all) | Compare binary version (ldflags) vs plugin.json major.minor; skip "dev" builds. Piggyback: check encryption key age vs `rc.KeyRotationDays()` | Daily | | `check-resources` | UserPromptSubmit | (all) | `sysinfo.Collect()` + `Evaluate()`; output ONLY at DANGER severity (mem≥90%, swap≥75%, disk≥95%, load≥1.5x CPUs) | None | | `check-knowledge` | UserPromptSubmit | (all) | DECISIONS entry count vs `rc.EntryCountDecisions()` (default 20), LEARNINGS vs `rc.EntryCountLearnings()` (default 30), CONVENTIONS lines vs `rc.ConventionLineCount()` (default 200). Suggest /ctx-consolidate | Daily | -| `check-map-staleness` | UserPromptSubmit | (all) | Two conditions (both required): map-tracking.json `last_run` >30 days AND `git log --since= -- internal/` has commits. Suggest /ctx-map | Daily | +| `check-map-staleness` | UserPromptSubmit | (all) | Two conditions (both required): map-tracking.json `last_run` >30 days AND `git log --since= -- internal/` has commits. Suggest /ctx-architecture | Daily | | `check-backup-age` | UserPromptSubmit | (all) | Check SMB mount (via GVFS path from `CTX_BACKUP_SMB_URL` env) + backup marker mtime (>2 days). Suggest `ctx system backup` | Daily | | `mark-journal` | (plumbing) | — | `ctx system mark-journal [--check]`. Valid stages: exported, enriched, normalized, fences_verified, locked | N/A | | `cleanup-tmp` | SessionEnd | (all) | Remove files >15 days old from `secureTempDir()`. Silent side-effect, no output | N/A | diff --git a/.context/LEARNINGS.md b/.context/LEARNINGS.md index 2a7c9a1c..956a2e9a 100644 --- a/.context/LEARNINGS.md +++ b/.context/LEARNINGS.md @@ -3,6 +3,14 @@ | Date | Learning | |------|--------| +| 2026-03-13 | sync-why mechanism existed but was not wired to build | +| 2026-03-13 | Linter reverts import-only edits when references still use old package | +| 2026-03-12 | Project-root files vs context files are distinct categories | +| 2026-03-12 | Constants belong in their domain package not in god objects | +| 2026-03-07 | Always search for existing constants before adding new ones | +| 2026-03-07 | SafeReadFile requires split base+filename paths | +| 2026-03-06 | Spawned agents reliably create new files but consistently fail to delete old ones — always audit for stale files, duplicate function definitions, and orphaned imports after agent-driven refactoring | +| 2026-03-06 | Import cycle avoidance: when package A imports package B for logic, B must own shared types — A aliases them. entry imports add/core for insert logic, so add/core owns EntryParams and entry aliases it as entry.Params | | 2026-03-06 | Stale directory inodes cause invisible files over SSH | | 2026-03-06 | Stats sort uses string comparison on RFC3339 timestamps with mixed timezones | | 2026-03-06 | Claude Code supports PreCompact and SessionStart hooks that ctx does not use | @@ -58,6 +66,86 @@ --- +## [2026-03-13-151952] sync-why mechanism existed but was not wired to build + +**Context**: assets/why/ had drifted from docs/ — the sync targets existed in the Makefile but build did not depend on sync-why + +**Lesson**: Freshness checks that are not in the critical path will be forgotten. Wire them as build prerequisites, not optional audit steps + +**Application**: Any derived or copied asset should be a prerequisite of build, not just audit + +--- + +## [2026-03-13-151951] Linter reverts import-only edits when references still use old package + +**Context**: Moving tpl_entry.go from config/entry to assets — linter kept reverting the import change + +**Lesson**: When moving constants between packages, change imports and all references in a single atomic write (use Write not incremental Edit), so the linter never sees an inconsistent state + +**Application**: For future package migrations, use full file rewrites when a linter is active + +--- + +## [2026-03-12-133008] Project-root files vs context files are distinct categories + +**Context**: Tried moving ImplementationPlan constant to config/ctx assuming it was a context file + +**Lesson**: Files created by ctx init in the project root (Makefile, IMPLEMENTATION_PLAN.md) are scaffolding, not context files loaded via ReadOrder. They belong in config/file, not config/ctx + +**Application**: Before moving a file constant, check whether it is in ReadOrder (context) or created by init (project-root) + +--- + +## [2026-03-12-133007] Constants belong in their domain package not in god objects + +**Context**: file.go held agent scoring constants, budget percentages, cooldown durations — none related to file config + +**Lesson**: When a constant is only used by one domain (e.g. agent scoring), it should live in that domain's config package + +**Application**: Check callers before placing constants; if all callers are in one domain, the constant belongs there + +--- + +## [2026-03-07-221151] Always search for existing constants before adding new ones + +**Context**: Added ExtJsonl constant to config/file.go but ExtJSONL already existed with the same value, causing a duplicate + +**Lesson**: Grep for the value (e.g. '.jsonl') across config/ before creating a new constant — naming variations (camelCase vs ALLCAPS) make duplicates easy to miss + +**Application**: Before adding any new constant to internal/config, search by value not just by name + +--- + +## [2026-03-07-221148] SafeReadFile requires split base+filename paths + +**Context**: During system/core cleanup, persistence.go passed a full path to validation.SafeReadFile which expects (baseDir, filename) separately + +**Lesson**: Use filepath.Dir(path) and filepath.Base(path) to split full paths when adapting os.ReadFile calls to SafeReadFile + +**Application**: When converting os.ReadFile to SafeReadFile, always check whether the existing code has a full path or separate components + +--- + +## [2026-03-06-200319] Spawned agents reliably create new files but consistently fail to delete old ones — always audit for stale files, duplicate function definitions, and orphaned imports after agent-driven refactoring + +**Context**: Multiple agent batches across cmd/ restructuring, color removal, and flag externalization left stale files, duplicate run.go, and unupdated parent imports + +**Lesson**: Agent cleanup is a known gap — budget 5-10 minutes for post-agent audit per batch + +**Application**: After every agent batch: grep for stale package declarations, check parent imports point to cmd/root not cmd/, verify old files are deleted + +--- + +## [2026-03-06-200237] Import cycle avoidance: when package A imports package B for logic, B must own shared types — A aliases them. entry imports add/core for insert logic, so add/core owns EntryParams and entry aliases it as entry.Params + +**Context**: Extracting entry.Params as a standalone struct in internal/entry created a cycle because entry/write.go imports add/core for AppendEntry + +**Lesson**: The package that provides implementation logic must own the types; the facade package aliases them + +**Application**: When extracting shared types from implementation packages, check the import direction first — the type lives where the logic lives + +--- + ## [2026-03-06-141506] Stale directory inodes cause invisible files over SSH **Context**: Files created by Claude Code hooks were visible inside the VM but not from the SSH terminal @@ -415,7 +503,7 @@ - CLI reference docs can outpace implementation: ctx remind had no CLI, ctx recall sync had no Cobra wiring, key file naming diverged between docs and code. Always verify with `ctx --help` before releasing docs. - Structural doc sections (project layouts, command tables, skill counts) drift silently. Add `` markers above any section that mirrors codebase structure. - Agent sweeps for style violations are unreliable (8 found vs 48+ actual). Always follow agent results with targeted grep and manual classification. -- ARCHITECTURE.md missed 4 core packages and 4 CLI commands. The /ctx-drift skill catches stale paths but not missing entries — run /ctx-map after adding new packages or commands. +- ARCHITECTURE.md missed 4 core packages and 4 CLI commands. The /ctx-drift skill catches stale paths but not missing entries — run /ctx-architecture after adding new packages or commands. - Documentation audits must compare against known-good examples and pattern-match for the COMPLETE standard, not just presence of any comment. - Dead link checking belongs in /consolidate's check list (check 12), not as a standalone concern. When a new audit concern emerges, check if it fits an existing audit skill first. diff --git a/.context/TASKS.md b/.context/TASKS.md index 7a972923..e0aad290 100644 --- a/.context/TASKS.md +++ b/.context/TASKS.md @@ -455,11 +455,11 @@ similarity and merges them with user approval. Originals archived, not deleted. Spec: `specs/context-consolidation.md` Ref: https://github.com/ActiveMemory/ctx/issues/19 (Phase 3) -### Phase 10: Architecture Mapping Skill (`/ctx-map`) +### Phase 10: Architecture Mapping Skill (`/ctx-architecture`) **Context**: Skill that incrementally builds and maintains ARCHITECTURE.md and DETAILED_DESIGN.md. Coverage tracked in map-tracking.json. -Spec: `specs/ctx-map.md` +Spec: `specs/ctx-architecture.md` ### Docs: Knowledge Health @@ -479,6 +479,82 @@ output package. All CLI commands should route printed output through this packag - [x] WC.1: Add godoc docstrings to all functions in `internal/write/`, add `doc.go` #added:2026-03-06 #done:2026-03-06 - [x] Move add command example strings from core/example.go to assets — user-facing text for i18n #added:2026-03-06-191651 +- [ ] SEC.1: Security-sensitive file change hook — PostToolUse on Edit/Write matching security-critical paths (.claude/settings.local.json, .claude/settings.json, CLAUDE.md, .claude/CLAUDE.md, .context/CONSTITUTION.md). Three actions: (1) nudge user in-session, (2) relay to webhook for out-of-band alerting (autonomous loops), (3) append to dedicated security log (.context/state/security-events.jsonl) for forensics. Separate from general event log. Spec needed. #priority:high #added:2026-03-13 + +- [ ] O.5: Session timeline view — add --sessions flag to ctx system events. Per-session breakdown of eval/fired counts with hook list. See ideas/spec-hook-observability.md Phase 5 #added:2026-03-12-145401 + +- [ ] O.4: Doctor hook health check — surface hook activity in ctx doctor output (active/evaluated-never-fired/never-evaluated). See ideas/spec-hook-observability.md Phase 4 #added:2026-03-12-145401 + +- [ ] O.3: Skip reason logging — add eventlog.Skip() with standard reason constants (paused, throttled, condition-not-met). Instrument 19 hook early-exit paths. See ideas/spec-hook-observability.md Phase 3 #added:2026-03-12-145401 + +- [ ] O.2: Event summary view — add --summary flag to ctx system events. Aggregates eval/fired counts per hook, shows last-eval/last-fired timestamps, lists never-evaluated hooks. See ideas/spec-hook-observability.md Phase 2 #added:2026-03-12-145401 + +- [ ] O.1: Hook eval logging — wrap hook cobra commands to log 'eval' events on every invocation. Refactor Run() signatures from os.Stdin to io.Reader (peek+replay pattern). Adds eventlog.Eval(), EventTypeEval constant. See ideas/spec-hook-observability.md Phase 1 #added:2026-03-12-145401 + +- [ ] Companion intelligence recommendation: implement spec from ideas/spec-companion-intelligence.md — ctx doctor companion detection, ctx init recommendation tip, ctx agent awareness in packets #added:2026-03-12-133008 + +- [ ] Add configurable assets layer: allow users to plug their own YAML files for localization (language selection, custom text overrides). Currently all user-facing text is hardcoded in commands.yaml; need a mechanism to load user-provided YAML that overlays or replaces built-in text. This enables i18n without forking. #priority:low #added:2026-03-07-233756 + +- [ ] Cleanup internal/cli/system/core/persistence.go: move 10 (base for ParseInt) to config constant #priority:low #added:2026-03-07-220825 + +- [ ] Cleanup internal/cli/system/core/session_tokens.go: move SessionStats from state.go to types.go #priority:low #added:2026-03-07-220825 + +- [ ] Cleanup internal/cli/system/core/wrapup.go: line 18 constant should go to config; make WrappedUpExpiry configurable via ctxrc #priority:low #added:2026-03-07-220825 + +- [ ] Cleanup internal/cli/system/core/version.go: line 81 newline should come from config #priority:low #added:2026-03-07-220819 + +- [ ] Add taxonomy to internal/cli/system/core/ — currently an unstructured bag of files; group by domain (backup, hooks, session, knowledge, etc.) #priority:medium #added:2026-03-07-220819 + +- [ ] Cleanup internal/cli/system/core/version_drift.go: line 53 string formatting should use assets #priority:medium #added:2026-03-07-220819 + +- [ ] Cleanup internal/cli/system/core/state.go: magic permissions (0o750), magic strings ('Context: ' prefix, etc.) #priority:medium #added:2026-03-07-220819 + +- [ ] Cleanup internal/cli/system/core/smb.go: errors should come from internal/err; lines 101, 116, 111 need assets text #priority:medium #added:2026-03-07-220819 + +- [ ] Make AutoPruneStaleDays configurable via ctxrc. Currently hardcoded to 7 days in config.AutoPruneStaleDays; add a ctxrc key (e.g., auto_prune_days) and fallback to the default. #priority:low #added:2026-03-07-220512 + +- [ ] Refactor check_backup_age/run.go: move consts (lines 23-24) to config, magic directories (line 59) to config, symbolic constants for strings (line 72), messages to assets (lines 79, 90-91), extract non-Run functions to system/core, fix docstrings #priority:medium #added:2026-03-07-180020 + +- [ ] Add ctxrc support for recall.list.limit to make the default --limit for recall list configurable. Currently hardcoded as config.DefaultRecallListLimit (20). #priority:low #added:2026-03-07-164342 + +- [ ] Extract journal/core into a standalone journal parser package — functionally isolated enough for its own package rather than remaining as core/ #added:2026-03-07-093815 + +- [ ] Move PluginInstalled/PluginEnabledGlobally/PluginEnabledLocally from initialize to internal/claude — these are Claude Code plugin detection functions, not init-specific #added:2026-03-07-091656 + +- [ ] Move guide/cmd/root/run.go text to assets, listCommands to separate file + internal/write #added:2026-03-07-090322 + +- [ ] Move drift/core/sanitize.go strings to assets #added:2026-03-07-090322 + +- [ ] Move drift/core/out.go output functions to internal/write per convention #added:2026-03-07-090322 + +- [ ] Move drift/core/fix.go fmt.Sprintf strings to assets — user-facing output text for i18n #added:2026-03-07-090322 + +- [ ] Move drift/cmd/root/run.go cmd.Print* output strings to internal/write per convention #added:2026-03-07-084152 + +- [ ] Extract doctor/core/checks.go strings — 105 inline Name/Category/Message values to assets (i18n) and config (Name/Category constants) #added:2026-03-07-083428 + +- [ ] Split deps/core builders into per-ecosystem packages — go.go, node.go, python.go, rust.go are specific enough for their own packages under deps/core/ or deps/builders/ #added:2026-03-07-082827 + +- [ ] Audit git graceful degradation — verify all exec.Command(git) call sites degrade gracefully when git is absent, per project guide recommendation #added:2026-03-07-081625 + +- [ ] Fix 19 doc.go quality issues: system (13 missing subcmds), agent (phantom refs), load/loop (header typo), claude (stale migration note), 13 minimal descriptions (pause, resume, task, notify, decision, learnings, remind, context, eventlog, index, rc, recall/parser, task/core) #added:2026-03-07-075741 + +- [ ] Move cmd.Print* output strings in compact/cmd/root/run.go to internal/write per convention #added:2026-03-07-074737 + +- [ ] Extract changes format.go rendering templates to assets — headings, labels, and format strings are user-facing text for i18n #added:2026-03-07-074719 + +- [ ] Lift HumanAgo and Pluralize to a common package — reusable time formatting, used by changes and potentially status/recall #added:2026-03-07-074649 + +- [ ] Extract isAlnum predicate for localization — currently ASCII-only in agent keyword extraction (score.go:141) #added:2026-03-07-073900 + +- [ ] Make stopwords configurable via .ctxrc — currently embedded in assets, domain users need custom terms #added:2026-03-07-073900 + +- [ ] Make recency scoring thresholds and relevance match cap configurable via .ctxrc — currently hardcoded in config (7/30/90 days, cap 3) #added:2026-03-07-073900 + +- [ ] Make DefaultAgentCooldown configurable via .ctxrc — currently hardcoded at 10 minutes in config #added:2026-03-07-073106 + +- [ ] Make TaskBudgetPct and ConventionBudgetPct configurable via .ctxrc — currently hardcoded at 0.40 and 0.20 in config #added:2026-03-07-072714 + - [ ] Localization inventory: audit config constants, write package templates, and assets YAML for i18n mapping — low priority, most users are English-first developers #added:2026-03-06-192419 - [ ] Consider indexing tasks and conventions in TASKS.md and CONVENTIONS.md (currently only decisions and learnings have index tables) #added:2026-03-06-190225 diff --git a/.context/decisions-reference.md b/.context/decisions-reference.md index fe3f4693..58220505 100644 --- a/.context/decisions-reference.md +++ b/.context/decisions-reference.md @@ -107,7 +107,7 @@ preserved verbatim. **Status**: Accepted -**Context**: Designing the /ctx-map skill output documents — needed to decide where DETAILED_DESIGN.md fits in the context loading pipeline +**Context**: Designing the /ctx-architecture skill output documents — needed to decide where DETAILED_DESIGN.md fits in the context loading pipeline **Decision**: DETAILED_DESIGN.md lives outside FileReadOrder diff --git a/.ctxrc.base b/.ctxrc.base index df21a5b0..c2200e35 100644 --- a/.ctxrc.base +++ b/.ctxrc.base @@ -4,6 +4,8 @@ # All settings use defaults. Copy to .ctxrc and uncomment to customize. # See .ctxrc.dev for a verbose profile with logging enabled. # +profile: base + # context_dir: .context # token_budget: 8000 # event_log: false diff --git a/.ctxrc.dev b/.ctxrc.dev index 7fcedf5d..b18caa59 100644 --- a/.ctxrc.dev +++ b/.ctxrc.dev @@ -4,6 +4,8 @@ # All settings are optional. Missing values use defaults. # Priority: CLI flags > environment variables > .ctxrc > defaults +profile: dev + # context_dir: .context # token_budget: 8000 # auto_archive: true diff --git a/.golangci.yml b/.golangci.yml index d0cca7e1..09228fe9 100644 --- a/.golangci.yml +++ b/.golangci.yml @@ -34,6 +34,10 @@ linters: - linters: [gosec] text: "G30[16]" path: "_test\\.go" + # TextDescKey constants are i18n keys, not credentials + - linters: [gosec] + text: "G101" + path: "internal/assets/embed\\.go" run: timeout: 5m diff --git a/Makefile b/Makefile index dcf0d0b8..44d669bc 100644 --- a/Makefile +++ b/Makefile @@ -14,8 +14,8 @@ OUTPUT := $(BINARY) # Default target all: build -## build: Build for current platform -build: +## build: Build for current platform (syncs embedded docs first) +build: sync-why CGO_ENABLED=0 go build -ldflags="-X github.com/ActiveMemory/ctx/internal/bootstrap.version=$$(cat VERSION | tr -d '[:space:]')" -o $(OUTPUT) ./cmd/ctx ## test: Run tests with coverage summary diff --git a/cmd/ctx/doc.go b/cmd/ctx/doc.go new file mode 100644 index 00000000..dcbb4f20 --- /dev/null +++ b/cmd/ctx/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package main is the entry point for the ctx CLI. +// +// It initializes the root command tree and delegates execution to the +// bootstrap package. +package main diff --git a/docs/cli/context.md b/docs/cli/context.md index 8dfd856a..f36db527 100644 --- a/docs/cli/context.md +++ b/docs/cli/context.md @@ -64,30 +64,6 @@ ctx add convention "Use kebab-case for filenames" --section "Naming" --- -### `ctx complete` - -Mark a task as completed. - -```bash -ctx complete -``` - -**Arguments**: - -* `task-id-or-text`: Task number or partial text match - -**Examples**: - -```bash -# By text (partial match) -ctx complete "user auth" - -# By task number -ctx complete 3 -``` - ---- - ### `ctx drift` Detect stale or invalid context. @@ -110,7 +86,7 @@ ctx drift [flags] * Constitution rules aren't violated (*heuristic*) * Staleness indicators (*old files, many completed tasks*) * Missing packages: warns when `internal/` directories exist on disk but are - not referenced in `ARCHITECTURE.md` (*suggests running `/ctx-map`*) + not referenced in `ARCHITECTURE.md` (*suggests running `/ctx-architecture`*) * Entry count: warns when `LEARNINGS.md` or `DECISIONS.md` exceed configurable thresholds (*default: 30 learnings, 20 decisions*), or when `CONVENTIONS.md` exceeds a line count threshold (default: 200). Configure via `.ctxrc`: @@ -196,12 +172,34 @@ ctx compact --archive ### `ctx tasks` -Manage task archival and snapshots. +Manage task completion, archival, and snapshots. ```bash ctx tasks ``` +#### `ctx tasks complete` + +Mark a task as completed. + +```bash +ctx tasks complete +``` + +**Arguments**: + +* `task-id-or-text`: Task number or partial text match + +**Examples**: + +```bash +# By text (partial match) +ctx tasks complete "user auth" + +# By task number +ctx tasks complete 3 +``` + #### `ctx tasks archive` Move completed tasks from `TASKS.md` to a timestamped archive file. diff --git a/docs/cli/index.md b/docs/cli/index.md index fa7d8177..4dbbaa03 100644 --- a/docs/cli/index.md +++ b/docs/cli/index.md @@ -51,11 +51,10 @@ own guards and no-op gracefully. | [`ctx agent`](init-status.md#ctx-agent) | Print token-budgeted context packet for AI consumption | | [`ctx load`](init-status.md#ctx-load) | Output assembled context in read order | | [`ctx add`](context.md#ctx-add) | Add a task, decision, learning, or convention | -| [`ctx complete`](context.md#ctx-complete) | Mark a task as done | | [`ctx drift`](context.md#ctx-drift) | Detect stale paths, secrets, missing files | | [`ctx sync`](context.md#ctx-sync) | Reconcile context with codebase state | | [`ctx compact`](context.md#ctx-compact) | Archive completed tasks, clean up files | -| [`ctx tasks`](context.md#ctx-tasks) | Task archival and snapshots | +| [`ctx tasks`](context.md#ctx-tasks) | Task completion, archival, and snapshots | | [`ctx permissions`](context.md#ctx-permissions) | Permission snapshots (golden image) | | [`ctx reindex`](context.md#ctx-reindex) | Regenerate indices for `DECISIONS.md` and `LEARNINGS.md` | | [`ctx decisions`](context.md#ctx-decisions) | Manage `DECISIONS.md` (reindex) | @@ -74,6 +73,7 @@ own guards and no-op gracefully. | [`ctx prompt`](tools.md#ctx-prompt) | Manage reusable prompt templates | | [`ctx remind`](tools.md#ctx-remind) | Session-scoped reminders that surface at session start | | [`ctx completion`](tools.md#ctx-completion) | Generate shell autocompletion scripts | +| [`ctx guide`](tools.md#ctx-guide) | Quick-reference cheat sheet | | [`ctx why`](tools.md#ctx-why) | Read the philosophy behind ctx | | [`ctx site`](tools.md#ctx-site) | Site management (feed generation) | | [`ctx doctor`](doctor.md#ctx-doctor) | Structural health check (hooks, drift, config) | diff --git a/docs/cli/tools.md b/docs/cli/tools.md index fdfa2c98..0f0a0ca8 100644 --- a/docs/cli/tools.md +++ b/docs/cli/tools.md @@ -399,6 +399,69 @@ ctx notify test --- +### `ctx changes` + +Show what changed in context files and code since your last session. + +Automatically detects the previous session boundary from state markers +or event log. Useful at session start to quickly see what moved while +you were away. + +```bash +ctx changes [flags] +``` + +**Flags**: + +| Flag | Description | +|-----------|-------------------------------------------------------| +| `--since` | Time reference: duration (`24h`) or date (`2026-03-01`) | + +**Reference time detection** (priority order): + +1. `--since` flag (duration, date, or RFC3339 timestamp) +2. `ctx-loaded-*` marker files in `.context/state/` (second most recent) +3. Last `context-load-gate` event from `.context/state/events.jsonl` +4. Fallback: 24 hours ago + +**Example**: + +```bash +# Auto-detect last session, show what changed +ctx changes + +# Changes in the last 48 hours +ctx changes --since 48h + +# Changes since a specific date +ctx changes --since 2026-03-10 +``` + +**Output**: + +``` +## Changes Since Last Session + +**Reference point**: 6 hours ago + +### Context File Changes +- `TASKS.md` — modified 2026-03-12 14:30 +- `DECISIONS.md` — modified 2026-03-12 09:15 + +### Code Changes +- **12 commits** since reference point +- **Latest**: Fix journal enrichment ordering +- **Directories touched**: internal, docs, specs +- **Authors**: jose, claude +``` + +Context file changes are detected by filesystem mtime (works without +git). Code changes use `git log --since` (empty when not in a git repo). + +**See also**: [Reviewing Session Changes](../recipes/session-changes.md) + +--- + ### `ctx deps` Generate a dependency graph from source code. @@ -684,6 +747,84 @@ ctx pad merge --dry-run pad-a.enc pad-b.md --- +### `ctx prompt` + +Manage reusable prompt templates stored in `.context/prompts/`. + +Templates are Markdown files that can be applied via the `/ctx-prompt` +skill or listed and shown from the CLI. + +```bash +ctx prompt +``` + +#### `ctx prompt list` + +List all available prompt templates. + +```bash +ctx prompt list +``` + +**Aliases**: `ls` + +#### `ctx prompt show` + +Print a prompt template to stdout. + +```bash +ctx prompt show +``` + +**Arguments**: + +- `name`: Template name (without `.md` extension) + +#### `ctx prompt add` + +Create a new prompt template. Reads from stdin when `--stdin` is set, +otherwise creates from the embedded starter template. + +```bash +ctx prompt add [flags] +``` + +**Arguments**: + +- `name`: Template name (without `.md` extension) + +**Flags**: + +| Flag | Description | +|-----------|-------------------------------------| +| `--stdin` | Read template content from stdin | + +**Examples**: + +```bash +# Create from starter template +ctx prompt add code-review + +# Create from stdin +echo "Review this PR for security issues" | ctx prompt add security-review --stdin +``` + +#### `ctx prompt rm` + +Delete a prompt template. + +```bash +ctx prompt rm +``` + +**Arguments**: + +- `name`: Template name (without `.md` extension) + +**See also**: [Prompt Templates](../recipes/prompts.md) + +--- + ### `ctx remind` Session-scoped reminders that surface at session start. Reminders are @@ -926,6 +1067,38 @@ make site # Builds site + feed --- +### `ctx guide` + +Quick-reference cheat sheet for common ctx commands and skills. + +```bash +ctx guide [flags] +``` + +**Flags**: + +| Flag | Description | +|--------------|------------------------------| +| `--skills` | Show available skills | +| `--commands` | Show available CLI commands | + +**Example**: + +```bash +# Show the full cheat sheet +ctx guide + +# Skills only +ctx guide --skills + +# Commands only +ctx guide --commands +``` + +Works without initialization (no `.context/` required). + +--- + ### `ctx why` Read `ctx`'s philosophy documents directly in the terminal. diff --git a/docs/home/common-workflows.md b/docs/home/common-workflows.md index 31545b26..824791da 100644 --- a/docs/home/common-workflows.md +++ b/docs/home/common-workflows.md @@ -41,7 +41,7 @@ ctx add learning "Mock functions must be hoisted in Jest" \ --application "Place jest.mock() before imports" # Mark task complete -ctx complete "user auth" +ctx tasks complete "user auth" ``` ## Leave a Reminder for Next Session @@ -320,7 +320,7 @@ These have no CLI equivalent. They require the agent's reasoning. | `/ctx-journal-enrich-all` | Full journal pipeline: export if needed, then batch-enrich | | `/ctx-blog` | Generate a blog post ([zensical](https://pypi.org/project/zensical/)-flavored Markdown) | | `/ctx-blog-changelog` | Generate themed blog post from commits between releases | -| `/ctx-map` | Build and maintain architecture maps (ARCHITECTURE.md, DETAILED_DESIGN.md) | +| `/ctx-architecture` | Build and maintain architecture maps (ARCHITECTURE.md, DETAILED_DESIGN.md) | ### CLI-Only Commands @@ -330,7 +330,7 @@ These are infrastructure: used in scripts, CI, or one-time setup. |----------------------------|-------------------------------------------------| | `ctx init` | Initialize `.context/` directory | | `ctx load` | Output assembled context for piping | -| `ctx complete` | Mark a task done by substring match | +| `ctx tasks complete` | Mark a task done by substring match | | `ctx sync` | Reconcile context with codebase state | | `ctx compact` | Consolidate and clean up context files | | `ctx hook` | Generate AI tool integration config | diff --git a/docs/home/configuration.md b/docs/home/configuration.md index 81db05ed..9cb3b537 100644 --- a/docs/home/configuration.md +++ b/docs/home/configuration.md @@ -87,6 +87,7 @@ A commented `.ctxrc` showing all options and their defaults: # billing_token_warn: 0 # one-shot warning at this token count (0 = disabled) # # key_rotation_days: 90 +# task_nudge_interval: 5 # Edit/Write calls between task completion nudges # # notify: # requires: ctx notify setup # events: # required: no events sent unless listed @@ -123,6 +124,7 @@ A commented `.ctxrc` showing all options and their defaults: | `context_window` | `int` | `200000` | Context window size in tokens. Auto-detected for Claude Code (200k/1M); override for other AI tools | | `billing_token_warn` | `int` | `0` *(off)* | One-shot warning when session tokens exceed this threshold (0 = disabled). For plans where tokens beyond an included allowance cost extra | | `key_rotation_days` | `int` | `90` | Days before encryption key rotation nudge | +| `task_nudge_interval` | `int` | `5` | Edit/Write calls between task completion nudges | | `notify.events` | `[]string` | *(all)* | Event filter for webhook notifications (empty = all) | | `priority_order` | `[]string` | *(see below)* | Custom file loading priority for context assembly | diff --git a/docs/operations/autonomous-loop.md b/docs/operations/autonomous-loop.md index 38228aa4..3e59f6a4 100644 --- a/docs/operations/autonomous-loop.md +++ b/docs/operations/autonomous-loop.md @@ -258,7 +258,7 @@ During the loop, the AI should update context files: **Mark task complete:** ```bash -ctx complete "implement user auth" +ctx tasks complete "implement user auth" ``` Or emit an update command (parsed by `ctx watch`): @@ -379,7 +379,7 @@ End EVERY response with one of: **Fix**: Add explicit instructions to PROMPT.md: ```markdown After completing a task, you MUST: -1. Run: ctx complete "" +1. Run: ctx tasks complete "" 2. Add learnings: ctx add learning "..." ``` @@ -392,7 +392,7 @@ After completing a task, you MUST: ```markdown Order of operations: 1. Complete coding work -2. Update context files (*`ctx complete`, `ctx add`*) +2. Update context files (*`ctx tasks complete`, `ctx add`*) 3. Commit **ALL** changes including `.context/` 4. Then signal status ``` diff --git a/docs/operations/integrations.md b/docs/operations/integrations.md index b27973f2..0c147b13 100644 --- a/docs/operations/integrations.md +++ b/docs/operations/integrations.md @@ -320,7 +320,7 @@ These are invoked in Claude Code with `/skill-name`. | `/ctx-implement` | Execute a plan step-by-step with checks | | `/ctx-import-plans` | Import Claude Code plan files into project specs | | `/ctx-worktree` | Manage git worktrees for parallel agents | -| `/ctx-map` | Build and maintain architecture maps | +| `/ctx-architecture` | Build and maintain architecture maps | #### Usage Examples diff --git a/docs/recipes/context-health.md b/docs/recipes/context-health.md index 635e2d71..deb9736e 100644 --- a/docs/recipes/context-health.md +++ b/docs/recipes/context-health.md @@ -47,7 +47,7 @@ Or just ask your agent: *"Is our context clean?"* | `ctx compact` | Command | Archive completed tasks, clean up empty sections | | `ctx status` | Command | Quick health overview | | `/ctx-drift` | Skill | Structural plus semantic drift detection | -| `/ctx-map` | Skill | Refresh `ARCHITECTURE.md` from actual codebase | +| `/ctx-architecture` | Skill | Refresh `ARCHITECTURE.md` from actual codebase | | `/ctx-alignment-audit` | Skill | Audit doc claims against agent instructions | | `/ctx-status` | Skill | In-session context summary | | `/ctx-prompt-audit` | Skill | Audit prompt quality and token efficiency | diff --git a/docs/recipes/dependency-graph.md b/docs/recipes/dependency-graph.md new file mode 100644 index 00000000..2e9c256f --- /dev/null +++ b/docs/recipes/dependency-graph.md @@ -0,0 +1,166 @@ +--- +# / ctx: https://ctx.ist +# ,'`./ do you remember? +# `.,'\ +# \ Copyright 2026-present Context contributors. +# SPDX-License-Identifier: Apache-2.0 + +title: Generating Dependency Graphs +--- + +## Why Dependency Graphs? + +Understanding how packages relate to each other is the first step in +onboarding, refactoring, and architecture review. `ctx deps` generates +dependency graphs from source code so you can see the structure at a +glance instead of tracing imports by hand. + +## Quick Start + +```bash +# Auto-detect ecosystem and output Mermaid (default) +ctx deps + +# Table format for a quick terminal overview +ctx deps --format table + +# JSON for programmatic consumption +ctx deps --format json +``` + +## Ecosystem Detection + +`ctx deps` looks for manifest files in this order: + +1. **Go** — `go.mod` +2. **Node.js** — `package.json` +3. **Python** — `pyproject.toml`, `setup.py`, `requirements.txt` +4. **Rust** — `Cargo.toml` + +First match wins. To override detection, use `--type`: + +```bash +# Force Python even if go.mod exists +ctx deps --type python +``` + +## Output Formats + +### Mermaid (default) + +Produces a Mermaid graph definition you can paste into GitHub PRs, +Obsidian notes, or any Mermaid-compatible renderer. + +```bash +ctx deps --format mermaid +``` + +```mermaid +graph TD + internal/cli --> internal/config + internal/cli --> internal/memory + internal/config --> internal/entry +``` + +### Table + +Flat two-column view for quick terminal scanning. + +```bash +ctx deps --format table +``` + +``` +PACKAGE DEPENDS ON +internal/cli internal/config, internal/memory +internal/config internal/entry +internal/memory internal/index +``` + +### JSON + +Machine-readable output for scripts and pipelines. + +```bash +ctx deps --format json | jq '.nodes | length' +``` + +## Including External Dependencies + +By default, only internal (first-party) dependencies are shown. Add +`--external` to include third-party packages: + +```bash +ctx deps --external +ctx deps --external --format table +``` + +This is useful when auditing transitive dependencies or checking which +packages pull in heavy external libraries. + +## When to Use It + +- **Onboarding.** Generate a Mermaid graph and drop it into the project + wiki. New contributors see the architecture before reading code. +- **Refactoring.** Before moving packages, check what depends on them. + Combine with `ctx drift` to find stale references after the move. +- **Architecture review.** Table format gives a quick overview; Mermaid + format goes into design docs and PRs. +- **Pre-commit.** Run in CI to detect unexpected new dependencies + between packages. + +## Combining with Other Commands + +### Refactoring with ctx drift + +```bash +# See the dependency structure before refactoring +ctx deps --format table + +# After moving packages, check for broken references +ctx drift +``` + +### Feeding architecture maps + +Use JSON output as input for context files or architecture documentation: + +```bash +# Generate a dependency snapshot for the context directory +ctx deps --format json > .context/deps.json + +# Or pipe into other tools +ctx deps --format mermaid >> docs/architecture.md +``` + +## Monorepos and Multi-Ecosystem Projects + +In a monorepo with multiple ecosystems, `ctx deps` picks the first +manifest it finds (Go beats Node.js beats Python beats Rust). Use +`--type` to target a specific ecosystem: + +```bash +# In a repo with both go.mod and package.json +ctx deps --type node +ctx deps --type go +``` + +For separate subdirectories, run from each root: + +```bash +cd services/api && ctx deps --format table +cd frontend && ctx deps --type node --format mermaid +``` + +## Tips + +- **Start with table format.** It is the fastest way to get a mental + model of the dependency structure. Switch to Mermaid when you need + a visual for documentation or a PR. +- **Pipe JSON to jq.** Filter for specific packages, count edges, or + extract subgraphs programmatically. +- **Skip `--external` unless you need it.** Internal-only graphs are + cleaner and load faster. Add external deps when you are specifically + auditing third-party usage. +- **Force `--type` in CI.** Auto-detection is convenient locally, but + explicit types prevent surprises when the repo structure changes. diff --git a/docs/recipes/hook-sequence-diagrams.md b/docs/recipes/hook-sequence-diagrams.md new file mode 100644 index 00000000..e5dd9ad8 --- /dev/null +++ b/docs/recipes/hook-sequence-diagrams.md @@ -0,0 +1,744 @@ +--- +# / ctx: https://ctx.ist +# ,'`./ do you remember? +# `.,'\ +# \ Copyright 2026-present Context contributors. +# SPDX-License-Identifier: Apache-2.0 + +title: Hook Sequence Diagrams +--- + +## Hook Lifecycle + +Every ctx hook is a Go binary invoked by Claude Code at either +`PreToolUse` (before a tool runs) or `PostToolUse` (after it +completes). Hooks receive JSON on stdin and emit JSON or plain +text on stdout. + +This page documents the execution flow of every hook as a +sequence diagram. + +--- + +## PreToolUse Hooks + +These fire **before** a tool executes. They can block, gate, or +inject context. + +### block-dangerous-commands + +Blocks dangerous shell patterns (sudo, git push, cp to bin). +No initialization or pause checks — always active. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as block-dangerous-commands + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Extract command + alt command empty + Hook-->>CC: (silent exit) + end + Hook->>Hook: Test regex: sudo, git push, cp-to-bin, install-to-bin + alt no match + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, variant, fallback) + Hook-->>CC: JSON {decision: BLOCK, reason} + Hook->>Hook: NudgeAndRelay(message) +``` + +### block-non-path-ctx + +Blocks `./ctx`, `go run ./cmd/ctx`, or absolute-path ctx +invocations. Constitutionally enforced. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as block-non-path-ctx + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Extract command + alt command empty + Hook-->>CC: (silent exit) + end + Hook->>Hook: Test regex: relative-path, go-run, absolute-path + alt no match + Hook-->>CC: (silent exit) + end + alt absolute-path + test exception + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, variant, fallback) + Hook-->>CC: JSON {decision: BLOCK, reason + constitution suffix} + Hook->>Hook: NudgeAndRelay(message) +``` + +### context-load-gate + +Injects the full context packet on first tool use of a session. +One-shot per session. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as context-load-gate + participant State as .context/state/ + participant Ctx as .context/ files + participant Git as git log + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Check initialized + alt not initialized + Hook-->>CC: (silent exit) + end + Hook->>Hook: Check paused + alt paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check ctx-loaded-{session} marker + alt marker exists + Hook-->>CC: (silent exit, already fired) + end + Hook->>State: Create marker (one-shot guard) + Hook->>State: Prune stale session files + Hook->>Ctx: Read files in priority order + Note over Hook,Ctx: DECISION/LEARNING: index only
Others: full content + Hook->>Git: Detect changes since last session + Hook->>Hook: Build injection (files + changes + token counts) + Hook-->>CC: JSON {additionalContext: injection} + Hook->>Hook: Send webhook (metadata only) + Hook->>State: Write oversize flag if tokens > threshold +``` + +### check-resources + +Checks system resources (memory, swap, disk, load). Fires on +every tool call. No initialization required. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-resources + participant Sys as sysinfo + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: HookPreamble (parse input, check pause) + alt paused + Hook-->>CC: (silent exit) + end + Hook->>Sys: Collect snapshot (memory, swap, disk, load) + Hook->>Sys: Evaluate thresholds + alt max severity < Danger + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, alert, vars, fallback) + Hook-->>CC: Nudge box (danger alerts) + Hook->>Hook: NudgeAndRelay(message) +``` + +### qa-reminder + +Gate nudge before any git command. Reminds agent to lint/test. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as qa-reminder + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>Hook: Check command contains "git" + alt no git command + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, gate, fallback) + Hook->>Hook: AppendDir(message) + Hook-->>CC: JSON {additionalContext: QA gate} + Hook->>Hook: Relay(message) +``` + +### specs-nudge + +Nudges agent to save plans/specs when new implementation detected. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as specs-nudge + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, nudge, fallback) + Hook->>Hook: AppendDir(message) + Hook-->>CC: JSON {additionalContext: specs nudge} + Hook->>Hook: Relay(message) +``` + +--- + +## PostToolUse Hooks + +These fire **after** a tool completes. They observe, nudge, and +track state. + +### heartbeat + +Silent per-prompt pulse. Tracks prompt count, context modification, +and token usage. The agent never sees this hook's output. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as heartbeat + participant State as .context/state/ + participant Ctx as .context/ files + participant Notify as Webhook + EventLog + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Increment heartbeat counter + Hook->>Ctx: Get latest context file mtime + Hook->>State: Compare with last recorded mtime + Hook->>State: Update mtime record + Hook->>State: Read session token info + Hook->>Notify: Send heartbeat notification + Hook->>Notify: Append to event log + Hook->>State: Write heartbeat log entry + Note over Hook: No stdout — agent never sees this +``` + +### check-context-size + +Adaptive context window monitoring. Fires checkpoints, window +warnings, and billing alerts based on prompt count and token usage. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-context-size + participant State as .context/state/ + participant Session as Session JSONL + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + Hook->>Hook: Read input, resolve session ID + Hook->>Hook: Check paused + alt paused + Hook-->>CC: Pause acknowledgment message + end + Hook->>State: Increment session prompt counter + Hook->>Session: Read token info (tokens, model, window) + + rect rgb(255, 240, 240) + Note over Hook: Billing check (independent, never suppressed) + alt tokens >= billing threshold (one-shot) + Hook->>Tpl: LoadMessage(hook, billing, vars) + Hook-->>CC: Billing warning nudge box + Hook->>Hook: NudgeAndRelay(billing message) + end + end + + Hook->>State: Check wrap-up marker + alt wrapped up recently (< 2h) + Hook->>State: Write stats (event: suppressed) + Hook-->>CC: (silent exit) + end + + rect rgb(240, 248, 255) + Note over Hook: Adaptive frequency check + alt count > 30 and count % 3 == 0 + Note over Hook: High frequency trigger + else count > 15 and count % 5 == 0 + Note over Hook: Medium frequency trigger + else + Hook->>State: Write stats (event: silent) + Hook-->>CC: (silent exit) + end + end + + alt context window >= 80% + Hook->>Tpl: LoadMessage(hook, window, vars) + Hook-->>CC: Window warning nudge box + Hook->>Hook: NudgeAndRelay(window message) + else checkpoint trigger + Hook->>Tpl: LoadMessage(hook, checkpoint) + Hook-->>CC: Checkpoint nudge box + Hook->>Hook: NudgeAndRelay(checkpoint message) + end + Hook->>State: Write session stats +``` + +### check-persistence + +Tracks context file modification and nudges when edits happen +without persisting context. Adaptive threshold based on prompt count. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-persistence + participant State as .context/state/ + participant Ctx as .context/ files + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Read persistence state {Count, LastNudge, LastMtime} + alt first prompt (no state) + Hook->>State: Initialize state {Count:1, LastNudge:0, LastMtime:now} + Hook-->>CC: (silent exit) + end + Hook->>Hook: Increment Count + Hook->>Ctx: Get current context mtime + alt context modified since LastMtime + Hook->>State: Reset LastNudge = Count, update LastMtime + Hook-->>CC: (silent exit) + end + Hook->>Hook: sinceNudge = Count - LastNudge + Hook->>Hook: PersistenceNudgeNeeded(Count, sinceNudge)? + alt threshold not reached + Hook->>State: Write state + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, nudge, vars) + Hook-->>CC: Nudge box (prompt count, time since last persist) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Update LastNudge = Count, write state +``` + +### check-backup-age + +Daily check for SMB mount and backup freshness. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-backup-age + participant State as .context/state/ + participant FS as Filesystem + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>FS: Check SMB mount (if env var set) + Hook->>FS: Check backup marker file age + alt no warnings + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, warning, {Warnings}) + Hook-->>CC: Nudge box (warnings) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle marker +``` + +### check-ceremonies + +Daily check for `/ctx-remember` and `/ctx-wrap-up` usage in +recent journal entries. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-ceremonies + participant State as .context/state/ + participant Journal as Journal files + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>Journal: Read recent files (lookback window) + alt no journal files + Hook-->>CC: (silent exit) + end + Hook->>Journal: Scan for /ctx-remember and /ctx-wrap-up + alt both ceremonies present + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, variant, fallback) + Note over Hook: variant: both | remember | wrapup + Hook-->>CC: Nudge box (missing ceremonies) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle marker +``` + +### check-journal + +Daily check for unexported sessions and unenriched journal entries. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-journal + participant State as .context/state/ + participant Journal as Journal dir + participant Claude as Claude projects dir + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>Journal: Check dir exists + Hook->>Claude: Check dir exists + alt either dir missing + Hook-->>CC: (silent exit) + end + Hook->>Journal: Get newest entry mtime + Hook->>Claude: Count .jsonl files newer than journal + Hook->>Journal: Count unenriched entries + alt unexported == 0 and unenriched == 0 + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, variant, {counts}) + Note over Hook: variant: both | unexported | unenriched + Hook-->>CC: Nudge box (counts) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle marker +``` + +### check-knowledge + +Daily check for knowledge file entry/line counts exceeding +configured thresholds. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-knowledge + participant State as .context/state/ + participant Ctx as .context/ files + participant RC as .ctxrc + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>RC: Read thresholds (decisions, learnings, conventions) + alt all thresholds disabled (0) + Hook-->>CC: (silent exit) + end + Hook->>Ctx: Parse DECISIONS.md entry count + Hook->>Ctx: Parse LEARNINGS.md entry count + Hook->>Ctx: Count CONVENTIONS.md lines + Hook->>Hook: Compare against thresholds + alt all within limits + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, warning, {FileWarnings}) + Hook-->>CC: Nudge box (file warnings) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle marker +``` + +### check-map-staleness + +Daily check for architecture map age and relevant code changes. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-map-staleness + participant State as .context/state/ + participant Tracking as map-tracking.json + participant Git as git log + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>Tracking: Read map-tracking.json + alt missing, invalid, or opted out + Hook-->>CC: (silent exit) + end + Hook->>Hook: Parse LastRun date + alt map not stale (< N days) + Hook-->>CC: (silent exit) + end + Hook->>Git: Count commits touching internal/ since LastRun + alt no relevant commits + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, stale, {date, count}) + Hook-->>CC: Nudge box (last refresh + commit count) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle marker +``` + +### check-memory-drift + +Per-session check for MEMORY.md changes since last sync. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-memory-drift + participant State as .context/state/ + participant Mem as memory.Discover + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check session tombstone + alt already nudged this session + Hook-->>CC: (silent exit) + end + Hook->>Mem: DiscoverMemoryPath(projectRoot) + alt auto memory not active + Hook-->>CC: (silent exit) + end + Hook->>Mem: HasDrift(contextDir, sourcePath) + alt no drift + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, nudge, fallback) + Hook-->>CC: Nudge box (drift reminder) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch session tombstone +``` + +### check-persistence + +See [check-persistence above](#check-persistence) — listed under +PostToolUse hooks. + +### check-reminders + +Per-prompt check for due reminders. No throttle. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-reminders + participant Store as Reminders store + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>Store: ReadReminders() + alt load error + Hook-->>CC: (silent exit) + end + Hook->>Hook: Filter by due date (After <= today) + alt no due reminders + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, reminders, {list}) + Hook-->>CC: Nudge box (reminder list + dismiss hints) + Hook->>Hook: NudgeAndRelay(message) +``` + +### check-task-completion + +Configurable-interval nudge after edits. Per-session counter resets +after firing. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-task-completion + participant State as .context/state/ + participant RC as .ctxrc + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>RC: Read task nudge interval + alt interval <= 0 (disabled) + Hook-->>CC: (silent exit) + end + Hook->>State: Read per-session counter + Hook->>Hook: Increment counter + alt counter < interval + Hook->>State: Write counter + Hook-->>CC: (silent exit) + end + Hook->>State: Reset counter to 0 + Hook->>Tpl: LoadMessage(hook, nudge, fallback) + Hook-->>CC: JSON {additionalContext: task nudge} + Hook->>Hook: Relay(message) +``` + +### check-version + +Daily binary-vs-plugin version comparison with piggybacked key +rotation check. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as check-version + participant State as .context/state/ + participant Config as Binary + Plugin version + participant Tpl as Message Template + + CC->>Hook: stdin {session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>State: Check daily throttle marker + alt throttled + Hook-->>CC: (silent exit) + end + Hook->>Config: Read binary version + alt dev build + Hook->>State: Touch throttle + Hook-->>CC: (silent exit) + end + Hook->>Config: Read plugin version + alt plugin version not found or parse error + Hook->>State: Touch throttle + Hook-->>CC: (silent exit) + end + Hook->>Hook: Compare major.minor + alt versions match + Hook->>State: Touch throttle + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, mismatch, {versions}) + Hook-->>CC: Nudge box (version mismatch) + Hook->>Hook: NudgeAndRelay(message) + Hook->>State: Touch throttle + Hook->>Hook: CheckKeyAge() (piggybacked) +``` + +### post-commit + +Fires after `git commit` (not amend). Nudges for context capture +and checks version drift. + +```mermaid +sequenceDiagram + participant CC as Claude Code + participant Hook as post-commit + participant Tpl as Message Template + + CC->>Hook: stdin {command, session_id} + Hook->>Hook: Check initialized + HookPreamble + alt not initialized or paused + Hook-->>CC: (silent exit) + end + Hook->>Hook: Regex: command contains "git commit"? + alt not a git commit + Hook-->>CC: (silent exit) + end + Hook->>Hook: Regex: command contains "--amend"? + alt is amend + Hook-->>CC: (silent exit) + end + Hook->>Tpl: LoadMessage(hook, nudge, fallback) + Hook->>Hook: AppendDir(message) + Hook-->>CC: JSON {additionalContext: post-commit nudge} + Hook->>Hook: Relay(message) + Hook->>Hook: CheckVersionDrift() +``` + +--- + +## Throttling Summary + +| Hook | Throttle Type | Scope | +|------|--------------|-------| +| block-dangerous-commands | None | Every match | +| block-non-path-ctx | None | Every match | +| context-load-gate | One-shot marker | Per session | +| check-resources | None | Every tool call | +| qa-reminder | None | Every git command | +| specs-nudge | None | Every prompt | +| heartbeat | None | Every prompt | +| check-context-size | Adaptive counter | Per session | +| check-persistence | Adaptive counter | Per session | +| check-task-completion | Configurable interval | Per session | +| check-memory-drift | Session tombstone | Once per session | +| check-reminders | None | Every prompt | +| check-backup-age | Daily marker | Once per day | +| check-ceremonies | Daily marker | Once per day | +| check-journal | Daily marker | Once per day | +| check-knowledge | Daily marker | Once per day | +| check-map-staleness | Daily marker | Once per day | +| check-version | Daily marker | Once per day | +| post-commit | None | Every git commit | + +## State File Reference + +All state files live in `.context/state/`. + +| File Pattern | Hook | Purpose | +|-------------|------|---------| +| `ctx-loaded-{session}` | context-load-gate | One-shot injection marker | +| `ctx-paused-{session}` | (all) | Session pause marker | +| `ctx-wrapped-up` | check-context-size | Suppress nudges after wrap-up (2h expiry) | +| `backup-reminded` | check-backup-age | Daily throttle | +| `ceremony-reminded` | check-ceremonies | Daily throttle | +| `journal-reminded` | check-journal | Daily throttle | +| `knowledge-reminded` | check-knowledge | Daily throttle | +| `map-staleness-reminded` | check-map-staleness | Daily throttle | +| `version-checked` | check-version | Daily throttle | +| `memory-drift-nudged-{session}` | check-memory-drift | Per-session tombstone | +| `ctx-context-count-{session}` | check-context-size | Prompt counter | +| `stats-{session}.jsonl` | check-context-size | Session stats log | +| `persist-{session}` | check-persistence | Counter + mtime state | +| `ctx-task-count-{session}` | check-task-completion | Prompt counter | +| `heartbeat-count-{session}` | heartbeat | Prompt counter | +| `heartbeat-mtime-{session}` | heartbeat | Last context mtime | diff --git a/docs/recipes/index.md b/docs/recipes/index.md index c7d69b66..b0ab8e90 100644 --- a/docs/recipes/index.md +++ b/docs/recipes/index.md @@ -91,6 +91,16 @@ Date-gate reminders to surface only after a specific date. --- +### [Reviewing Session Changes](session-changes.md) + +See what moved since your last session: context file edits, code +commits, directories touched. Auto-detects session boundaries from +state markers. + +**Uses**: `ctx changes`, `ctx agent`, `ctx status` + +--- + ### [Pausing Context Hooks](session-pause.md) Silence all nudge hooks for a **quick task** that doesn't need ceremony @@ -122,7 +132,7 @@ survive across sessions and team members. `TASKS.md` focused as your project evolves across dozens of sessions. -**Uses**: `ctx add task`, `ctx complete`, `ctx tasks archive`, +**Uses**: `ctx add task`, `ctx tasks complete`, `ctx tasks archive`, `ctx tasks snapshot`, `/ctx-add-task`, `/ctx-archive`, `/ctx-next` --- @@ -181,6 +191,16 @@ ceremony nudges, or tailor post-commit instructions for your stack. --- +### [Hook Sequence Diagrams](hook-sequence-diagrams.md) + +**Mermaid sequence diagrams** for every system hook: entry conditions, +state reads, output, throttling, and exit points. Includes throttling +summary table and state file reference. + +**Uses**: All `ctx system` hooks + +--- + ### [Auditing System Hooks](system-hooks-audit.md) The 12 system hooks that run **invisibly** during every session: what each @@ -346,6 +366,16 @@ file overlap, work in parallel, merge back. --- +### [Generating Dependency Graphs](dependency-graph.md) + +Map your project's internal and external **dependency structure**. +Auto-detects Go, Node.js, Python, and Rust. Output as Mermaid, +table, or JSON. + +**Uses**: `ctx deps`, `ctx drift` + +--- + ### [Reusable Prompt Templates](prompt-templates.md) Store and reuse **prompt templates** in `.context/prompts/` for diff --git a/docs/recipes/session-changes.md b/docs/recipes/session-changes.md new file mode 100644 index 00000000..d4839ee0 --- /dev/null +++ b/docs/recipes/session-changes.md @@ -0,0 +1,116 @@ +--- +# / ctx: https://ctx.ist +# ,'`./ do you remember? +# `.,'\ +# \ Copyright 2026-present Context contributors. +# SPDX-License-Identifier: Apache-2.0 + +title: Reviewing Session Changes +--- + +## What Changed While You Were Away? + +Between sessions, teammates commit code, context files get updated, +and decisions pile up. `ctx changes` gives you a single-command +summary of everything that moved since your last session. + +## Quick Start + +```bash +# Auto-detects your last session and shows what changed +ctx changes + +# Check what changed in the last 48 hours +ctx changes --since 48h + +# Check since a specific date +ctx changes --since 2026-03-10 +``` + +## How Reference Time Works + +`ctx changes` needs a reference point to compare against. It tries +these sources in order: + +1. **`--since` flag** — explicit duration (`24h`, `72h`) or date + (`2026-03-10`, RFC3339 timestamp) +2. **Session markers** — `ctx-loaded-*` files in `.context/state/`; + picks the second-most-recent (your *previous* session start) +3. **Event log** — last `context-load-gate` event from + `.context/state/events.jsonl` +4. **Fallback** — 24 hours ago + +The marker-based detection means `ctx changes` usually just works +without any flags: it knows when you last loaded context and shows +everything after that. + +## What It Reports + +### Context file changes + +Any `.md` file in `.context/` modified after the reference time: + +``` +### Context File Changes +- `TASKS.md` — modified 2026-03-11 14:30 +- `DECISIONS.md` — modified 2026-03-11 09:15 +``` + +### Code changes + +Git activity since the reference time: + +``` +### Code Changes +- **12 commits** since reference point +- **Latest**: Fix journal enrichment ordering +- **Directories touched**: internal, docs, specs +- **Authors**: jose, claude +``` + +## Integrating Into Session Start + +Pair `ctx changes` with the `/ctx-remember` ceremony for a complete +session-start picture: + +```bash +# 1. Load context (this also creates the session marker) +ctx agent --budget 4000 + +# 2. See what changed since your last session +ctx changes +``` + +Or script it: + +```bash +# .context/hooks/session-start.sh +ctx agent --budget 4000 +echo "---" +ctx changes +``` + +## Team Workflows + +When multiple people share a `.context/` directory, `ctx changes` +shows who changed what: + +```bash +# After pulling from remote +git pull +ctx changes --since 72h +``` + +This surfaces context file changes from teammates that you might +otherwise miss in the commit log. + +## Tips + +- **No changes?** If nothing shows up, the reference time might be + wrong. Use `--since 48h` to widen the window. +- **Works without git.** Context file changes are detected by + filesystem mtime, not git. Code changes require git. +- **Hook integration.** The `context-load-gate` hook writes the + session marker that `ctx changes` uses for auto-detection. If + you're not using the ctx plugin, markers won't exist and it falls + back to the event log or 24h window. diff --git a/docs/recipes/system-hooks-audit.md b/docs/recipes/system-hooks-audit.md index 78c87bfa..a10efbc6 100644 --- a/docs/recipes/system-hooks-audit.md +++ b/docs/recipes/system-hooks-audit.md @@ -222,7 +222,7 @@ throttle prevents repeated nudges. **Why**: Architecture documentation drifts silently as code evolves. This hook detects structural changes that the map hasn't caught up with and -suggests running `/ctx-map` to refresh. +suggests running `/ctx-architecture` to refresh. **Output**: VERBATIM relay when stale and modules changed, silent otherwise. @@ -230,9 +230,9 @@ suggests running `/ctx-map` to refresh. ┌─ Architecture Map Stale ──────────────────────────── │ ARCHITECTURE.md hasn't been refreshed since 2026-01-15 │ and there are commits touching 12 modules. -│ /ctx-map keeps architecture docs drift-free. +│ /ctx-architecture keeps architecture docs drift-free. │ -│ Want me to run /ctx-map to refresh? +│ Want me to run /ctx-architecture to refresh? └───────────────────────────────────────────────────── ``` diff --git a/docs/recipes/task-management.md b/docs/recipes/task-management.md index 61221e29..ce195a32 100644 --- a/docs/recipes/task-management.md +++ b/docs/recipes/task-management.md @@ -23,7 +23,7 @@ How do you manage work items that span multiple sessions without losing context? ```bash ctx add task "Fix race condition" --priority high # add ctx add task "Write tests" --section "Phase 2" # add to phase -ctx complete "race condition" # mark done +ctx tasks complete "race condition" # mark done ctx tasks snapshot "before-refactor" # backup ctx tasks archive # clean up ``` @@ -41,7 +41,7 @@ Read on for the full workflow and conversational patterns. | Tool | Type | Purpose | |----------------------|---------|---------------------------------------------| | `ctx add task` | Command | Add a new task to `TASKS.md` | -| `ctx complete` | Command | Mark a task as done by number or text | +| `ctx tasks complete` | Command | Mark a task as done by number or text | | `ctx tasks snapshot` | Command | Create a point-in-time backup of `TASKS.md` | | `ctx tasks archive` | Command | Move completed tasks to archive file | | `/ctx-add-task` | Skill | AI-assisted task creation with validation | @@ -96,7 +96,7 @@ status is tracked via checkboxes and inline tags. ## Phase 1: Core CLI - [x] Implement ctx add command `#done:2026-02-01-143022` -- [x] Implement ctx complete command `#done:2026-02-03-091544` +- [x] Implement ctx tasks complete command `#done:2026-02-03-091544` - [ ] Add --section flag to ctx add task `#priority:medium` ## Phase 2: AI Integration @@ -166,10 +166,10 @@ When a task is done, mark it complete by number or partial text match: ```bash # By task number (as shown in TASKS.md) -ctx complete 3 +ctx tasks complete 3 # By partial text match -ctx complete "agent cooldown" +ctx tasks complete "agent cooldown" ``` The task's checkbox changes from `[ ]` to `[x]` and a `#done` timestamp is @@ -177,7 +177,7 @@ added. Tasks are never deleted: they stay in their phase section so history is preserved. !!! tip "Be Conversational" - You rarely need to run `ctx complete` yourself during an interactive session. + You rarely need to run `ctx tasks complete` yourself during an interactive session. When you say something like "*the rate limiter is done*" or "*we finished that*," the agent marks the task complete and moves on to suggesting what is next. @@ -250,7 +250,7 @@ These conversational prompts replace explicit commands during interactive sessio |----------------------------------------|----------------------------------------------------| | `ctx add task "Write tests for X"` | "We should add tests for this: track that?" | | `/ctx-next` | "What should we work on?" | -| `ctx complete "rate limiting"` | "The rate limiter is done, what's next?" | +| `ctx tasks complete "rate limiting"` | "The rate limiter is done, what's next?" | | `ctx tasks archive` | "`TASKS.md` is getting long, can you clean it up?" | | `ctx add task ... && ctx add task ...` | "Add follow-ups for what we just built." | @@ -417,10 +417,10 @@ ctx add task "Write integration tests for rate limiter" --section "Phase 2" # (from AI assistant) /ctx-next # Mark done by text -ctx complete "rate limiting" +ctx tasks complete "rate limiting" # Mark done by number -ctx complete 5 +ctx tasks complete 5 # Snapshot before a risky refactor ctx tasks snapshot "before-middleware-rewrite" @@ -470,6 +470,6 @@ Store short-lived sensitive notes in an encrypted scratchpad. * [Detecting and Fixing Drift](context-health.md): keeping `TASKS.md` accurate over time * [CLI Reference](../cli/context.md): - full documentation for `ctx add`, `ctx complete`, `ctx tasks` + full documentation for `ctx add`, `ctx tasks complete`, `ctx tasks` * [Context Files: `TASKS.md`](../home/context-files.md#tasksmd): format and conventions for `TASKS.md` diff --git a/docs/reference/skills.md b/docs/reference/skills.md index 1c0243f4..b283e24c 100644 --- a/docs/reference/skills.md +++ b/docs/reference/skills.md @@ -72,7 +72,7 @@ opinionated behavior on top. | [`/ctx-implement`](#ctx-implement) | Execute a plan step-by-step with verification | user-invocable | | [`/ctx-loop`](#ctx-loop) | Generate autonomous loop script | user-invocable | | [`/ctx-worktree`](#ctx-worktree) | Manage git worktrees for parallel agents | user-invocable | -| [`/ctx-map`](#ctx-map) | Build and maintain architecture maps | user-invocable | +| [`/ctx-architecture`](#ctx-architecture) | Build and maintain architecture maps | user-invocable | | [`/ctx-remind`](#ctx-remind) | Manage session-scoped reminders | user-invocable | | [`/ctx-doctor`](#ctx-doctor) | Troubleshoot ctx behavior with health checks and event analysis | user-invocable | | [`/ctx-skill-creator`](#ctx-skill-creator) | Create, improve, and test skills | user-invocable | @@ -606,7 +606,7 @@ grouping, and tear down with merge. --- -### `/ctx-map` +### `/ctx-architecture` Build and maintain architecture maps incrementally. Creates or refreshes `ARCHITECTURE.md` (*succinct project map, loaded at session start*) and diff --git a/docs/reference/versions.md b/docs/reference/versions.md index e843c384..de721bda 100644 --- a/docs/reference/versions.md +++ b/docs/reference/versions.md @@ -28,37 +28,37 @@ Tap the corresponding **view docs** to view the docs as they were at that releas | v0.1.1 | 2026-01-26 | [view docs](https://github.com/ActiveMemory/ctx/tree/v0.1.1/docs) | | v0.1.0 | 2026-01-25 | [view docs](https://github.com/ActiveMemory/ctx/tree/v0.1.0/docs) | -### v0.6.0 -- The Integration Release +### `v0.6.0`: The Integration Release Plugin architecture: hooks and skills converted from shell scripts to Go subcommands, shipped as a Claude Code marketplace plugin. Multi-tool hook generation for Cursor, Aider, Copilot, and Windsurf. Webhook notifications with encrypted URL storage. -### v0.3.0 -- The Discipline Release +### `v0.3.0`: The Discipline Release Journal static site generation via zensical. 49-skill audit and fix pass (positive framing, phantom reference removal, scope tightening). Context consolidation skill. golangci-lint v2 migration. -### v0.2.0 -- The Archaeology Release +### `v0.2.0`: The Archaeology Release Session journal system: `ctx recall export` converts Claude Code JSONL transcripts to browsable Markdown. Constants refactor with semantic prefixes (`Dir*`, `File*`, `Filename*`). CRLF handling for Windows compatibility. -### v0.1.2 +### `v0.1.2` Default Claude Code permissions deployed on `ctx init`. Prompting guide published as a standalone documentation page. -### v0.1.1 +### `v0.1.1` Bug fixes: hook schema key format corrected, JSON unicode escaping fixed in context file output. -### v0.1.0 -- Initial Release +### `v0.1.0`: Initial Release CLI with 15 subcommands, 6 context file types (CONSTITUTION, TASKS, CONVENTIONS, ARCHITECTURE, DECISIONS, LEARNINGS), Makefile build system, diff --git a/docs/security/agent-security.md b/docs/security/agent-security.md index 663f3d30..b100ced2 100644 --- a/docs/security/agent-security.md +++ b/docs/security/agent-security.md @@ -14,7 +14,7 @@ unattended shell with unrestricted access to your machine**. This is not a theoretical concern. AI coding agents execute shell commands, write files, make network requests, and modify project configuration. When -running autonomously (*overnight, in a loop, without a human watching*) the +running autonomously (*overnight, in a loop, without a human watching*), the attack surface is the full capability set of the operating system user account. @@ -36,9 +36,9 @@ inject content into any of these sources can redirect the agent's behavior. | **Prompt injection via fetched content** | The agent fetches a URL (documentation, API response, Stack Overflow answer) containing embedded instructions. | | **Poisoned project files** | A contributor adds adversarial instructions to `CLAUDE.md`, `.cursorrules`, or `.context/` files. The agent loads these at session start. | | **Self-modification between iterations** | In an autonomous loop, the agent modifies its own configuration files. The next iteration loads the modified config with no human review. | -| **Tool output injection** | A command's output (error messages, log lines, file contents) contains instructions the agent interprets and follows. | +| **Tool output injection** | A command's output (*error messages, log lines, file contents*) contains instructions the agent interprets and follows. | -### What a Compromised Agent Can Do +### What Can a Compromised Agent Do Depends entirely on what permissions and access the agent has: @@ -76,40 +76,63 @@ can override soft instructions. Long context windows dilute attention on rules stated early. Edge cases where instructions are ambiguous. **Verdict**: Necessary but not sufficient. Good for the common case. -Do not rely on it for security boundaries. +**Do not** rely on it for security boundaries. ### Layer 2: Application Controls (*Deterministic at Runtime, Mutable Across Iterations*) -AI tool runtimes (Claude Code, Cursor, etc.) provide permission systems: +AI tool runtimes (*Claude Code, Cursor, etc.*) provide permission systems: tool allowlists, command restrictions, confirmation prompts. -For Claude Code, an explicit allowlist in `.claude/settings.local.json`: - -```json -{ - "permissions": { - "allow": [ - "Bash(make:*)", - "Bash(go:*)", - "Bash(git:*)", - "Bash(ctx:*)", - "Read", - "Write", - "Edit" - ] - } -} +For Claude Code, `ctx init` writes both an allowlist and an explicit deny +list into `.claude/settings.local.json`. The golden images live in +`internal/assets/permissions/`: + +**Allowlist** (`allow.txt`): only these tools run without confirmation: + +```text +Bash(ctx:*) +Skill(ctx-add-convention) +Skill(ctx-add-decision) +... # all bundled ctx-* skills +``` + +**Deny list** (`deny.txt`): these are blocked even if the agent requests them: + +```text +# Dangerous operations +Bash(sudo *) +Bash(git push *) +Bash(git push) +Bash(rm -rf /*) +Bash(rm -rf ~*) +Bash(curl *) +Bash(wget *) +Bash(chmod 777 *) + +# Sensitive file reads +Read(**/.env) +Read(**/.env.*) +Read(**/*credentials*) +Read(**/*secret*) +Read(**/*.pem) +Read(**/*.key) + +# Sensitive file edits +Edit(**/.env) +Edit(**/.env.*) ``` -**What it catches**: The agent cannot run commands outside the allowlist. -If `rm`, `curl`, `sudo`, or `docker` are not listed, the agent cannot -invoke them regardless of what any prompt says. +**What it catches**: The agent cannot run commands outside the allowlist, +and the deny list blocks dangerous operations even if a future allowlist +change were to widen access. If `rm`, `curl`, `sudo`, or `docker` are +not allowed *and* `sudo`/`curl`/`wget` are explicitly denied, the agent +cannot invoke them regardless of what any prompt says. **What it misses**: The agent can modify the allowlist itself. In an -autonomous loop, the agent writes to `.claude/settings.local.json`, and -the next iteration loads the modified config. The application enforces -the rules, but the application reads the rules from files the agent can -write. +autonomous loop, if the agent writes to `.claude/settings.local.json`, and +the next iteration loads the modified config, then the protection is +effectively lost. The application enforces the rules, but the application +reads the rules from files the agent can write. **Verdict**: Strong first layer. Must be combined with self-modification prevention (Layer 3). @@ -146,11 +169,11 @@ An agent that cannot reach the internet cannot exfiltrate data. It also cannot ingest new instructions mid-loop from external documents, API responses, or hostile content. -| Scenario | Recommended control | -|-----------------------------------|--------------------------------------------------------------------------------------------------------------| -| Agent does not need the internet | `--network=none` (container) or outbound firewall drop-all | -| Agent needs to fetch dependencies | Allow specific registries (npmjs.com, proxy.golang.org, pypi.org) via firewall rules. Block everything else. | -| Agent needs API access | Allow specific API endpoints only. Use an HTTP proxy with allowlisting. | +| Scenario | Recommended control | +|-----------------------------------|----------------------------------------------------------------------------------------------------------------| +| Agent does not need the internet | `--network=none` (*container*) or outbound firewall drop-all | +| Agent needs to fetch dependencies | Allow specific registries (*npmjs.com, proxy.golang.org, pypi.org*) via firewall rules. Block everything else. | +| Agent needs API access | Allow specific API endpoints only. Use an HTTP proxy with allowlisting. | **What it catches**: Data exfiltration, phone-home payloads, downloading additional tools, and instruction injection via fetched content. @@ -210,7 +233,7 @@ A defense-in-depth setup for overnight autonomous runs: | Container | `--cap-drop=ALL --network=none`, rootless, no socket mount | Host escape, network exfiltration | | Resource limits | `--memory=4g --cpus=2`, disk quotas | Resource exhaustion | -Each layer is simple. The strength is in the *combination*. +Each layer is straightforward: The strength is in the *combination*. ## Common Mistakes @@ -243,13 +266,13 @@ considerations extend beyond single-agent hardening. ### Code Review for Context Files Treat `.context/` changes like code changes. Context files influence -agent behavior -- a modified CONSTITUTION.md or CONVENTIONS.md changes -what every agent on the team will do next session. Review them in PRs +agent behavior (*a modified `CONSTITUTION.md` or `CONVENTIONS.md` changes +what every agent on the team will do next session*). Review them in PRs with the same scrutiny you apply to production code. Watch for: -* Weakened constitutional rules (removed constraints, softened language) +* Weakened constitutional rules (*removed constraints, softened language*) * New decisions that contradict existing ones without acknowledging it * Learnings that encode incorrect assumptions * Task additions that bypass the team's prioritization process @@ -259,22 +282,22 @@ Watch for: `ctx init` configures `.gitignore` automatically, but verify these patterns are in place: -* **Always gitignored**: `.context.key` (encryption key -- now at - `~/.ctx/.ctx.key`), `.context/logs/`, `.context/journal/` -* **Team decision**: `scratchpad.enc` -- encrypted, safe to commit for - shared scratchpad state; gitignore if scratchpads are personal -* **Never committed**: `.env`, credentials, API keys (enforced by - drift secret detection) +* **Always gitignored**: `.ctx.key` (*encryption key*), + `.context/logs/`, `.context/journal/` +* **Team decision**: `scratchpad.enc` (*encrypted, safe to commit for + shared scratchpad state*); `.gitignore` if scratchpads are personal +* **Never committed**: `.env`, credentials, API keys (*enforced by + drift secret detection*) ### Multi-Developer Context Sharing -CONSTITUTION.md is the shared contract. All team members and their +`CONSTITUTION.md` is the shared contract. All team members and their agents inherit it. Changes require team consensus, not unilateral edits. When multiple agents write to the same context files concurrently -(e.g., two developers adding learnings simultaneously), git merge +(*e.g., two developers adding learnings simultaneously*), git merge conflicts are expected. Resolution is typically additive: accept both -additions. Destructive resolution (dropping one side) loses context. +additions. Destructive resolution (*dropping one side*) loses context. ### Team Conventions for Context Management diff --git a/editors/vscode/README.md b/editors/vscode/README.md index 402af09c..4bfe70f3 100644 --- a/editors/vscode/README.md +++ b/editors/vscode/README.md @@ -1,23 +1,33 @@ -# ctx — VS Code Chat Extension +```text +# / ctx: https://ctx.ist +# ,'`./ do you remember? +# `.,'\ +# \ Copyright 2026-present Context contributors. +# SPDX-License-Identifier: Apache-2.0 +``` + +## `ctx`: VS Code Chat Extension -A VS Code Chat Participant that brings [ctx](https://ctx.ist) — persistent project context for AI coding sessions — directly into GitHub Copilot Chat. +A VS Code Chat Participant that brings [ctx](https://ctx.ist): +(*persistent project context for AI coding sessions*) +directly into GitHub Copilot Chat. ## Usage Type `@ctx` in the VS Code Chat view, then use slash commands: -| Command | Description | -|---------|-------------| -| `@ctx /init` | Initialize a `.context/` directory with template files | -| `@ctx /status` | Show context summary with token estimate | -| `@ctx /agent` | Print AI-ready context packet | -| `@ctx /drift` | Detect stale or invalid context | -| `@ctx /recall` | Browse and search AI session history | -| `@ctx /hook` | Generate AI tool integration configs | -| `@ctx /add` | Add a task, decision, or learning | -| `@ctx /load` | Output assembled context Markdown | -| `@ctx /compact` | Archive completed tasks and clean up | -| `@ctx /sync` | Reconcile context with codebase | +| Command | Description | +|-----------------|--------------------------------------------------------| +| `@ctx /init` | Initialize a `.context/` directory with template files | +| `@ctx /status` | Show context summary with token estimate | +| `@ctx /agent` | Print AI-ready context packet | +| `@ctx /drift` | Detect stale or invalid context | +| `@ctx /recall` | Browse and search AI session history | +| `@ctx /hook` | Generate AI tool integration configs | +| `@ctx /add` | Add a task, decision, or learning | +| `@ctx /load` | Output assembled context Markdown | +| `@ctx /compact` | Archive completed tasks and clean up | +| `@ctx /sync` | Reconcile context with codebase | ## Prerequisites @@ -26,9 +36,9 @@ Type `@ctx` in the VS Code Chat view, then use slash commands: ## Configuration -| Setting | Default | Description | -|---------|---------|-------------| -| `ctx.executablePath` | `ctx` | Path to the ctx executable | +| Setting | Default | Description | +|----------------------|---------|----------------------------| +| `ctx.executablePath` | `ctx` | Path to the ctx executable | ## Development diff --git a/examples/demo/README.md b/examples/demo/README.md index 86688640..48be633c 100644 --- a/examples/demo/README.md +++ b/examples/demo/README.md @@ -17,7 +17,7 @@ ctx agent ctx add task "Implement feature X" # Mark a task complete -ctx complete "feature X" +ctx tasks complete "feature X" # Check for stale context ctx drift diff --git a/internal/assets/claude/skills/ctx-map/SKILL.md b/internal/assets/claude/skills/ctx-architecture/SKILL.md similarity index 97% rename from internal/assets/claude/skills/ctx-map/SKILL.md rename to internal/assets/claude/skills/ctx-architecture/SKILL.md index b3cb00ac..f401bd17 100644 --- a/internal/assets/claude/skills/ctx-map/SKILL.md +++ b/internal/assets/claude/skills/ctx-architecture/SKILL.md @@ -1,5 +1,5 @@ --- -name: ctx-map +name: ctx-architecture description: "Build and maintain architecture maps. Use to create or refresh ARCHITECTURE.md and DETAILED_DESIGN.md." allowed-tools: Bash(ctx:*), Bash(git:*), Bash(go:*), Read, Write, Edit, Glob, Grep --- @@ -187,10 +187,10 @@ If the user says "never", "don't ask again", or similar: ## Nudge Behavior -The agent MAY suggest `/ctx-map` during session start when: +The agent MAY suggest `/ctx-architecture` during session start when: - **No tracking file**: "This project doesn't have an architecture - map yet. Want me to run `/ctx-map`?" + map yet. Want me to run `/ctx-architecture`?" - **Stale (>30 days)**: "The architecture map hasn't been updated since and there are commits touching modules. Want me to refresh?" diff --git a/internal/assets/claude/skills/ctx-drift/SKILL.md b/internal/assets/claude/skills/ctx-drift/SKILL.md index 20bb6f1f..c164d51b 100644 --- a/internal/assets/claude/skills/ctx-drift/SKILL.md +++ b/internal/assets/claude/skills/ctx-drift/SKILL.md @@ -106,7 +106,7 @@ After both layers, do **not** dump raw output. Instead: | Many completed tasks | TASKS.md is cluttered | Run `ctx compact --archive` | | File not modified in 30+ days | Content may be outdated | Review and update or confirm current | | Constitution violation | A hard rule may be broken | Fix immediately | -| Missing packages | An `internal/` package is not in ARCHITECTURE.md | Add it with `/ctx-map` or document manually | +| Missing packages | An `internal/` package is not in ARCHITECTURE.md | Add it with `/ctx-architecture` or document manually | | Required file missing | A core context file does not exist | Create it with `ctx init` or manually | ## Auto-Fix diff --git a/internal/assets/claude/skills/ctx-import-plans/SKILL.md b/internal/assets/claude/skills/ctx-import-plans/SKILL.md index dd851b96..4f5b5f95 100644 --- a/internal/assets/claude/skills/ctx-import-plans/SKILL.md +++ b/internal/assets/claude/skills/ctx-import-plans/SKILL.md @@ -36,12 +36,12 @@ If no files are found, tell the user and stop. The user may pass arguments to narrow the selection: -| Argument | Behavior | -|---|---| -| `--today` | Only plans modified today | -| `--since YYYY-MM-DD` | Only plans modified on or after the given date | -| `--all` | Import all plans without prompting | -| *(none)* | Interactive — present the list and ask the user to pick | +| Argument | Behavior | +|----------------------|--------------------------------------------------------| +| `--today` | Only plans modified today | +| `--since YYYY-MM-DD` | Only plans modified on or after the given date | +| `--all` | Import all plans without prompting | +| *(none)* | Interactive: present the list and ask the user to pick | **Filtering with `--today`:** ```bash @@ -77,10 +77,10 @@ For each selected plan: - Collapse multiple hyphens - Trim leading/trailing hyphens - Example: `Add Authentication Middleware` → `add-authentication-middleware` -3. **Check for conflicts** — if `specs/{slug}.md` already exists, ask +3. **Check for conflicts**: if `specs/{slug}.md` already exists, ask the user whether to overwrite or pick a different name 4. **Copy the file** to `specs/{slug}.md` -5. **Optionally add a task** — ask the user if they want a task in +5. **Optionally add a task**: ask the user if they want a task in TASKS.md referencing the imported spec (use `/ctx-add-task` if yes) ### 5. Report @@ -95,10 +95,10 @@ Imported 2 plan(s): ## Important Notes -- Plan filenames in `~/.claude/plans/` are typically UUIDs or hashes — +- Plan filenames in `~/.claude/plans/` are typically UUIDs or hashes: always use the H1 heading for the spec filename, not the original name - If a plan has no H1 heading, use the original filename (minus extension) as the slug -- Do not modify the original plan files — this is a copy, not a move +- Do not modify the original plan files: this is a copy, not a move - The `specs/` directory must exist (it should already be present in the project root) diff --git a/internal/assets/claude/skills/ctx-journal-enrich-all/SKILL.md b/internal/assets/claude/skills/ctx-journal-enrich-all/SKILL.md index 157d5862..140e5226 100644 --- a/internal/assets/claude/skills/ctx-journal-enrich-all/SKILL.md +++ b/internal/assets/claude/skills/ctx-journal-enrich-all/SKILL.md @@ -4,7 +4,7 @@ description: "Full journal pipeline: export unexported sessions, then batch-enri allowed-tools: Bash(ctx:*), Read, Glob, Grep, Edit, Write, Task --- -Full journal pipeline — export if needed, then batch-enrich. +Full journal pipeline: export if needed, then batch-enrich. ## When to Use @@ -34,14 +34,14 @@ JOURNAL_DIR="$CTX_DIR/journal" md_count=$(ls "$JOURNAL_DIR"/*.md 2>/dev/null | wc -l) if [ "$md_count" -eq 0 ]; then - echo "No journal entries found — exporting all sessions." + echo "No journal entries found: exporting all sessions." ctx recall export --all --yes else # Compare newest .md mtime against .jsonl files newest_md=$(stat -c %Y $(ls -t "$JOURNAL_DIR"/*.md | head -1)) unexported=$(find ~/.claude/projects -name "*.jsonl" -newermt @${newest_md} 2>/dev/null | wc -l) if [ "$unexported" -gt 0 ]; then - echo "$unexported unexported session(s) found — exporting first." + echo "$unexported unexported session(s) found: exporting first." ctx recall export --all --yes fi fi @@ -71,22 +71,22 @@ entries without an `enriched` date set. If `mark-journal --check` is unavailable (no state file, command fails), fall back to frontmatter inspection. An entry is considered **already enriched** if its YAML frontmatter contains **both** `type` -and `outcome` fields — these are set exclusively by enrichment, never +and `outcome` fields: these are set exclusively by enrichment, never by export. -Do NOT use `title` or `date` to detect enrichment — those are always +Do NOT use `title` or `date` to detect enrichment: those are always present from export. The enrichment-only fields are: -| Field | Set by | -|----------------|---------------| -| `title` | Export | -| `date` | Export | -| `time` | Export | -| `model` | Export | -| `tokens_in` | Export | -| `tokens_out` | Export | -| `session_id` | Export | -| `project` | Export | +| Field | Set by | +|----------------|----------------| +| `title` | Export | +| `date` | Export | +| `time` | Export | +| `model` | Export | +| `tokens_in` | Export | +| `tokens_out` | Export | +| `session_id` | Export | +| `project` | Export | | `type` | **Enrichment** | | `outcome` | **Enrichment** | | `topics` | **Enrichment** | @@ -101,7 +101,7 @@ Skip entries that are not worth enriching: - **Locked entries**: a file is locked if `.state.json` has a `locked` date OR the frontmatter contains `locked: true`. Never - modify locked files — neither metadata nor body. Check via: + modify locked files: neither metadata nor body. Check via: `ctx system mark-journal --check locked` or look for `locked: true` in the YAML frontmatter. - **Suggestion sessions**: files under ~20 lines or containing @@ -198,7 +198,7 @@ patterns, then inserts frontmatter and marks state automatically. ``` 2. Run the heuristic enrichment script. The script path is relative - to this skill's directory — copy it to /tmp or reference it via + to this skill's directory: copy it to /tmp or reference it via the full embedded path: ```bash python3 references/enrich-heuristic.py /tmp/enrich-list.txt @@ -209,11 +209,11 @@ patterns, then inserts frontmatter and marks state automatically. ### When to use heuristic vs. per-file enrichment -| Backlog size | Approach | -|-------------|----------| -| 1-5 entries | Read each file, enrich manually with full context | -| 6-20 entries | Sequential processing in the main conversation | -| 20+ entries | Use `enrich-heuristic.py` for bulk processing | +| Backlog size | Approach | +|--------------|---------------------------------------------------| +| 1-5 entries | Read each file, enrich manually with full context | +| 6-20 entries | Sequential processing in the main conversation | +| 20+ entries | Use `enrich-heuristic.py` for bulk processing | The heuristic script produces good-enough enrichment from titles and filenames. For higher quality, follow up with manual review diff --git a/internal/assets/claude/skills/ctx-journal-enrich/SKILL.md b/internal/assets/claude/skills/ctx-journal-enrich/SKILL.md index ae026703..81d23026 100644 --- a/internal/assets/claude/skills/ctx-journal-enrich/SKILL.md +++ b/internal/assets/claude/skills/ctx-journal-enrich/SKILL.md @@ -9,7 +9,7 @@ Enrich a session journal entry with structured metadata. 1. **Check if locked**: a file is locked if `.state.json` has a `locked` date OR the frontmatter contains `locked: true`. Locked - files must not be modified — skip them silently. Check via: + files must not be modified: skip them silently. Check via: `ctx system mark-journal --check locked` or look for `locked: true` in the YAML frontmatter. 2. **Check if already enriched**: check the state file via @@ -74,9 +74,9 @@ Read the journal entry and extract: --- title: "Session title" date: 2026-01-27 -model: claude-opus-4-6 # auto-populated at export -tokens_in: 234000 # auto-populated at export -tokens_out: 89000 # auto-populated at export +model: claude-opus-4-6 # auto-populated at export +tokens_in: 234000 # auto-populated at export +tokens_out: 89000 # auto-populated at export type: feature outcome: completed topics: diff --git a/internal/assets/claude/skills/ctx-loop/SKILL.md b/internal/assets/claude/skills/ctx-loop/SKILL.md index 126b190a..3cdcc737 100644 --- a/internal/assets/claude/skills/ctx-loop/SKILL.md +++ b/internal/assets/claude/skills/ctx-loop/SKILL.md @@ -45,13 +45,13 @@ Generate a ready-to-use autonomous loop shell script. ## Flags -| Flag | Short | Default | Purpose | -|--------------------|-------|--------------------|-------------------------------| -| `--prompt` | `-p` | `PROMPT.md` | Prompt file the loop reads | +| Flag | Short | Default | Purpose | +|--------------------|-------|--------------------|---------------------------------| +| `--prompt` | `-p` | `PROMPT.md` | Prompt file the loop reads | | `--tool` | `-t` | `claude` | AI tool: claude, aider, generic | -| `--max-iterations` | `-n` | `0` (unlimited) | Stop after N iterations | -| `--completion` | `-c` | `SYSTEM_CONVERGED` | Signal that ends the loop | -| `--output` | `-o` | `loop.sh` | Output script filename | +| `--max-iterations` | `-n` | `0` (unlimited) | Stop after N iterations | +| `--completion` | `-c` | `SYSTEM_CONVERGED` | Signal that ends the loop | +| `--output` | `-o` | `loop.sh` | Output script filename | ## Supported Tools @@ -94,7 +94,7 @@ chmod +x loop.sh # already done by ctx loop - The script captures AI tool errors with `|| true` so one failed iteration does not kill the loop - Autonomous agents benefit from explicit reasoning prompts in - PROMPT.md — adding "think step-by-step before each change" + PROMPT.md: adding "think step-by-step before each change" to the iteration prompt significantly improves accuracy and reduces cascading mistakes in unattended runs diff --git a/internal/assets/claude/skills/ctx-next/SKILL.md b/internal/assets/claude/skills/ctx-next/SKILL.md index 7ea869ff..2cf06b17 100644 --- a/internal/assets/claude/skills/ctx-next/SKILL.md +++ b/internal/assets/claude/skills/ctx-next/SKILL.md @@ -29,7 +29,7 @@ Analyze current tasks and recent session activity, then suggest ## Process -Do all of this **silently** — do not narrate the steps: +Do all of this **silently**: do not narrate the steps: 1. **Read TASKS.md** to get the full task list with statuses, priorities, and phases @@ -81,7 +81,7 @@ Present your recommendations like this: **2. [Task title or summary]** `#priority:X` > [1-2 sentence rationale] -**3. [Task title or summary]** *(optional — only if genuinely +**3. [Task title or summary]** *(optional: only if genuinely useful)* > [1-2 sentence rationale] @@ -92,11 +92,11 @@ useful)* ### Rules for recommendations: -- **1-3 items only** — more than 3 defeats the purpose -- **Be specific** — "Fix `block-non-path-ctx` hook" not +- **1-3 items only**: more than 3 defeats the purpose +- **Be specific**: "Fix `block-non-path-ctx` hook" not "work on hooks" - **Include the priority tag** so the user sees the weight -- **Rationale must reference context** — why *this* task, not +- **Rationale must reference context**: why *this* task, not just what it is. Connect to recent work, priority, or dependencies - If an in-progress task exists, it should almost always be @@ -110,12 +110,12 @@ useful)* > > **1. Fix `block-non-path-ctx` hook** `#priority:high` > > Still open from yesterday's session. The hook is too -> > aggressive — it blocks `git -C path` commands that don't +> > aggressive: it blocks `git -C path` commands that don't > > invoke ctx. Quick fix, clears a blocker. > > **2. Add `Context.File(name)` method** `#priority:high` > > Eliminates 10+ linear scan boilerplate instances across -> > 5 packages. High impact, low effort — good consolidation +> > 5 packages. High impact, low effort: good consolidation > > target. > > **3. Topics system (T1.1)** `#priority:medium` diff --git a/internal/assets/claude/skills/ctx-pad/SKILL.md b/internal/assets/claude/skills/ctx-pad/SKILL.md index ee839113..9431135d 100644 --- a/internal/assets/claude/skills/ctx-pad/SKILL.md +++ b/internal/assets/claude/skills/ctx-pad/SKILL.md @@ -23,23 +23,23 @@ command. ## Command Mapping -| User intent | Command | -|---|---| -| "show my scratchpad" / "what's on my pad" | `ctx pad` | -| "show me entry 3" / "what's in entry 3" | `ctx pad show 3` | -| "add a note: check DNS" / "jot down: check DNS" | `ctx pad add "check DNS"` | -| "delete the third one" / "remove entry 3" | `ctx pad rm 3` | -| "change entry 2 to ..." / "replace entry 2 with ..." | `ctx pad edit 2 "new text"` | -| "append '-- important' to entry 3" / "add to entry 3: ..." | `ctx pad edit 3 --append "-- important"` | -| "prepend 'URGENT:' to entry 1" | `ctx pad edit 1 --prepend "URGENT:"` | -| "move entry 4 to the top" / "prioritize entry 4" | `ctx pad mv 4 1` | -| "move entry 1 to the bottom" | `ctx pad mv 1 N` (where N = last position) | -| "import my notes from notes.txt" | `ctx pad import notes.txt` | -| "import from stdin" / pipe into pad | `cmd \| ctx pad import -` | -| "export all blobs" / "extract blobs to DIR" | `ctx pad export [DIR]` | -| "export blobs, overwrite existing" | `ctx pad export --force [DIR]` | -| "merge entries from another pad" | `ctx pad merge FILE...` | -| "merge with a different key" | `ctx pad merge --key /path/to/key FILE` | +| User intent | Command | +|------------------------------------------------------------|--------------------------------------------| +| "show my scratchpad" / "what's on my pad" | `ctx pad` | +| "show me entry 3" / "what's in entry 3" | `ctx pad show 3` | +| "add a note: check DNS" / "jot down: check DNS" | `ctx pad add "check DNS"` | +| "delete the third one" / "remove entry 3" | `ctx pad rm 3` | +| "change entry 2 to ..." / "replace entry 2 with ..." | `ctx pad edit 2 "new text"` | +| "append '-- important' to entry 3" / "add to entry 3: ..." | `ctx pad edit 3 --append "-- important"` | +| "prepend 'URGENT:' to entry 1" | `ctx pad edit 1 --prepend "URGENT:"` | +| "move entry 4 to the top" / "prioritize entry 4" | `ctx pad mv 4 1` | +| "move entry 1 to the bottom" | `ctx pad mv 1 N` (where N = last position) | +| "import my notes from notes.txt" | `ctx pad import notes.txt` | +| "import from stdin" / pipe into pad | `cmd \| ctx pad import -` | +| "export all blobs" / "extract blobs to DIR" | `ctx pad export [DIR]` | +| "export blobs, overwrite existing" | `ctx pad export --force [DIR]` | +| "merge entries from another pad" | `ctx pad merge FILE...` | +| "merge with a different key" | `ctx pad merge --key /path/to/key FILE` | ## Execution @@ -70,7 +70,7 @@ ctx pad edit 1 "updated note text" **Append to an entry:** ```bash -ctx pad edit 3 --append " -- this is important" +ctx pad edit 3 --append " - this is important" ``` **Prepend to an entry:** @@ -122,16 +122,16 @@ When the user's intent is ambiguous: - "prioritize" / "bump up" / "move to top" → **mv N 1** - "deprioritize" / "move to bottom" → **mv N last** -When the user says "add" — check context: +When the user says "add": check context: - "add a note" / "add to my pad" → `ctx pad add` (new entry) - "add to entry 3" / "add this to the third one" → `ctx pad edit 3 --append` (modify existing) ## Important Notes - Keep the encryption key path (`~/.ctx/.ctx.key`) internal to - `ctx pad` commands — exposing it grants full decryption access + `ctx pad` commands: exposing it grants full decryption access to all pad entries -- Always use `ctx pad` to access entries — reading `scratchpad.enc` +- Always use `ctx pad` to access entries: reading `scratchpad.enc` directly yields unreadable ciphertext - If the user gets a "no key" error, tell them to obtain the key file from a teammate diff --git a/internal/assets/claude/skills/ctx-pause/SKILL.md b/internal/assets/claude/skills/ctx-pause/SKILL.md index 7b2dec5a..9668d069 100644 --- a/internal/assets/claude/skills/ctx-pause/SKILL.md +++ b/internal/assets/claude/skills/ctx-pause/SKILL.md @@ -45,4 +45,4 @@ Then confirm to the user: - **Resume before wrap-up**: if the session evolves into real work, resume hooks before wrapping up to capture learnings and decisions - **Initial context load is unaffected**: the ~8k token startup injection - happens before any command runs — pause only affects subsequent hooks + happens before any command runs: pause only affects subsequent hooks diff --git a/internal/assets/claude/skills/ctx-prompt/SKILL.md b/internal/assets/claude/skills/ctx-prompt/SKILL.md index 57a46bff..937cca62 100644 --- a/internal/assets/claude/skills/ctx-prompt/SKILL.md +++ b/internal/assets/claude/skills/ctx-prompt/SKILL.md @@ -5,8 +5,8 @@ allowed-tools: Bash(ctx:*) --- Apply reusable prompt templates from `.context/prompts/` to the -current task. Prompt templates are plain markdown files — no -frontmatter, no trigger rules — for common patterns like code +current task. Prompt templates are plain Markdown files: no +frontmatter, no trigger rules: for common patterns like code review, refactoring, or explaining code. ## When to Use @@ -24,13 +24,13 @@ review, refactoring, or explaining code. ## Command Mapping -| User intent | Command | -|---|---| -| "list my prompts" / "what prompts do I have" | `ctx prompt list` | -| "show the code-review prompt" | `ctx prompt show code-review` | -| "create a new prompt called debug" | `ctx prompt add debug --stdin` (then ask for content) | -| "add the refactor template" | `ctx prompt add refactor` | -| "delete the debug prompt" | `ctx prompt rm debug` | +| User intent | Command | +|----------------------------------------------|-------------------------------------------------------| +| "list my prompts" / "what prompts do I have" | `ctx prompt list` | +| "show the code-review prompt" | `ctx prompt show code-review` | +| "create a new prompt called debug" | `ctx prompt add debug --stdin` (then ask for content) | +| "add the refactor template" | `ctx prompt add refactor` | +| "delete the debug prompt" | `ctx prompt rm debug` | ## Execution @@ -50,7 +50,7 @@ ctx prompt show Read the prompt content, then **follow the instructions in the prompt** applied to the user's current context. The prompt template tells you -what to do — treat it as your working instructions. +what to do: treat it as your working instructions. ## Interpreting User Intent @@ -68,7 +68,7 @@ When the user wants to create a prompt: ## Important Notes -- Prompt templates are plain markdown — no frontmatter parsing needed +- Prompt templates are plain markdown: no frontmatter parsing needed - Templates live in `.context/prompts/` and are committed to git by default - `ctx init` stamps starter templates (code-review, refactor, explain) - If a prompt is not found, suggest running `ctx prompt list` to see diff --git a/internal/assets/claude/skills/ctx-recall/SKILL.md b/internal/assets/claude/skills/ctx-recall/SKILL.md index 80788a64..5370fb05 100644 --- a/internal/assets/claude/skills/ctx-recall/SKILL.md +++ b/internal/assets/claude/skills/ctx-recall/SKILL.md @@ -68,17 +68,17 @@ Use `--full` for the complete conversation. Export sessions to the journal directory as markdown. -| Flag | Default | Purpose | -|-----------------------|---------|---------------------------------------------------| -| `--all` | false | Export all sessions (only new files by default) | -| `--all-projects` | false | Include all projects | -| `--regenerate` | false | Re-export existing files (preserves frontmatter) | -| `--keep-frontmatter` | true | Preserve enriched YAML frontmatter during regen | -| `--yes`, `-y` | false | Skip confirmation prompt | -| `--dry-run` | false | Preview what would be exported | +| Flag | Default | Purpose | +|----------------------|---------|--------------------------------------------------| +| `--all` | false | Export all sessions (only new files by default) | +| `--all-projects` | false | Include all projects | +| `--regenerate` | false | Re-export existing files (preserves frontmatter) | +| `--keep-frontmatter` | true | Preserve enriched YAML frontmatter during regen | +| `--yes`, `-y` | false | Skip confirmation prompt | +| `--dry-run` | false | Preview what would be exported | Accepts a session ID (always writes), or `--all` to export -everything (safe by default — only new sessions, existing +everything (safe by default: only new sessions, existing files skipped). Use `--regenerate` with `--all` to re-export existing files; YAML frontmatter is preserved by default. Use `--keep-frontmatter=false` to discard enriched frontmatter. @@ -155,7 +155,7 @@ ctx recall show ```bash ctx recall export --all ``` -This only exports new sessions — existing files are skipped. +This only exports new sessions: existing files are skipped. If the user asks what to do next, mention that `/ctx-journal-enrich-all` can enrich the exported journals. @@ -172,6 +172,6 @@ Before reporting results, verify: or topic - [ ] For export, reminded the user about the normalize/enrich pipeline as next steps -- [ ] Used `--all` for bulk export (safe — only new sessions) +- [ ] Used `--all` for bulk export (safe: only new sessions) - [ ] Suggested `--dry-run` when user seems uncertain - [ ] Only used `--regenerate` when explicitly needed diff --git a/internal/assets/claude/skills/ctx-remember/SKILL.md b/internal/assets/claude/skills/ctx-remember/SKILL.md index f22e329d..c96bfa1b 100644 --- a/internal/assets/claude/skills/ctx-remember/SKILL.md +++ b/internal/assets/claude/skills/ctx-remember/SKILL.md @@ -23,17 +23,17 @@ tracking, then there will be something to remember." ## When NOT to Use -- Context was already loaded this session via `/ctx-agent` — don't +- Context was already loaded this session via `/ctx-agent`: don't re-fetch what you already have - Mid-session when you are actively working on a task and context - is fresh — don't interrupt flow + is fresh: don't interrupt flow - When the user is asking about a *specific* past session by name - or ID — use `/ctx-recall` instead, which has list/show/export + or ID: use `/ctx-recall` instead, which has list/show/export subcommands ## Process -Do all of this **silently** — narrating the steps makes the readback +Do all of this **silently**: narrating the steps makes the readback feel like a file search rather than genuine recall: 1. **Load context packet**: @@ -56,7 +56,7 @@ Present your findings as a structured readback with these sections: most recent session from the session list. **Active work**: Pending and in-progress tasks from TASKS.md. Use -a brief list — one line per task with its status. +a brief list: one line per task with its status. **Recent context**: 1-2 recent decisions or learnings that are relevant. Pick the most recent or most impactful. @@ -66,11 +66,11 @@ tasks, or ask the user for direction if priorities are unclear. ## Readback Rules -- Open directly with the readback — instead of "I don't have memory", +- Open directly with the readback: instead of "I don't have memory", present what you found -- Skip preamble like "Let me check" — go straight to the structured +- Skip preamble like "Let me check": go straight to the structured readback -- Present findings as recall, not discovery — you are *remembering*, +- Present findings as recall, not discovery: you are *remembering*, not *searching* - Be honest about the mechanism only if the user explicitly asks *how* you remember (e.g., "It's stored in context files managed diff --git a/internal/assets/claude/skills/ctx-remind/SKILL.md b/internal/assets/claude/skills/ctx-remind/SKILL.md index ad06f890..24b4c7d6 100644 --- a/internal/assets/claude/skills/ctx-remind/SKILL.md +++ b/internal/assets/claude/skills/ctx-remind/SKILL.md @@ -20,19 +20,19 @@ command. - For structured tasks with status tracking (use `ctx add task`) - For sensitive values or quick notes (use `ctx pad`) - For architectural decisions (use `ctx add decision`) -- Create a reminder only when the user explicitly says "remind me" — +- Create a reminder only when the user explicitly says "remind me": for everything else, let the conversation proceed without creating records ## Command Mapping -| User intent | Command | -|---|---| -| "remind me to refactor swagger" | `ctx remind "refactor swagger"` | -| "remind me tomorrow to check CI" | `ctx remind "check CI" --after YYYY-MM-DD` | +| User intent | Command | +|--------------------------------------|-----------------------------------------------| +| "remind me to refactor swagger" | `ctx remind "refactor swagger"` | +| "remind me tomorrow to check CI" | `ctx remind "check CI" --after YYYY-MM-DD` | | "remind me next week to review auth" | `ctx remind "review auth" --after YYYY-MM-DD` | -| "what reminders do I have?" | `ctx remind list` | -| "dismiss reminder 3" | `ctx remind dismiss 3` | -| "clear all reminders" | `ctx remind dismiss --all` | +| "what reminders do I have?" | `ctx remind list` | +| "dismiss reminder 3" | `ctx remind dismiss 3` | +| "clear all reminders" | `ctx remind dismiss --all` | ## Execution @@ -66,23 +66,23 @@ ctx remind dismiss --all The CLI only accepts `YYYY-MM-DD` for `--after`. You must convert natural language dates to this format. -| User says | You run | -|---|---| -| "remind me next session" | `ctx remind "..."` (no `--after`) | -| "remind me tomorrow" | `ctx remind "..." --after YYYY-MM-DD` (tomorrow's date) | -| "remind me next week" | `ctx remind "..." --after YYYY-MM-DD` (7 days from now) | -| "remind me about X" | `ctx remind "X"` (no `--after`, immediate) | -| "remind me after Friday" | `ctx remind "..." --after YYYY-MM-DD` (next Saturday) | +| User says | You run | +|--------------------------|---------------------------------------------------------| +| "remind me next session" | `ctx remind "..."` (no `--after`) | +| "remind me tomorrow" | `ctx remind "..." --after YYYY-MM-DD` (tomorrow's date) | +| "remind me next week" | `ctx remind "..." --after YYYY-MM-DD` (7 days from now) | +| "remind me about X" | `ctx remind "X"` (no `--after`, immediate) | +| "remind me after Friday" | `ctx remind "..." --after YYYY-MM-DD` (next Saturday) | If the date is ambiguous (e.g., "after the release"), ask the user for a specific date. ## Important Notes -- Reminders fire **every session** until dismissed — no throttle +- Reminders fire **every session** until dismissed: no throttle - The `--after` flag gates when a reminder starts appearing, not when it expires -- IDs are never reused — after dismissing ID 3, the next gets ID 4+ +- IDs are never reused: after dismissing ID 3, the next gets ID 4+ - Reminders are stored in `.context/reminders.json` (committed to git) - After creating or dismissing, show the command output so the user can confirm the action diff --git a/internal/assets/claude/skills/ctx-sanitize-permissions/SKILL.md b/internal/assets/claude/skills/ctx-sanitize-permissions/SKILL.md index f7ef6b83..7111edef 100644 --- a/internal/assets/claude/skills/ctx-sanitize-permissions/SKILL.md +++ b/internal/assets/claude/skills/ctx-sanitize-permissions/SKILL.md @@ -5,7 +5,7 @@ description: "Audit settings.local.json for dangerous permissions. Use periodica Audit `.claude/settings.local.json` permissions for entries that bypass safety hooks, grant overly broad access, or create injection -vectors. This is a defense-in-depth measure — hooks block dangerous +vectors. This is a defense-in-depth measure: hooks block dangerous commands at runtime, but pre-approved permissions skip the confirmation step that makes hooks visible. @@ -47,53 +47,53 @@ Flag any permission matching these categories: These pre-approve commands that safety hooks are designed to intercept. The hook still runs, but the user never sees the -confirmation dialog — so they cannot reject it. +confirmation dialog: so they cannot reject it. -| Pattern | Why Dangerous | -|---------|---------------| -| `Bash(git push:*)` | Bypasses push-blocking hook confirmation | -| `Bash(git push)` | Same — exact match variant | -| `Bash(git push --force:*)` | Force push with no confirmation | +| Pattern | Why Dangerous | +|----------------------------|------------------------------------------| +| `Bash(git push:*)` | Bypasses push-blocking hook confirmation | +| `Bash(git push)` | Same: exact match variant | +| `Bash(git push --force:*)` | Force push with no confirmation | #### Category B: Destructive Commands (High) -| Pattern | Why Dangerous | -|---------|---------------| -| `Bash(rm -rf:*)` | Recursive delete with no confirmation | -| `Bash(git reset --hard:*)` | Discards uncommitted work | -| `Bash(git checkout .:*)` | Discards all unstaged changes | -| `Bash(git clean -f:*)` | Deletes untracked files | -| `Bash(git branch -D:*)` | Force-deletes branches | -| `Bash(sudo:*)` | Escalated privileges | +| Pattern | Why Dangerous | +|----------------------------|---------------------------------------| +| `Bash(rm -rf:*)` | Recursive delete with no confirmation | +| `Bash(git reset --hard:*)` | Discards uncommitted work | +| `Bash(git checkout .:*)` | Discards all unstaged changes | +| `Bash(git clean -f:*)` | Deletes untracked files | +| `Bash(git branch -D:*)` | Force-deletes branches | +| `Bash(sudo:*)` | Escalated privileges | #### Category C: Config Injection Vectors (High) -These allow the agent to modify files that control its own behavior -— a self-modification vector that could be exploited via prompt +These allow the agent to modify files that control its own behavior: +a self-modification vector that could be exploited via prompt injection. -| Pattern | Why Dangerous | -|---------|---------------| -| Any `Bash(...)` that could write to `.claude/settings.local.json` | Agent modifies its own permissions | -| Any `Bash(...)` that could write to `CLAUDE.md` | Agent modifies its own instructions | -| Any `Bash(...)` that could write to `.claude/hooks/*.sh` | Agent modifies safety hooks | -| Any `Bash(...)` that could write to `.context/CONSTITUTION.md` | Agent modifies its own hard rules | +| Pattern | Why Dangerous | +|-------------------------------------------------------------------|-------------------------------------| +| Any `Bash(...)` that could write to `.claude/settings.local.json` | Agent modifies its own permissions | +| Any `Bash(...)` that could write to `CLAUDE.md` | Agent modifies its own instructions | +| Any `Bash(...)` that could write to `.claude/hooks/*.sh` | Agent modifies safety hooks | +| Any `Bash(...)` that could write to `.context/CONSTITUTION.md` | Agent modifies its own hard rules | These are harder to detect by pattern alone. Look for overly broad permissions like `Bash(echo:*)`, `Bash(cat:*)`, `Bash(tee:*)`, `Bash(cp:*)` that could be composed into writes to sensitive paths. -Flag them as **informational** — they have legitimate uses but are +Flag them as **informational**: they have legitimate uses but are worth noting. #### Category D: Overly Broad (Medium) -| Pattern | Why Dangerous | -|---------|---------------| -| `Bash(*:*)` or `Bash(*)` | Allows any command | -| `Bash(curl:*)` | Arbitrary network requests | -| `Bash(wget:*)` | Arbitrary downloads | -| `Bash(pip install:*)` | Arbitrary package installation | -| `Bash(npm install:*)` | Arbitrary package installation | +| Pattern | Why Dangerous | +|--------------------------|--------------------------------| +| `Bash(*:*)` or `Bash(*)` | Allows any command | +| `Bash(curl:*)` | Arbitrary network requests | +| `Bash(wget:*)` | Arbitrary downloads | +| `Bash(pip install:*)` | Arbitrary package installation | +| `Bash(npm install:*)` | Arbitrary package installation | ### Step 3: Check for Duplicates @@ -119,7 +119,7 @@ Sort both `permissions.allow` and `permissions.deny` arrays in This produces a stable, predictable order that makes it easy to spot duplicates, find specific entries, and review diffs. -Apply the sort directly to the file — this is a non-destructive +Apply the sort directly to the file: this is a non-destructive reformat. Show the user a summary of what moved (e.g., "Sorted 45 allow entries and 8 deny entries into 4 tool groups"). @@ -131,16 +131,16 @@ Format findings by severity: ## Permission Audit Results ### Critical (hook bypass) -- `Bash(git push:*)` — bypasses block-git-push.sh +- `Bash(git push:*)`: bypasses block-git-push.sh ### High (destructive / injection vector) -- `Bash(rm -rf:*)` — recursive delete, no confirmation +- `Bash(rm -rf:*)`: recursive delete, no confirmation ### Medium (overly broad) -- `Bash(curl:*)` — arbitrary network access +- `Bash(curl:*)`: arbitrary network access ### Informational -- `Bash(cat:*)` — could compose into config file writes +- `Bash(cat:*)`: could compose into config file writes - 3 duplicate entries found ### Clean @@ -152,7 +152,7 @@ Format findings by severity: For each finding, offer a specific action: - **Critical/High**: "Remove this permission? (y/n)" -- **Medium**: "This is broad — do you want to keep it?" +- **Medium**: "This is broad: do you want to keep it?" - **Duplicates**: "Remove N duplicate entries?" - **Informational**: Note only, no action needed @@ -162,12 +162,12 @@ directly. Show the diff before and after. ## Important Notes - Show the user exactly what will be removed and get explicit - confirmation before editing — preventing accidental lockout + confirmation before editing: preventing accidental lockout preserves user agency - Permissions the user just granted in this session are more - likely intentional — note them but do not alarm + likely intentional: note them but do not alarm - Some broad permissions are legitimate for development - workflows (e.g., `Bash(go test:*)`) — use judgment + workflows (e.g., `Bash(go test:*)`): use judgment - The goal is awareness, not lockdown. Flag risks, let the user decide diff --git a/internal/assets/claude/skills/ctx-skill-audit/SKILL.md b/internal/assets/claude/skills/ctx-skill-audit/SKILL.md index 43bc4e2a..540f3fd7 100644 --- a/internal/assets/claude/skills/ctx-skill-audit/SKILL.md +++ b/internal/assets/claude/skills/ctx-skill-audit/SKILL.md @@ -19,7 +19,7 @@ improvements. ## Before Auditing 1. Read `references/anthropic-best-practices.md` from this - skill's directory — it contains the condensed audit criteria. + skill's directory: it contains the condensed audit criteria. 2. Identify which skill(s) to audit. If the user names a specific skill, audit that one. If they say "audit all skills," plan a batch pass. @@ -50,7 +50,7 @@ comments. Edit only the files specified in the task. Preserve existing -tests and comments — add new ones only when the user requests +tests and comments: add new ones only when the user requests them. @@ -71,7 +71,7 @@ importance. You MUST ALWAYS run tests before reporting completion. -Run tests before reporting completion — untested changes +Run tests before reporting completion: untested changes create silent regressions that compound across sessions. @@ -139,7 +139,7 @@ exist. ### 8. Scope Discipline Check whether the skill encourages work beyond what's -requested — "while you're in there" improvements, unsolicited +requested: "while you're in there" improvements, unsolicited refactoring, or scope creep. Skills should state the minimum viable outcome. @@ -168,16 +168,16 @@ or too narrow (misses common phrasings). 1. Read the skill's SKILL.md. 2. Apply all 9 audit dimensions. 3. Report findings using the output format below. -4. Suggest specific rewrites for any failures — show the +4. Suggest specific rewrites for any failures: show the current text and the proposed replacement. ### Batch Audit 1. List all skills to audit (bundled, live, or both). -2. Audit each skill directly in the main conversation — +2. Audit each skill directly in the main conversation: spawning one subagent per skill adds latency and context overhead that outweighs parallelism for typical batch sizes. -3. Report concisely — only dimensions that fail or have notable +3. Report concisely: only dimensions that fail or have notable findings. 4. Summarize with a scorecard at the end. @@ -204,7 +204,7 @@ For each audited skill, report: **Suggested fixes:** - [Dimension 2] Line "You MUST ALWAYS run tests" → - "Run tests before completion — untested changes create + "Run tests before completion: untested changes create silent regressions." - [Dimension 4] Add example showing expected output format after the "Report results" section. @@ -219,7 +219,7 @@ For batch audits, end with a summary: |--------------------|-------|--------------------------| | ctx-commit | 8/9 | Missing example | | ctx-drift | 7/9 | 2 bare mandates | -| ctx-verify | 9/9 | — | +| ctx-verify | 9/9 | - | ``` ## Quality Checklist diff --git a/internal/assets/claude/skills/ctx-skill-audit/references/anthropic-best-practices.md b/internal/assets/claude/skills/ctx-skill-audit/references/anthropic-best-practices.md index 10e3eeac..55960f57 100644 --- a/internal/assets/claude/skills/ctx-skill-audit/references/anthropic-best-practices.md +++ b/internal/assets/claude/skills/ctx-skill-audit/references/anthropic-best-practices.md @@ -52,7 +52,7 @@ but the primary instruction should describe the desired behavior. NEVER use ellipses. Your response will be read aloud by a text-to-speech -engine, so avoid ellipses — the engine cannot pronounce them. +engine, so avoid ellipses: the engine cannot pronounce them. ## Context and Motivation @@ -103,7 +103,7 @@ Best practices: - Nest tags when content has natural hierarchy. - Tags are especially valuable when the skill injects external content (file contents, user input, tool output) alongside - instructions — the tags prevent the agent from confusing + instructions: the tags prevent the agent from confusing injected content with skill instructions. **When XML tags help most:** skills that template in variable @@ -119,7 +119,7 @@ patterns: - Be explicit about which tool to use and when: "Use the Edit tool for modifications" beats "modify the file." - If a skill references tools, state expected behavior clearly: - "Read the file first, then edit" — not "look at the file." + "Read the file first, then edit": not "look at the file." **Overtriggering risk:** Claude 4.5/4.6 models are more responsive to system prompts than earlier models. Skills that @@ -163,14 +163,14 @@ Claude 4.5/4.6 models are less verbose and more direct. Skills written for earlier models may have compensating instructions that are now counterproductive: -- **Excessive emphasis**: CRITICAL, MUST, NEVER, ALWAYS in caps - — earlier models needed strong signals; current models may +- **Excessive emphasis**: CRITICAL, MUST, NEVER, ALWAYS in caps: + earlier models needed strong signals; current models may overtrigger or treat these as higher priority than intended. - **Redundant capability reminders**: "You are an expert at X" - or "You have the ability to Y" — the model already knows its + or "You have the ability to Y": the model already knows its capabilities. - **Verbose output templates**: asking for detailed summaries - after every action — current models skip unnecessary summaries + after every action: current models skip unnecessary summaries by default, which is usually better. **Calibration test:** read the skill's instructions and ask: @@ -185,7 +185,7 @@ adding unnecessary abstractions, or building in flexibility that wasn't requested. Skills should: -- Scope actions to what's requested — a bug fix skill shouldn't +- Scope actions to what's requested: a bug fix skill shouldn't also clean up surrounding code. - Avoid encouraging "while you're in there" improvements. - State the minimum viable outcome, not the maximum possible. diff --git a/internal/assets/claude/skills/ctx-skill-creator/SKILL.md b/internal/assets/claude/skills/ctx-skill-creator/SKILL.md index 1efcd065..f553f34d 100644 --- a/internal/assets/claude/skills/ctx-skill-creator/SKILL.md +++ b/internal/assets/claude/skills/ctx-skill-creator/SKILL.md @@ -34,14 +34,14 @@ knows: delete). Target: >70% Expert, <10% Redundant. **Description is the trigger.** The `description` field in frontmatter determines when a skill activates. All "when to use" context belongs there, not in the body. Claude tends to -undertrigger — make descriptions a little "pushy" by listing +undertrigger: make descriptions a little "pushy" by listing concrete situations and synonyms the user might say: ```yaml -# Weak — too vague, will undertrigger +# Weak: too vague, will undertrigger description: "Use when starting any conversation" -# Strong — specific triggers, covers synonyms +# Strong: specific triggers, covers synonyms description: >- Use after writing code, before commits, or when CI might fail. Also use when the user says 'run checks', 'lint this', @@ -61,7 +61,7 @@ memorizing a rule it might misapply. **Match communication to user skill level.** Pay attention to context cues. Some users are experienced developers; others are -new to terminals entirely. Briefly explain terms ("assertions — +new to terminals entirely. Briefly explain terms ("assertions: automated checks that verify the output") when in doubt. ## Skill Anatomy @@ -92,9 +92,9 @@ skill-name/ Skills use a three-level loading system: -1. **Metadata** (name + description) — always in context (~100 words) -2. **SKILL.md body** — loaded when skill triggers (<500 lines) -3. **Bundled resources** — loaded as needed (unlimited size; +1. **Metadata** (name + description): always in context (~100 words) +2. **SKILL.md body**: loaded when skill triggers (<500 lines) +3. **Bundled resources**: loaded as needed (unlimited size; scripts can execute without loading into context) Keep SKILL.md under 500 lines. If approaching that limit, move @@ -151,7 +151,7 @@ Output: feat(auth): implement JWT-based authentication ### Workflow Patterns -**Sequential** — overview the steps upfront: +**Sequential**: overview the steps upfront: ```markdown 1. Check formatting (`gofmt -l .`) @@ -160,7 +160,7 @@ Output: feat(auth): implement JWT-based authentication 4. Report results ``` -**Conditional** — guide through decision points: +**Conditional**: guide through decision points: ```markdown 1. Determine the change type: @@ -175,7 +175,7 @@ Output: feat(auth): implement JWT-based authentication Start by understanding what the user wants. Two paths: **From conversation**: The user says "turn this into a skill." -Extract answers from the conversation history — the tools used, +Extract answers from the conversation history: the tools used, the sequence of steps, corrections made, input/output formats observed. Confirm your understanding before proceeding. @@ -189,7 +189,7 @@ observed. Confirm your understanding before proceeding. Proactively ask about edge cases, dependencies, and success criteria *before* writing. Don't wait for the user to think of -everything — come prepared with questions based on what you +everything: come prepared with questions based on what you know about the domain. Check for existing skills in `.claude/skills/` to avoid @@ -201,7 +201,7 @@ Write the skill following the anatomy above. As you write: - Tag paragraphs (Expert/Activation/Redundant) and remove anything Redundant -- Write the description field to be "pushy" — cover synonyms, +- Write the description field to be "pushy": cover synonyms, concrete situations, edge-case triggers - Explain reasoning behind instructions instead of relying on rigid MUST/NEVER directives @@ -209,7 +209,7 @@ Write the skill following the anatomy above. As you write: ### 4. Test -Propose 2-3 realistic test prompts — the kind of thing a real +Propose 2-3 realistic test prompts: the kind of thing a real user would actually say. Share them: "Here are a few test cases I'd like to try. Do these look right, or do you want to add more?" @@ -296,5 +296,5 @@ Before finalizing a skill: - Capability lists ("Masters X, Y, Z...") - README, CHANGELOG, or auxiliary documentation - References to files that do not exist -- Great content with a vague description — it will never +- Great content with a vague description: it will never trigger diff --git a/internal/assets/claude/skills/ctx-skill-creator/references/anthropic-best-practices.md b/internal/assets/claude/skills/ctx-skill-creator/references/anthropic-best-practices.md index 10e3eeac..55960f57 100644 --- a/internal/assets/claude/skills/ctx-skill-creator/references/anthropic-best-practices.md +++ b/internal/assets/claude/skills/ctx-skill-creator/references/anthropic-best-practices.md @@ -52,7 +52,7 @@ but the primary instruction should describe the desired behavior. NEVER use ellipses. Your response will be read aloud by a text-to-speech -engine, so avoid ellipses — the engine cannot pronounce them. +engine, so avoid ellipses: the engine cannot pronounce them. ## Context and Motivation @@ -103,7 +103,7 @@ Best practices: - Nest tags when content has natural hierarchy. - Tags are especially valuable when the skill injects external content (file contents, user input, tool output) alongside - instructions — the tags prevent the agent from confusing + instructions: the tags prevent the agent from confusing injected content with skill instructions. **When XML tags help most:** skills that template in variable @@ -119,7 +119,7 @@ patterns: - Be explicit about which tool to use and when: "Use the Edit tool for modifications" beats "modify the file." - If a skill references tools, state expected behavior clearly: - "Read the file first, then edit" — not "look at the file." + "Read the file first, then edit": not "look at the file." **Overtriggering risk:** Claude 4.5/4.6 models are more responsive to system prompts than earlier models. Skills that @@ -163,14 +163,14 @@ Claude 4.5/4.6 models are less verbose and more direct. Skills written for earlier models may have compensating instructions that are now counterproductive: -- **Excessive emphasis**: CRITICAL, MUST, NEVER, ALWAYS in caps - — earlier models needed strong signals; current models may +- **Excessive emphasis**: CRITICAL, MUST, NEVER, ALWAYS in caps: + earlier models needed strong signals; current models may overtrigger or treat these as higher priority than intended. - **Redundant capability reminders**: "You are an expert at X" - or "You have the ability to Y" — the model already knows its + or "You have the ability to Y": the model already knows its capabilities. - **Verbose output templates**: asking for detailed summaries - after every action — current models skip unnecessary summaries + after every action: current models skip unnecessary summaries by default, which is usually better. **Calibration test:** read the skill's instructions and ask: @@ -185,7 +185,7 @@ adding unnecessary abstractions, or building in flexibility that wasn't requested. Skills should: -- Scope actions to what's requested — a bug fix skill shouldn't +- Scope actions to what's requested: a bug fix skill shouldn't also clean up surrounding code. - Avoid encouraging "while you're in there" improvements. - State the minimum viable outcome, not the maximum possible. diff --git a/internal/assets/claude/skills/ctx-spec/SKILL.md b/internal/assets/claude/skills/ctx-spec/SKILL.md index 59176990..d5fb8cad 100644 --- a/internal/assets/claude/skills/ctx-spec/SKILL.md +++ b/internal/assets/claude/skills/ctx-spec/SKILL.md @@ -55,19 +55,19 @@ Work through each section **one at a time**. For each section: **Section order and prompts:** -| Section | Prompt | -|---------|--------| -| **Problem** | "What user-visible problem does this solve? Why now?" | -| **Approach** | "High-level: how does this work? Where does it fit?" | -| **Happy Path** | "Walk me through what happens when everything goes right." | -| **Edge Cases** | "What could go wrong? Think: empty input, partial failure, duplicates, concurrency, missing deps." | -| **Validation Rules** | "What input constraints are enforced? Where?" | -| **Error Handling** | "For each error condition: what message does the user see? How do they recover?" | -| **Interface** | "CLI command? Skill? Both? What flags?" | -| **Implementation** | "Which files change? Key functions? Existing helpers to reuse?" | -| **Configuration** | "Any .ctxrc keys, env vars, or settings?" | -| **Testing** | "Unit, integration, edge case tests?" | -| **Non-Goals** | "What does this intentionally NOT do?" | +| Section | Prompt | +|----------------------|----------------------------------------------------------------------------------------------------| +| **Problem** | "What user-visible problem does this solve? Why now?" | +| **Approach** | "High-level: how does this work? Where does it fit?" | +| **Happy Path** | "Walk me through what happens when everything goes right." | +| **Edge Cases** | "What could go wrong? Think: empty input, partial failure, duplicates, concurrency, missing deps." | +| **Validation Rules** | "What input constraints are enforced? Where?" | +| **Error Handling** | "For each error condition: what message does the user see? How do they recover?" | +| **Interface** | "CLI command? Skill? Both? What flags?" | +| **Implementation** | "Which files change? Key functions? Existing helpers to reuse?" | +| **Configuration** | "Any .ctxrc keys, env vars, or settings?" | +| **Testing** | "Unit, integration, edge case tests?" | +| **Non-Goals** | "What does this intentionally NOT do?" | **Spend extra time on Edge Cases and Error Handling.** These are where specs earn their value. Push for at least 3 edge cases and @@ -94,7 +94,7 @@ Write the completed spec to `specs/{feature-name}.md`. Not every spec needs every section. If a section clearly does not apply (e.g., no CLI for an internal refactor), the user can say -"skip" and the section is omitted entirely — not left with +"skip" and the section is omitted entirely: not left with placeholder text. ## Quality Checklist diff --git a/internal/assets/claude/skills/ctx-verify/SKILL.md b/internal/assets/claude/skills/ctx-verify/SKILL.md index c642b4e8..f006945e 100644 --- a/internal/assets/claude/skills/ctx-verify/SKILL.md +++ b/internal/assets/claude/skills/ctx-verify/SKILL.md @@ -29,31 +29,31 @@ Run the relevant verification command before claiming a result. ## Workflow 1. **Identify** what command proves the claim -2. **Think through** what a passing result looks like — and what - a false positive would look like — before running +2. **Think through** what a passing result looks like: and what + a false positive would look like: before running 3. **Run** the command (fresh, not a previous run) 4. **Read** the full output; check exit code, count failures 5. **Report** actual results with evidence -Run the verification command fresh each time — reusing earlier output +Run the verification command fresh each time: reusing earlier output is unreliable because code changes between runs and stale results have caused false confidence. ## Claim-to-Evidence Map -| Claim | Required Evidence | -|-------------------|---------------------------------------------------------| -| Tests pass | Test command output showing 0 failures | -| Linter clean | `golangci-lint run` output showing 0 errors | -| Build succeeds | `go build` exit 0 (linter passing is not enough) | -| Bug fixed | Original symptom no longer reproduces | -| Regression tested | Red-green cycle: test fails without fix, passes with it | -| All checks pass | `make audit` output showing all steps pass | -| Files match | `diff` showing no differences (e.g., template vs live) | +| Claim | Required Evidence | +|-------------------|-----------------------------------------------------------------------| +| Tests pass | Test command output showing 0 failures | +| Linter clean | `golangci-lint run` output showing 0 errors | +| Build succeeds | `go build` exit 0 (linter passing is not enough) | +| Bug fixed | Original symptom no longer reproduces | +| Regression tested | Red-green cycle: test fails without fix, passes with it | +| All checks pass | `make audit` output showing all steps pass | +| Files match | `diff` showing no differences (e.g., template vs live) | | Design is sound | Assumptions listed, failure modes identified, alternatives considered | -| Doc is accurate | Claims traced to source code or config; no stale references | -| Skill works | Trigger conditions tested, output matches spec, edge cases covered | -| Config is correct | Values validated against schema or runtime; no stale references | +| Doc is accurate | Claims traced to source code or config; no stale references | +| Skill works | Trigger conditions tested, output matches spec, edge cases covered | +| Config is correct | Values validated against schema or runtime; no stale references | ## Self-Audit Questions @@ -66,7 +66,7 @@ complete, run this checklist on your own output: - What would a reviewer question first? If any answer reveals a gap, address it before reporting done. -This applies to all artifact types — not just code. +This applies to all artifact types: not just code. ## Transform Vague Tasks into Verifiable Goals diff --git a/internal/assets/claude/skills/ctx-worktree/SKILL.md b/internal/assets/claude/skills/ctx-worktree/SKILL.md index a60780d6..bf354c03 100644 --- a/internal/assets/claude/skills/ctx-worktree/SKILL.md +++ b/internal/assets/claude/skills/ctx-worktree/SKILL.md @@ -31,7 +31,7 @@ Create a new worktree as a sibling directory with a `work/` branch. **Process:** -1. **Check count** — refuse if 4 worktrees already exist: +1. **Check count**: refuse if 4 worktrees already exist: ```bash git worktree list ``` @@ -85,7 +85,7 @@ Merge a completed worktree back and clean up. git merge "work/" ``` If there are conflicts, stop and help the user resolve them. - TASKS.md conflicts are common — see guidance below. + TASKS.md conflicts are common: see guidance below. 3. **Remove the worktree**: ```bash @@ -105,17 +105,17 @@ Merge a completed worktree back and clean up. ## Guardrails -- **Max 4 worktrees** — more than 4 parallel tracks makes merge +- **Max 4 worktrees**: more than 4 parallel tracks makes merge complexity outweigh productivity gains -- **Sibling directories only** — worktrees go in `../-`, +- **Sibling directories only**: worktrees go in `../-`, never inside the project tree -- **`work/` branch prefix** — all worktree branches use `work/` +- **`work/` branch prefix**: all worktree branches use `work/` for easy identification and cleanup -- **No `ctx init` in worktrees** — the context directory is tracked +- **No `ctx init` in worktrees**: the context directory is tracked in git; running init would overwrite shared context files -- **Manage from main checkout only** — create and teardown worktrees +- **Manage from main checkout only**: create and teardown worktrees from the main working tree, not from inside a worktree -- **TASKS.md conflict resolution** — when merging, TASKS.md will +- **TASKS.md conflict resolution**: when merging, TASKS.md will often conflict because multiple agents marked different tasks as complete. Resolution: accept all `[x]` completions from both sides. No task should go from `[x]` back to `[ ]`. @@ -128,10 +128,10 @@ the project). All worktrees on the same machine share this path, so One thing to watch: -- **Journal enrichment** — `ctx recall export` and `ctx journal enrich` +- **Journal enrichment**: `ctx recall export` and `ctx journal enrich` resolve paths relative to the current working directory. Files created in a worktree stay in that worktree and are discarded on - teardown. Enrich journals on the main branch after merging — the + teardown. Enrich journals on the main branch after merging: the JSONL session logs are intact regardless. ## Task Grouping Guidance @@ -140,20 +140,20 @@ Before creating worktrees, analyze the backlog to group tasks into non-overlapping tracks: 1. **Read TASKS.md** and identify all pending tasks -2. **Estimate blast radius** — which files/directories does each +2. **Estimate blast radius**: which files/directories does each task touch? -3. **Group by non-overlapping directories** — tasks that touch the +3. **Group by non-overlapping directories**: tasks that touch the same package or file must go in the same track 4. **Present the grouping** to the user before creating worktrees: ```text Proposed worktree groups: - work/docs — recipe updates, blog post, getting started guide + work/docs : recipe updates, blog post, getting started guide (touches: docs/) - work/crypto — P3.1-P3.3 encrypted scratchpad infra + work/crypto : P3.1-P3.3 encrypted scratchpad infra (touches: internal/crypto/, internal/config/) - work/pad-cli — P3.4-P3.9 pad CLI commands + work/pad-cli : P3.4-P3.9 pad CLI commands (touches: internal/cli/pad/) ``` diff --git a/internal/assets/claude/skills/ctx-wrap-up/SKILL.md b/internal/assets/claude/skills/ctx-wrap-up/SKILL.md index 74380dfb..bf9a14e7 100644 --- a/internal/assets/claude/skills/ctx-wrap-up/SKILL.md +++ b/internal/assets/claude/skills/ctx-wrap-up/SKILL.md @@ -8,7 +8,7 @@ Guide end-of-session context persistence. Gather signal from the session, propose candidates worth persisting, and persist approved items via `ctx add`. -This is a **ceremony skill** — invoke it explicitly as `/ctx-wrap-up` +This is a **ceremony skill**: invoke it explicitly as `/ctx-wrap-up` at session end, not conversationally. It pairs with `/ctx-remember` at session start. @@ -29,14 +29,14 @@ tracking, then there will be something to wrap up." - Nothing meaningful happened (only read files, quick lookup) - The user already persisted everything manually with `ctx add` -- Mid-session when the user is still in flow — use `/ctx-reflect` +- Mid-session when the user is still in flow: use `/ctx-reflect` instead for mid-session checkpoints ## Process ### Phase 1: Gather signal -Do this **silently** — do not narrate the steps: +Do this **silently**: do not narrate the steps: 1. Check what changed in the working tree: ```bash @@ -64,7 +64,7 @@ potential candidate, ask yourself: - Is this substantial enough to record, or is it trivial? Present candidates in a structured list, grouped by type. -Skip categories with no candidates — do not show empty sections. +Skip categories with no candidates: do not show empty sections. ``` ## Session Wrap-Up @@ -93,20 +93,20 @@ Persist all? Or select which to keep? ### Phase 3: Persist approved candidates Wait for the user to approve, select, or modify candidates. -Wait for the user to approve each item before persisting — +Wait for the user to approve each item before persisting: candidates proposed by the agent may be incomplete or mischaracterized, and the user is the final authority on what belongs in their context. For each approved candidate, run the appropriate command: -| Type | Command | -|------------|---------------------------------------------------------------------------------------| -| Learning | `ctx add learning "Title" --context "..." --lesson "..." --application "..."` | -| Decision | `ctx add decision "Title" --context "..." --rationale "..." --consequences "..."` | -| Convention | `ctx add convention "Description"` | -| Task (new) | `ctx add task "Description"` | -| Task (done)| Edit TASKS.md to mark complete | +| Type | Command | +|-------------|-----------------------------------------------------------------------------------| +| Learning | `ctx add learning "Title" --context "..." --lesson "..." --application "..."` | +| Decision | `ctx add decision "Title" --context "..." --rationale "..." --consequences "..."` | +| Convention | `ctx add convention "Description"` | +| Task (new) | `ctx add task "Description"` | +| Task (done) | Edit TASKS.md to mark complete | Report the result of each command. If any fail, report the error and continue with the remaining items. @@ -140,27 +140,27 @@ Do not auto-commit. The user decides. ### Good candidates - "PyMdownx `details` extension wraps content in `
` - tags, breaking `
` rendering in MkDocs" — specific
+  tags, breaking `
` rendering in MkDocs": specific
   gotcha, actionable for future sessions
 - "Decision: use file-based cooldown tokens instead of env vars
-  because hooks run in subprocesses" — real trade-off with
+  because hooks run in subprocesses": real trade-off with
   rationale
-- "Convention: all skill descriptions use imperative mood" —
+- "Convention: all skill descriptions use imperative mood":
   codifies a pattern for consistency
 
 ### Weak candidates (do not propose)
 
-- "Go has good error handling" — general knowledge, not
+- "Go has good error handling": general knowledge, not
   project-specific
-- "We edited main.go" — obvious from the diff, not an insight
-- "Tests should pass before committing" — too generic to be
+- "We edited main.go": obvious from the diff, not an insight
+- "Tests should pass before committing": too generic to be
   useful
 - Anything already present in LEARNINGS.md or DECISIONS.md
 
 ## Relationship to /ctx-reflect
 
 `/ctx-reflect` is for mid-session checkpoints at natural
-breakpoints. `/ctx-wrap-up` is for end-of-session — it's more
+breakpoints. `/ctx-wrap-up` is for end-of-session: it's more
 thorough, covers the full session arc, and includes the commit
 offer. If the user already ran `/ctx-reflect` recently, avoid
 proposing the same candidates again.
diff --git a/internal/assets/commands/commands.yaml b/internal/assets/commands/commands.yaml
index 5e6f6d47..e5117dbc 100644
--- a/internal/assets/commands/commands.yaml
+++ b/internal/assets/commands/commands.yaml
@@ -3,331 +3,684 @@
 # Keys use dot notation: parent.subcommand (e.g., pad.show)
 
 add:
-  long: "Add a new decision, task, learning, or convention\nto the appropriate context file.\n\nTypes:\n  decision    Add\
-    \ to DECISIONS.md (requires --context, --rationale, --consequences)\n  learning    Add to LEARNINGS.md (requires --context,\
-    \ --lesson, --application)\n  task        Add to TASKS.md\n  convention  Add to CONVENTIONS.md\n\nContent can be provided\
-    \ as:\n  - Command argument: ctx add learning \"title here\"\n  - File: ctx add learning --file /path/to/content.md\n\
-    \  - Stdin: echo \"title\" | ctx add learning\n\nExamples:\n  ctx add decision \"Use PostgreSQL\" \\\n    --context \"\
-    Need a reliable database for production\" \\\n    --rationale \"PostgreSQL offers ACID compliance and JSON support\" \\\
-    \n    --consequences \"Team needs PostgreSQL training\"\n  ctx add learning \"Go embed requires files in same package\"\
-    \ \\\n    --context \"Tried to embed files from parent directory\" \\\n    --lesson \"go:embed only works with files in\
-    \ same or child directories\" \\\n    --application \"Keep embedded files in internal/templates/, not project root\"\n\
-    \  ctx add task \"Implement user authentication\" --priority high"
+  long: |-
+    Add a new decision, task, learning, or convention
+    to the appropriate context file.
+
+    Types:
+      decision    Add to DECISIONS.md (requires --context, --rationale, --consequences)
+      learning    Add to LEARNINGS.md (requires --context, --lesson, --application)
+      task        Add to TASKS.md
+      convention  Add to CONVENTIONS.md
+
+    Content can be provided as:
+      - Command argument: ctx add learning "title here"
+      - File: ctx add learning --file /path/to/content.md
+      - Stdin: echo "title" | ctx add learning
+
+    Examples:
+      ctx add decision "Use PostgreSQL" \
+        --context "Need a reliable database for production" \
+        --rationale "PostgreSQL offers ACID compliance and JSON support" \
+        --consequences "Team needs PostgreSQL training"
+      ctx add learning "Go embed requires files in same package" \
+        --context "Tried to embed files from parent directory" \
+        --lesson "go:embed only works with files in same or child directories" \
+        --application "Keep embedded files in internal/templates/, not project root"
+      ctx add task "Implement user authentication" --priority high
   short: Add a new item to a context file
 agent:
-  long: "Print a concise context packet optimized for AI consumption.\n\nThe output is designed to be copy-pasted into an\
-    \ AI chat\nor piped to a system prompt. It includes:\n  - Constitution rules (NEVER VIOLATE)\n  - Current tasks (budget-capped)\n\
-    \  - Key conventions (budget-capped)\n  - Recent decisions (scored by relevance, full body)\n  - Key learnings (scored\
-    \ by relevance, full body)\n\nThe --budget flag controls content selection. Entries are scored by\nrecency and relevance\
-    \ to active tasks, then included in priority order\nuntil the budget is consumed. Entries that don't fit get title-only\n\
-    summaries in an \"Also Noted\" section.\n\nUse --budget to set token budget (default from .ctxrc or 8000).\nUse --format\
-    \ to choose between Markdown (md) or JSON output.\n\nCooldown (for hooks and automation):\n  --session identifies the\
-    \ caller (e.g., $PPID). Without it, cooldown\n  is disabled and every call produces output. When --session is set,\n \
-    \ repeated calls within the --cooldown window (default 10m) are suppressed.\n\nExamples:\n  ctx agent                \
-    \              # Default budget, Markdown output\n  ctx agent --budget 4000                # Smaller context packet\n\
-    \  ctx agent --format json                # JSON output for programmatic use\n  ctx agent --session $PPID            \
-    \  # Cooldown scoped to calling process"
+  long: |-
+    Print a concise context packet optimized for AI consumption.
+
+    The output is designed to be copy-pasted into an AI chat
+    or piped to a system prompt. It includes:
+      - Constitution rules (NEVER VIOLATE)
+      - Current tasks (budget-capped)
+      - Key conventions (budget-capped)
+      - Recent decisions (scored by relevance, full body)
+      - Key learnings (scored by relevance, full body)
+
+    The --budget flag controls content selection. Entries are scored by
+    recency and relevance to active tasks, then included in priority order
+    until the budget is consumed. Entries that don't fit get title-only
+    summaries in an "Also Noted" section.
+
+    Use --budget to set token budget (default from .ctxrc or 8000).
+    Use --format to choose between Markdown (md) or JSON output.
+
+    Cooldown (for hooks and automation):
+      --session identifies the caller (e.g., $PPID). Without it, cooldown
+      is disabled and every call produces output. When --session is set,
+      repeated calls within the --cooldown window (default 10m) are suppressed.
+
+    Examples:
+      ctx agent                              # Default budget, Markdown output
+      ctx agent --budget 4000                # Smaller context packet
+      ctx agent --format json                # JSON output for programmatic use
+      ctx agent --session $PPID              # Cooldown scoped to calling process
   short: Print AI-ready context packet
 changes:
-  long: "Show changes in context files and code since the last AI session.\n\nAutomatically detects the last session boundary\
-    \ from state markers.\nUse --since to specify a custom time range (duration like \"24h\" or\ndate like \"2026-03-01\"\
-    ).\n\nExamples:\n  ctx changes                     # changes since last session\n  ctx changes --since 24h         # changes\
-    \ in last 24 hours\n  ctx changes --since 2026-03-01  # changes since specific date"
+  long: |-
+    Show changes in context files and code since the last AI session.
+
+    Automatically detects the last session boundary from state markers.
+    Use --since to specify a custom time range (duration like "24h" or
+    date like "2026-03-01").
+
+    Examples:
+      ctx changes                     # changes since last session
+      ctx changes --since 24h         # changes in last 24 hours
+      ctx changes --since 2026-03-01  # changes since specific date
   short: Show what changed since last session
 compact:
-  long: "Consolidate and clean up context files.\n\nActions performed:\n  - Move completed tasks to \"Completed (Recent)\"\
-    \ section\n  - Archive old completed tasks (with --archive)\n  - Archive old decisions and learnings (with --archive)\n\
-    \  - Remove empty sections from context files\n  - Report on potential duplicates\n\nUse --archive to create .context/archive/\
-    \ for old content.\n\nExamples:\n  ctx compact                  # Clean up context, move completed tasks\n  ctx compact\
-    \ --archive        # Also archive old tasks, decisions, and learnings"
+  long: |-
+    Consolidate and clean up context files.
+
+    Actions performed:
+      - Move completed tasks to "Completed (Recent)" section
+      - Archive old completed tasks (with --archive)
+      - Archive old decisions and learnings (with --archive)
+      - Remove empty sections from context files
+      - Report on potential duplicates
+
+    Use --archive to create .context/archive/ for old content.
+
+    Examples:
+      ctx compact                  # Clean up context, move completed tasks
+      ctx compact --archive        # Also archive old tasks, decisions, and learnings
   short: Archive completed tasks and clean up context
 complete:
-  long: "Mark a task as completed in TASKS.md.\n\nYou can specify a task by:\n  - Task number (e.g., \"ctx complete 3\")\n\
-    \  - Partial text match (e.g., \"ctx complete auth\")\n  - Full task text (e.g., \"ctx complete 'Implement user authentication'\"\
-    )\n\nThe task will be marked with [x] \nand optionally moved to the Completed section."
+  long: |-
+    Mark a task as completed in TASKS.md.
+
+    You can specify a task by:
+      - Task number (e.g., "ctx tasks complete 3")
+      - Partial text match (e.g., "ctx tasks complete auth")
+      - Full task text (e.g., "ctx tasks complete 'Implement user authentication'")
+
+    The task will be marked with [x]
+    and optionally moved to the Completed section.
   short: Mark a task as completed
 config:
-  long: "Manage runtime configuration profiles.\n\nSubcommands:\n  switch [dev|base]    Switch .ctxrc profile (no arg = toggle)\n\
-    \  status               Show active .ctxrc profile\n  schema               Print JSON Schema for .ctxrc"
+  long: |-
+    Manage runtime configuration profiles.
+
+    Subcommands:
+      switch [dev|base]    Switch .ctxrc profile (no arg = toggle)
+      status               Show active .ctxrc profile
+      schema               Print JSON Schema for .ctxrc
   short: Manage runtime configuration
 config.schema:
-  long: "Print the JSON Schema for .ctxrc to stdout.\n\nPipe-friendly — redirect to a file for IDE integration:\n\n  ctx config\
-    \ schema > .ctxrc.schema.json\n\nVS Code integration (requires redhat.vscode-yaml extension):\n\n  // .vscode/settings.json\n\
-    \  {\n    \"yaml.schemas\": {\n      \"./.ctxrc.schema.json\": \".ctxrc\"\n    }\n  }"
+  long: |-
+    Print the JSON Schema for .ctxrc to stdout.
+
+    Pipe-friendly — redirect to a file for IDE integration:
+
+      ctx config schema > .ctxrc.schema.json
+
+    VS Code integration (requires redhat.vscode-yaml extension):
+
+      // .vscode/settings.json
+      {
+        "yaml.schemas": {
+          "./.ctxrc.schema.json": ".ctxrc"
+        }
+      }
   short: Print JSON Schema for .ctxrc
 config.status:
   short: Show active .ctxrc profile
 config.switchcmd:
-  long: 'Switch between .ctxrc configuration profiles.
-
+  long: |-
+    Switch between .ctxrc configuration profiles.
 
     With no argument, toggles between dev and base.
-
     Accepts "prod" as an alias for "base".
 
-
     Source files (.ctxrc.base, .ctxrc.dev) are committed to git.
-
-    The working copy (.ctxrc) is gitignored.'
+    The working copy (.ctxrc) is gitignored.
   short: Switch .ctxrc profile
 ctx:
-  long: "ctx (Context) maintains persistent context files that help\n  AI coding assistants understand your project's architecture,\
-    \ conventions,\n  decisions, and current tasks.\n\n  Use 'ctx init' to create a .context/ directory in your project,\n\
-    \  then use 'ctx status', 'ctx load', and 'ctx agent' to work with context."
+  long: |-
+    ctx (Context) maintains persistent context files that help
+      AI coding assistants understand your project's architecture, conventions,
+      decisions, and current tasks.
+
+      Use 'ctx init' to create a .context/ directory in your project,
+      then use 'ctx status', 'ctx load', and 'ctx agent' to work with context.
   short: Context - persistent context for AI coding assistants
 decision:
-  long: "Manage the DECISIONS.md file and its quick-reference index.\n\nThe decisions file maintains an auto-generated index\
-    \ at the top for quick\nscanning. Use the subcommands to manage this index.\n\nSubcommands:\n  reindex    Regenerate the\
-    \ quick-reference index\n\nExamples:\n  ctx decisions reindex"
+  long: |-
+    Manage the DECISIONS.md file and its quick-reference index.
+
+    The decisions file maintains an auto-generated index at the top for quick
+    scanning. Use the subcommands to manage this index.
+
+    Subcommands:
+      reindex    Regenerate the quick-reference index
+
+    Examples:
+      ctx decisions reindex
   short: Manage DECISIONS.md file
 decision.reindex:
-  long: "Regenerate the quick-reference index at the top of DECISIONS.md.\n\nThe index is a compact table showing date and\
-    \ title for each decision,\nallowing AI agents to quickly scan entries without reading the full file.\n\nThis command\
-    \ is useful after manual edits to DECISIONS.md or when\nmigrating existing files to use the index format.\n\nExamples:\n\
-    \  ctx decisions reindex"
+  long: |-
+    Regenerate the quick-reference index at the top of DECISIONS.md.
+
+    The index is a compact table showing date and title for each decision,
+    allowing AI agents to quickly scan entries without reading the full file.
+
+    This command is useful after manual edits to DECISIONS.md or when
+    migrating existing files to use the index format.
+
+    Examples:
+      ctx decisions reindex
   short: Regenerate the quick-reference index
 deps:
-  long: "Generate a dependency graph from source code.\n\nOutputs a Mermaid graph of internal package dependencies by default.\n\
-    Use --external to include external module dependencies.\n\nSupported project types: Go, Node.js, Python, Rust.\nAuto-detected\
-    \ from manifest files (go.mod, package.json,\nrequirements.txt/pyproject.toml, Cargo.toml). Use --type to override.\n\n\
-    Output formats:\n  mermaid   Mermaid graph definition (default)\n  table     Package | Imports table\n  json      Machine-readable\
-    \ adjacency list"
+  long: |-
+    Generate a dependency graph from source code.
+
+    Outputs a Mermaid graph of internal package dependencies by default.
+    Use --external to include external module dependencies.
+
+    Supported project types: Go, Node.js, Python, Rust.
+    Auto-detected from manifest files (go.mod, package.json,
+    requirements.txt/pyproject.toml, Cargo.toml). Use --type to override.
+
+    Output formats:
+      mermaid   Mermaid graph definition (default)
+      table     Package | Imports table
+      json      Machine-readable adjacency list
   short: Show package dependency graph
 doctor:
-  long: "Run mechanical health checks across context, hooks, and configuration.\n\nChecks:\n  - Context initialized and required\
-    \ files present\n  - .ctxrc validation (unknown fields, typos)\n  - Drift detected (stale paths, missing files)\n  - Plugin\
-    \ installed and enabled\n  - Event logging status\n  - Webhook configured\n  - Pending reminders\n  - Task completion\
-    \ ratio\n  - Context token size\n  - System resources (memory, swap, disk, load)\n\nUse --json for machine-readable output."
+  long: |-
+    Run mechanical health checks across context, hooks, and configuration.
+
+    Checks:
+      - Context initialized and required files present
+      - .ctxrc validation (unknown fields, typos)
+      - Drift detected (stale paths, missing files)
+      - Plugin installed and enabled
+      - Event logging status
+      - Webhook configured
+      - Pending reminders
+      - Task completion ratio
+      - Context token size
+      - System resources (memory, swap, disk, load)
+
+    Use --json for machine-readable output.
   short: Structural health check
 drift:
-  long: "Run drift detection to find stale paths,\nbroken references, and constitution violations.\n\nChecks performed:\n\
-    \  - Path references in ARCHITECTURE.md and CONVENTIONS.md exist\n  - Staleness indicators (many completed tasks)\n  -\
-    \ Constitution rule violations (potential secrets)\n  - Required files are present\n\nUse --json for machine-readable\
-    \ output."
+  long: |-
+    Run drift detection to find stale paths,
+    broken references, and constitution violations.
+
+    Checks performed:
+      - Path references in ARCHITECTURE.md and CONVENTIONS.md exist
+      - Staleness indicators (many completed tasks)
+      - Constitution rule violations (potential secrets)
+      - Required files are present
+
+    Use --json for machine-readable output.
   short: Detect stale or invalid context
 guide:
-  long: 'Use-case-oriented cheat sheet for ctx.
-
+  long: |-
+    Use-case-oriented cheat sheet for ctx.
 
     Shows core commands grouped by workflow, key skills, and common recipes.
-
     Default output fits one screen.
 
-
     Use --skills to list all available slash-command skills.
-
-    Use --commands to list all CLI commands.'
+    Use --commands to list all CLI commands.
   short: Quick-reference cheat sheet for ctx
 hook:
-  long: "Generate configuration and instructions\nfor integrating Context with AI tools.\n\nSupported tools:\n  claude-code\
-    \  - Anthropic's Claude Code CLI (use plugin instead)\n  cursor       - Cursor IDE\n  aider        - Aider AI coding assistant\n\
-    \  copilot      - GitHub Copilot\n  windsurf     - Windsurf IDE\n\nUse --write to generate the configuration file directly:\n\
-    \  ctx hook copilot --write    # Creates .github/copilot-instructions.md\n\nExample:\n  ctx hook cursor"
+  long: |-
+    Generate configuration and instructions
+    for integrating Context with AI tools.
+
+    Supported tools:
+      claude-code  - Anthropic's Claude Code CLI (use plugin instead)
+      cursor       - Cursor IDE
+      aider        - Aider AI coding assistant
+      copilot      - GitHub Copilot
+      windsurf     - Windsurf IDE
+
+    Use --write to generate the configuration file directly:
+      ctx hook copilot --write    # Creates .github/copilot-instructions.md
+
+    Example:
+      ctx hook cursor
   short: Generate AI tool integration configs
 initialize:
-  long: "Initialize a new .context/ directory with template files for\nmaintaining persistent context for AI coding assistants.\n\
-    \nThe following files are created:\n  - CONSTITUTION.md  — Hard invariants that must never be violated\n  - TASKS.md \
-    \        — Current and planned work\n  - DECISIONS.md     — Architectural decisions with rationale\n  - LEARNINGS.md \
-    \    — Lessons learned, gotchas, tips\n  - CONVENTIONS.md   — Project patterns and standards\n  - ARCHITECTURE.md  — System\
-    \ overview\n  - GLOSSARY.md      — Domain terms and abbreviations\n  - AGENT_PLAYBOOK.md — How AI agents should use this\
-    \ system\n\nAdditionally, in the project root:\n  - PROMPT.md              — Session prompt for AI agents\n  - IMPLEMENTATION_PLAN.md\
-    \ — High-level project direction\n  - CLAUDE.md              — Claude Code configuration\n\nUse --minimal to only create\
-    \ essential files\n(TASKS.md, DECISIONS.md, CONSTITUTION.md).\n\nUse --ralph for autonomous loop mode where the agent\
-    \ works without\nasking clarifying questions, uses completion signals, and follows\none-task-per-iteration discipline.\n\
-    \nBy default (without --ralph), the agent is encouraged to ask questions\nwhen requirements are unclear — better for collaborative\
-    \ sessions.\n\nIf the ctx Claude Code plugin is installed, init auto-enables it in\n~/.claude/settings.json so it works\
-    \ across all projects.\nUse --no-plugin-enable to skip this step.\n\nExamples:\n  ctx init           # Collaborative mode\
-    \ (agent asks questions)\n  ctx init --ralph   # Autonomous mode (agent works independently)\n  ctx init --minimal # Only\
-    \ essential files (TASKS, DECISIONS, CONSTITUTION)\n  ctx init --force   # Overwrite existing files without prompting\n\
-    \  ctx init --merge   # Auto-merge ctx content into existing files"
+  long: |-
+    Initialize a new .context/ directory with template files for
+    maintaining persistent context for AI coding assistants.
+
+    The following files are created:
+      - CONSTITUTION.md  — Hard invariants that must never be violated
+      - TASKS.md         — Current and planned work
+      - DECISIONS.md     — Architectural decisions with rationale
+      - LEARNINGS.md     — Lessons learned, gotchas, tips
+      - CONVENTIONS.md   — Project patterns and standards
+      - ARCHITECTURE.md  — System overview
+      - GLOSSARY.md      — Domain terms and abbreviations
+      - AGENT_PLAYBOOK.md — How AI agents should use this system
+
+    Additionally, in the project root:
+      - PROMPT.md              — Session prompt for AI agents
+      - IMPLEMENTATION_PLAN.md — High-level project direction
+      - CLAUDE.md              — Claude Code configuration
+
+    Use --minimal to only create essential files
+    (TASKS.md, DECISIONS.md, CONSTITUTION.md).
+
+    Use --ralph for autonomous loop mode where the agent works without
+    asking clarifying questions, uses completion signals, and follows
+    one-task-per-iteration discipline.
+
+    By default (without --ralph), the agent is encouraged to ask questions
+    when requirements are unclear — better for collaborative sessions.
+
+    If the ctx Claude Code plugin is installed, init auto-enables it in
+    ~/.claude/settings.json so it works across all projects.
+    Use --no-plugin-enable to skip this step.
+
+    Examples:
+      ctx init           # Collaborative mode (agent asks questions)
+      ctx init --ralph   # Autonomous mode (agent works independently)
+      ctx init --minimal # Only essential files (TASKS, DECISIONS, CONSTITUTION)
+      ctx init --force   # Overwrite existing files without prompting
+      ctx init --merge   # Auto-merge ctx content into existing files
   short: Initialize a new .context/ directory with template files
 journal:
-  long: "Work with exported session files in .context/journal/.\n\nThe journal system provides tools for analyzing, enriching,\
-    \ and\npublishing your AI session history.\n\nSubcommands:\n  site      Generate a static site from journal entries\n\
-    \  obsidian  Generate an Obsidian vault from journal entries\n\nExamples:\n  ctx journal site                    # Generate\
-    \ site in .context/journal-site/\n  ctx journal site --output ~/public  # Custom output directory\n  ctx journal site\
-    \ --serve            # Generate and serve locally\n  ctx journal obsidian                # Generate Obsidian vault"
+  long: |-
+    Work with exported session files in .context/journal/.
+
+    The journal system provides tools for analyzing, enriching, and
+    publishing your AI session history.
+
+    Subcommands:
+      site      Generate a static site from journal entries
+      obsidian  Generate an Obsidian vault from journal entries
+
+    Examples:
+      ctx journal site                    # Generate site in .context/journal-site/
+      ctx journal site --output ~/public  # Custom output directory
+      ctx journal site --serve            # Generate and serve locally
+      ctx journal obsidian                # Generate Obsidian vault
   short: Analyze and synthesize exported sessions
 journal.obsidian:
-  long: "Generate an Obsidian-compatible vault from .context/journal/ entries.\n\nCreates a vault structure with:\n  - Wikilinks\
-    \ for internal navigation\n  - MOC (Map of Content) pages for topics, files, and types\n  - Related sessions footer for\
-    \ graph connectivity\n  - Minimal .obsidian/ configuration\n\nExamples:\n  ctx journal obsidian                      \
-    \    # Generate in .context/journal-obsidian/\n  ctx journal obsidian --output ~/vaults/ctx    # Custom output directory"
+  long: |-
+    Generate an Obsidian-compatible vault from .context/journal/ entries.
+
+    Creates a vault structure with:
+      - Wikilinks for internal navigation
+      - MOC (Map of Content) pages for topics, files, and types
+      - Related sessions footer for graph connectivity
+      - Minimal .obsidian/ configuration
+
+    Examples:
+      ctx journal obsidian                          # Generate in .context/journal-obsidian/
+      ctx journal obsidian --output ~/vaults/ctx    # Custom output directory
   short: Generate an Obsidian vault from journal entries
 journal.site:
-  long: "Generate a zensical-compatible static site from .context/journal/ entries.\n\nCreates a site structure with:\n  -\
-    \ Index page with all sessions listed by date\n  - Individual pages for each journal entry\n  - Navigation and search\
-    \ support\n\nRequires zensical to be installed for building/serving:\n  pipx install zensical\n\nExamples:\n  ctx journal\
-    \ site                    # Generate in .context/journal-site/\n  ctx journal site --output ~/public  # Custom output\
-    \ directory\n  ctx journal site --build            # Generate and build HTML\n  ctx journal site --serve            #\
-    \ Generate and serve locally"
+  long: |-
+    Generate a zensical-compatible static site from .context/journal/ entries.
+
+    Creates a site structure with:
+      - Index page with all sessions listed by date
+      - Individual pages for each journal entry
+      - Navigation and search support
+
+    Requires zensical to be installed for building/serving:
+      pipx install zensical
+
+    Examples:
+      ctx journal site                    # Generate in .context/journal-site/
+      ctx journal site --output ~/public  # Custom output directory
+      ctx journal site --build            # Generate and build HTML
+      ctx journal site --serve            # Generate and serve locally
   short: Generate a static site from journal entries
 learnings:
-  long: "Manage the LEARNINGS.md file and its quick-reference index.\n\nThe learnings file maintains an auto-generated index\
-    \ at the top for quick\nscanning. Use the subcommands to manage this index.\n\nSubcommands:\n  reindex    Regenerate the\
-    \ quick-reference index\n\nExamples:\n  ctx learnings reindex"
+  long: |-
+    Manage the LEARNINGS.md file and its quick-reference index.
+
+    The learnings file maintains an auto-generated index at the top for quick
+    scanning. Use the subcommands to manage this index.
+
+    Subcommands:
+      reindex    Regenerate the quick-reference index
+
+    Examples:
+      ctx learnings reindex
   short: Manage LEARNINGS.md file
 learnings.reindex:
-  long: "Regenerate the quick-reference index at the top of LEARNINGS.md.\n\nThe index is a compact table showing date and\
-    \ title for each learning,\nallowing AI agents to quickly scan entries without reading the full file.\n\nThis command\
-    \ is useful after manual edits to LEARNINGS.md or when\nmigrating existing files to use the index format.\n\nExamples:\n\
-    \  ctx learnings reindex"
+  long: |-
+    Regenerate the quick-reference index at the top of LEARNINGS.md.
+
+    The index is a compact table showing date and title for each learning,
+    allowing AI agents to quickly scan entries without reading the full file.
+
+    This command is useful after manual edits to LEARNINGS.md or when
+    migrating existing files to use the index format.
+
+    Examples:
+      ctx learnings reindex
   short: Regenerate the quick-reference index
 load:
-  long: "Load and display the assembled context\nas it would be provided to an AI.\n\nThe context files are assembled in the\
-    \ recommended read order:\n  1. CONSTITUTION.md\n  2. TASKS.md\n  3. CONVENTIONS.md\n  4. ARCHITECTURE.md\n  5. DECISIONS.md\n\
-    \  6. LEARNINGS.md\n  7. GLOSSARY.md\n  8. AGENT_PLAYBOOK.md\n\nUse --raw to output raw file contents without headers\
-    \ or assembly.\nUse --budget to limit output to a specific token count (default from .ctxrc or 8000)."
+  long: |-
+    Load and display the assembled context
+    as it would be provided to an AI.
+
+    The context files are assembled in the recommended read order:
+      1. CONSTITUTION.md
+      2. TASKS.md
+      3. CONVENTIONS.md
+      4. ARCHITECTURE.md
+      5. DECISIONS.md
+      6. LEARNINGS.md
+      7. GLOSSARY.md
+      8. AGENT_PLAYBOOK.md
+
+    Use --raw to output raw file contents without headers or assembly.
+    Use --budget to limit output to a specific token count (default from .ctxrc or 8000).
   short: Output assembled context Markdown
 loop:
-  long: "Generate a ready-to-use shell script for running a Ralph loop.\n\nA Ralph loop continuously runs an AI assistant\
-    \ with the same prompt until\na completion signal is detected. This enables iterative development where\nthe AI can build\
-    \ on its previous work.\n\nExamples:\n  ctx loop                           # Generate loop.sh for Claude\n  ctx loop --tool\
-    \ aider              # Generate for Aider\n  ctx loop --prompt TASKS.md         # Use custom prompt file\n  ctx loop --max-iterations\
-    \ 10       # Limit to 10 iterations\n  ctx loop -o my-loop.sh             # Output to custom file"
+  long: |-
+    Generate a ready-to-use shell script for running a Ralph loop.
+
+    A Ralph loop continuously runs an AI assistant with the same prompt until
+    a completion signal is detected. This enables iterative development where
+    the AI can build on its previous work.
+
+    Examples:
+      ctx loop                           # Generate loop.sh for Claude
+      ctx loop --tool aider              # Generate for Aider
+      ctx loop --prompt TASKS.md         # Use custom prompt file
+      ctx loop --max-iterations 10       # Limit to 10 iterations
+      ctx loop -o my-loop.sh             # Output to custom file
   short: Generate a Ralph loop script
 mcp:
   short: Model Context Protocol server
+mcp.serve:
+  long: |-
+    Start the MCP server, communicating via JSON-RPC 2.0 over stdin/stdout.
+
+    This command is intended to be invoked by MCP clients (AI tools), not
+    run directly by users. Configure your AI tool to run 'ctx mcp serve'
+    as an MCP server.
+  short: Start the MCP server (stdin/stdout)
 memory:
-  long: "Bridge Claude Code's auto memory (MEMORY.md) into .context/.\n\nDiscovers MEMORY.md from ~/.claude/projects/, mirrors\
-    \ it into\n.context/memory/mirror.md (git-tracked), and detects drift.\n\nSubcommands:\n  sync       Copy MEMORY.md to\
-    \ mirror, archive previous version\n  status     Show drift, timestamps, and entry counts\n  diff       Show what changed\
-    \ since last sync\n  import     Classify and promote entries to .context/ files\n  publish    Push curated .context/ content\
-    \ to MEMORY.md\n  unpublish  Remove published block from MEMORY.md"
+  long: |-
+    Bridge Claude Code's auto memory (MEMORY.md) into .context/.
+
+    Discovers MEMORY.md from ~/.claude/projects/, mirrors it into
+    .context/memory/mirror.md (git-tracked), and detects drift.
+
+    Subcommands:
+      sync       Copy MEMORY.md to mirror, archive previous version
+      status     Show drift, timestamps, and entry counts
+      diff       Show what changed since last sync
+      import     Classify and promote entries to .context/ files
+      publish    Push curated .context/ content to MEMORY.md
+      unpublish  Remove published block from MEMORY.md
   short: Bridge Claude Code auto memory into .context/
+memory.diff:
+  long: |-
+    Show a line-based diff between .context/memory/mirror.md and the
+    current MEMORY.md. No output when files are identical.
+  short: Show what changed since last sync
+memory.import:
+  long: |-
+    Classify and promote entries from Claude Code's MEMORY.md into
+    structured .context/ files using heuristic keyword matching.
+
+    Each entry is classified as a convention, decision, learning, task,
+    or skipped (session notes, generic text). Deduplication prevents
+    re-importing the same entry.
+
+    Exit codes:
+      0  Imported successfully (or nothing new to import)
+      1  MEMORY.md not found
+  short: Import entries from MEMORY.md into .context/ files
+memory.publish:
+  long: |-
+    Push curated .context/ content into Claude Code's MEMORY.md
+    so the agent sees structured project context on session start.
+
+    Content is wrapped in markers ( / ).
+    Claude-owned content outside the markers is preserved.
+
+    Exit codes:
+      0  Published successfully
+      1  MEMORY.md not found
+  short: Push curated context to MEMORY.md
 memory.status:
-  long: "Show memory bridge status: source location, last sync time,\nline counts, drift indicator, and archive count.\n\n\
-    Exit codes:\n  0  No drift\n  1  MEMORY.md not found\n  2  Drift detected (MEMORY.md changed since last sync)"
+  long: |-
+    Show memory bridge status: source location, last sync time,
+    line counts, drift indicator, and archive count.
+
+    Exit codes:
+      0  No drift
+      1  MEMORY.md not found
+      2  Drift detected (MEMORY.md changed since last sync)
   short: Show drift, timestamps, and entry counts
 memory.sync:
-  long: "Copy Claude Code's MEMORY.md to .context/memory/mirror.md.\n\nArchives the previous mirror before overwriting. Reports\
-    \ line counts\nand drift since last sync.\n\nExit codes:\n  0  Synced successfully\n  1  MEMORY.md not found (auto memory\
-    \ not active)"
+  long: |-
+    Copy Claude Code's MEMORY.md to .context/memory/mirror.md.
+
+    Archives the previous mirror before overwriting. Reports line counts
+    and drift since last sync.
+
+    Exit codes:
+      0  Synced successfully
+      1  MEMORY.md not found (auto memory not active)
   short: Copy MEMORY.md to mirror, archive previous version
+memory.unpublish:
+  long: |-
+    Remove the ctx-managed marker block from MEMORY.md,
+    preserving all Claude-owned content outside the markers.
+  short: Remove published context from MEMORY.md
 notify:
-  long: "Send a fire-and-forget webhook notification.\n\nRequires a configured webhook URL (see \"ctx notify setup\").\nSilent\
-    \ noop when no webhook is configured or the event is filtered.\n\nExamples:\n  ctx notify --event loop \"Loop completed\
-    \ after 5 iterations\"\n  ctx notify -e nudge -s session-abc \"Context checkpoint at prompt #20\"\n  ctx notify -e relay\
-    \ --hook check-version --variant mismatch \"Version mismatch\""
+  long: |-
+    Send a fire-and-forget webhook notification.
+
+    Requires a configured webhook URL (see "ctx notify setup").
+    Silent noop when no webhook is configured or the event is filtered.
+
+    Examples:
+      ctx notify --event loop "Loop completed after 5 iterations"
+      ctx notify -e nudge -s session-abc "Context checkpoint at prompt #20"
+      ctx notify -e relay --hook check-version --variant mismatch "Version mismatch"
   short: Send a webhook notification
 notify.setup:
-  long: 'Prompts for a webhook URL and encrypts it using the scratchpad key.
-
+  long: |-
+    Prompts for a webhook URL and encrypts it using the scratchpad key.
 
     The URL is stored in .context/.notify.enc (encrypted, safe to commit).
-
-    The key lives at ~/.ctx/.ctx.key (user-level, never committed).'
+    The key lives at ~/.ctx/.ctx.key (user-level, never committed).
   short: Configure webhook URL
 notify.test:
   long: Sends a test notification to the configured webhook and reports the HTTP status.
   short: Send a test notification
 pad:
-  long: "Manage an encrypted scratchpad stored in .context/.\n\nEntries are short one-liners encrypted with AES-256-GCM. The\
-    \ key is\nstored at ~/.ctx/.ctx.key (global, user-level). The encrypted file\n(.context/scratchpad.enc) is committed to\
-    \ git.\n\nFile blobs can be stored as entries using \"add --file\". Blob entries use\nthe format \"label:::base64data\"\
-    \ and are shown as \"label [BLOB]\" in the\nlist view. Use \"show N\" to decode or \"show N --out file\" to write to disk.\n\
-    \nWhen invoked without a subcommand, lists all entries.\n\nSubcommands:\n  show     Output raw text of an entry by number\n\
-    \  add      Append a new entry\n  rm       Remove an entry by number\n  edit     Replace an entry by number\n  mv    \
-    \   Move an entry to a different position\n  resolve  Show both sides of a merge conflict\n  import   Bulk-import lines\
-    \ from a file\n  export   Export blob entries to a directory as files\n  merge    Merge entries from scratchpad files"
+  long: |-
+    Manage an encrypted scratchpad stored in .context/.
+
+    Entries are short one-liners encrypted with AES-256-GCM. The key is
+    stored at ~/.ctx/.ctx.key (global, user-level). The encrypted file
+    (.context/scratchpad.enc) is committed to git.
+
+    File blobs can be stored as entries using "add --file". Blob entries use
+    the format "label:::base64data" and are shown as "label [BLOB]" in the
+    list view. Use "show N" to decode or "show N --out file" to write to disk.
+
+    When invoked without a subcommand, lists all entries.
+
+    Subcommands:
+      show     Output raw text of an entry by number
+      add      Append a new entry
+      rm       Remove an entry by number
+      edit     Replace an entry by number
+      mv       Move an entry to a different position
+      resolve  Show both sides of a merge conflict
+      import   Bulk-import lines from a file
+      export   Export blob entries to a directory as files
+      merge    Merge entries from scratchpad files
   short: Encrypted scratchpad for sensitive one-liners
 pad.add:
   short: Append a new entry to the scratchpad
 pad.edit:
-  long: "Replace, append to, or prepend to an entry by number.\n\nBy default, replaces the entire entry with the positional\
-    \ TEXT argument.\nUse --append to add text to the end of an existing entry, or --prepend\nto add text to the beginning.\n\
-    \nFor blob entries, use --file to replace file content and/or --label to\nchange the label.\n\nExamples:\n  ctx pad edit\
-    \ 2 \"new text\"           # replace entry 2\n  ctx pad edit 2 --append \"suffix\"    # append to entry 2\n  ctx pad edit\
-    \ 2 --prepend \"prefix\"   # prepend to entry 2\n  ctx pad edit 2 --file ./v2.md       # replace blob file content\n \
-    \ ctx pad edit 2 --label \"new name\"   # rename blob label\n  ctx pad edit 2 --file ./v2.md --label \"new\"  # replace\
-    \ both"
+  long: |-
+    Replace, append to, or prepend to an entry by number.
+
+    By default, replaces the entire entry with the positional TEXT argument.
+    Use --append to add text to the end of an existing entry, or --prepend
+    to add text to the beginning.
+
+    For blob entries, use --file to replace file content and/or --label to
+    change the label.
+
+    Examples:
+      ctx pad edit 2 "new text"           # replace entry 2
+      ctx pad edit 2 --append "suffix"    # append to entry 2
+      ctx pad edit 2 --prepend "prefix"   # prepend to entry 2
+      ctx pad edit 2 --file ./v2.md       # replace blob file content
+      ctx pad edit 2 --label "new name"   # rename blob label
+      ctx pad edit 2 --file ./v2.md --label "new"  # replace both
   short: Replace, append to, or prepend to an entry by number
 pad.export:
-  long: "Export all blob entries from the scratchpad to a directory as files.\nEach blob's label becomes the filename. Non-blob\
-    \ entries are skipped.\n\nWhen a file already exists, a unix timestamp is prepended to avoid\ncollisions. Use --force\
-    \ to overwrite instead.\n\nExamples:\n  ctx pad export\n  ctx pad export ./ideas\n  ctx pad export --dry-run\n  ctx pad\
-    \ export --force ./backup"
+  long: |-
+    Export all blob entries from the scratchpad to a directory as files.
+    Each blob's label becomes the filename. Non-blob entries are skipped.
+
+    When a file already exists, a unix timestamp is prepended to avoid
+    collisions. Use --force to overwrite instead.
+
+    Examples:
+      ctx pad export
+      ctx pad export ./ideas
+      ctx pad export --dry-run
+      ctx pad export --force ./backup
   short: Export blob entries to a directory as files
 pad.imp:
-  long: "Import lines from a file into the scratchpad. Each non-empty line\nbecomes a separate entry. Use \"-\" to read from\
-    \ stdin.\n\nWith --blobs, import all first-level files from a directory as blob entries.\nEach file becomes a blob with\
-    \ the filename as its label. Subdirectories and\nnon-regular files are skipped.\n\nExamples:\n  ctx pad import notes.txt\n\
-    \  grep pattern file | ctx pad import -\n  ctx pad import --blobs ./ideas/"
+  long: |-
+    Import lines from a file into the scratchpad. Each non-empty line
+    becomes a separate entry. Use "-" to read from stdin.
+
+    With --blobs, import all first-level files from a directory as blob entries.
+    Each file becomes a blob with the filename as its label. Subdirectories and
+    non-regular files are skipped.
+
+    Examples:
+      ctx pad import notes.txt
+      grep pattern file | ctx pad import -
+      ctx pad import --blobs ./ideas/
   short: Bulk-import lines from a file into the scratchpad
 pad.merge:
-  long: "Merge entries from one or more scratchpad files into the current pad.\n\nEach input file is auto-detected as encrypted\
-    \ or plaintext: decryption is\nattempted first, and on failure the file is parsed as plain text. Entries\nare deduplicated\
-    \ by exact content — position does not matter.\n\nUse --key to provide a key file for encrypted pads from other projects.\n\
-    \nExamples:\n  ctx pad merge worktree/.context/scratchpad.enc\n  ctx pad merge notes.md backup.enc\n  ctx pad merge --key\
-    \ /other/.ctx.key foreign.enc\n  ctx pad merge --dry-run pad-a.enc pad-b.md"
+  long: |-
+    Merge entries from one or more scratchpad files into the current pad.
+
+    Each input file is auto-detected as encrypted or plaintext: decryption is
+    attempted first, and on failure the file is parsed as plain text. Entries
+    are deduplicated by exact content — position does not matter.
+
+    Use --key to provide a key file for encrypted pads from other projects.
+
+    Examples:
+      ctx pad merge worktree/.context/scratchpad.enc
+      ctx pad merge notes.md backup.enc
+      ctx pad merge --key /other/.ctx.key foreign.enc
+      ctx pad merge --dry-run pad-a.enc pad-b.md
   short: Merge entries from scratchpad files into the current pad
 pad.mv:
   short: Move an entry from position N to position M
 pad.resolve:
-  long: 'Decrypt and display both sides of a merge conflict for the scratchpad.
-
+  long: |-
+    Decrypt and display both sides of a merge conflict for the scratchpad.
 
     Git stores conflict versions as .context/scratchpad.enc.ours and
-
     .context/scratchpad.enc.theirs during a merge conflict. This command
-
-    decrypts both and displays them for manual resolution.'
+    decrypts both and displays them for manual resolution.
   short: Show both sides of a merge conflict
 pad.rm:
   short: Remove an entry by number
 pad.show:
-  long: "Output the raw text of entry N with no numbering prefix.\n\nDesigned for unix pipe composability. The output contains\
-    \ just the entry\ntext followed by a single trailing newline.\n\nFor blob entries, the decoded file content is printed\
-    \ (or written to disk\nwith --out).\n\nExamples:\n  ctx pad show 3\n  ctx pad show 3 --out ./recovered.md\n  ctx pad edit\
-    \ 1 --append \"$(ctx pad show 3)\""
+  long: |-
+    Output the raw text of entry N with no numbering prefix.
+
+    Designed for unix pipe composability. The output contains just the entry
+    text followed by a single trailing newline.
+
+    For blob entries, the decoded file content is printed (or written to disk
+    with --out).
+
+    Examples:
+      ctx pad show 3
+      ctx pad show 3 --out ./recovered.md
+      ctx pad edit 1 --append "$(ctx pad show 3)"
   short: Output raw text of an entry by number
 pause:
-  long: 'Pause all context nudge and reminder hooks for the current session.
-
+  long: |-
+    Pause all context nudge and reminder hooks for the current session.
     Security hooks (dangerous command blocking) and housekeeping hooks still fire.
 
-
     The session ID is read from stdin JSON (same as hooks) or --session-id flag.
-
-    Resume with: ctx resume'
+    Resume with: ctx resume
   short: Pause context hooks for this session
 permissions:
-  long: "Manage Claude Code permission snapshots.\n\nSave a curated settings.local.json as a golden image, then restore\n\
-    at session start to automatically drop session-accumulated permissions.\n\nSubcommands:\n  snapshot  Save settings.local.json\
-    \ as golden image\n  restore   Reset settings.local.json from golden image"
+  long: |-
+    Manage Claude Code permission snapshots.
+
+    Save a curated settings.local.json as a golden image, then restore
+    at session start to automatically drop session-accumulated permissions.
+
+    Subcommands:
+      snapshot  Save settings.local.json as golden image
+      restore   Reset settings.local.json from golden image
   short: Manage permission snapshots
 permissions.restore:
-  long: 'Replace .claude/settings.local.json with the golden image.
-
+  long: |-
+    Replace .claude/settings.local.json with the golden image.
 
     Prints a diff of dropped (session-accumulated) and restored permissions.
-
-    No-op if the files already match.'
+    No-op if the files already match.
   short: Reset settings.local.json from golden image
 permissions.snapshot:
-  long: 'Save .claude/settings.local.json as the golden image.
-
+  long: |-
+    Save .claude/settings.local.json as the golden image.
 
     The golden file (.claude/settings.golden.json) is a byte-for-byte copy
-
     of the current settings. It is meant to be committed to version control
-
     and shared with the team.
 
-
-    Overwrites any existing golden file.'
+    Overwrites any existing golden file.
   short: Save settings.local.json as golden image
 prompt:
-  long: "Manage prompt templates stored in .context/prompts/.\n\nPrompt templates are plain markdown files — no frontmatter,\
-    \ no build step.\nUse them as lightweight, reusable instructions for common tasks like\ncode reviews, refactoring, or\
-    \ explaining code.\n\nWhen invoked without a subcommand, lists all available prompts.\n\nSubcommands:\n  list     List\
-    \ available prompt templates\n  show     Print a prompt template to stdout\n  add      Create a new prompt from embedded\
-    \ template or stdin\n  rm       Remove a prompt template"
+  long: |-
+    Manage prompt templates stored in .context/prompts/.
+
+    Prompt templates are plain markdown files — no frontmatter, no build step.
+    Use them as lightweight, reusable instructions for common tasks like
+    code reviews, refactoring, or explaining code.
+
+    When invoked without a subcommand, lists all available prompts.
+
+    Subcommands:
+      list     List available prompt templates
+      show     Print a prompt template to stdout
+      add      Create a new prompt from embedded template or stdin
+      rm       Remove a prompt template
   short: Manage reusable prompt templates
 prompt.add:
-  long: "Create a new prompt template in .context/prompts/.\n\nBy default, creates from an embedded starter template if one\
-    \ exists\nwith the given name. Use --stdin to read content from standard input.\n\nExamples:\n  ctx prompt add code-review\n\
-    \  echo \"# My Prompt\" | ctx prompt add my-prompt --stdin"
+  long: |-
+    Create a new prompt template in .context/prompts/.
+
+    By default, creates from an embedded starter template if one exists
+    with the given name. Use --stdin to read content from standard input.
+
+    Examples:
+      ctx prompt add code-review
+      echo "# My Prompt" | ctx prompt add my-prompt --stdin
   short: Create a new prompt from embedded template or stdin
 prompt.list:
   short: List available prompt templates
@@ -336,75 +689,183 @@ prompt.rm:
 prompt.show:
   short: Print a prompt template to stdout
 recall:
-  long: "Browse and search AI session history from Claude Code and other tools.\n\nThe recall system parses JSONL session\
-    \ files and provides commands to\nlist sessions, view details, and search across your conversation history.\n\nSubcommands:\n\
-    \  list    List all parsed sessions\n  show    Show details of a specific session\n  export  Export sessions to editable\
-    \ journal files\n  lock    Protect journal entries from export regeneration\n  unlock  Remove lock protection from journal\
-    \ entries\n  sync    Sync lock state from journal frontmatter to state file\n\nExamples:\n  ctx recall list\n  ctx recall\
-    \ list --limit 5\n  ctx recall show abc123\n  ctx recall show --latest\n  ctx recall export --all\n  ctx recall lock 2026-01-21-session-abc12345.md\n\
-    \  ctx recall unlock --all\n  ctx recall sync"
+  long: |-
+    Browse and search AI session history from Claude Code and other tools.
+
+    The recall system parses JSONL session files and provides commands to
+    list sessions, view details, and search across your conversation history.
+
+    Subcommands:
+      list    List all parsed sessions
+      show    Show details of a specific session
+      export  Export sessions to editable journal files
+      lock    Protect journal entries from export regeneration
+      unlock  Remove lock protection from journal entries
+      sync    Sync lock state from journal frontmatter to state file
+
+    Examples:
+      ctx recall list
+      ctx recall list --limit 5
+      ctx recall show abc123
+      ctx recall show --latest
+      ctx recall export --all
+      ctx recall lock 2026-01-21-session-abc12345.md
+      ctx recall unlock --all
+      ctx recall sync
   short: Browse and search AI session history
 recall.export:
-  long: "Export AI sessions to .context/journal/ as editable Markdown files.\n\nExported files include session metadata, tool\
-    \ usage summary, and the full\nconversation. You can edit these files to add notes, highlight key moments,\nor clean up\
-    \ the transcript.\n\nBy default, only sessions from the current project are exported. Use\n--all-projects to include sessions\
-    \ from all projects.\n\nSafe by default: --all only exports new sessions. Existing files are\nskipped. Use --regenerate\
-    \ to re-export existing files (preserves YAML\nfrontmatter by default). Use --keep-frontmatter=false to discard\nenriched\
-    \ frontmatter during regeneration.\n\nLocked entries (via \"ctx recall lock\") are always skipped, regardless\nof flags.\n\
-    \nExamples:\n  ctx recall export abc123                              # Export one session\n  ctx recall export --all \
-    \                              # Export only new\n  ctx recall export --all --dry-run                     # Preview changes\n\
-    \  ctx recall export --all --regenerate                  # Re-export (prompts)\n  ctx recall export --all --regenerate\
-    \ -y               # Re-export, no prompt\n  ctx recall export --all --regenerate --keep-frontmatter=false -y  # Discard\
-    \ frontmatter"
+  long: |-
+    Export AI sessions to .context/journal/ as editable Markdown files.
+
+    Exported files include session metadata, tool usage summary, and the full
+    conversation. You can edit these files to add notes, highlight key moments,
+    or clean up the transcript.
+
+    By default, only sessions from the current project are exported. Use
+    --all-projects to include sessions from all projects.
+
+    Safe by default: --all only exports new sessions. Existing files are
+    skipped. Use --regenerate to re-export existing files (preserves YAML
+    frontmatter by default). Use --keep-frontmatter=false to discard
+    enriched frontmatter during regeneration.
+
+    Locked entries (via "ctx recall lock") are always skipped, regardless
+    of flags.
+
+    Examples:
+      ctx recall export abc123                              # Export one session
+      ctx recall export --all                               # Export only new
+      ctx recall export --all --dry-run                     # Preview changes
+      ctx recall export --all --regenerate                  # Re-export (prompts)
+      ctx recall export --all --regenerate -y               # Re-export, no prompt
+      ctx recall export --all --regenerate --keep-frontmatter=false -y  # Discard frontmatter
   short: Export sessions to editable journal files
 recall.list:
-  long: "List AI sessions from the current project.\n\nSessions are sorted by date (newest first) and display:\n  - Session\
-    \ slug (human-friendly name)\n  - Project name\n  - Start time and duration\n  - Turn count (user messages)\n  - Token\
-    \ usage\n\nBy default, only sessions from the current project are shown.\nUse --all-projects to see sessions from all\
-    \ projects.\n\nDate filtering: --since and --until accept YYYY-MM-DD format.\nBoth are inclusive.\n\nExamples:\n  ctx\
-    \ recall list\n  ctx recall list --limit 5\n  ctx recall list --all-projects\n  ctx recall list --project ctx\n  ctx recall\
-    \ list --tool claude-code\n  ctx recall list --since 2026-03-01\n  ctx recall list --since 2026-03-01 --until 2026-03-05"
+  long: |-
+    List AI sessions from the current project.
+
+    Sessions are sorted by date (newest first) and display:
+      - Session slug (human-friendly name)
+      - Project name
+      - Start time and duration
+      - Turn count (user messages)
+      - Token usage
+
+    By default, only sessions from the current project are shown.
+    Use --all-projects to see sessions from all projects.
+
+    Date filtering: --since and --until accept YYYY-MM-DD format.
+    Both are inclusive.
+
+    Examples:
+      ctx recall list
+      ctx recall list --limit 5
+      ctx recall list --all-projects
+      ctx recall list --project ctx
+      ctx recall list --tool claude-code
+      ctx recall list --since 2026-03-01
+      ctx recall list --since 2026-03-01 --until 2026-03-05
   short: List all parsed sessions
 recall.lock:
-  long: "Lock journal entries to prevent export --regenerate from overwriting them.\n\nLocked entries are skipped during export\
-    \ regardless of --regenerate or --force.\nUse \"ctx recall unlock\" to remove the protection.\n\nThe pattern matches against\
-    \ filenames by slug, date, or short ID (same\nmatching as export). Locking a multi-part entry locks all parts.\n\nThe\
-    \ lock is recorded in .context/journal/.state.json (source of truth) and\na \"locked: true\" line is added to the file's\
-    \ YAML frontmatter for visibility.\n\nExamples:\n  ctx recall lock 2026-01-21-session-abc12345.md\n  ctx recall lock abc12345\n\
-    \  ctx recall lock --all"
+  long: |-
+    Lock journal entries to prevent export --regenerate from overwriting them.
+
+    Locked entries are skipped during export regardless of --regenerate or --force.
+    Use "ctx recall unlock" to remove the protection.
+
+    The pattern matches against filenames by slug, date, or short ID (same
+    matching as export). Locking a multi-part entry locks all parts.
+
+    The lock is recorded in .context/journal/.state.json (source of truth) and
+    a "locked: true" line is added to the file's YAML frontmatter for visibility.
+
+    Examples:
+      ctx recall lock 2026-01-21-session-abc12345.md
+      ctx recall lock abc12345
+      ctx recall lock --all
   short: Protect journal entries from export regeneration
 recall.show:
-  long: "Show detailed information about a specific session.\n\nThe session ID can be:\n  - Full session UUID\n  - Partial\
-    \ match (first few characters)\n  - Session slug name\n\nUse --latest to show the most recent session.\nBy default, only\
-    \ searches sessions from the current project.\n\nExamples:\n  ctx recall show abc123\n  ctx recall show gleaming-wobbling-sutherland\n\
-    \  ctx recall show --latest\n  ctx recall show --latest --full\n  ctx recall show abc123 --all-projects"
+  long: |-
+    Show detailed information about a specific session.
+
+    The session ID can be:
+      - Full session UUID
+      - Partial match (first few characters)
+      - Session slug name
+
+    Use --latest to show the most recent session.
+    By default, only searches sessions from the current project.
+
+    Examples:
+      ctx recall show abc123
+      ctx recall show gleaming-wobbling-sutherland
+      ctx recall show --latest
+      ctx recall show --latest --full
+      ctx recall show abc123 --all-projects
   short: Show details of a specific session
 recall.sync:
-  long: "Scan journal markdowns and sync their lock state to .state.json.\n\nThis is the sister command to \"ctx recall lock\"\
-    . Instead of marking files\nlocked in state and updating frontmatter, it reads \"locked: true\" from\neach file's YAML\
-    \ frontmatter and updates .state.json to match.\n\nTypical workflow:\n  1. Enrich journal entries (add \"locked: true\"\
-    \ to frontmatter)\n  2. Run \"ctx recall sync\" to propagate lock state to .state.json\n\nFiles with \"locked: true\"\
-    \ in frontmatter will be marked locked in state.\nFiles without a \"locked:\" line (or with \"locked: false\") will have\
-    \ their\nlock cleared if one exists in state.\n\nExamples:\n  ctx recall sync"
+  long: |-
+    Scan journal markdowns and sync their lock state to .state.json.
+
+    This is the sister command to "ctx recall lock". Instead of marking files
+    locked in state and updating frontmatter, it reads "locked: true" from
+    each file's YAML frontmatter and updates .state.json to match.
+
+    Typical workflow:
+      1. Enrich journal entries (add "locked: true" to frontmatter)
+      2. Run "ctx recall sync" to propagate lock state to .state.json
+
+    Files with "locked: true" in frontmatter will be marked locked in state.
+    Files without a "locked:" line (or with "locked: false") will have their
+    lock cleared if one exists in state.
+
+    Examples:
+      ctx recall sync
   short: Sync lock state from journal frontmatter to state file
 recall.unlock:
-  long: "Unlock journal entries to allow export --regenerate to overwrite them.\n\nThe pattern matches against filenames by\
-    \ slug, date, or short ID (same\nmatching as export). Unlocking a multi-part entry unlocks all parts.\n\nExamples:\n \
-    \ ctx recall unlock 2026-01-21-session-abc12345.md\n  ctx recall unlock abc12345\n  ctx recall unlock --all"
+  long: |-
+    Unlock journal entries to allow export --regenerate to overwrite them.
+
+    The pattern matches against filenames by slug, date, or short ID (same
+    matching as export). Unlocking a multi-part entry unlocks all parts.
+
+    Examples:
+      ctx recall unlock 2026-01-21-session-abc12345.md
+      ctx recall unlock abc12345
+      ctx recall unlock --all
   short: Remove lock protection from journal entries
 reindex:
-  long: "Regenerate the quick-reference index at the top of both DECISIONS.md\nand LEARNINGS.md in a single invocation.\n\n\
-    This is a convenience wrapper around:\n  ctx decisions reindex\n  ctx learnings reindex\n\nThe index is a compact table\
-    \ showing date and title for each entry,\nallowing AI agents to quickly scan entries without reading the full file.\n\n\
-    Run this after manual edits to either file or when migrating existing\nfiles to use the index format.\n\nExamples:\n \
-    \ ctx reindex"
+  long: |-
+    Regenerate the quick-reference index at the top of both DECISIONS.md
+    and LEARNINGS.md in a single invocation.
+
+    This is a convenience wrapper around:
+      ctx decisions reindex
+      ctx learnings reindex
+
+    The index is a compact table showing date and title for each entry,
+    allowing AI agents to quickly scan entries without reading the full file.
+
+    Run this after manual edits to either file or when migrating existing
+    files to use the index format.
+
+    Examples:
+      ctx reindex
   short: Regenerate indices for DECISIONS.md and LEARNINGS.md
 remind:
-  long: "Manage session-scoped reminders stored in .context/reminders.json.\n\nReminders surface verbatim at session start\
-    \ and repeat every session until\ndismissed. Use --after to gate a reminder until a specific date.\n\nWhen invoked with\
-    \ a text argument, adds a reminder (equivalent to \"remind add\").\nWhen invoked with no arguments, lists all reminders.\n\
-    \nSubcommands:\n  add      Add a reminder (default action)\n  list     Show all pending reminders\n  dismiss  Dismiss\
-    \ one or all reminders"
+  long: |-
+    Manage session-scoped reminders stored in .context/reminders.json.
+
+    Reminders surface verbatim at session start and repeat every session until
+    dismissed. Use --after to gate a reminder until a specific date.
+
+    When invoked with a text argument, adds a reminder (equivalent to "remind add").
+    When invoked with no arguments, lists all reminders.
+
+    Subcommands:
+      add      Add a reminder (default action)
+      list     Show all pending reminders
+      dismiss  Dismiss one or all reminders
   short: Session-scoped reminders
 remind.add:
   short: Add a reminder
@@ -413,601 +874,513 @@ remind.dismiss:
 remind.list:
   short: Show all pending reminders
 resume:
-  long: 'Resume context hooks after a pause. Silent no-op if not paused.
-
+  long: |-
+    Resume context hooks after a pause. Silent no-op if not paused.
 
-    The session ID is read from stdin JSON (same as hooks) or --session-id flag.'
+    The session ID is read from stdin JSON (same as hooks) or --session-id flag.
   short: Resume context hooks for this session
 serve:
-  long: "Serve a static site using zensical.\n\nIf no directory is specified, serves the journal site (.context/journal-site).\n\
-    \nRequires zensical to be installed:\n  pipx install zensical\n\nExamples:\n  ctx serve                           # Serve\
-    \ journal site\n  ctx serve .context/journal-site     # Serve specific directory\n  ctx serve ./docs                 \
-    \   # Serve docs folder"
+  long: |-
+    Serve a static site using zensical.
+
+    If no directory is specified, serves the journal site (.context/journal-site).
+
+    Requires zensical to be installed:
+      pipx install zensical
+
+    Examples:
+      ctx serve                           # Serve journal site
+      ctx serve .context/journal-site     # Serve specific directory
+      ctx serve ./docs                    # Serve docs folder
   short: Serve a static site locally via zensical
 site:
-  long: "Manage the ctx.ist static site.\n\nSubcommands:\n  feed    Generate an Atom 1.0 feed from blog posts\n\nExamples:\n\
-    \  ctx site feed                              # Generate site/feed.xml\n  ctx site feed --out /tmp/feed.xml          #\
-    \ Custom output path\n  ctx site feed --base-url https://example.com"
+  long: |-
+    Manage the ctx.ist static site.
+
+    Subcommands:
+      feed    Generate an Atom 1.0 feed from blog posts
+
+    Examples:
+      ctx site feed                              # Generate site/feed.xml
+      ctx site feed --out /tmp/feed.xml          # Custom output path
+      ctx site feed --base-url https://example.com
   short: Site management commands
 site.feed:
-  long: "Generate an Atom 1.0 feed from finalized blog posts in docs/blog/.\n\nParses YAML frontmatter for title, date, author,\
-    \ and topics. Extracts\na summary from the first paragraph after the heading. Only posts with\nreviewed_and_finalized:\
-    \ true are included.\n\nExamples:\n  ctx site feed\n  ctx site feed --out /tmp/feed.xml\n  ctx site feed --base-url https://example.com"
+  long: |-
+    Generate an Atom 1.0 feed from finalized blog posts in docs/blog/.
+
+    Parses YAML frontmatter for title, date, author, and topics. Extracts
+    a summary from the first paragraph after the heading. Only posts with
+    reviewed_and_finalized: true are included.
+
+    Examples:
+      ctx site feed
+      ctx site feed --out /tmp/feed.xml
+      ctx site feed --base-url https://example.com
   short: Generate an Atom 1.0 feed from blog posts
 status:
-  long: "Display a summary of the current .context/ directory including:\n  - Number of context files\n  - Estimated token\
-    \ count\n  - Status of each file\n  - Recent activity\n\nUse --verbose to include content previews for each file."
+  long: |-
+    Display a summary of the current .context/ directory including:
+      - Number of context files
+      - Estimated token count
+      - Status of each file
+      - Recent activity
+
+    Use --verbose to include content previews for each file.
   short: Show context summary with token estimate
 sync:
-  long: "Scan the codebase and reconcile context files with current state.\n\nActions performed:\n  - Scan for new directories\
-    \ that should be in ARCHITECTURE.md\n  - Check for package.json/go.mod changes\n  - Identify stale references\n  - Suggest\
-    \ updates to context files\n\nUse --dry-run to see what would change without modifying files."
+  long: |-
+    Scan the codebase and reconcile context files with current state.
+
+    Actions performed:
+      - Scan for new directories that should be in ARCHITECTURE.md
+      - Check for package.json/go.mod changes
+      - Identify stale references
+      - Suggest updates to context files
+
+    Use --dry-run to see what would change without modifying files.
   short: Reconcile context with codebase
 system:
+  long: |-
+    System diagnostics and hook commands.
+
+    Subcommands:
+      backup               Backup context and Claude data
+      resources            Show system resource usage (memory, swap, disk, load)
+      bootstrap            Print context location for AI agents
+      message              Manage hook message templates (list/show/edit/reset)
+
+      stats                Show session token usage stats
+
+    Plumbing subcommands (used by skills and automation):
+      mark-journal         Update journal processing state
+      mark-wrapped-up      Suppress checkpoint nudges after wrap-up
+      pause                Pause context hooks for this session
+      resume               Resume context hooks for this session
+      prune                Clean stale per-session state files
+      events               Query the local hook event log
+
+    Hook subcommands (Claude Code plugin — safe to run manually):
+      context-load-gate           Context file read directive (PreToolUse)
+      check-context-size          Context size checkpoint
+      check-ceremonies            Session ceremony adoption nudge
+      check-persistence           Context persistence nudge
+      check-journal               Journal maintenance reminder
+      check-resources             Resource pressure warning (DANGER only)
+      check-knowledge             Knowledge file growth nudge
+      check-reminders             Pending reminders relay
+      check-version               Version update nudge
+      check-map-staleness         Architecture map staleness nudge
+      block-non-path-ctx          Block non-PATH ctx invocations
+      block-dangerous-commands    Block dangerous command patterns (project-local)
+      check-backup-age            Backup staleness check (project-local)
+      check-task-completion       Task completion nudge after edits
+      post-commit                 Post-commit context capture nudge
+      qa-reminder                 QA reminder before completion
+      specs-nudge                 Plan-to-specs directory nudge (PreToolUse)
+      check-memory-drift          Memory drift nudge (MEMORY.md changed)
+      heartbeat                   Session heartbeat webhook (no stdout)
   short: System diagnostics and hook commands
 system.backup:
-  long: "Create timestamped tar.gz archives of project context and/or global\nClaude Code data. Optionally copies archives\
-    \ to an SMB share.\n\nScopes:\n  project  .context/, .claude/, ideas/, ~/.bashrc\n  global   ~/.claude/ (excludes todos/)\n\
-    \  all      Both project and global (default)\n\nEnvironment:\n  CTX_BACKUP_SMB_URL    - SMB share URL (e.g. smb://host/share)\n\
-    \  CTX_BACKUP_SMB_SUBDIR - Subdirectory on share (default: ctx-sessions)"
+  long: |-
+    Create timestamped tar.gz archives of project context and/or global
+    Claude Code data. Optionally copies archives to an SMB share.
+
+    Scopes:
+      project  .context/, .claude/, ideas/, ~/.bashrc
+      global   ~/.claude/ (excludes todos/)
+      all      Both project and global (default)
+
+    Environment:
+      CTX_BACKUP_SMB_URL    - SMB share URL (e.g. smb://host/share)
+      CTX_BACKUP_SMB_SUBDIR - Subdirectory on share (default: ctx-sessions)
   short: Backup context and Claude data
 system.blockdangerouscommands:
-  long: 'Regex safety net for commands that the deny-list cannot express.
-
+  long: |-
+    Regex safety net for commands that the deny-list cannot express.
     Catches mid-command sudo, mid-command git push, and binary installs
-
     to bin directories.
 
-
     Hook event: PreToolUse (Bash)
-
     Output: {"decision":"block","reason":"..."} or silent
-
-    Silent when: command doesn''t match any dangerous pattern'
+    Silent when: command doesn't match any dangerous pattern
   short: Block dangerous command patterns (regex safety net)
 system.blocknonpathctx:
-  long: 'Blocks ./ctx, go run ./cmd/ctx, and absolute-path ctx invocations.
-
+  long: |-
+    Blocks ./ctx, go run ./cmd/ctx, and absolute-path ctx invocations.
     Enforces the CONSTITUTION.md rule: always use ctx from PATH.
-
     Outputs a JSON block decision that prevents the tool call.
 
-
     Hook event: PreToolUse (Bash)
-
     Output: {"decision":"block","reason":"..."} or silent
-
-    Silent when: command doesn''t invoke ctx via a non-PATH route'
+    Silent when: command doesn't invoke ctx via a non-PATH route
   short: Block non-PATH ctx invocations
 system.bootstrap:
   short: Print context location for AI agents
 system.checkbackupage:
-  long: "Checks if the .context backup is stale (>2 days old) or the SMB share\nis unmounted. Outputs a VERBATIM relay warning\
-    \ when issues are found.\nThrottled to once per day.\n\nEnvironment:\n  CTX_BACKUP_SMB_URL - SMB share URL (e.g. smb://myhost/myshare).\n\
-    \                       If unset, the SMB mount check is skipped.\n\nHook event: UserPromptSubmit\nOutput: VERBATIM relay\
-    \ with warning box, silent otherwise\nSilent when: backup is fresh, or already checked today"
+  long: |-
+    Checks if the .context backup is stale (>2 days old) or the SMB share
+    is unmounted. Outputs a VERBATIM relay warning when issues are found.
+    Throttled to once per day.
+
+    Environment:
+      CTX_BACKUP_SMB_URL - SMB share URL (e.g. smb://myhost/myshare).
+                           If unset, the SMB mount check is skipped.
+
+    Hook event: UserPromptSubmit
+    Output: VERBATIM relay with warning box, silent otherwise
+    Silent when: backup is fresh, or already checked today
   short: Backup staleness check hook
 system.checkceremonies:
-  long: 'Scans the last 3 journal entries for /ctx-remember and /ctx-wrap-up
-
+  long: |-
+    Scans the last 3 journal entries for /ctx-remember and /ctx-wrap-up
     usage. If either is missing, emits a VERBATIM relay nudge encouraging
-
     adoption. Throttled to once per day.
 
-
     Hook event: UserPromptSubmit
-
     Output: VERBATIM relay (when ceremonies missing), silent otherwise
-
-    Silent when: both ceremonies found in recent sessions'
+    Silent when: both ceremonies found in recent sessions
   short: Session ceremony nudge hook
 system.checkcontextsize:
-  long: "Counts prompts per session and emits VERBATIM relay reminders at\nadaptive intervals, prompting the user to consider\
-    \ wrapping up.\n\n  Prompts  1-15: silent\n  Prompts 16-30: every 5th prompt\n  Prompts   30+: every 3rd prompt\n\nAlso\
-    \ monitors actual context window token usage from session JSONL data.\nFires an independent warning when context window\
-    \ exceeds 80%, regardless\nof prompt count.\n\nHook event: UserPromptSubmit\nOutput: VERBATIM relay (when triggered),\
-    \ silent otherwise\nSilent when: early in session or between checkpoints"
-  short: Context size checkpoint hook
-system.checkjournal:
-  long: 'Detects unexported Claude Code sessions and unenriched journal entries,
+  long: |-
+    Counts prompts per session and emits VERBATIM relay reminders at
+    adaptive intervals, prompting the user to consider wrapping up.
 
-    then prints actionable commands. Throttled to once per day.
+      Prompts  1-15: silent
+      Prompts 16-30: every 5th prompt
+      Prompts   30+: every 3rd prompt
 
+    Also monitors actual context window token usage from session JSONL data.
+    Fires an independent warning when context window exceeds 80%, regardless
+    of prompt count.
 
     Hook event: UserPromptSubmit
+    Output: VERBATIM relay (when triggered), silent otherwise
+    Silent when: early in session or between checkpoints
+  short: Context size checkpoint hook
+system.checkjournal:
+  long: |-
+    Detects unexported Claude Code sessions and unenriched journal entries,
+    then prints actionable commands. Throttled to once per day.
 
+    Hook event: UserPromptSubmit
     Output: VERBATIM relay with export/enrich commands, silent otherwise
-
-    Silent when: no unexported sessions and no unenriched entries'
+    Silent when: no unexported sessions and no unenriched entries
   short: Journal export/enrich reminder hook
 system.checkknowledge:
-  long: "Counts entries in DECISIONS.md and LEARNINGS.md and lines in\nCONVENTIONS.md, and outputs a VERBATIM relay nudge\
-    \ when any file exceeds\nthe configured threshold. Throttled to once per day.\n\n  Learnings threshold:   entry_count_learnings\
-    \   (default 30)\n  Decisions threshold:   entry_count_decisions    (default 20)\n  Conventions threshold: convention_line_count\
-    \    (default 200)\n\nHook event: UserPromptSubmit\nOutput: VERBATIM relay (when thresholds exceeded), silent otherwise\n\
-    Silent when: below thresholds, already nudged today, or uninitialized"
+  long: |-
+    Counts entries in DECISIONS.md and LEARNINGS.md and lines in
+    CONVENTIONS.md, and outputs a VERBATIM relay nudge when any file exceeds
+    the configured threshold. Throttled to once per day.
+
+      Learnings threshold:   entry_count_learnings   (default 30)
+      Decisions threshold:   entry_count_decisions    (default 20)
+      Conventions threshold: convention_line_count    (default 200)
+
+    Hook event: UserPromptSubmit
+    Output: VERBATIM relay (when thresholds exceeded), silent otherwise
+    Silent when: below thresholds, already nudged today, or uninitialized
   short: Knowledge file growth nudge
 system.checkmapstaleness:
-  long: 'Checks whether map-tracking.json is stale (>30 days) and there are
-
+  long: |-
+    Checks whether map-tracking.json is stale (>30 days) and there are
     commits touching internal/ since the last map refresh. Outputs a VERBATIM
-
-    relay nudge suggesting /ctx-map when both conditions are met.
-
+    relay nudge suggesting /ctx-architecture when both conditions are met.
 
     Hook event: UserPromptSubmit
-
     Output: VERBATIM relay (when stale and modules changed), silent otherwise
-
     Silent when: map-tracking.json missing or fresh, opted out, no module
-
-    commits, already nudged today, or uninitialized'
+    commits, already nudged today, or uninitialized
   short: Architecture map staleness nudge
 system.checkmemorydrift:
   short: Memory drift nudge
 system.checkpersistence:
-  long: "Tracks prompts since the last .context/ file modification and nudges\nthe agent to persist learnings, decisions,\
-    \ or task updates.\n\n  Prompts  1-10: silent (too early)\n  Prompts 11-25: nudge once at prompt 20 since last modification\n\
-    \  Prompts   25+: every 15th prompt since last modification\n\nHook event: UserPromptSubmit\nOutput: agent directive (when\
-    \ triggered), silent otherwise\nSilent when: context files were recently modified"
+  long: |-
+    Tracks prompts since the last .context/ file modification and nudges
+    the agent to persist learnings, decisions, or task updates.
+
+      Prompts  1-10: silent (too early)
+      Prompts 11-25: nudge once at prompt 20 since last modification
+      Prompts   25+: every 15th prompt since last modification
+
+    Hook event: UserPromptSubmit
+    Output: agent directive (when triggered), silent otherwise
+    Silent when: context files were recently modified
   short: Persistence nudge hook
 system.checkreminders:
   short: Surface pending reminders at session start
 system.checkresources:
-  long: "Collects system resource metrics (memory, swap, disk, load) and outputs\na VERBATIM relay warning when any resource\
-    \ hits DANGER severity.\nSilent at WARNING level and below.\n\n  Memory DANGER: >= 90% used    Swap DANGER: >= 75% used\n\
-    \  Disk DANGER:   >= 95% full    Load DANGER: >= 1.5x CPUs\n\nFor full resource stats at any severity, use: ctx system\
-    \ resources\n\nHook event: UserPromptSubmit\nOutput: VERBATIM relay (DANGER only), silent otherwise\nSilent when: all\
-    \ resources below DANGER thresholds"
+  long: |-
+    Collects system resource metrics (memory, swap, disk, load) and outputs
+    a VERBATIM relay warning when any resource hits DANGER severity.
+    Silent at WARNING level and below.
+
+      Memory DANGER: >= 90% used    Swap DANGER: >= 75% used
+      Disk DANGER:   >= 95% full    Load DANGER: >= 1.5x CPUs
+
+    For full resource stats at any severity, use: ctx system resources
+
+    Hook event: UserPromptSubmit
+    Output: VERBATIM relay (DANGER only), silent otherwise
+    Silent when: all resources below DANGER thresholds
   short: Resource pressure hook
 system.checktaskcompletion:
-  long: 'Counts Edit/Write tool calls and periodically nudges the agent
-
+  long: |-
+    Counts Edit/Write tool calls and periodically nudges the agent
     to check whether any tasks should be marked done in TASKS.md.
 
-
     Hook event: PostToolUse (Edit, Write)
-
     Output: agent directive every N edits, silent otherwise
-
-    Silent when: counter below threshold, interval is 0, or session is paused'
+    Silent when: counter below threshold, interval is 0, or session is paused
   short: Task completion nudge after edits
 system.checkversion:
-  long: 'Compares the ctx binary version against the embedded plugin version.
-
+  long: |-
+    Compares the ctx binary version against the embedded plugin version.
     Warns when the binary is older than the plugin expects, which happens
-
-    when the marketplace plugin updates but the binary hasn''t been
-
+    when the marketplace plugin updates but the binary hasn't been
     reinstalled. Throttled to once per day. Skipped for dev builds.
 
-
     Hook event: UserPromptSubmit
-
     Output: VERBATIM relay with reinstall command, silent otherwise
-
-    Silent when: versions match, dev build, or already checked today'
+    Silent when: versions match, dev build, or already checked today
   short: Binary/plugin version drift detection hook
 system.contextloadgate:
-  long: 'Auto-injects project context into the agent''s context window.
-
+  long: |-
+    Auto-injects project context into the agent's context window.
     Fires on the first tool use per session via PreToolUse hook. Subsequent
-
     tool calls in the same session are silent (tracked by session marker file).
 
-
     Reads context files directly and injects content — no delegation to
-
     bootstrap command, no agent compliance required.
-
     See specs/context-load-gate-v2.md for design rationale.
 
-
     Hook event: PreToolUse (.*)
-
     Output: JSON HookResponse (additionalContext) on first tool use, silent otherwise
-
-    Silent when: marker exists for session_id, or context not initialized'
+    Silent when: marker exists for session_id, or context not initialized
   short: Auto-inject project context on first tool use
 system.events:
-  long: "Query the local event log (requires event_log: true in .ctxrc).\n\nReads events from .context/state/events.jsonl\
-    \ and outputs them in\nhuman-readable or raw JSONL format. All filter flags use intersection\n(AND) logic.\n\nFlags:\n\
-    \  --hook       Filter by hook name\n  --session    Filter by session ID\n  --event      Filter by event type (relay,\
-    \ nudge)\n  --last       Show last N events (default 50)\n  --json       Output raw JSONL (for piping to jq)\n  --all\
-    \        Include rotated log file"
+  long: |-
+    Query the local event log (requires event_log: true in .ctxrc).
+
+    Reads events from .context/state/events.jsonl and outputs them in
+    human-readable or raw JSONL format. All filter flags use intersection
+    (AND) logic.
+
+    Flags:
+      --hook       Filter by hook name
+      --session    Filter by session ID
+      --event      Filter by event type (relay, nudge)
+      --last       Show last N events (default 50)
+      --json       Output raw JSONL (for piping to jq)
+      --all        Include rotated log file
   short: Query the local hook event log
 system.heartbeat:
-  long: 'Sends a heartbeat webhook notification on every prompt, providing
-
+  long: |-
+    Sends a heartbeat webhook notification on every prompt, providing
     continuous session-alive visibility with metadata (prompt count, session ID,
-
     context modification status).
 
-
     Unlike other hooks, the heartbeat never produces stdout — the agent never
-
     sees it. It only fires a webhook and writes to the event log.
 
-
     Hook event: UserPromptSubmit
-
     Output: none (webhook + event log only)
-
-    Silent when: not initialized, paused, or no webhook configured'
+    Silent when: not initialized, paused, or no webhook configured
   short: Session heartbeat webhook
 system.markjournal:
+  long: |-
+    Mark a journal entry as having completed a processing stage.
+
+    Valid stages: %s
+
+    The state is recorded in .context/journal/.state.json with today's date.
+
+    Examples:
+      ctx system mark-journal 2026-01-21-session-abc12345.md exported
+      ctx system mark-journal 2026-01-21-session-abc12345.md enriched
+      ctx system mark-journal 2026-01-21-session-abc12345.md normalized
+      ctx system mark-journal 2026-01-21-session-abc12345.md fences_verified
   short: Update journal processing state
 system.markwrappedup:
-  long: 'Write a marker file that suppresses context checkpoint nudges
-
+  long: |-
+    Write a marker file that suppresses context checkpoint nudges
     for 2 hours. Called by /ctx-wrap-up after persisting context.
 
-
     The check-context-size hook checks this marker before emitting
-
     a checkpoint. If the marker exists and is less than 2 hours old,
-
     the nudge is suppressed.
 
-
-    This is a plumbing command — use /ctx-wrap-up instead.'
+    This is a plumbing command — use /ctx-wrap-up instead.
   short: Suppress checkpoint nudges after wrap-up
 system.message:
-  long: "Manage hook message templates.\n\nHook messages control what text hooks emit. The hook logic (when to\nfire, counting,\
-    \ state tracking) is universal. The messages are opinions\nthat can be customized per-project.\n\nSubcommands:\n  list\
-    \     Show all hook messages with category and override status\n  show     Print the effective message template for a\
-    \ hook/variant\n  edit     Copy the embedded default to .context/ for editing\n  reset    Delete a user override and revert\
-    \ to embedded default"
+  long: |-
+    Manage hook message templates.
+
+    Hook messages control what text hooks emit. The hook logic (when to
+    fire, counting, state tracking) is universal. The messages are opinions
+    that can be customized per-project.
+
+    Subcommands:
+      list     Show all hook messages with category and override status
+      show     Print the effective message template for a hook/variant
+      edit     Copy the embedded default to .context/ for editing
+      reset    Delete a user override and revert to embedded default
   short: Manage hook message templates
+system.message.edit:
+  short: Copy the embedded default to .context/ for editing
+system.message.list:
+  short: Show all hook messages with category and override status
+system.message.reset:
+  short: Delete a user override and revert to embedded default
+system.message.show:
+  short: Print the effective message template for a hook/variant
 system.pause:
-  long: 'Creates a session-scoped pause marker. While paused, all nudge
-
+  long: |-
+    Creates a session-scoped pause marker. While paused, all nudge
     and reminder hooks no-op. Security and housekeeping hooks still fire.
 
-
-    The session ID is read from stdin JSON (same as hooks) or --session-id flag.'
+    The session ID is read from stdin JSON (same as hooks) or --session-id flag.
   short: Pause context hooks for this session
 system.postcommit:
-  long: 'Detects git commit commands and nudges the agent to offer context
-
+  long: |-
+    Detects git commit commands and nudges the agent to offer context
     capture (decision or learning) and suggest running lints/tests.
-
     Skips amend commits.
 
-
     Hook event: PostToolUse (Bash)
-
     Output: agent directive after git commits, silent otherwise
-
-    Silent when: command is not a git commit, or is an amend'
+    Silent when: command is not a git commit, or is an amend
   short: Post-commit context capture nudge
 system.prune:
-  long: "Remove per-session state files from .context/state/ that are\nolder than the specified age. Session state files are\
-    \ identified by\nUUID suffixes (e.g. context-check-, heartbeat-).\n\nGlobal files without session\
-    \ IDs (events.jsonl, memory-import.json, etc.)\nare always preserved.\n\nExamples:\n  ctx system prune              #\
-    \ Prune files older than 7 days\n  ctx system prune --days 3     # Prune files older than 3 days\n  ctx system prune --dry-run\
-    \    # Show what would be pruned"
+  long: |-
+    Remove per-session state files from .context/state/ that are
+    older than the specified age. Session state files are identified by
+    UUID suffixes (e.g. context-check-, heartbeat-).
+
+    Global files without session IDs (events.jsonl, memory-import.json, etc.)
+    are always preserved.
+
+    Examples:
+      ctx system prune              # Prune files older than 7 days
+      ctx system prune --days 3     # Prune files older than 3 days
+      ctx system prune --dry-run    # Show what would be pruned
   short: Clean stale per-session state files
 system.qareminder:
-  long: 'Emits a hard reminder to lint and test the entire project before
-
+  long: |-
+    Emits a hard reminder to lint and test the entire project before
     committing. Fires on Bash tool use when the command contains "git",
-
     placing reinforcement at the commit sequence rather than during edits.
 
-
     Hook event: PreToolUse (Bash)
-
     Output: agent directive (when command contains "git" and .context/ is initialized)
-
-    Silent when: .context/ not initialized or command does not contain "git"'
+    Silent when: .context/ not initialized or command does not contain "git"
   short: QA reminder hook
 system.resources:
   short: Show system resource usage (memory, swap, disk, load)
 system.resume:
-  long: 'Removes the session-scoped pause marker. Hooks resume normal
-
+  long: |-
+    Removes the session-scoped pause marker. Hooks resume normal
     behavior. Silent no-op if not paused.
 
-
-    The session ID is read from stdin JSON (same as hooks) or --session-id flag.'
+    The session ID is read from stdin JSON (same as hooks) or --session-id flag.
   short: Resume context hooks for this session
 system.specsnudge:
-  long: 'Emits a directive reminding the agent to save plans to specs/
-
+  long: |-
+    Emits a directive reminding the agent to save plans to specs/
     for release tracking. Fires on EnterPlanMode tool use.
 
-
     Hook event: PreToolUse (EnterPlanMode)
-
     Output: agent directive (always, when .context/ is initialized)
-
-    Silent when: .context/ not initialized'
+    Silent when: .context/ not initialized
   short: Plan-to-specs directory nudge
 system.stats:
-  long: "Display per-session token usage statistics from stats JSONL files.\n\nBy default, shows the last 20 entries across\
-    \ all sessions. Use --follow\nto stream new entries as they arrive (like tail -f).\n\nFlags:\n  --follow, -f   Stream\
-    \ new entries as they arrive\n  --session, -s  Filter by session ID (prefix match)\n  --last, -n     Show last N entries\
-    \ (default 20)\n  --json, -j     Output raw JSONL"
+  long: |-
+    Display per-session token usage statistics from stats JSONL files.
+
+    By default, shows the last 20 entries across all sessions. Use --follow
+    to stream new entries as they arrive (like tail -f).
+
+    Flags:
+      --follow, -f   Stream new entries as they arrive
+      --session, -s  Filter by session ID (prefix match)
+      --last, -n     Show last N entries (default 20)
+      --json, -j     Output raw JSONL
   short: Show session token usage stats
 task:
-  long: "Manage task archival and snapshots.\n\nTasks can be archived to move completed items out of TASKS.md while\npreserving\
-    \ them for historical reference. Snapshots create point-in-time\ncopies without modifying the original.\n\nSubcommands:\n\
-    \  archive   Move completed tasks to timestamped archive file\n  snapshot  Create point-in-time snapshot of TASKS.md"
+  long: |-
+    Manage task archival and snapshots.
+
+    Tasks can be archived to move completed items out of TASKS.md while
+    preserving them for historical reference. Snapshots create point-in-time
+    copies without modifying the original.
+
+    Subcommands:
+      archive   Move completed tasks to timestamped archive file
+      snapshot  Create point-in-time snapshot of TASKS.md
   short: Manage task archival and snapshots
 task.archive:
-  long: "Move completed tasks from TASKS.md to an archive file.\n\nArchive files are stored in .context/archive/ with timestamped\
-    \ names:\n  .context/archive/tasks-YYYY-MM-DD.md\n\nThe archive preserves Phase structure for traceability. Completed\
-    \ tasks\n(marked with [x]) are moved; pending tasks ([ ]) remain in TASKS.md.\n\nUse --dry-run to preview changes without\
-    \ modifying files."
+  long: |-
+    Move completed tasks from TASKS.md to an archive file.
+
+    Archive files are stored in .context/archive/ with timestamped names:
+      .context/archive/tasks-YYYY-MM-DD.md
+
+    The archive preserves Phase structure for traceability. Completed tasks
+    (marked with [x]) are moved; pending tasks ([ ]) remain in TASKS.md.
+
+    Use --dry-run to preview changes without modifying files.
   short: Move completed tasks to timestamped archive file
 task.snapshot:
-  long: "Create a point-in-time snapshot of TASKS.md without modifying the original.\n\nSnapshots are stored in .context/archive/\
-    \ with timestamped names:\n  .context/archive/tasks-snapshot-YYYY-MM-DD-HHMM.md\n\nUnlike archive, snapshot copies the\
-    \ entire file as-is."
+  long: |-
+    Create a point-in-time snapshot of TASKS.md without modifying the original.
+
+    Snapshots are stored in .context/archive/ with timestamped names:
+      .context/archive/tasks-snapshot-YYYY-MM-DD-HHMM.md
+
+    Unlike archive, snapshot copies the entire file as-is.
   short: Create point-in-time snapshot of TASKS.md
 watch:
-  long: "Watch stdin or a log file for \ncommands and apply them.\n\nThis command parses AI output looking\
-    \ for structured update commands:\n\n  Simple formats (tasks, conventions, complete):\n    Implement user auth\n    Use kebab-case for files\n\
-    \    user auth\n\n  Structured formats (learnings, decisions) - all\
-    \ attributes required:\n    Title here\n\n    Use Redis\n\nLearnings require: context, lesson, application attributes.\nDecisions require:\
-    \ context, rationale, consequences attributes.\nUpdates missing required attributes will be rejected with an error.\n\n\
-    Use --log to watch a specific file instead of stdin.\nUse --dry-run to see what would be updated without making changes.\n\
-    \nPress Ctrl+C to stop watching."
+  long: |-
+    Watch stdin or a log file for 
+    commands and apply them.
+
+    This command parses AI output looking for structured update commands:
+
+      Simple formats (tasks, conventions, complete):
+        Implement user auth
+        Use kebab-case for files
+        user auth
+
+      Structured formats (learnings, decisions) - all attributes required:
+        Title here
+
+        Use Redis
+
+    Learnings require: context, lesson, application attributes.
+    Decisions require: context, rationale, consequences attributes.
+    Updates missing required attributes will be rejected with an error.
+
+    Use --log to watch a specific file instead of stdin.
+    Use --dry-run to see what would be updated without making changes.
+
+    Press Ctrl+C to stop watching.
   short: Watch for context-update commands in AI output
 why:
-  long: "Surface ctx's philosophy documents in the terminal.\n\nDocuments:\n  manifesto    The ctx Manifesto — creation, not\
-    \ code\n  about        About ctx — what it is and why it exists\n  invariants   Design invariants — properties that must\
-    \ hold\n\nUsage:\n  ctx why              Interactive numbered menu\n  ctx why manifesto    Show the manifesto directly\n\
-    \  ctx why about        Show the about page\n  ctx why invariants   Show the design invariants"
+  long: |-
+    Surface ctx's philosophy documents in the terminal.
+
+    Documents:
+      manifesto    The ctx Manifesto — creation, not code
+      about        About ctx — what it is and why it exists
+      invariants   Design invariants — properties that must hold
+
+    Usage:
+      ctx why              Interactive numbered menu
+      ctx why manifesto    Show the manifesto directly
+      ctx why about        Show the about page
+      ctx why invariants   Show the design invariants
   short: Read the philosophy behind ctx
-
-# Global flag descriptions (prefix with _flags.)
-"_flags.context-dir":
-  short: "Override context directory path (default: .context)"
-"_flags.no-color":
-  short: "Disable colored output"
-"_flags.allow-outside-cwd":
-  short: "Allow context directory outside current working directory"
-"_flags.add.priority":
-  short: "Priority level for tasks (high, medium, low)"
-"_flags.add.section":
-  short: "Target section within file"
-"_flags.add.file":
-  short: "Read content from file instead of argument"
-"_flags.add.context":
-  short: "Context for decisions: what prompted this decision (required for decisions)"
-"_flags.add.rationale":
-  short: "Rationale for decisions: why this choice over alternatives (required for decisions)"
-"_flags.add.consequences":
-  short: "Consequences for decisions: what changes as a result (required for decisions)"
-"_flags.add.lesson":
-  short: "Lesson for learnings: the key insight (required for learnings)"
-"_flags.add.application":
-  short: "Application for learnings: how to apply this going forward (required for learnings)"
-"_flags.agent.budget":
-  short: "Token budget for context packet"
-"_flags.agent.cooldown":
-  short: "Suppress repeated output within this duration (0 to disable)"
-"_flags.agent.format":
-  short: "Output format: md or json"
-"_flags.agent.session":
-  short: "Session identifier for cooldown isolation (e.g., $PPID)"
-"_flags.changes.since":
-  short: "Time reference: duration (24h) or date (2026-03-01)"
-"_flags.compact.archive":
-  short: "Create .context/archive/ for old content"
-"_flags.deps.external":
-  short: "Include external module dependencies"
-"_flags.deps.format":
-  short: "Output format: mermaid, table, json"
-"_flags.deps.type":
-  short: "Force project type: go, node, python, rust"
-"_flags.doctor.json":
-  short: "Machine-readable JSON output"
-"_flags.drift.fix":
-  short: "Auto-fix supported issues (staleness, missing files)"
-"_flags.drift.json":
-  short: "Output as JSON"
-"_flags.guide.commands":
-  short: "List all CLI commands"
-"_flags.guide.skills":
-  short: "List all available skills"
-"_flags.hook.write":
-  short: "Write the configuration file instead of printing"
-"_flags.initialize.force":
-  short: "Overwrite existing context files"
-"_flags.initialize.merge":
-  short: "Auto-merge ctx content into existing CLAUDE.md and PROMPT.md"
-"_flags.initialize.minimal":
-  short: "Only create essential files (TASKS.md, DECISIONS.md, CONSTITUTION.md)"
-"_flags.initialize.no-plugin-enable":
-  short: "Skip auto-enabling the ctx plugin in global Claude Code settings"
-"_flags.initialize.ralph":
-  short: "Agent works autonomously without asking questions"
-"_flags.journal.obsidian.output":
-  short: "Output directory for vault"
-"_flags.journal.site.build":
-  short: "Run zensical build after generating"
-"_flags.journal.site.output":
-  short: "Output directory for site"
-"_flags.journal.site.serve":
-  short: "Run zensical serve after generating"
-"_flags.load.budget":
-  short: "Token budget for assembly"
-"_flags.load.raw":
-  short: "Output raw file contents without assembly"
-"_flags.loop.completion":
-  short: "Completion signal to detect"
-"_flags.loop.max-iterations":
-  short: "Maximum iterations (0 = unlimited)"
-"_flags.loop.output":
-  short: "Output script filename"
-"_flags.loop.prompt":
-  short: "Prompt file to use"
-"_flags.loop.tool":
-  short: "AI tool: claude, aider, or generic"
-"_flags.memory.budget":
-  short: "Line budget for published content"
-"_flags.memory.dry-run":
-  short: "Show classification plan without writing"
-"_flags.memory.sync.dry-run":
-  short: "Show what would happen without writing"
-"_flags.notify.event":
-  short: "Event name (required)"
-"_flags.notify.hook":
-  short: "Hook name for structured detail (optional)"
-"_flags.notify.session-id":
-  short: "Session ID (optional)"
-"_flags.notify.variant":
-  short: "Template variant for structured detail (optional)"
-"_flags.pad.add.file":
-  short: "ingest a file as a blob entry"
-"_flags.pad.edit.append":
-  short: "append text to the end of the entry"
-"_flags.pad.edit.file":
-  short: "replace blob file content"
-"_flags.pad.edit.label":
-  short: "replace blob label"
-"_flags.pad.edit.prepend":
-  short: "prepend text to the beginning of the entry"
-"_flags.pad.export.dry-run":
-  short: "print what would be exported without writing"
-"_flags.pad.export.force":
-  short: "overwrite existing files instead of timestamping"
-"_flags.pad.imp.blobs":
-  short: "import first-level files from a directory as blob entries"
-"_flags.pad.merge.dry-run":
-  short: "print what would be merged without writing"
-"_flags.pad.merge.key":
-  short: "path to key file for decrypting input files"
-"_flags.pad.show.out":
-  short: "write blob content to a file"
-"_flags.prompt.add.stdin":
-  short: "read prompt content from stdin"
-"_flags.recall.export.all":
-  short: "Export all sessions from current project"
-"_flags.recall.export.all-projects":
-  short: "Include sessions from all projects"
-"_flags.recall.export.dry-run":
-  short: "Show what would be exported without writing files"
-"_flags.recall.export.force":
-  short: "Overwrite existing files completely (discard frontmatter)"
-"_flags.recall.export.keep-frontmatter":
-  short: "Preserve enriched YAML frontmatter during regeneration"
-"_flags.recall.export.regenerate":
-  short: "Re-export existing files (preserves YAML frontmatter by default)"
-"_flags.recall.export.skip-existing":
-  short: "Skip files that already exist"
-"_flags.recall.export.yes":
-  short: "Skip confirmation prompt"
-"_flags.recall.list.all-projects":
-  short: "Include sessions from all projects"
-"_flags.recall.list.limit":
-  short: "Maximum sessions to display"
-"_flags.recall.list.project":
-  short: "Filter by project name"
-"_flags.recall.list.since":
-  short: "Show sessions on or after this date (YYYY-MM-DD)"
-"_flags.recall.list.tool":
-  short: "Filter by tool (e.g., claude-code)"
-"_flags.recall.list.until":
-  short: "Show sessions on or before this date (YYYY-MM-DD)"
-"_flags.recall.lock.all":
-  short: "Lock all journal entries"
-"_flags.recall.show.all-projects":
-  short: "Search sessions from all projects"
-"_flags.recall.show.full":
-  short: "Show full message content"
-"_flags.recall.show.latest":
-  short: "Show the most recent session"
-"_flags.recall.unlock.all":
-  short: "Unlock all journal entries"
-"_flags.remind.add.after":
-  short: "Don't surface until this date (YYYY-MM-DD)"
-"_flags.remind.after":
-  short: "Don't surface until this date (YYYY-MM-DD)"
-"_flags.remind.dismiss.all":
-  short: "Dismiss all reminders"
-"_flags.site.feed.base-url":
-  short: "Base URL for entry links"
-"_flags.site.feed.out":
-  short: "Output path for the generated feed"
-"_flags.status.json":
-  short: "Output as JSON"
-"_flags.status.verbose":
-  short: "Include file content previews"
-"_flags.sync.dry-run":
-  short: "Show what would change without modifying"
-"_flags.system.bootstrap.quiet":
-  short: "Output only the context directory path"
-"_flags.system.events.all":
-  short: "Include rotated log file"
-"_flags.system.events.event":
-  short: "Filter by event type"
-"_flags.system.events.hook":
-  short: "Filter by hook name"
-"_flags.system.events.json":
-  short: "Output raw JSONL"
-"_flags.system.events.last":
-  short: "Show last N events"
-"_flags.system.events.session":
-  short: "Filter by session ID"
-"_flags.system.prune.days":
-  short: "Prune files older than this many days"
-"_flags.system.prune.dry-run":
-  short: "Show what would be pruned without deleting"
-"_flags.system.stats.follow":
-  short: "Stream new entries as they arrive"
-"_flags.system.stats.json":
-  short: "Output raw JSONL"
-"_flags.system.stats.last":
-  short: "Show last N entries"
-"_flags.system.stats.session":
-  short: "Filter by session ID (prefix match)"
-"_flags.task.archive.dry-run":
-  short: "Preview changes without modifying files"
-"_flags.watch.dry-run":
-  short: "Show updates without applying"
-"_flags.watch.log":
-  short: "Log file to watch (default: stdin)"
-"_flags.pause.session-id":
-  short: "Session ID (overrides stdin)"
-"_flags.resume.session-id":
-  short: "Session ID (overrides stdin)"
-"_flags.system.pause.session-id":
-  short: "Session ID (overrides stdin)"
-"_flags.system.resume.session-id":
-  short: "Session ID (overrides stdin)"
-"_flags.system.backup.scope":
-  short: "Backup scope: project, global, or all"
-"_flags.system.backup.json":
-  short: "Output results as JSON"
-"_flags.system.bootstrap.json":
-  short: "Output in JSON format"
-"_flags.system.resources.json":
-  short: "Output in JSON format"
-"_flags.system.message.json":
-  short: "Output in JSON format"
-"_flags.system.markjournal.check":
-  short: "Check if stage is set (exit 1 if not)"
-"_flags.memory.import.dry-run":
-  short: "Show classification plan without writing"
-"_flags.memory.publish.budget":
-  short: "Line budget for published content"
-"_flags.memory.publish.dry-run":
-  short: "Show what would be published without writing"
-"_examples.decision":
-  short: "  ctx add decision \"Use PostgreSQL for primary database\"\n  ctx add decision \"Adopt Go 1.22 for range-over-func support\""
-"_examples.task":
-  short: "  ctx add task \"Implement user authentication\"\n  ctx add task \"Fix login bug\" --priority high"
-"_examples.learning":
-  short: "  ctx add learning \"Go embed requires files in same package\" \\\n    --context \"Tried to embed files from parent directory\" \\\n    --lesson \"go:embed only works with files in same or child directories\" \\\n    --application \"Keep embedded files in internal/templates/\""
-"_examples.convention":
-  short: "  ctx add convention \"Use camelCase for function names\"\n  ctx add convention \"All API responses use JSON\""
-"_examples.default":
-  short: "  ctx add  \"your content here\""
diff --git a/internal/assets/commands/examples.yaml b/internal/assets/commands/examples.yaml
new file mode 100644
index 00000000..e2e6f9fe
--- /dev/null
+++ b/internal/assets/commands/examples.yaml
@@ -0,0 +1,24 @@
+# Example usage text for ctx CLI.
+# Used by assets.ExampleDesc() for cobra Example fields.
+# Keys match entry types: decision, learning, task, convention
+
+convention:
+  short: |2-
+      ctx add convention "Use camelCase for function names"
+      ctx add convention "All API responses use JSON"
+decision:
+  short: |2-
+      ctx add decision "Use PostgreSQL for primary database"
+      ctx add decision "Adopt Go 1.22 for range-over-func support"
+default:
+  short: '  ctx add  "your content here"'
+learning:
+  short: |2-
+      ctx add learning "Go embed requires files in same package" \
+        --context "Tried to embed files from parent directory" \
+        --lesson "go:embed only works with files in same or child directories" \
+        --application "Keep embedded files in internal/templates/"
+task:
+  short: |2-
+      ctx add task "Implement user authentication"
+      ctx add task "Fix login bug" --priority high
diff --git a/internal/assets/commands/flags.yaml b/internal/assets/commands/flags.yaml
new file mode 100644
index 00000000..514a6d89
--- /dev/null
+++ b/internal/assets/commands/flags.yaml
@@ -0,0 +1,236 @@
+# Flag descriptions for ctx CLI.
+# Used by assets.FlagDesc() to populate cobra flag usage strings.
+# Keys use dot notation: scope.flag-name (e.g., add.file)
+
+add.application:
+  short: 'Application for learnings: how to apply this going forward (required for learnings)'
+add.consequences:
+  short: 'Consequences for decisions: what changes as a result (required for decisions)'
+add.context:
+  short: 'Context for decisions: what prompted this decision (required for decisions)'
+add.file:
+  short: Read content from file instead of argument
+add.lesson:
+  short: 'Lesson for learnings: the key insight (required for learnings)'
+add.priority:
+  short: Priority level for tasks (high, medium, low)
+add.rationale:
+  short: 'Rationale for decisions: why this choice over alternatives (required for decisions)'
+add.section:
+  short: Target section within file
+agent.budget:
+  short: Token budget for context packet
+agent.cooldown:
+  short: Suppress repeated output within this duration (0 to disable)
+agent.format:
+  short: 'Output format: md or json'
+agent.session:
+  short: Session identifier for cooldown isolation (e.g., $PPID)
+allow-outside-cwd:
+  short: Allow context directory outside current working directory
+changes.since:
+  short: 'Time reference: duration (24h) or date (2026-03-01)'
+compact.archive:
+  short: Create .context/archive/ for old content
+context-dir:
+  short: 'Override context directory path (default: .context)'
+deps.external:
+  short: Include external module dependencies
+deps.format:
+  short: 'Output format: mermaid, table, json'
+deps.type:
+  short: 'Force project type: go, node, python, rust'
+doctor.json:
+  short: Machine-readable JSON output
+drift.fix:
+  short: Auto-fix supported issues (staleness, missing files)
+drift.json:
+  short: Output as JSON
+guide.commands:
+  short: List all CLI commands
+guide.skills:
+  short: List all available skills
+hook.write:
+  short: Write the configuration file instead of printing
+initialize.force:
+  short: Overwrite existing context files
+initialize.merge:
+  short: Auto-merge ctx content into existing CLAUDE.md and PROMPT.md
+initialize.minimal:
+  short: Only create essential files (TASKS.md, DECISIONS.md, CONSTITUTION.md)
+initialize.no-plugin-enable:
+  short: Skip auto-enabling the ctx plugin in global Claude Code settings
+initialize.ralph:
+  short: Agent works autonomously without asking questions
+journal.obsidian.output:
+  short: Output directory for vault
+journal.site.build:
+  short: Run zensical build after generating
+journal.site.output:
+  short: Output directory for site
+journal.site.serve:
+  short: Run zensical serve after generating
+load.budget:
+  short: Token budget for assembly
+load.raw:
+  short: Output raw file contents without assembly
+loop.completion:
+  short: Completion signal to detect
+loop.max-iterations:
+  short: Maximum iterations (0 = unlimited)
+loop.output:
+  short: Output script filename
+loop.prompt:
+  short: Prompt file to use
+loop.tool:
+  short: 'AI tool: claude, aider, or generic'
+memory.budget:
+  short: Line budget for published content
+memory.dry-run:
+  short: Show classification plan without writing
+memory.import.dry-run:
+  short: Show classification plan without writing
+memory.publish.budget:
+  short: Line budget for published content
+memory.publish.dry-run:
+  short: Show what would be published without writing
+memory.sync.dry-run:
+  short: Show what would happen without writing
+no-color:
+  short: Disable colored output
+notify.event:
+  short: Event name (required)
+notify.hook:
+  short: Hook name for structured detail (optional)
+notify.session-id:
+  short: Session ID (optional)
+notify.variant:
+  short: Template variant for structured detail (optional)
+pad.add.file:
+  short: ingest a file as a blob entry
+pad.edit.append:
+  short: append text to the end of the entry
+pad.edit.file:
+  short: replace blob file content
+pad.edit.label:
+  short: replace blob label
+pad.edit.prepend:
+  short: prepend text to the beginning of the entry
+pad.export.dry-run:
+  short: print what would be exported without writing
+pad.export.force:
+  short: overwrite existing files instead of timestamping
+pad.imp.blobs:
+  short: import first-level files from a directory as blob entries
+pad.merge.dry-run:
+  short: print what would be merged without writing
+pad.merge.key:
+  short: path to key file for decrypting input files
+pad.show.out:
+  short: write blob content to a file
+pause.session-id:
+  short: Session ID (overrides stdin)
+prompt.add.stdin:
+  short: read prompt content from stdin
+recall.export.all:
+  short: Export all sessions from current project
+recall.export.all-projects:
+  short: Include sessions from all projects
+recall.export.dry-run:
+  short: Show what would be exported without writing files
+recall.export.keep-frontmatter:
+  short: Preserve enriched YAML frontmatter during regeneration
+recall.export.regenerate:
+  short: Re-export existing files (preserves YAML frontmatter by default)
+recall.export.skip-existing:
+  short: Skip files that already exist
+recall.export.yes:
+  short: Skip confirmation prompt
+recall.list.all-projects:
+  short: Include sessions from all projects
+recall.list.limit:
+  short: Maximum sessions to display
+recall.list.project:
+  short: Filter by project name
+recall.list.since:
+  short: Show sessions on or after this date (YYYY-MM-DD)
+recall.list.tool:
+  short: Filter by tool (e.g., claude-code)
+recall.list.until:
+  short: Show sessions on or before this date (YYYY-MM-DD)
+recall.lock.all:
+  short: Lock all journal entries
+recall.show.all-projects:
+  short: Search sessions from all projects
+recall.show.full:
+  short: Show full message content
+recall.show.latest:
+  short: Show the most recent session
+recall.unlock.all:
+  short: Unlock all journal entries
+remind.add.after:
+  short: Don't surface until this date (YYYY-MM-DD)
+remind.after:
+  short: Don't surface until this date (YYYY-MM-DD)
+remind.dismiss.all:
+  short: Dismiss all reminders
+resume.session-id:
+  short: Session ID (overrides stdin)
+site.feed.base-url:
+  short: Base URL for entry links
+site.feed.out:
+  short: Output path for the generated feed
+status.json:
+  short: Output as JSON
+status.verbose:
+  short: Include file content previews
+sync.dry-run:
+  short: Show what would change without modifying
+system.backup.json:
+  short: Output results as JSON
+system.backup.scope:
+  short: 'Backup scope: project, global, or all'
+system.bootstrap.json:
+  short: Output in JSON format
+system.bootstrap.quiet:
+  short: Output only the context directory path
+system.events.all:
+  short: Include rotated log file
+system.events.event:
+  short: Filter by event type
+system.events.hook:
+  short: Filter by hook name
+system.events.json:
+  short: Output raw JSONL
+system.events.last:
+  short: Show last N events
+system.events.session:
+  short: Filter by session ID
+system.markjournal.check:
+  short: Check if stage is set (exit 1 if not)
+system.message.json:
+  short: Output in JSON format
+system.pause.session-id:
+  short: Session ID (overrides stdin)
+system.prune.days:
+  short: Prune files older than this many days
+system.prune.dry-run:
+  short: Show what would be pruned without deleting
+system.resources.json:
+  short: Output in JSON format
+system.resume.session-id:
+  short: Session ID (overrides stdin)
+system.stats.follow:
+  short: Stream new entries as they arrive
+system.stats.json:
+  short: Output raw JSONL
+system.stats.last:
+  short: Show last N entries
+system.stats.session:
+  short: Filter by session ID (prefix match)
+task.archive.dry-run:
+  short: Preview changes without modifying files
+watch.dry-run:
+  short: Show updates without applying
+watch.log:
+  short: 'Log file to watch (default: stdin)'
diff --git a/internal/assets/commands/text.yaml b/internal/assets/commands/text.yaml
new file mode 100644
index 00000000..b98cb053
--- /dev/null
+++ b/internal/assets/commands/text.yaml
@@ -0,0 +1,1426 @@
+# User-facing text strings for ctx CLI.
+# Used by assets.TextDesc() for UI messages, prompts, and labels.
+# Keys use dot notation: scope.name (e.g., agent.instruction)
+
+agent.instruction:
+  short: 'Before starting work, confirm to the user: "I have read the required context files and I''m following project conventions."'
+backup.box-title:
+  short: Backup Warning
+backup.no-marker:
+  short: No backup marker found — backup may have never run.
+backup.relay-message:
+  short: Backup warning
+backup.relay-prefix:
+  short: 'IMPORTANT: Relay this backup warning to the user VERBATIM before answering their question.'
+backup.run-hint:
+  short: 'Run: ctx system backup'
+backup.smb-not-mounted:
+  short: SMB share (%s) is not mounted.
+backup.smb-unavailable:
+  short: Backups cannot run until it's available.
+backup.stale:
+  short: Last .context backup is %d days old.
+block.absolute-path:
+  short: 'Use ''ctx'' from PATH, not absolute paths. Ask the user to run: make build && sudo make install'
+block.constitution-suffix:
+  short: 'See CONSTITUTION.md: ctx Invocation Invariants'
+block.cp-to-bin:
+  short: Agent must not copy binaries to bin directories. Ask the user to run 'sudo make install' instead.
+block.dot-slash:
+  short: 'Use ''ctx'' from PATH, not ''./ctx'' or ''./dist/ctx''. Ask the user to run: make build && sudo make install'
+block.go-run:
+  short: 'Use ''ctx'' from PATH, not ''go run ./cmd/ctx''. Ask the user to run: make build && sudo make install'
+block.install-to-local-bin:
+  short: Do not copy binaries to ~/.local/bin — this overrides the system ctx in /usr/local/bin. Use 'ctx' from PATH.
+block.mid-git-push:
+  short: git push requires explicit user approval.
+block.mid-sudo:
+  short: Cannot use sudo (no password access). Use 'make build && sudo make install' manually if needed.
+block.non-path-relay-message:
+  short: Blocked non-PATH ctx invocation
+bootstrap.next-steps:
+  short: |-
+    1. Read AGENT_PLAYBOOK.md from the context directory
+    2. Run `ctx agent --budget 4000` for a content summary
+bootstrap.none:
+  short: (none)
+bootstrap.plugin-warning:
+  short: 'ctx plugin is installed but not enabled. Run ''ctx init'' to auto-enable, or add {"enabledPlugins": {"ctx@activememory-ctx": true}} to ~/.claude/settings.json'
+bootstrap.rules:
+  short: |-
+    1. Use context_dir above for ALL file reads/writes
+    2. Never say "I don't have memory" — context IS your memory
+    3. Read files silently, present as recall (not search)
+    4. Persist learnings/decisions before session ends
+    5. Run `ctx agent` for content summaries
+    6. Run `ctx status` for context health
+ceremony.box-both:
+  short: Session Ceremonies
+ceremony.box-remember:
+  short: Session Start
+ceremony.box-wrapup:
+  short: Session End
+ceremony.fallback-both:
+  short: |-
+    Your last 3 sessions didn't use /ctx-remember or
+    /ctx-wrap-up.
+
+    Start sessions with:  /ctx-remember
+      → Loads context, shows active tasks, picks up
+        where you left off. No re-explaining needed.
+
+    End sessions with:    /ctx-wrap-up
+      → Captures learnings and decisions so the next
+        session starts informed, not from scratch.
+
+    These take seconds and save minutes.
+ceremony.fallback-remember:
+  short: |-
+    Try starting this session with /ctx-remember
+
+    It loads your context, shows active tasks, and
+    picks up where you left off — no re-explaining.
+ceremony.fallback-wrapup:
+  short: |-
+    Your last 3 sessions didn't end with /ctx-wrap-up
+
+    It captures learnings and decisions so the next
+    session starts informed, not from scratch.
+ceremony.relay-message:
+  short: Session ceremony nudge
+ceremony.relay-prefix:
+  short: 'IMPORTANT: Relay this session tip to the user VERBATIM before answering their question.'
+changes.fallback-label:
+  short: 24 hour(s) ago (default)
+changes.since-prefix:
+  short: 'since '
+check-context-size.billing-box-title:
+  short: Billing Threshold
+check-context-size.billing-fallback:
+  short: |-
+    ⚠ Token usage (~%s) has exceeded your
+    billing_token_warn threshold (%s).
+    Additional tokens may incur extra cost.
+check-context-size.billing-log-format:
+  short: prompt#%d BILLING-WARNING tokens=%d threshold=%d
+check-context-size.billing-relay-format:
+  short: Billing threshold exceeded (%s tokens > %s)
+check-context-size.billing-relay-prefix:
+  short: 'IMPORTANT: Relay this billing warning to the user VERBATIM before answering their question.'
+check-context-size.checkpoint-box-title:
+  short: 'Context Checkpoint (prompt #%d)'
+check-context-size.checkpoint-fallback:
+  short: |-
+    This session is getting deep. Consider wrapping up
+    soon. If there are unsaved learnings, decisions, or
+    conventions, now is a good time to persist them.
+check-context-size.checkpoint-log-format:
+  short: prompt#%d CHECKPOINT tokens=%d pct=%d%%
+check-context-size.checkpoint-relay-format:
+  short: 'Context Checkpoint at prompt #%d'
+check-context-size.oversize-fallback:
+  short: |-
+    ⚠ Context injection is large (~%d tokens).
+    Run /ctx-consolidate to distill your context files.
+check-context-size.relay-prefix:
+  short: 'IMPORTANT: Relay this context checkpoint to the user VERBATIM before answering their question.'
+check-context-size.running-low-suffix:
+  short: ' — running low'
+check-context-size.silenced-billing-log:
+  short: prompt#%d billing-silenced tokens=%d threshold=%d
+check-context-size.silenced-checkpoint-log:
+  short: prompt#%d silenced-by-template
+check-context-size.silenced-window-log:
+  short: prompt#%d window-silenced pct=%d%%
+check-context-size.silent-log-format:
+  short: prompt#%d silent
+check-context-size.suppressed-log-format:
+  short: prompt#%d suppressed (wrapped up)
+check-context-size.token-low:
+  short: ⚠
+check-context-size.token-normal:
+  short: ⏱
+check-context-size.token-usage:
+  short: '%s Context window: ~%s tokens (~%d%% of %s)%s'
+check-context-size.window-box-title:
+  short: Context Window Warning
+check-context-size.window-fallback:
+  short: |-
+    ⚠ Context window is %d%% full (~%s tokens).
+    The session will lose older context soon. Consider wrapping up
+    or starting a fresh session with /ctx-wrap-up.
+check-context-size.window-log-format:
+  short: prompt#%d WINDOW-WARNING tokens=%d pct=%d%%
+check-context-size.window-relay-format:
+  short: Context window at %d%%
+check-journal.box-title:
+  short: Journal Reminder
+check-journal.fallback-both:
+  short: |-
+    You have %d new session(s) not yet exported.
+    %d existing entries need enrichment.
+
+    Process journal (exports and enriches):
+      /ctx-journal-enrich-all
+check-journal.fallback-unenriched:
+  short: |-
+    %d journal entries need enrichment.
+
+    Enrich:
+      /ctx-journal-enrich-all
+check-journal.fallback-unexported:
+  short: |-
+    You have %d new session(s) not yet exported.
+
+    Process journal (exports and enriches):
+      /ctx-journal-enrich-all
+check-journal.relay-format:
+  short: '%d unexported, %d unenriched'
+check-journal.relay-prefix:
+  short: 'IMPORTANT: Relay this journal reminder to the user VERBATIM before answering their question.'
+check-knowledge.box-title:
+  short: Knowledge File Growth
+check-knowledge.fallback:
+  short: |-
+    Large knowledge files dilute agent context. Consider:
+     • Review and remove outdated entries
+     • Use /ctx-consolidate to merge overlapping entries
+     • Use /ctx-drift for semantic drift (stale patterns)
+     • Move stale entries to .context/archive/ manually
+check-knowledge.finding-format:
+  short: |
+    %s has %d %s (recommended: ≤%d).
+check-knowledge.relay-message:
+  short: Knowledge file growth detected
+check-knowledge.relay-prefix:
+  short: 'IMPORTANT: Relay this knowledge health notice to the user VERBATIM before answering their question.'
+check-map-staleness.box-title:
+  short: Architecture Map Stale
+check-map-staleness.fallback:
+  short: |-
+    ARCHITECTURE.md hasn't been refreshed since %s
+    and there are commits touching %d modules.
+    /ctx-architecture keeps architecture docs drift-free.
+
+    Want me to run /ctx-architecture to refresh?
+check-map-staleness.relay-message:
+  short: Architecture map stale
+check-map-staleness.relay-prefix:
+  short: 'IMPORTANT: Relay this architecture map notice to the user VERBATIM before answering their question.'
+check-memory-drift.box-title:
+  short: Memory Drift
+check-memory-drift.content:
+  short: 'MEMORY.md has changed since last sync.%sRun: ctx memory sync'
+check-memory-drift.relay-message:
+  short: 'MEMORY.md has changed since last sync — run ctx memory sync'
+check-memory-drift.relay-prefix:
+  short: 'IMPORTANT: Relay this memory drift notice to the user VERBATIM before answering their question.'
+check-persistence.box-title:
+  short: Persistence Checkpoint
+check-persistence.box-title-format:
+  short: '%s (prompt #%d)'
+check-persistence.checkpoint-format:
+  short: 'Persistence Checkpoint at prompt #%d'
+check-persistence.fallback:
+  short: |-
+    No context files updated in %d+ prompts.
+    Have you discovered learnings, made decisions,
+    established conventions, or completed tasks
+    worth persisting?
+
+    Run /ctx-wrap-up to capture session context.
+check-persistence.init-log-format:
+  short: init count=1 mtime=%d
+check-persistence.modified-log-format:
+  short: prompt#%d context-modified, reset nudge counter
+check-persistence.relay-format:
+  short: No context updated in %d+ prompts
+check-persistence.relay-prefix:
+  short: 'IMPORTANT: Relay this persistence checkpoint to the user VERBATIM before answering their question.'
+check-persistence.silenced-log-format:
+  short: prompt#%d silenced-by-template
+check-persistence.silent-log-format:
+  short: prompt#%d silent since_nudge=%d
+check-persistence.state-format:
+  short: |
+    count=%d
+    last_nudge=%d
+    last_mtime=%d
+check-reminders.box-title:
+  short: Reminders
+check-reminders.dismiss-all-hint:
+  short: 'Dismiss all: ctx remind dismiss --all'
+check-reminders.dismiss-hint:
+  short: 'Dismiss: ctx remind dismiss '
+check-reminders.item-format:
+  short: ' [%d] %s'
+check-reminders.nudge-format:
+  short: You have %d pending reminders
+check-reminders.relay-prefix:
+  short: 'IMPORTANT: Relay these reminders to the user VERBATIM before answering their question.'
+check-resources.box-title:
+  short: Resource Alert
+check-resources.fallback-end:
+  short: and consider ending this session.
+check-resources.fallback-low:
+  short: System resources are critically low.
+check-resources.fallback-persist:
+  short: Persist unsaved context NOW with /ctx-wrap-up
+check-resources.relay-message:
+  short: System resources critically low
+check-resources.relay-prefix:
+  short: 'IMPORTANT: Relay this resource warning to the user VERBATIM.'
+check-task-completion.fallback:
+  short: If you completed a task, mark it [x] in TASKS.md.
+check-task-completion.nudge-message:
+  short: task completion nudge
+check-version.box-title:
+  short: Version Mismatch
+check-version.fallback:
+  short: |-
+    Your ctx binary is v%s but the plugin expects v%s.
+
+    Reinstall the binary to get the best out of ctx:
+      go install github.com/ActiveMemory/ctx/cmd/ctx@latest
+check-version.key-box-title:
+  short: Key Rotation
+check-version.key-fallback:
+  short: |-
+    Your encryption key is %d days old.
+    Consider rotating: ctx pad rotate-key
+check-version.key-relay-format:
+  short: Encryption key is %d days old
+check-version.key-relay-prefix:
+  short: 'IMPORTANT: Relay this security reminder to the user VERBATIM.'
+check-version.mismatch-relay-format:
+  short: Binary v%s vs plugin v%s
+check-version.relay-prefix:
+  short: 'IMPORTANT: Relay this version warning to the user VERBATIM before answering their question.'
+confirm.proceed:
+  short: 'Proceed? [y/N] '
+context-load-gate.file-header:
+  short: |+
+    --- %s ---
+    %s
+
+context-load-gate.footer:
+  short: |
+    Context: %d files loaded (~%d tokens). Order follows config.FileReadOrder.
+
+    TASKS.md contains the project's prioritized work items. Read it when discussing priorities, picking up work, or when the user asks about tasks.
+
+    For full decision or learning details, read the entry in DECISIONS.md or LEARNINGS.md by timestamp.
+context-load-gate.header:
+  short: |
+    PROJECT CONTEXT (auto-loaded by system hook — already in your context window)
+context-load-gate.index-fallback:
+  short: (no index entries)
+context-load-gate.index-header:
+  short: |+
+    --- %s (index — read full entries by date when relevant) ---
+    %s
+
+context-load-gate.oversize-action:
+  short: |
+    Action: Run /ctx-consolidate to distill context files.
+    Files with the most growth are the best candidates.
+context-load-gate.oversize-breakdown:
+  short: |
+    Per-file breakdown:
+context-load-gate.oversize-file-entry:
+  short: |2
+      %-22s %5d tokens
+context-load-gate.oversize-header:
+  short: |
+    Context injection oversize warning
+context-load-gate.oversize-injected:
+  short: |+
+    Injected:  %d tokens (threshold: %d)
+
+context-load-gate.oversize-timestamp:
+  short: |
+    Timestamp: %s
+context-load-gate.webhook:
+  short: 'context-load-gate: injected %d files (~%d tokens)'
+doctor.context-file.format:
+  short: '%-22s ~%d tokens'
+doctor.context-initialized.error:
+  short: Context not initialized — run ctx init
+doctor.context-initialized.ok:
+  short: Context initialized (.context/)
+doctor.context-size.format:
+  short: 'Context size: ~%d tokens (window: %d)'
+doctor.context-size.warning-suffix:
+  short: ' — consider ctx compact'
+doctor.ctxrc-validation.error:
+  short: '.ctxrc parse error: %v'
+doctor.ctxrc-validation.ok:
+  short: .ctxrc valid
+doctor.ctxrc-validation.ok-no-file:
+  short: No .ctxrc file (using defaults)
+doctor.ctxrc-validation.warning:
+  short: '.ctxrc has unknown fields: %s'
+doctor.drift.detected:
+  short: 'Drift: %s — run ctx drift for details'
+doctor.drift.ok:
+  short: No drift detected
+doctor.drift.violations:
+  short: '%d violations'
+doctor.drift.warning-load:
+  short: 'Could not load context for drift check: %v'
+doctor.drift.warnings:
+  short: '%d warnings'
+doctor.event-logging.info:
+  short: 'Event logging disabled (enable with event_log: true in .ctxrc)'
+doctor.event-logging.ok:
+  short: Event logging enabled
+doctor.output.header:
+  short: ctx doctor
+doctor.output.result-line:
+  short: '  %s %s'
+doctor.output.separator:
+  short: ==========
+doctor.output.summary:
+  short: 'Summary: %d warnings, %d errors'
+doctor.plugin-enabled-global.ok:
+  short: Plugin enabled globally (~/.claude/settings.json)
+doctor.plugin-enabled-local.ok:
+  short: Plugin enabled locally (.claude/settings.local.json)
+doctor.plugin-enabled.warning:
+  short: 'Plugin installed but not enabled — run ''ctx init'' to auto-enable, or add {"enabledPlugins": {"%s": true}} to ~/.claude/settings.json'
+doctor.plugin-installed.info:
+  short: ctx plugin not installed
+doctor.plugin-installed.ok:
+  short: ctx plugin installed
+doctor.recent-events.info:
+  short: No events in log
+doctor.recent-events.ok:
+  short: 'Last event: %s'
+doctor.reminders.info:
+  short: '%d pending reminders'
+doctor.reminders.ok:
+  short: No pending reminders
+doctor.required-files.error:
+  short: 'Missing required files (%d/%d): %s'
+doctor.required-files.ok:
+  short: Required files present (%d/%d)
+doctor.resource-disk.format:
+  short: Disk %d%% (%s / %s GB)
+doctor.resource-load.format:
+  short: Load %.2fx (%.1f / %d CPUs)
+doctor.resource-memory.format:
+  short: Memory %d%% (%s / %s GB)
+doctor.resource-swap.format:
+  short: Swap %d%% (%s / %s GB)
+doctor.task-completion.format:
+  short: 'Tasks: %d/%d completed (%d%%)'
+doctor.task-completion.warning-suffix:
+  short: ' — consider archiving with ctx tasks archive'
+doctor.webhook.info:
+  short: No webhook configured (optional — use ctx notify setup)
+doctor.webhook.ok:
+  short: Webhook configured
+drift.cleared:
+  short: ✓ Index cleared (no %s found)
+drift.dead-path:
+  short: references path that does not exist
+drift.entry-count:
+  short: 'has %d entries (recommended: ≤%d)'
+drift.missing-file:
+  short: required context file is missing
+drift.missing-package:
+  short: package %s is not documented
+drift.regenerated:
+  short: ✓ Index regenerated with %d entries
+drift.secret:
+  short: may contain secrets (constitution violation)
+drift.stale-age:
+  short: last modified %d days ago
+drift.staleness:
+  short: has many completed items (consider archiving)
+events.empty:
+  short: No events logged.
+events.human-format:
+  short: '%-19s  %-5s  %-24s  %s'
+heartbeat.log-plain:
+  short: prompt#%d context_modified=%t
+heartbeat.log-tokens:
+  short: prompt#%d context_modified=%t tokens=%s pct=%d%%
+heartbeat.notify-plain:
+  short: 'heartbeat: prompt #%d (context_modified=%t)'
+heartbeat.notify-tokens:
+  short: 'heartbeat: prompt #%d (context_modified=%t tokens=%s pct=%d%%)'
+hook.aider:
+  short: |
+    Aider Integration
+    =================
+
+    Add to your .aider.conf.yml:
+
+    ```yaml
+    read:
+      - .context/CONSTITUTION.md
+      - .context/TASKS.md
+      - .context/CONVENTIONS.md
+      - .context/ARCHITECTURE.md
+      - .context/DECISIONS.md
+    ```
+
+    Or pass context via command line:
+
+    ```bash
+    ctx agent | aider --message "$(cat -)"
+    ```
+hook.claude:
+  short: |
+    Claude Code Integration
+    =======================
+
+    Claude Code integration is now provided via the ctx plugin.
+
+    Install the plugin:
+      /plugin marketplace add ActiveMemory/ctx
+      /plugin install ctx@activememory-ctx
+
+    The plugin provides hooks (context monitoring, persistence
+    nudges, post-commit capture) and 25 skills automatically.
+hook.copilot:
+  short: |
+    GitHub Copilot Integration
+    ==========================
+
+    Add the following to .github/copilot-instructions.md,
+    or run with --write to generate the file directly:
+
+      ctx hook copilot --write
+hook.cursor:
+  short: |
+    Cursor IDE Integration
+    ======================
+
+    Add to your .cursorrules file:
+
+    ```markdown
+    # Project Context
+
+    Always read these files before making changes:
+    - .context/CONSTITUTION.md (NEVER violate these rules)
+    - .context/TASKS.md (current work)
+    - .context/CONVENTIONS.md (how we write code)
+    - .context/ARCHITECTURE.md (system structure)
+
+    Run 'ctx agent' for a context summary.
+    Run 'ctx drift' to check for stale context.
+    ```
+hook.supported-tools:
+  short: |
+    Supported tools:
+      claude-code  - Anthropic's Claude Code CLI (use plugin instead)
+      cursor       - Cursor IDE
+      aider        - Aider AI coding assistant
+      copilot      - GitHub Copilot
+      windsurf     - Windsurf IDE
+hook.windsurf:
+  short: |
+    Windsurf Integration
+    ====================
+
+    Add to your .windsurfrules file:
+
+    ```markdown
+    # Context
+
+    Read order for context:
+    1. .context/CONSTITUTION.md
+    2. .context/TASKS.md
+    3. .context/CONVENTIONS.md
+    4. .context/ARCHITECTURE.md
+    5. .context/DECISIONS.md
+
+    Run 'ctx agent' for AI-ready context packet.
+    ```
+import.count-convention:
+  short: '%d convention'
+import.count-decision:
+  short: '%d decision'
+import.count-learning:
+  short: '%d learning'
+import.count-task:
+  short: '%d task'
+journal.moc.browse-by:
+  short: '## Browse by'
+journal.moc.file-page-stats:
+  short: '**%d sessions** touching this file.'
+journal.moc.file-stats:
+  short: '**%d files** across **%d sessions** — **%d popular**, **%d long-tail**'
+journal.moc.files-description:
+  short: — sessions grouped by file touched
+journal.moc.nav-description:
+  short: Navigation hub for all journal entries.
+journal.moc.see-also:
+  short: '**See also**:'
+journal.moc.session-link:
+  short: '- [%s](%s%s) (%d sessions)%s'
+journal.moc.topic-page-stats:
+  short: '**%d sessions** with this topic.'
+journal.moc.topic-stats:
+  short: '**%d topics** across **%d sessions** — **%d popular**, **%d long-tail**'
+journal.moc.topics-description:
+  short: — sessions grouped by topic
+journal.moc.topics-label:
+  short: '**Topics**: '
+journal.moc.type-label:
+  short: '**Type**: '
+journal.moc.type-page-stats:
+  short: '**%d sessions** of type *%s*.'
+journal.moc.type-stats:
+  short: '**%d types** across **%d sessions**'
+journal.moc.types-description:
+  short: — sessions grouped by type
+mark-journal.checked:
+  short: '%s: %s = %s'
+mark-journal.marked:
+  short: '%s: marked %s'
+mark-wrapped-up.confirmed:
+  short: marked wrapped-up
+mcp.added-format:
+  short: Added %s to %s
+mcp.also-noted:
+  short: |+
+    ---
+    ## Also Noted
+
+mcp.completed-format:
+  short: 'Completed: %s'
+mcp.drift-issue-format:
+  short: |2
+      - [%s] %s: %s
+mcp.drift-passed:
+  short: |
+    Passed:
+mcp.drift-passed-format:
+  short: |2
+      - %s
+mcp.drift-status-format:
+  short: |+
+    Status: %s
+
+mcp.drift-violations:
+  short: |
+    Violations:
+mcp.drift-warnings:
+  short: |
+    Warnings:
+mcp.failed-marshal:
+  short: failed to marshal response
+mcp.file-not-found:
+  short: 'file not found: %s'
+mcp.invalid-params:
+  short: invalid params
+mcp.load-context:
+  short: 'failed to load context: %v'
+mcp.method-not-found:
+  short: 'method not found: %s'
+mcp.omitted-format:
+  short: |
+    - %s (omitted for budget)
+mcp.packet-header:
+  short: |+
+    # Context Packet
+
+mcp.parse-error:
+  short: parse error
+mcp.query-required:
+  short: query is required
+mcp.res-agent:
+  short: All context files assembled in priority read order
+mcp.res-architecture:
+  short: System architecture documentation
+mcp.res-constitution:
+  short: Hard rules that must never be violated
+mcp.res-conventions:
+  short: Code patterns and standards
+mcp.res-decisions:
+  short: Architectural decisions with rationale
+mcp.res-glossary:
+  short: Project-specific terminology
+mcp.res-learnings:
+  short: Gotchas, tips, and lessons learned
+mcp.res-playbook:
+  short: How agents should use this system
+mcp.res-tasks:
+  short: Current work items and their status
+mcp.section-format:
+  short: |+
+    ---
+    ## %s
+
+    %s
+
+mcp.status-context-format:
+  short: |
+    Context: %s
+mcp.status-empty:
+  short: EMPTY
+mcp.status-file-format:
+  short: |2
+      %-22s %6d tokens  [%s]
+mcp.status-files-format:
+  short: |
+    Files: %d
+mcp.status-ok:
+  short: OK
+mcp.status-tokens-format:
+  short: |+
+    Tokens: ~%d
+
+mcp.tool-add-desc:
+  short: Add a task, decision, learning, or convention to the context
+mcp.tool-complete-desc:
+  short: Mark a task as done by number or text match
+mcp.tool-drift-desc:
+  short: 'Detect stale or invalid context: dead paths, missing files, staleness'
+mcp.tool-prop-application:
+  short: How to apply this lesson (required for learnings)
+mcp.tool-prop-consequences:
+  short: Consequences (required for decisions)
+mcp.tool-prop-content:
+  short: Title or main content of the entry
+mcp.tool-prop-context:
+  short: Context field (required for decisions and learnings)
+mcp.tool-prop-lesson:
+  short: Lesson learned (required for learnings)
+mcp.tool-prop-priority:
+  short: Priority level (for tasks only)
+mcp.tool-prop-query:
+  short: Task number (e.g. '1') or search text to match
+mcp.tool-prop-rationale:
+  short: Rationale (required for decisions)
+mcp.tool-prop-type:
+  short: Entry type to add
+mcp.tool-status-desc:
+  short: 'Show context health: file count, token estimate, and file summaries'
+mcp.type-content-required:
+  short: type and content are required
+mcp.unknown-resource:
+  short: 'unknown resource: %s'
+mcp.unknown-tool:
+  short: 'unknown tool: %s'
+mcp.write-failed:
+  short: 'write failed: %v'
+memory.diff-new-format:
+  short: |
+    +++ %s (source)
+memory.diff-old-format:
+  short: |
+    --- %s (mirror)
+memory.import-review:
+  short: Imported from MEMORY.md — review and update as needed
+memory.import-source:
+  short: auto-memory import
+memory.publish-conventions:
+  short: '## Key Conventions'
+memory.publish-decisions:
+  short: '## Recent Decisions'
+memory.publish-learnings:
+  short: '## Recent Learnings'
+memory.publish-tasks:
+  short: '## Pending Tasks'
+memory.publish-title:
+  short: |+
+    # Project Context (managed by ctx)
+
+memory.select-content:
+  short: 'selecting content: %v'
+memory.write-memory:
+  short: 'writing MEMORY.md: %v'
+message.ctx-specific-warning:
+  short: |-
+    Warning: this message is ctx-specific (intended for ctx development).
+    Customizing it may produce unexpected results.
+message.edit-hint:
+  short: Edit this file to customize the message.
+message.list-header-category:
+  short: Category
+message.list-header-hook:
+  short: Hook
+message.list-header-override:
+  short: Override
+message.list-header-variant:
+  short: Variant
+message.no-override:
+  short: No override found for %s/%s. Already using embedded default.
+message.override-created:
+  short: Override created at %s
+message.override-label:
+  short: override
+message.override-removed:
+  short: Override removed for %s/%s. Using embedded default.
+message.source-default:
+  short: 'Source: embedded default'
+message.source-override:
+  short: 'Source: user override (%s)'
+message.template-vars-label:
+  short: 'Template variables: %s'
+message.template-vars-none:
+  short: 'Template variables: (none)'
+pad.key-created:
+  short: Scratchpad key created at %s
+parser.git-not-found:
+  short: git not found in PATH; install git to enable remote URL enrichment
+parser.session_prefix:
+  short: 'Session:'
+parser.session_prefix_alt:
+  short: 'Oturum:'
+pause.confirmed:
+  short: Context hooks paused for session %s
+post-commit.fallback:
+  short: 'Commit succeeded. 1. Offer context capture to the user: Decision (design choice?), Learning (gotcha?), or Neither. 2. Ask the user: "Want me to run lints and tests before you push?" Do NOT push. The user pushes manually.'
+post-commit.relay-message:
+  short: Commit succeeded, context capture offered
+prune.dry-run-line:
+  short: '  would prune: %s (age: %s)'
+prune.dry-run-summary:
+  short: Dry run — would prune %d files (skip %d recent, preserve %d global)
+prune.error-line:
+  short: '  error removing %s: %v'
+prune.summary:
+  short: Pruned %d files (skipped %d recent, preserved %d global)
+qa-reminder.fallback:
+  short: 'HARD GATE — DO NOT COMMIT without completing ALL of these steps first: (1) lint the ENTIRE project, (2) test the ENTIRE project, (3) verify a clean working tree (no modified or untracked files left behind). Not just the files you changed — the whole branch. If unrelated modified files remain, offer to commit them separately, stash them, or get explicit confirmation to leave them. Do NOT say ''I''ll do that at the end'' or ''I''ll handle that after committing.'' Run lint and tests BEFORE every git commit, every time, no exceptions.'
+qa-reminder.relay-message:
+  short: QA gate reminder emitted
+rc.parse_warning:
+  short: 'ctx: warning: failed to parse %s: %v (using defaults)'
+resources.alert-danger:
+  short: '  ✖ %s'
+resources.alert-disk:
+  short: Disk %.0f%% used (%s / %s GB)
+resources.alert-load:
+  short: Load %.2fx CPU count
+resources.alert-memory:
+  short: Memory %.0f%% used (%s / %s GB)
+resources.alert-swap:
+  short: Swap %.0f%% used (%s / %s GB)
+resources.alert-warning:
+  short: '  ⚠ %s'
+resources.alerts:
+  short: 'Alerts:'
+resources.all-clear:
+  short: All clear — no resource warnings.
+resources.header:
+  short: System Resources
+resources.separator:
+  short: ====================
+resources.status-danger:
+  short: ✖ DANGER
+resources.status-ok:
+  short: ✓ ok
+resources.status-warn:
+  short: ⚠ WARNING
+resume.confirmed:
+  short: Context hooks resumed for session %s
+specs-nudge.fallback:
+  short: Save your plan to specs/ — these documents track what was designed for the current release. Use specs/feature-name.md naming. If this is a quick fix that doesn't need a spec, proceed without one.
+specs-nudge.nudge-message:
+  short: plan-to-specs nudge emitted
+stats.empty:
+  short: No stats recorded yet.
+stats.header-format:
+  short: '%-19s  %-8s  %6s  %8s  %4s  %-12s'
+stats.line-format:
+  short: '%-19s  %-8s  %6d  %7s  %3d%%  %-12s'
+stopwords:
+  short: the and for that this with from are was were been have has had but not you all can her his she its our they will each make like use way may any into when which their about would there what also should after before than then them could more some other only just see add new update how
+summary.active:
+  short: '%d active'
+summary.completed:
+  short: '%d completed'
+summary.decision:
+  short: 1 decision
+summary.decisions:
+  short: '%d decisions'
+summary.empty:
+  short: empty
+summary.invariants:
+  short: '%d invariants'
+summary.loaded:
+  short: loaded
+summary.term:
+  short: 1 term
+summary.terms:
+  short: '%d terms'
+sync.config.description:
+  short: Found %s but %s not documented
+sync.config.suggestion:
+  short: Document %s in %s
+sync.deps.description:
+  short: Found %s (%s) but no dependency documentation
+sync.deps.suggestion:
+  short: Consider documenting key dependencies in %s or create %s
+sync.dir.description:
+  short: Directory '%s/' exists but not documented
+sync.dir.suggestion:
+  short: Add '%s/' to %s with description
+task-archive.content-preview:
+  short: 'Archived content preview:'
+task-archive.dry-run-header:
+  short: Dry run - no files modified
+task-archive.dry-run-summary:
+  short: Would archive %d completed tasks (keeping %d pending)
+task-archive.no-completed:
+  short: No completed tasks to archive.
+task-archive.pending-remain:
+  short: '  %d pending tasks remain in TASKS.md'
+task-archive.skip-incomplete:
+  short: No tasks to archive (%d skipped due to incomplete children).
+task-archive.skipping:
+  short: '! Skipping (has incomplete children): %s'
+task-archive.success:
+  short: ✓ Archived %d completed tasks to %s
+task-archive.success-with-age:
+  short: ✓ Archived %d tasks to %s (older than %d days)
+task-snapshot.created-format:
+  short: 'Created: %s'
+task-snapshot.header-format:
+  short: '# TASKS.md Snapshot — %s'
+task-snapshot.saved:
+  short: ✓ Snapshot saved to %s
+time.ago:
+  short: ' ago'
+time.day:
+  short: day
+time.hour:
+  short: hour
+time.just-now:
+  short: just now
+time.minute:
+  short: minute
+version-drift.relay-message:
+  short: versions out of sync
+watch.apply-failed:
+  short: |
+    ✗ Failed to apply [%s]: %v
+watch.apply-success:
+  short: |
+    ✓ Applied: [%s] %s
+watch.close-log-error:
+  short: 'failed to close log file: %v'
+watch.dry-run:
+  short: DRY RUN — No changes will be made
+watch.dry-run-preview:
+  short: |
+    ○ Would apply: [%s] %s
+watch.stop-hint:
+  short: Press Ctrl+C to stop
+watch.watching:
+  short: Watching for context updates...
+why.admonition-format:
+  short: '> **%s**'
+why.banner:
+  short: |2-
+
+       /    ctx:                         https://ctx.ist
+     ,'`./    do you remember?
+     `.,'\
+       \
+          {}  -> what
+          ctx -> why
+why.blockquote-prefix:
+  short: '> '
+why.bold-format:
+  short: '**%s**'
+why.menu-item-format:
+  short: '  [%d] %s'
+why.menu-prompt:
+  short: "\nSelect a document (1-3): "
+write.added-to:
+  short: ✓ Added to %s
+write.archived:
+  short: Archived previous mirror to %s
+write.backup-result:
+  short: '%s: %s (%s)'
+write.backup-smb-dest:
+  short: ' → %s'
+write.bootstrap-dir:
+  short: 'context_dir: %s'
+write.bootstrap-files:
+  short: 'Files:'
+write.bootstrap-next-steps:
+  short: 'Next steps:'
+write.bootstrap-numbered:
+  short: '  %d. %s'
+write.bootstrap-rules:
+  short: 'Rules:'
+write.bootstrap-sep:
+  short: =============
+write.bootstrap-title:
+  short: ctx bootstrap
+write.bootstrap-warning:
+  short: 'Warning: %s'
+write.completed-task:
+  short: '✓ Completed: %s'
+write.config-profile-base:
+  short: 'active: base (defaults)'
+write.config-profile-dev:
+  short: 'active: dev (verbose logging enabled)'
+write.config-profile-none:
+  short: 'active: none (%s does not exist)'
+write.deps-looking-for:
+  short: 'Looking for: go.mod, package.json, requirements.txt, pyproject.toml, Cargo.toml'
+write.deps-no-deps:
+  short: No dependencies found.
+write.deps-no-project:
+  short: No supported project detected.
+write.deps-use-type:
+  short: 'Use --type to force: %s'
+write.dry-run:
+  short: Dry run — no files will be written.
+write.exists-writing-as-alternative:
+  short: '  ! %s exists, writing as %s'
+write.hook-copilot-created:
+  short: '  ✓ %s'
+write.hook-copilot-force-hint:
+  short: '  Use --force to overwrite (not yet implemented).'
+write.hook-copilot-merged:
+  short: '  ✓ %s (merged)'
+write.hook-copilot-sessions-dir:
+  short: '  ✓ %s/'
+write.hook-copilot-skipped:
+  short: '  ○ %s (ctx content exists, skipped)'
+write.hook-copilot-summary:
+  short: |-
+    Copilot Chat (agent mode) will now:
+      1. Read .context/ files at session start
+      2. Save session summaries to .context/sessions/
+      3. Proactively update context during work
+write.hook-unknown-tool:
+  short: |
+    Unknown tool: %s
+write.import-added:
+  short: '     Added to %s'
+write.import-classified:
+  short: '     Classified: %s (keywords: %s)'
+write.import-classified-skip:
+  short: '     Classified: skip'
+write.import-duplicates:
+  short: 'Duplicates: %d entries (already imported)'
+write.import-entry:
+  short: '  -> %q'
+write.import-found:
+  short: '  Found %d entries'
+write.import-no-entries:
+  short: No entries found in %s.
+write.import-scanning:
+  short: Scanning %s for new entries...
+write.import-skipped:
+  short: 'Skipped: %d entries (session notes/unclassified)'
+write.import-summary:
+  short: 'Imported: %d entries'
+write.import-summary-dry-run:
+  short: 'Dry run — would import: %d entries'
+write.init-aborted:
+  short: Aborted.
+write.init-backup:
+  short: '  ✓ %s (backup)'
+write.init-created-dir:
+  short: '  ✓ %s/'
+write.init-created-with:
+  short: '  ✓ %s%s'
+write.init-creating-root-files:
+  short: Creating project root files...
+write.init-ctx-content-exists:
+  short: '  ○ %s (ctx content exists, skipped)'
+write.init-exists-skipped:
+  short: '  ○ %s (exists, skipped)'
+write.init-file-created:
+  short: '  ✓ %s'
+write.init-file-exists-no-ctx:
+  short: '%s exists but has no ctx content.'
+write.init-gitignore-review:
+  short: '  Review with: cat .gitignore'
+write.init-gitignore-updated:
+  short: '  ✓ .gitignore updated (%d entries added)'
+write.init-makefile-appended:
+  short: '  ✓ Makefile (appended %s include)'
+write.init-makefile-created:
+  short: '  ✓ Makefile (created with ctx include)'
+write.init-makefile-includes:
+  short: '  ○ Makefile (already includes %s)'
+write.init-merged:
+  short: '  ✓ %s (merged)'
+write.init-next-steps:
+  short: |-
+    Next steps:
+      1. Edit .context/TASKS.md to add your current tasks
+      2. Run 'ctx status' to see context summary
+      3. Run 'ctx agent' to get AI-ready context packet
+write.init-no-changes:
+  short: '  ○ %s (no changes needed)'
+write.init-overwrite-prompt:
+  short: '%s already exists. Overwrite? [y/N] '
+write.init-perms-allow:
+  short: '  ✓ %s (added ctx permissions)'
+write.init-perms-allow-deny:
+  short: '  ✓ %s (added ctx allow + deny permissions)'
+write.init-perms-deduped:
+  short: '  ✓ %s (removed duplicate permissions)'
+write.init-perms-deny:
+  short: '  ✓ %s (added ctx deny permissions)'
+write.init-perms-merged-deduped:
+  short: '  ✓ %s (added ctx permissions, removed duplicates)'
+write.init-plugin-already-enabled:
+  short: '  ○ Plugin already enabled globally'
+write.init-plugin-enabled:
+  short: '  ✓ Plugin enabled globally in %s'
+write.init-plugin-info:
+  short: |-
+    Claude Code users: install the ctx plugin for hooks & skills:
+      /plugin marketplace add ActiveMemory/ctx
+      /plugin install ctx@activememory-ctx
+write.init-plugin-note:
+  short: |-
+    Note: local plugin installs are not auto-enabled globally.
+    Run 'ctx init' again after installing the plugin to enable it,
+    or manually add to ~/.claude/settings.json:
+      {"enabledPlugins": {"ctx@activememory-ctx": true}}
+write.init-plugin-skipped:
+  short: '  ○ Plugin enablement skipped (plugin not installed)'
+write.init-scratchpad-key-created:
+  short: '  ✓ Scratchpad key created at %s'
+write.init-scratchpad-no-key:
+  short: '  ⚠ Encrypted scratchpad found but no key at %s'
+write.init-scratchpad-plaintext:
+  short: '  ✓ %s (plaintext scratchpad)'
+write.init-setting-up-permissions:
+  short: Setting up Claude Code permissions...
+write.init-skipped-dir:
+  short: '  ○ %s/ (exists, skipped)'
+write.init-skipped-plain:
+  short: '  ○ %s (skipped)'
+write.init-updated-ctx-section:
+  short: '  ✓ %s (updated ctx section)'
+write.init-updated-plan-section:
+  short: '  ✓ %s (updated plan section)'
+write.init-updated-prompt-section:
+  short: '  ✓ %s (updated prompt section)'
+write.init-warn-non-fatal:
+  short: '  ⚠ %s: %v'
+write.initialized:
+  short: Context initialized in %s/
+write.journal-orphan-removed:
+  short: '  removed orphan: %s'
+write.journal-site-alt:
+  short: '  ctx journal site --serve'
+write.journal-site-building:
+  short: Building site...
+write.journal-site-generated:
+  short: ✓ Generated site with %d entries in %s
+write.journal-site-next-steps:
+  short: '  cd %s && %s serve'
+write.journal-site-starting:
+  short: Starting local server...
+write.journal-sync-locked:
+  short: '  ✓ %s (locked)'
+write.journal-sync-locked-count:
+  short: |2-
+
+    Locked %d entry(s).
+write.journal-sync-match:
+  short: No changes — state already matches frontmatter.
+write.journal-sync-none:
+  short: No journal entries found.
+write.journal-sync-unlocked:
+  short: '  ✓ %s (unlocked)'
+write.journal-sync-unlocked-count:
+  short: |2-
+
+    Unlocked %d entry(s).
+write.lines:
+  short: '  Lines: %d'
+write.lines-previous:
+  short: ' (was %d)'
+write.lock-unlock-entry:
+  short: '  ok %s (%s)'
+write.lock-unlock-no-changes:
+  short: No changes — all matched entries already %s.
+write.lock-unlock-summary:
+  short: |2-
+
+    %s %d entry(s).
+write.loop-completion:
+  short: 'Completion signal: %s'
+write.loop-generated:
+  short: ✓ Generated %s
+write.loop-max-iterations:
+  short: 'Max iterations: %d'
+write.loop-prompt:
+  short: 'Prompt: %s'
+write.loop-run-cmd:
+  short: '  ./%s'
+write.loop-tool:
+  short: 'Tool: %s'
+write.loop-unlimited:
+  short: 'Max iterations: unlimited'
+write.memory-archives:
+  short: '  Archives:   %d snapshots in .context/%s/'
+write.memory-bridge-header:
+  short: Memory Bridge Status
+write.memory-drift-detected:
+  short: '  Drift:      detected (source is newer)'
+write.memory-drift-none:
+  short: '  Drift:      none'
+write.memory-last-sync:
+  short: '  Last sync:   %s (%s ago)'
+write.memory-last-sync-never:
+  short: '  Last sync:   never'
+write.memory-mirror:
+  short: '  Mirror:      %s'
+write.memory-mirror-lines:
+  short: '  Mirror:     %d lines'
+write.memory-mirror-not-synced:
+  short: '  Mirror:     not yet synced'
+write.memory-no-changes:
+  short: No changes since last sync.
+write.memory-source:
+  short: '  Source:      %s'
+write.memory-source-lines:
+  short: '  MEMORY.md:  %d lines'
+write.memory-source-lines-drift:
+  short: '  MEMORY.md:  %d lines (modified since last sync)'
+write.memory-source-not-active:
+  short: '  Source: auto memory not active (MEMORY.md not found)'
+write.mirror:
+  short: '  Mirror: %s'
+write.moving-task:
+  short: '✓ Moving completed task: %s'
+write.new-content:
+  short: '  New content: %d lines since last sync'
+write.obsidian-generated:
+  short: ✓ Generated Obsidian vault with %d entries in %s
+write.obsidian-next-steps:
+  short: '  Open Obsidian → Open folder as vault → Select %s'
+write.pad-blob-written:
+  short: Wrote %d bytes to %s
+write.pad-empty:
+  short: Scratchpad is empty.
+write.pad-entry-added:
+  short: Added entry %d.
+write.pad-entry-moved:
+  short: Moved entry %d to %d.
+write.pad-entry-removed:
+  short: Removed entry %d.
+write.pad-entry-updated:
+  short: Updated entry %d.
+write.pad-export-done:
+  short: '  + %s'
+write.pad-export-none:
+  short: No blob entries to export.
+write.pad-export-plan:
+  short: '  %s → %s'
+write.pad-export-summary:
+  short: '%s %d blobs.'
+write.pad-export-verb-done:
+  short: Exported
+write.pad-export-verb-dry-run:
+  short: Would export
+write.pad-export-write-failed:
+  short: '  ! failed to write %s: %v'
+write.pad-import-blob-added:
+  short: '  + %s'
+write.pad-import-blob-none:
+  short: No files to import.
+write.pad-import-blob-skipped:
+  short: '  ! skipped: %s (%v)'
+write.pad-import-blob-summary:
+  short: Done. Added %d, skipped %d.
+write.pad-import-blob-too-large:
+  short: '  ! skipped: %s (exceeds %d byte limit)'
+write.pad-import-close-warning:
+  short: 'warning: close %s: %v'
+write.pad-import-done:
+  short: Imported %d entries.
+write.pad-import-none:
+  short: No entries to import.
+write.pad-key-created:
+  short: Scratchpad key created at %s
+write.pad-merge-added:
+  short: '  + %-40s (from %s)'
+write.pad-merge-binary-warning:
+  short: '  ! %s appears to contain binary data; it may be encrypted (use --key)'
+write.pad-merge-blob-conflict:
+  short: '  ! blob %q has different content across sources; both kept'
+write.pad-merge-done:
+  short: Merged %d new %s (%d %s skipped).
+write.pad-merge-dry-run:
+  short: Would merge %d new %s (%d %s skipped).
+write.pad-merge-dupe:
+  short: '  = %-40s (duplicate, skipped)'
+write.pad-merge-none:
+  short: No entries to merge.
+write.pad-merge-none-new:
+  short: No new entries to merge (%d %s skipped).
+write.pad-resolve-entry:
+  short: '  %d. %s'
+write.pad-resolve-header:
+  short: === %s ===
+write.path-exists:
+  short: '  %s -> %s (exists)'
+write.paused:
+  short: Context hooks paused for session %s
+write.prefix-error:
+  short: 'Error: '
+write.prompt-created:
+  short: Created prompt %q.
+write.prompt-item:
+  short: '  %s'
+write.prompt-none:
+  short: No prompts found. Run 'ctx init' or 'ctx prompt add' to create prompts.
+write.prompt-removed:
+  short: Removed prompt %q.
+write.publish-block:
+  short: '  Published block:'
+write.publish-budget:
+  short: '  Budget: %d lines'
+write.publish-conventions:
+  short: '    %d key conventions (from CONVENTIONS.md)'
+write.publish-decisions:
+  short: '    %d recent decisions (from DECISIONS.md)'
+write.publish-done:
+  short: 'Published to MEMORY.md (markers:  ... )'
+write.publish-dry-run:
+  short: Dry run — no files written.
+write.publish-header:
+  short: Publishing .context/ -> MEMORY.md...
+write.publish-learnings:
+  short: '    %d recent learnings (from LEARNINGS.md)'
+write.publish-source-files:
+  short: '  Source files: TASKS.md, DECISIONS.md, CONVENTIONS.md, LEARNINGS.md'
+write.publish-tasks:
+  short: '    %d pending tasks (from TASKS.md)'
+write.publish-total:
+  short: '  Total: %d lines (within %d-line budget)'
+write.reminder-added:
+  short: '  + [%d] %s%s'
+write.reminder-after-suffix:
+  short: '  (after %s)'
+write.reminder-dismissed:
+  short: '  - [%d] %s'
+write.reminder-dismissed-all:
+  short: Dismissed %d reminders.
+write.reminder-item:
+  short: '  [%d] %s%s'
+write.reminder-none:
+  short: No reminders.
+write.reminder-not-due:
+  short: '  (after %s, not yet due)'
+write.restore-added:
+  short: '  + %s'
+write.restore-deny-dropped-header:
+  short: 'Dropped %d session deny rule(s):'
+write.restore-deny-restored-header:
+  short: 'Restored %d deny rule(s):'
+write.restore-done:
+  short: Restored from golden image.
+write.restore-dropped-header:
+  short: 'Dropped %d session allow permission(s):'
+write.restore-match:
+  short: Settings already match golden image.
+write.restore-no-local:
+  short: Restored golden image (no local settings existed).
+write.restore-perm-match:
+  short: Permission lists match; other settings differ.
+write.restore-removed:
+  short: '  - %s'
+write.restore-restored-header:
+  short: 'Restored %d allow permission(s):'
+write.resumed:
+  short: Context hooks resumed for session %s
+write.setup-done:
+  short: |-
+    Webhook configured: %s
+    Encrypted at: %s
+write.setup-prompt:
+  short: 'Enter webhook URL: '
+write.skill-line:
+  short: '  /%-22s %s'
+write.skills-header:
+  short: 'Available Skills:'
+write.snapshot-saved:
+  short: 'Saved golden image: %s'
+write.snapshot-updated:
+  short: 'Updated golden image: %s'
+write.source:
+  short: '  Source: %s'
+write.status-activity-header:
+  short: 'Recent Activity:'
+write.status-activity-item:
+  short: '  - %s modified %s'
+write.status-dir:
+  short: 'Context Directory: %s'
+write.status-drift:
+  short: '  Status: drift detected (source is newer)'
+write.status-file-compact:
+  short: '  %s %s (%s)'
+write.status-file-verbose:
+  short: '  %s %s (%s) [%s tokens, %s]'
+write.status-files:
+  short: 'Total Files: %d'
+write.status-files-header:
+  short: 'Files:'
+write.status-no-drift:
+  short: '  Status: no drift'
+write.status-preview-line:
+  short: '      (%s)'
+write.status-separator:
+  short: ====================
+write.status-title:
+  short: Context Status
+write.status-tokens:
+  short: 'Token Estimate: %s tokens'
+write.sync-action:
+  short: '%d. [%s] %s'
+write.sync-dry-run:
+  short: DRY RUN — No changes will be made
+write.sync-dry-run-summary:
+  short: Found %d items to sync. Run without --dry-run to apply suggestions.
+write.sync-header:
+  short: Sync Analysis
+write.sync-in-sync:
+  short: ✓ Context is in sync with codebase
+write.sync-separator:
+  short: =============
+write.sync-suggestion:
+  short: '   Suggestion: %s'
+write.sync-summary:
+  short: Found %d items. Review and update context files manually.
+write.synced:
+  short: Synced %s -> %s
+write.test-filtered:
+  short: |-
+    Note: event "test" is filtered by your .ctxrc notify.events config.
+    Sending anyway for testing purposes.
+write.test-no-webhook:
+  short: 'No webhook configured. Run: ctx notify setup'
+write.test-result:
+  short: 'Webhook responded: HTTP %d %s'
+write.test-working:
+  short: Webhook is working %s
+write.time-day-ago:
+  short: 1 day ago
+write.time-days-ago:
+  short: '%d days ago'
+write.time-hour-ago:
+  short: 1 hour ago
+write.time-hours-ago:
+  short: '%d hours ago'
+write.time-just-now:
+  short: just now
+write.time-minute-ago:
+  short: 1 minute ago
+write.time-minutes-ago:
+  short: '%d minutes ago'
+write.unpublish-done:
+  short: Removed published block from %s.
+write.unpublish-not-found:
+  short: No published block found in %s.
diff --git a/internal/assets/context/AGENT_PLAYBOOK.md b/internal/assets/context/AGENT_PLAYBOOK.md
index f0d78512..1e4ae64a 100644
--- a/internal/assets/context/AGENT_PLAYBOOK.md
+++ b/internal/assets/context/AGENT_PLAYBOOK.md
@@ -5,7 +5,7 @@
 Each session is a fresh execution in a shared workshop. Work
 continuity comes from artifacts left on the bench. Follow the
 cycle: **Work → Reflect → Persist**. After completing a task,
-making a decision, learning something, or hitting a milestone —
+making a decision, learning something, or hitting a milestone:
 persist before continuing. Don't wait for session end; it may
 never come cleanly.
 
@@ -16,7 +16,6 @@ Always use `ctx` from PATH:
 ctx status        # ✓ correct
 ctx agent         # ✓ correct
 ./dist/ctx        # ✗ avoid hardcoded paths
-go run ./cmd/ctx  # ✗ avoid unless developing ctx itself
 ```
 
 Check with `which ctx` if unsure whether it's installed.
@@ -36,7 +35,7 @@ Before implementing any non-trivial change, think through it step-by-step:
 3. **Anticipate failure**: what could go wrong? What are the edge cases?
 4. **Sequence**: what order minimizes risk and maximizes checkpoints?
 
-This applies to debugging too — reason through the cause before reaching
+This applies to debugging too: reason through the cause before reaching
 for a fix. Rushing to code before reasoning is the most common source of
 wasted work.
 
@@ -46,8 +45,8 @@ A session follows this arc:
 
 **Load → Orient → Pick → Work → Commit → Reflect**
 
-Not every session uses every step — a quick bugfix skips reflection, a
-research session skips committing — but the full flow is:
+Not every session uses every step: a quick bugfix skips reflection, a
+research session skips committing: but the full flow is:
 
 | Step        | What Happens                                       | Skill / Command  |
 |-------------|----------------------------------------------------|------------------|
@@ -70,42 +69,42 @@ Surface problems worth mentioning:
 - **Drift between files and code**: spot-check paths from
   ARCHITECTURE.md against the actual file tree
 
-One sentence is enough — don't turn startup into a maintenance session.
+One sentence is enough: don't turn startup into a maintenance session.
 
 ### Conversational Triggers
 
 Users rarely invoke skills explicitly. Recognize natural language:
 
-| User Says | Action |
-|-----------|--------|
-| "Do you remember?" / "What were we working on?" | `/ctx-remember` |
-| "How's our context looking?" | `/ctx-status` |
-| "What should we work on?" | `/ctx-next` |
-| "Commit this" / "Ship it" | `/ctx-commit` |
-| "The rate limiter is done" / "We finished that" | `ctx complete` (match to TASKS.md) |
-| "What did we learn?" | `/ctx-reflect` |
-| "Save that as a decision" | `/ctx-add-decision` |
-| "That's worth remembering" / "Any gotchas?" | `/ctx-add-learning` |
-| "Record that convention" | `/ctx-add-convention` |
-| "Add a task for that" | `/ctx-add-task` |
-| "Let's wrap up" | Reflect → persist outstanding items → present together |
+| User Says                                       | Action                                                 |
+|-------------------------------------------------|--------------------------------------------------------|
+| "Do you remember?" / "What were we working on?" | `/ctx-remember`                                        |
+| "How's our context looking?"                    | `/ctx-status`                                          |
+| "What should we work on?"                       | `/ctx-next`                                            |
+| "Commit this" / "Ship it"                       | `/ctx-commit`                                          |
+| "The rate limiter is done" / "We finished that" | `ctx tasks complete` (match to TASKS.md)               |
+| "What did we learn?"                            | `/ctx-reflect`                                         |
+| "Save that as a decision"                       | `/ctx-add-decision`                                    |
+| "That's worth remembering" / "Any gotchas?"     | `/ctx-add-learning`                                    |
+| "Record that convention"                        | `/ctx-add-convention`                                  |
+| "Add a task for that"                           | `/ctx-add-task`                                        |
+| "Let's wrap up"                                 | Reflect → persist outstanding items → present together |
 
 ## Proactive Persistence
 
 **Don't wait to be asked.** Identify persist-worthy moments in real time:
 
-| Event | Action |
-|-------|--------|
-| Completed a task | Mark done in TASKS.md, offer to add learnings |
-| Chose between design alternatives | Offer: *"Worth recording as a decision?"* |
-| Hit a subtle bug or gotcha | Offer: *"Want me to add this as a learning?"* |
-| Finished a feature or fix | Identify follow-up work, offer to add as tasks |
-| Resolved a tricky debugging session | Capture root cause before moving on |
-| Multi-step task or feature complete | Suggest reflection: *"Want me to capture what we learned?"* |
-| Session winding down | Offer: *"Want me to capture outstanding learnings or decisions?"* |
-| Shipped a feature or closed batch of tasks | Offer blog post or journal site rebuild |
-
-**Self-check**: periodically ask yourself — *"If this session ended
+| Event                                      | Action                                                            |
+|--------------------------------------------|-------------------------------------------------------------------|
+| Completed a task                           | Mark done in TASKS.md, offer to add learnings                     |
+| Chose between design alternatives          | Offer: *"Worth recording as a decision?"*                         |
+| Hit a subtle bug or gotcha                 | Offer: *"Want me to add this as a learning?"*                     |
+| Finished a feature or fix                  | Identify follow-up work, offer to add as tasks                    |
+| Resolved a tricky debugging session        | Capture root cause before moving on                               |
+| Multi-step task or feature complete        | Suggest reflection: *"Want me to capture what we learned?"*       |
+| Session winding down                       | Offer: *"Want me to capture outstanding learnings or decisions?"* |
+| Shipped a feature or closed batch of tasks | Offer blog post or journal site rebuild                           |
+
+**Self-check**: periodically ask yourself: *"If this session ended
 right now, would the next session know what happened?"* If no, persist
 something before continuing.
 
@@ -135,12 +134,12 @@ user. These apply unless the user overrides them for the session
 (e.g., "skip the alternatives, just build it").
 
 - **At design decisions**: always present 2+ approaches with
-  trade-offs before committing — don't silently pick one
+  trade-offs before committing: don't silently pick one
 - **At completion claims**: run self-audit questions (What did I
   assume? What didn't I check? Where am I least confident? What
   would a reviewer question?) before reporting done
 - **At ambiguous moments**: ask the user rather than inferring
-  intent — a quick question is cheaper than rework
+  intent: a quick question is cheaper than rework
 - **When producing artifacts**: flag assumptions and uncertainty
   areas inline, not buried in a footnote
 
@@ -149,23 +148,23 @@ and respect "no."
 
 ## Own the Whole Branch
 
-When working on a branch, you own every issue on it — lint failures, test
-failures, build errors — regardless of who introduced them. Never dismiss
+When working on a branch, you own every issue on it: lint failures, test
+failures, build errors: regardless of who introduced them. Never dismiss
 a problem as "pre-existing" or "not related to my changes."
 
 - **If `make lint` fails, fix it.** The branch must be green when you're done.
 - **If tests break, investigate.** Even if the failing test is in a file you
-  didn't touch, something you changed may have caused it — or it may have been
+  didn't touch, something you changed may have caused it: or it may have been
   broken before and it's still your job to fix it on this branch.
-- **Run the full validation suite** (`make lint`, `go test ./...`, `go build`)
-  before declaring any phase complete.
+- **Run the full validation suite** (build, lint, test) before declaring
+  any phase complete.
 
 ## How to Avoid Hallucinating Memory
 
 Never assume. If you don't see it in files, you don't know it.
 
 - Don't claim "we discussed X" without file evidence
-- Don't invent history — check context files and `ctx recall`
+- Don't invent history: check context files and `ctx recall`
 - If uncertain, say "I don't see this documented"
 - Trust files over intuition
 
@@ -173,20 +172,20 @@ Never assume. If you don't see it in files, you don't know it.
 
 Before implementing a feature or multi-task effort, follow this sequence:
 
-**1. Spec first** — Write a design document in `specs/` covering: problem,
+**1. Spec first**: Write a design document in `specs/` covering: problem,
 solution, storage, CLI surface, error cases, and non-goals. Keep it concise
 but complete enough that another session could implement from it alone.
 
-**2. Task it out** — Break the work into individual tasks in TASKS.md under
+**2. Task it out**: Break the work into individual tasks in TASKS.md under
 a dedicated Phase section. Each task should be independently completable and
 verifiable.
 
-**3. Cross-reference** — The Phase header in TASKS.md must reference the
+**3. Cross-reference**: The Phase header in TASKS.md must reference the
 spec: `Spec: \`specs/feature-name.md\``. The first task in the phase should
 include: "Read `specs/feature-name.md` before starting any PX task."
 
-**4. Read before building** — When picking up a task that references a spec,
-read the spec first. Don't rely on the task description alone — it's a
+**4. Read before building**: When picking up a task that references a spec,
+read the spec first. Don't rely on the task description alone: it's a
 summary, not the full design.
 
 ## When to Consolidate vs Add Features
@@ -201,13 +200,13 @@ When in doubt, ask: "Would a new contributor understand where this belongs?"
 
 ## Pre-Flight Checklist: CLI Code
 
-Before writing or modifying CLI code (`internal/cli/**/*.go`):
+Before writing or modifying CLI code:
 
-1. **Read CONVENTIONS.md** — load established patterns into context
-2. **Check similar commands** — how do existing commands handle output?
-3. **Use cmd methods for output** — `cmd.Printf`, `cmd.Println`,
+1. **Read CONVENTIONS.md**: load established patterns into context
+2. **Check similar commands**: how do existing commands handle output?
+3. **Use cmd methods for output**: `cmd.Printf`, `cmd.Println`,
    not `fmt.Printf`, `fmt.Println`
-4. **Follow docstring format** — see CONVENTIONS.md, Documentation section
+4. **Follow docstring format**: see CONVENTIONS.md, Documentation section
 
 ---
 
@@ -224,21 +223,21 @@ completing work, not as a separate task. Run `ctx drift` periodically.
 
 ### Context Sprawl
 
-Information scattered across multiple locations — same decision in
+Information scattered across multiple locations: same decision in
 DECISIONS.md and a session file, conventions split between
 CONVENTIONS.md and code comments. **Solution**: Single source of
 truth for each type of information. Use the defined file structure.
 
 ### Implicit Context
 
-Relying on knowledge not captured in artifacts — "everyone knows we
+Relying on knowledge not captured in artifacts: "everyone knows we
 don't do X" but it's not in CONSTITUTION.md, patterns followed but
 not in CONVENTIONS.md. **Solution**: If you reference something
 repeatedly, add it to the appropriate file.
 
 ### Over-Specification
 
-Context becomes so detailed it's impossible to maintain — 50+ rules
+Context becomes so detailed it's impossible to maintain: 50+ rules
 in CONVENTIONS.md, every minor choice gets a DECISIONS.md entry.
 **Solution**: Keep artifacts focused on decisions that affect behavior
 and alignment. Not everything needs documenting.
diff --git a/internal/assets/context/CONSTITUTION.md b/internal/assets/context/CONSTITUTION.md
index f4236e36..710bfa5d 100644
--- a/internal/assets/context/CONSTITUTION.md
+++ b/internal/assets/context/CONSTITUTION.md
@@ -23,7 +23,8 @@ These rules are INVIOLABLE. If a task requires violating these, the task is wron
 
 - [ ] All code must pass tests before commit
 - [ ] No TODO comments in main branch (move to TASKS.md)
-- [ ] Path construction uses stdlib — no string concatenation (security: prevents path traversal)
+- [ ] Path construction uses stdlib: no string concatenation 
+  (security: prevents path traversal)
 
 ## Process Invariants
 
@@ -31,17 +32,20 @@ These rules are INVIOLABLE. If a task requires violating these, the task is wron
 
 ## TASKS.md Structure Invariants
 
-TASKS.md must remain a replayable checklist. Uncheck all items and re-run = verify/redo all tasks in order.
+TASKS.md must remain a replayable checklist. Uncheck all items and 
+re-run = verify/redo all tasks in order.
 
-- [ ] **Never move tasks** — tasks stay in their Phase section permanently
-- [ ] **Never remove Phase headers** — Phase labels provide structure and order
-- [ ] **Never merge or collapse Phase sections** — each phase is a logical unit
-- [ ] **Never delete tasks** — mark as `[x]` completed, or `[-]` skipped with reason
-- [ ] **Use inline labels for status** — add `#in-progress` to task text, don't move it
-- [ ] **No "In Progress" / "Next Up" sections** — these encourage moving tasks
-- [ ] **Ask before restructuring** — if structure changes seem needed, ask the user first
+- [ ] **Never move tasks**: tasks stay in their Phase section permanently
+- [ ] **Never remove Phase headers**: Phase labels provide structure and order
+- [ ] **Never merge or collapse Phase sections**: each phase is a logical unit
+- [ ] **Never delete tasks**: mark as `[x]` completed, or `[-]` skipped with reason
+- [ ] **Use inline labels for status**: add `#in-progress` to task text, don't move it
+- [ ] **No "In Progress" / "Next Up" sections**: these encourage moving tasks
+- [ ] **Ask before restructuring**: if structure changes seem needed, ask the user first
 
 ## Context Preservation Invariants
 
-- [ ] **Archival is allowed, deletion is not** — use `ctx tasks archive` to move completed tasks to `.context/archive/`, never delete context history
-- [ ] **Archive preserves structure** — archived tasks keep their Phase headers for traceability
+- [ ] **Archival is allowed, deletion is not**: use `ctx tasks archive` to move 
+  completed tasks to `.context/archive/`, never delete context history
+- [ ] **Archive preserves structure**: archived tasks keep their Phase headers 
+  for traceability
diff --git a/internal/assets/context/PROMPT.md b/internal/assets/context/PROMPT.md
index 5ca47994..1bf7d063 100644
--- a/internal/assets/context/PROMPT.md
+++ b/internal/assets/context/PROMPT.md
@@ -11,13 +11,13 @@
 
 ## Context Files
 
-| File | Purpose |
-|------|---------|
-| `.context/CONSTITUTION.md` | Hard rules — NEVER violate |
-| `.context/TASKS.md` | Current work items |
-| `.context/DECISIONS.md` | Architectural decisions with rationale |
-| `.context/LEARNINGS.md` | Gotchas and lessons learned |
-| `.context/CONVENTIONS.md` | Code patterns and standards |
+| File                         | Purpose                                  |
+|------------------------------|------------------------------------------|
+| `.context/CONSTITUTION.md`   | Hard rules: NEVER violate                |
+| `.context/TASKS.md`          | Current work items                       |
+| `.context/DECISIONS.md`      | Architectural decisions with rationale   |
+| `.context/LEARNINGS.md`      | Gotchas and lessons learned              |
+| `.context/CONVENTIONS.md`    | Code patterns and standards              |
 | `.context/AGENT_PLAYBOOK.md` | How to persist context, session patterns |
 
 ## Working Style
@@ -31,13 +31,13 @@
 
 After completing meaningful work, capture what matters:
 
-| Trigger | Action |
-|---------|--------|
-| Completed a task | Mark done in TASKS.md, add learnings if any |
-| Made a decision | `ctx add decision "..."` |
-| Discovered a gotcha | `ctx add learning "..."` |
-| Significant code changes | Consider what's worth capturing |
+| Trigger                  | Action                                      |
+|--------------------------|---------------------------------------------|
+| Completed a task         | Mark done in TASKS.md, add learnings if any |
+| Made a decision          | `ctx add decision "..."`                    |
+| Discovered a gotcha      | `ctx add learning "..."`                    |
+| Significant code changes | Consider what's worth capturing             |
 
-Don't wait for the session to end — it may never come cleanly.
+Don't wait for the session to end: it may never come cleanly.
 
 
diff --git a/internal/assets/context/TASKS.md b/internal/assets/context/TASKS.md
index 8cf07817..b5ceabeb 100644
--- a/internal/assets/context/TASKS.md
+++ b/internal/assets/context/TASKS.md
@@ -13,16 +13,16 @@ DO NOT UPDATE FOR:
 - Removing completed tasks (use ctx tasks archive instead)
 
 STRUCTURE RULES (see CONSTITUTION.md):
-- Tasks stay in their Phase section permanently — never move them
+- Tasks stay in their Phase section permanently: never move them
 - Use inline labels: #in-progress, #blocked, #priority:high
 - Mark completed: [x], skipped: [-] (with reason)
 - Never delete tasks, never remove Phase headers
 
 TASK STATUS LABELS:
-  `[ ]` — pending
-  `[x]` — completed
-  `[-]` — skipped (with reason)
-  `#in-progress` — currently being worked on (add inline, don't move task)
+  `[ ]`: pending
+  `[x]`: completed
+  `[-]`: skipped (with reason)
+  `#in-progress`: currently being worked on (add inline, don't move task)
 -->
 
 ### Phase 1: [Name] `#priority:high`
diff --git a/internal/assets/embed.go b/internal/assets/embed.go
index e1c1825c..1b62c109 100644
--- a/internal/assets/embed.go
+++ b/internal/assets/embed.go
@@ -15,13 +15,854 @@ import (
 	"strings"
 	"sync"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"gopkg.in/yaml.v3"
 )
 
-//go:embed claude/.claude-plugin/plugin.json claude/CLAUDE.md claude/skills/*/references/*.md claude/skills/*/SKILL.md context/*.md project/* entry-templates/*.md hooks/messages/*/*.txt hooks/messages/registry.yaml prompt-templates/*.md ralph/*.md schema/*.json why/*.md permissions/*.txt commands/*.yaml
+//go:embed claude/.claude-plugin/plugin.json claude/CLAUDE.md claude/skills/*/references/*.md claude/skills/*/SKILL.md context/*.md project/* entry-templates/*.md hooks/*.md hooks/messages/*/*.txt hooks/messages/registry.yaml prompt-templates/*.md ralph/*.md schema/*.json why/*.md permissions/*.txt commands/*.yaml journal/*.css
 var FS embed.FS
 
+const (
+	FlagDescKeyAddApplication              = "add.application"
+	FlagDescKeyAddConsequences             = "add.consequences"
+	FlagDescKeyAddContext                  = "add.context"
+	FlagDescKeyAddFile                     = "add.file"
+	FlagDescKeyAddLesson                   = "add.lesson"
+	FlagDescKeyAddPriority                 = "add.priority"
+	FlagDescKeyAddRationale                = "add.rationale"
+	FlagDescKeyAddSection                  = "add.section"
+	FlagDescKeyAgentBudget                 = "agent.budget"
+	FlagDescKeyAgentCooldown               = "agent.cooldown"
+	FlagDescKeyAgentFormat                 = "agent.format"
+	FlagDescKeyAgentSession                = "agent.session"
+	FlagDescKeyChangesSince                = "changes.since"
+	FlagDescKeyCompactArchive              = "compact.archive"
+	FlagDescKeyDepsExternal                = "deps.external"
+	FlagDescKeyDepsFormat                  = "deps.format"
+	FlagDescKeyDepsType                    = "deps.type"
+	FlagDescKeyDoctorJson                  = "doctor.json"
+	FlagDescKeyDriftFix                    = "drift.fix"
+	FlagDescKeyDriftJson                   = "drift.json"
+	FlagDescKeyGuideCommands               = "guide.commands"
+	FlagDescKeyGuideSkills                 = "guide.skills"
+	FlagDescKeyHookWrite                   = "hook.write"
+	FlagDescKeyInitializeForce             = "initialize.force"
+	FlagDescKeyInitializeMerge             = "initialize.merge"
+	FlagDescKeyInitializeMinimal           = "initialize.minimal"
+	FlagDescKeyInitializeNoPluginEnable    = "initialize.no-plugin-enable"
+	FlagDescKeyInitializeRalph             = "initialize.ralph"
+	FlagDescKeyJournalObsidianOutput       = "journal.obsidian.output"
+	FlagDescKeyJournalSiteBuild            = "journal.site.build"
+	FlagDescKeyJournalSiteOutput           = "journal.site.output"
+	FlagDescKeyJournalSiteServe            = "journal.site.serve"
+	FlagDescKeyLoadBudget                  = "load.budget"
+	FlagDescKeyLoadRaw                     = "load.raw"
+	FlagDescKeyLoopCompletion              = "loop.completion"
+	FlagDescKeyLoopMaxIterations           = "loop.max-iterations"
+	FlagDescKeyLoopOutput                  = "loop.output"
+	FlagDescKeyLoopPrompt                  = "loop.prompt"
+	FlagDescKeyLoopTool                    = "loop.tool"
+	FlagDescKeyMemoryImportDryRun          = "memory.import.dry-run"
+	FlagDescKeyMemoryPublishBudget         = "memory.publish.budget"
+	FlagDescKeyMemoryPublishDryRun         = "memory.publish.dry-run"
+	FlagDescKeyMemorySyncDryRun            = "memory.sync.dry-run"
+	FlagDescKeyNotifyEvent                 = "notify.event"
+	FlagDescKeyNotifyHook                  = "notify.hook"
+	FlagDescKeyNotifySessionId             = "notify.session-id"
+	FlagDescKeyNotifyVariant               = "notify.variant"
+	FlagDescKeyPadAddFile                  = "pad.add.file"
+	FlagDescKeyPadEditAppend               = "pad.edit.append"
+	FlagDescKeyPadEditFile                 = "pad.edit.file"
+	FlagDescKeyPadEditLabel                = "pad.edit.label"
+	FlagDescKeyPadEditPrepend              = "pad.edit.prepend"
+	FlagDescKeyPadExportDryRun             = "pad.export.dry-run"
+	FlagDescKeyPadExportForce              = "pad.export.force"
+	FlagDescKeyPadImpBlobs                 = "pad.imp.blobs"
+	FlagDescKeyPadMergeDryRun              = "pad.merge.dry-run"
+	FlagDescKeyPadMergeKey                 = "pad.merge.key"
+	FlagDescKeyPadShowOut                  = "pad.show.out"
+	FlagDescKeyPauseSessionId              = "pause.session-id"
+	FlagDescKeyPromptAddStdin              = "prompt.add.stdin"
+	FlagDescKeyRecallExportAll             = "recall.export.all"
+	FlagDescKeyRecallExportAllProjects     = "recall.export.all-projects"
+	FlagDescKeyRecallExportDryRun          = "recall.export.dry-run"
+	FlagDescKeyRecallExportKeepFrontmatter = "recall.export.keep-frontmatter"
+	FlagDescKeyRecallExportRegenerate      = "recall.export.regenerate"
+	FlagDescKeyRecallExportSkipExisting    = "recall.export.skip-existing"
+	FlagDescKeyRecallExportYes             = "recall.export.yes"
+	FlagDescKeyRecallListAllProjects       = "recall.list.all-projects"
+	FlagDescKeyRecallListLimit             = "recall.list.limit"
+	FlagDescKeyRecallListProject           = "recall.list.project"
+	FlagDescKeyRecallListSince             = "recall.list.since"
+	FlagDescKeyRecallListTool              = "recall.list.tool"
+	FlagDescKeyRecallListUntil             = "recall.list.until"
+	FlagDescKeyRecallLockAll               = "recall.lock.all"
+	FlagDescKeyRecallShowAllProjects       = "recall.show.all-projects"
+	FlagDescKeyRecallShowFull              = "recall.show.full"
+	FlagDescKeyRecallShowLatest            = "recall.show.latest"
+	FlagDescKeyRecallUnlockAll             = "recall.unlock.all"
+	FlagDescKeyRemindAddAfter              = "remind.add.after"
+	FlagDescKeyRemindAfter                 = "remind.after"
+	FlagDescKeyRemindDismissAll            = "remind.dismiss.all"
+	FlagDescKeyResumeSessionId             = "resume.session-id"
+	FlagDescKeySiteFeedBaseUrl             = "site.feed.base-url"
+	FlagDescKeySiteFeedOut                 = "site.feed.out"
+	FlagDescKeyStatusJson                  = "status.json"
+	FlagDescKeyStatusVerbose               = "status.verbose"
+	FlagDescKeySyncDryRun                  = "sync.dry-run"
+	FlagDescKeySystemBackupJson            = "system.backup.json"
+	FlagDescKeySystemBackupScope           = "system.backup.scope"
+	FlagDescKeySystemBootstrapJson         = "system.bootstrap.json"
+	FlagDescKeySystemBootstrapQuiet        = "system.bootstrap.quiet"
+	FlagDescKeySystemEventsAll             = "system.events.all"
+	FlagDescKeySystemEventsEvent           = "system.events.event"
+	FlagDescKeySystemEventsHook            = "system.events.hook"
+	FlagDescKeySystemEventsJson            = "system.events.json"
+	FlagDescKeySystemEventsLast            = "system.events.last"
+	FlagDescKeySystemEventsSession         = "system.events.session"
+	FlagDescKeySystemMarkjournalCheck      = "system.markjournal.check"
+	FlagDescKeySystemMessageJson           = "system.message.json"
+	FlagDescKeySystemPauseSessionId        = "system.pause.session-id"
+	FlagDescKeySystemPruneDays             = "system.prune.days"
+	FlagDescKeySystemPruneDryRun           = "system.prune.dry-run"
+	FlagDescKeySystemResourcesJson         = "system.resources.json"
+	FlagDescKeySystemResumeSessionId       = "system.resume.session-id"
+	FlagDescKeySystemStatsFollow           = "system.stats.follow"
+	FlagDescKeySystemStatsJson             = "system.stats.json"
+	FlagDescKeySystemStatsLast             = "system.stats.last"
+	FlagDescKeySystemStatsSession          = "system.stats.session"
+	FlagDescKeyTaskArchiveDryRun           = "task.archive.dry-run"
+	FlagDescKeyWatchDryRun                 = "watch.dry-run"
+	FlagDescKeyWatchLog                    = "watch.log"
+)
+
+const (
+	CmdDescKeyAdd                          = "add"
+	CmdDescKeyAgent                        = "agent"
+	CmdDescKeyChanges                      = "changes"
+	CmdDescKeyCompact                      = "compact"
+	CmdDescKeyComplete                     = "complete"
+	CmdDescKeyConfig                       = "config"
+	CmdDescKeyConfigSchema                 = "config.schema"
+	CmdDescKeyConfigStatus                 = "config.status"
+	CmdDescKeyConfigSwitch                 = "config.switch"
+	CmdDescKeyCtx                          = "ctx"
+	CmdDescKeyDecision                     = "decision"
+	CmdDescKeyDecisionReindex              = "decision.reindex"
+	CmdDescKeyDeps                         = "deps"
+	CmdDescKeyDoctor                       = "doctor"
+	CmdDescKeyDrift                        = "drift"
+	CmdDescKeyGuide                        = "guide"
+	CmdDescKeyHook                         = "hook"
+	CmdDescKeyInitialize                   = "initialize"
+	CmdDescKeyJournal                      = "journal"
+	CmdDescKeyJournalObsidian              = "journal.obsidian"
+	CmdDescKeyJournalSite                  = "journal.site"
+	CmdDescKeyLearnings                    = "learnings"
+	CmdDescKeyLearningsReindex             = "learnings.reindex"
+	CmdDescKeyLoad                         = "load"
+	CmdDescKeyLoop                         = "loop"
+	CmdDescKeyMcp                          = "mcp"
+	CmdDescKeyMcpServe                     = "mcp.serve"
+	CmdDescKeyMemory                       = "memory"
+	CmdDescKeyMemoryDiff                   = "memory.diff"
+	CmdDescKeyMemoryImport                 = "memory.import"
+	CmdDescKeyMemoryPublish                = "memory.publish"
+	CmdDescKeyMemoryStatus                 = "memory.status"
+	CmdDescKeyMemorySync                   = "memory.sync"
+	CmdDescKeyMemoryUnpublish              = "memory.unpublish"
+	CmdDescKeyNotify                       = "notify"
+	CmdDescKeyNotifySetup                  = "notify.setup"
+	CmdDescKeyNotifyTest                   = "notify.test"
+	CmdDescKeyPad                          = "pad"
+	CmdDescKeyPadAdd                       = "pad.add"
+	CmdDescKeyPadEdit                      = "pad.edit"
+	CmdDescKeyPadExport                    = "pad.export"
+	CmdDescKeyPadImp                       = "pad.imp"
+	CmdDescKeyPadMerge                     = "pad.merge"
+	CmdDescKeyPadMv                        = "pad.mv"
+	CmdDescKeyPadResolve                   = "pad.resolve"
+	CmdDescKeyPadRm                        = "pad.rm"
+	CmdDescKeyPadShow                      = "pad.show"
+	CmdDescKeyPause                        = "pause"
+	CmdDescKeyPermissions                  = "permissions"
+	CmdDescKeyPermissionsRestore           = "permissions.restore"
+	CmdDescKeyPermissionsSnapshot          = "permissions.snapshot"
+	CmdDescKeyPrompt                       = "prompt"
+	CmdDescKeyPromptAdd                    = "prompt.add"
+	CmdDescKeyPromptList                   = "prompt.list"
+	CmdDescKeyPromptRm                     = "prompt.rm"
+	CmdDescKeyPromptShow                   = "prompt.show"
+	CmdDescKeyRecall                       = "recall"
+	CmdDescKeyRecallExport                 = "recall.export"
+	CmdDescKeyRecallList                   = "recall.list"
+	CmdDescKeyRecallLock                   = "recall.lock"
+	CmdDescKeyRecallShow                   = "recall.show"
+	CmdDescKeyRecallSync                   = "recall.sync"
+	CmdDescKeyRecallUnlock                 = "recall.unlock"
+	CmdDescKeyReindex                      = "reindex"
+	CmdDescKeyRemind                       = "remind"
+	CmdDescKeyRemindAdd                    = "remind.add"
+	CmdDescKeyRemindDismiss                = "remind.dismiss"
+	CmdDescKeyRemindList                   = "remind.list"
+	CmdDescKeyResume                       = "resume"
+	CmdDescKeyServe                        = "serve"
+	CmdDescKeySite                         = "site"
+	CmdDescKeySiteFeed                     = "site.feed"
+	CmdDescKeyStatus                       = "status"
+	CmdDescKeySync                         = "sync"
+	CmdDescKeySystem                       = "system"
+	CmdDescKeySystemBackup                 = "system.backup"
+	CmdDescKeySystemBlockDangerousCommands = "system.blockdangerouscommands"
+	CmdDescKeySystemBlockNonPathCtx        = "system.blocknonpathctx"
+	CmdDescKeySystemBootstrap              = "system.bootstrap"
+	CmdDescKeySystemCheckBackupAge         = "system.checkbackupage"
+	CmdDescKeySystemCheckCeremonies        = "system.checkceremonies"
+	CmdDescKeySystemCheckContextSize       = "system.checkcontextsize"
+	CmdDescKeySystemCheckJournal           = "system.checkjournal"
+	CmdDescKeySystemCheckKnowledge         = "system.checkknowledge"
+	CmdDescKeySystemCheckMapStaleness      = "system.checkmapstaleness"
+	CmdDescKeySystemCheckMemoryDrift       = "system.checkmemorydrift"
+	CmdDescKeySystemCheckPersistence       = "system.checkpersistence"
+	CmdDescKeySystemCheckReminders         = "system.checkreminders"
+	CmdDescKeySystemCheckResources         = "system.checkresources"
+	CmdDescKeySystemCheckTaskCompletion    = "system.checktaskcompletion"
+	CmdDescKeySystemCheckVersion           = "system.checkversion"
+	CmdDescKeySystemContextLoadGate        = "system.contextloadgate"
+	CmdDescKeySystemEvents                 = "system.events"
+	CmdDescKeySystemHeartbeat              = "system.heartbeat"
+	CmdDescKeySystemMarkJournal            = "system.markjournal"
+	CmdDescKeySystemMarkWrappedUp          = "system.markwrappedup"
+	CmdDescKeySystemMessage                = "system.message"
+	CmdDescKeySystemMessageEdit            = "system.message.edit"
+	CmdDescKeySystemMessageList            = "system.message.list"
+	CmdDescKeySystemMessageReset           = "system.message.reset"
+	CmdDescKeySystemMessageShow            = "system.message.show"
+	CmdDescKeySystemPause                  = "system.pause"
+	CmdDescKeySystemPostCommit             = "system.postcommit"
+	CmdDescKeySystemPrune                  = "system.prune"
+	CmdDescKeySystemQaReminder             = "system.qareminder"
+	CmdDescKeySystemResources              = "system.resources"
+	CmdDescKeySystemResume                 = "system.resume"
+	CmdDescKeySystemSpecsNudge             = "system.specsnudge"
+	CmdDescKeySystemStats                  = "system.stats"
+	CmdDescKeyTask                         = "task"
+	CmdDescKeyTaskArchive                  = "task.archive"
+	CmdDescKeyTaskSnapshot                 = "task.snapshot"
+	CmdDescKeyWatch                        = "watch"
+	CmdDescKeyWhy                          = "why"
+)
+
+const (
+	TextDescKeyAgentInstruction                 = "agent.instruction"
+	TextDescKeyBackupBoxTitle                   = "backup.box-title"
+	TextDescKeyBackupNoMarker                   = "backup.no-marker"
+	TextDescKeyBackupRelayMessage               = "backup.relay-message"
+	TextDescKeyBackupRelayPrefix                = "backup.relay-prefix"
+	TextDescKeyBackupRunHint                    = "backup.run-hint"
+	TextDescKeyBackupSMBNotMounted              = "backup.smb-not-mounted"
+	TextDescKeyBackupSMBUnavailable             = "backup.smb-unavailable"
+	TextDescKeyBackupStale                      = "backup.stale"
+	TextDescKeyBootstrapNextSteps               = "bootstrap.next-steps"
+	TextDescKeyBootstrapNone                    = "bootstrap.none"
+	TextDescKeyBootstrapPluginWarning           = "bootstrap.plugin-warning"
+	TextDescKeyBootstrapRules                   = "bootstrap.rules"
+	TextDescKeyContextLoadGateFileHeader        = "context-load-gate.file-header"
+	TextDescKeyContextLoadGateFooter            = "context-load-gate.footer"
+	TextDescKeyContextLoadGateHeader            = "context-load-gate.header"
+	TextDescKeyContextLoadGateIndexFallback     = "context-load-gate.index-fallback"
+	TextDescKeyContextLoadGateIndexHeader       = "context-load-gate.index-header"
+	TextDescKeyContextLoadGateOversizeAction    = "context-load-gate.oversize-action"
+	TextDescKeyContextLoadGateOversizeBreakdown = "context-load-gate.oversize-breakdown"
+	TextDescKeyContextLoadGateOversizeFileEntry = "context-load-gate.oversize-file-entry"
+	TextDescKeyContextLoadGateOversizeHeader    = "context-load-gate.oversize-header"
+	TextDescKeyContextLoadGateOversizeInjected  = "context-load-gate.oversize-injected"
+	TextDescKeyContextLoadGateOversizeTimestamp = "context-load-gate.oversize-timestamp"
+	TextDescKeyContextLoadGateWebhook           = "context-load-gate.webhook"
+
+	TextDescKeyHeartbeatLogTokens    = "heartbeat.log-tokens"
+	TextDescKeyHeartbeatLogPlain     = "heartbeat.log-plain"
+	TextDescKeyHeartbeatNotifyTokens = "heartbeat.notify-tokens"
+	TextDescKeyHeartbeatNotifyPlain  = "heartbeat.notify-plain"
+
+	TextDescKeyEventsEmpty       = "events.empty"
+	TextDescKeyEventsHumanFormat = "events.human-format"
+
+	TextDescKeyStatsEmpty        = "stats.empty"
+	TextDescKeyStatsHeaderFormat = "stats.header-format"
+	TextDescKeyStatsLineFormat   = "stats.line-format"
+
+	TextDescKeyCheckContextSizeBillingBoxTitle       = "check-context-size.billing-box-title"
+	TextDescKeyCheckContextSizeBillingFallback       = "check-context-size.billing-fallback"
+	TextDescKeyCheckContextSizeBillingRelayFormat    = "check-context-size.billing-relay-format"
+	TextDescKeyCheckContextSizeBillingRelayPrefix    = "check-context-size.billing-relay-prefix"
+	TextDescKeyCheckContextSizeCheckpointBoxTitle    = "check-context-size.checkpoint-box-title"
+	TextDescKeyCheckContextSizeCheckpointFallback    = "check-context-size.checkpoint-fallback"
+	TextDescKeyCheckContextSizeCheckpointRelayFormat = "check-context-size.checkpoint-relay-format"
+	TextDescKeyCheckContextSizeOversizeFallback      = "check-context-size.oversize-fallback"
+	TextDescKeyCheckContextSizeRelayPrefix           = "check-context-size.relay-prefix"
+	TextDescKeyCheckContextSizeRunningLowSuffix      = "check-context-size.running-low-suffix"
+	TextDescKeyCheckContextSizeSilentLogFormat       = "check-context-size.silent-log-format"
+	TextDescKeyCheckContextSizeSilencedCheckpointLog = "check-context-size.silenced-checkpoint-log"
+	TextDescKeyCheckContextSizeCheckpointLogFormat   = "check-context-size.checkpoint-log-format"
+	TextDescKeyCheckContextSizeSuppressedLogFormat   = "check-context-size.suppressed-log-format"
+	TextDescKeyCheckContextSizeSilencedWindowLog     = "check-context-size.silenced-window-log"
+	TextDescKeyCheckContextSizeWindowLogFormat       = "check-context-size.window-log-format"
+	TextDescKeyCheckContextSizeSilencedBillingLog    = "check-context-size.silenced-billing-log"
+	TextDescKeyCheckContextSizeBillingLogFormat      = "check-context-size.billing-log-format"
+	TextDescKeyCheckContextSizeTokenLow              = "check-context-size.token-low"
+	TextDescKeyCheckContextSizeTokenNormal           = "check-context-size.token-normal"
+	TextDescKeyCheckContextSizeTokenUsage            = "check-context-size.token-usage"
+	TextDescKeyCheckContextSizeWindowBoxTitle        = "check-context-size.window-box-title"
+	TextDescKeyCheckContextSizeWindowFallback        = "check-context-size.window-fallback"
+	TextDescKeyCheckContextSizeWindowRelayFormat     = "check-context-size.window-relay-format"
+	TextDescKeyCheckJournalBoxTitle                  = "check-journal.box-title"
+	TextDescKeyCheckJournalFallbackBoth              = "check-journal.fallback-both"
+	TextDescKeyCheckJournalFallbackUnenriched        = "check-journal.fallback-unenriched"
+	TextDescKeyCheckJournalFallbackUnexported        = "check-journal.fallback-unexported"
+	TextDescKeyCheckJournalRelayFormat               = "check-journal.relay-format"
+	TextDescKeyCheckJournalRelayPrefix               = "check-journal.relay-prefix"
+	TextDescKeyCheckKnowledgeBoxTitle                = "check-knowledge.box-title"
+	TextDescKeyCheckKnowledgeFallback                = "check-knowledge.fallback"
+	TextDescKeyCheckKnowledgeFindingFormat           = "check-knowledge.finding-format"
+	TextDescKeyCheckKnowledgeRelayMessage            = "check-knowledge.relay-message"
+	TextDescKeyCheckKnowledgeRelayPrefix             = "check-knowledge.relay-prefix"
+	TextDescKeyCheckPersistenceBoxTitle              = "check-persistence.box-title"
+	TextDescKeyCheckPersistenceBoxTitleFormat        = "check-persistence.box-title-format"
+	TextDescKeyCheckPersistenceCheckpointFormat      = "check-persistence.checkpoint-format"
+	TextDescKeyCheckPersistenceFallback              = "check-persistence.fallback"
+	TextDescKeyCheckPersistenceInitLogFormat         = "check-persistence.init-log-format"
+	TextDescKeyCheckPersistenceModifiedLogFormat     = "check-persistence.modified-log-format"
+	TextDescKeyCheckPersistenceRelayFormat           = "check-persistence.relay-format"
+	TextDescKeyCheckPersistenceRelayPrefix           = "check-persistence.relay-prefix"
+	TextDescKeyCheckPersistenceSilencedLogFormat     = "check-persistence.silenced-log-format"
+	TextDescKeyCheckPersistenceSilentLogFormat       = "check-persistence.silent-log-format"
+	TextDescKeyCheckPersistenceStateFormat           = "check-persistence.state-format"
+	TextDescKeyCheckVersionBoxTitle                  = "check-version.box-title"
+	TextDescKeyCheckVersionFallback                  = "check-version.fallback"
+	TextDescKeyCheckVersionKeyBoxTitle               = "check-version.key-box-title"
+	TextDescKeyCheckVersionKeyFallback               = "check-version.key-fallback"
+	TextDescKeyCheckVersionKeyRelayFormat            = "check-version.key-relay-format"
+	TextDescKeyCheckVersionKeyRelayPrefix            = "check-version.key-relay-prefix"
+	TextDescKeyCheckVersionMismatchRelayFormat       = "check-version.mismatch-relay-format"
+	TextDescKeyCheckVersionRelayPrefix               = "check-version.relay-prefix"
+	TextDescKeyCheckMapStalenessBoxTitle             = "check-map-staleness.box-title"
+	TextDescKeyCheckMapStalenessFallback             = "check-map-staleness.fallback"
+	TextDescKeyCheckMapStalenessRelayMessage         = "check-map-staleness.relay-message"
+	TextDescKeyCheckMapStalenessRelayPrefix          = "check-map-staleness.relay-prefix"
+	TextDescKeyCheckMemoryDriftBoxTitle              = "check-memory-drift.box-title"
+	TextDescKeyCheckMemoryDriftContent               = "check-memory-drift.content"
+	TextDescKeyCheckMemoryDriftRelayMessage          = "check-memory-drift.relay-message"
+	TextDescKeyCheckMemoryDriftRelayPrefix           = "check-memory-drift.relay-prefix"
+	TextDescKeyCeremonyBoxBoth                       = "ceremony.box-both"
+	TextDescKeyCeremonyBoxRemember                   = "ceremony.box-remember"
+	TextDescKeyCeremonyBoxWrapup                     = "ceremony.box-wrapup"
+	TextDescKeyCeremonyFallbackBoth                  = "ceremony.fallback-both"
+	TextDescKeyCeremonyFallbackRemember              = "ceremony.fallback-remember"
+	TextDescKeyCeremonyFallbackWrapup                = "ceremony.fallback-wrapup"
+	TextDescKeyCeremonyRelayMessage                  = "ceremony.relay-message"
+	TextDescKeyCeremonyRelayPrefix                   = "ceremony.relay-prefix"
+	TextDescKeyCheckRemindersBoxTitle                = "check-reminders.box-title"
+	TextDescKeyCheckRemindersDismissHint             = "check-reminders.dismiss-hint"
+	TextDescKeyCheckRemindersDismissAllHint          = "check-reminders.dismiss-all-hint"
+	TextDescKeyCheckRemindersItemFormat              = "check-reminders.item-format"
+	TextDescKeyCheckRemindersNudgeFormat             = "check-reminders.nudge-format"
+	TextDescKeyCheckRemindersRelayPrefix             = "check-reminders.relay-prefix"
+
+	TextDescKeyCheckResourcesBoxTitle        = "check-resources.box-title"
+	TextDescKeyCheckResourcesFallbackLow     = "check-resources.fallback-low"
+	TextDescKeyCheckResourcesFallbackPersist = "check-resources.fallback-persist"
+	TextDescKeyCheckResourcesFallbackEnd     = "check-resources.fallback-end"
+	TextDescKeyCheckResourcesRelayMessage    = "check-resources.relay-message"
+	TextDescKeyCheckResourcesRelayPrefix     = "check-resources.relay-prefix"
+
+	TextDescKeyCheckTaskCompletionFallback     = "check-task-completion.fallback"
+	TextDescKeyCheckTaskCompletionNudgeMessage = "check-task-completion.nudge-message"
+
+	TextDescKeyVersionDriftRelayMessage = "version-drift.relay-message"
+
+	TextDescKeyChangesFallbackLabel              = "changes.fallback-label"
+	TextDescKeyChangesSincePrefix                = "changes.since-prefix"
+	TextDescKeyDoctorContextFileFormat           = "doctor.context-file.format"
+	TextDescKeyDoctorContextInitializedError     = "doctor.context-initialized.error"
+	TextDescKeyDoctorContextInitializedOk        = "doctor.context-initialized.ok"
+	TextDescKeyDoctorContextSizeFormat           = "doctor.context-size.format"
+	TextDescKeyDoctorContextSizeWarningSuffix    = "doctor.context-size.warning-suffix"
+	TextDescKeyDoctorCtxrcValidationError        = "doctor.ctxrc-validation.error"
+	TextDescKeyDoctorCtxrcValidationOk           = "doctor.ctxrc-validation.ok"
+	TextDescKeyDoctorCtxrcValidationOkNoFile     = "doctor.ctxrc-validation.ok-no-file"
+	TextDescKeyDoctorCtxrcValidationWarning      = "doctor.ctxrc-validation.warning"
+	TextDescKeyDoctorDriftDetected               = "doctor.drift.detected"
+	TextDescKeyDoctorDriftOk                     = "doctor.drift.ok"
+	TextDescKeyDoctorDriftViolations             = "doctor.drift.violations"
+	TextDescKeyDoctorDriftWarningLoad            = "doctor.drift.warning-load"
+	TextDescKeyDoctorDriftWarnings               = "doctor.drift.warnings"
+	TextDescKeyDoctorEventLoggingInfo            = "doctor.event-logging.info"
+	TextDescKeyDoctorEventLoggingOk              = "doctor.event-logging.ok"
+	TextDescKeyDoctorOutputHeader                = "doctor.output.header"
+	TextDescKeyDoctorOutputResultLine            = "doctor.output.result-line"
+	TextDescKeyDoctorOutputSeparator             = "doctor.output.separator"
+	TextDescKeyDoctorOutputSummary               = "doctor.output.summary"
+	TextDescKeyDoctorPluginEnabledGlobalOk       = "doctor.plugin-enabled-global.ok"
+	TextDescKeyDoctorPluginEnabledLocalOk        = "doctor.plugin-enabled-local.ok"
+	TextDescKeyDoctorPluginEnabledWarning        = "doctor.plugin-enabled.warning"
+	TextDescKeyDoctorPluginInstalledInfo         = "doctor.plugin-installed.info"
+	TextDescKeyDoctorPluginInstalledOk           = "doctor.plugin-installed.ok"
+	TextDescKeyDoctorRecentEventsInfo            = "doctor.recent-events.info"
+	TextDescKeyDoctorRecentEventsOk              = "doctor.recent-events.ok"
+	TextDescKeyDoctorRemindersInfo               = "doctor.reminders.info"
+	TextDescKeyDoctorRemindersOk                 = "doctor.reminders.ok"
+	TextDescKeyDoctorRequiredFilesError          = "doctor.required-files.error"
+	TextDescKeyDoctorRequiredFilesOk             = "doctor.required-files.ok"
+	TextDescKeyDoctorResourceDiskFormat          = "doctor.resource-disk.format"
+	TextDescKeyDoctorResourceLoadFormat          = "doctor.resource-load.format"
+	TextDescKeyDoctorResourceMemoryFormat        = "doctor.resource-memory.format"
+	TextDescKeyDoctorResourceSwapFormat          = "doctor.resource-swap.format"
+	TextDescKeyDoctorTaskCompletionFormat        = "doctor.task-completion.format"
+	TextDescKeyDoctorTaskCompletionWarningSuffix = "doctor.task-completion.warning-suffix"
+	TextDescKeyDoctorWebhookInfo                 = "doctor.webhook.info"
+	TextDescKeyDoctorWebhookOk                   = "doctor.webhook.ok"
+	TextDescKeyHookAider                         = "hook.aider"
+	TextDescKeyImportCountConvention             = "import.count-convention"
+	TextDescKeyImportCountDecision               = "import.count-decision"
+	TextDescKeyImportCountLearning               = "import.count-learning"
+	TextDescKeyImportCountTask                   = "import.count-task"
+	TextDescKeyHookClaude                        = "hook.claude"
+	TextDescKeyHookCopilot                       = "hook.copilot"
+	TextDescKeyHookCursor                        = "hook.cursor"
+	TextDescKeyHookSupportedTools                = "hook.supported-tools"
+	TextDescKeyHookWindsurf                      = "hook.windsurf"
+	TextDescKeyTimeAgo                           = "time.ago"
+	TextDescKeyTimeDay                           = "time.day"
+	TextDescKeyTimeHour                          = "time.hour"
+	TextDescKeyTimeJustNow                       = "time.just-now"
+	TextDescKeyTimeMinute                        = "time.minute"
+
+	TextDescKeyConfirmProceed           = "confirm.proceed"
+	TextDescKeySyncDepsDescription      = "sync.deps.description"
+	TextDescKeySyncDepsSuggestion       = "sync.deps.suggestion"
+	TextDescKeySyncConfigDescription    = "sync.config.description"
+	TextDescKeySyncConfigSuggestion     = "sync.config.suggestion"
+	TextDescKeySyncDirDescription       = "sync.dir.description"
+	TextDescKeySyncDirSuggestion        = "sync.dir.suggestion"
+	TextDescKeyBlockNonPathRelayMessage = "block.non-path-relay-message"
+	TextDescKeyBlockConstitutionSuffix  = "block.constitution-suffix"
+	TextDescKeyBlockMidSudo             = "block.mid-sudo"
+	TextDescKeyBlockMidGitPush          = "block.mid-git-push"
+	TextDescKeyBlockCpToBin             = "block.cp-to-bin"
+	TextDescKeyBlockInstallToLocalBin   = "block.install-to-local-bin"
+	TextDescKeyBlockDotSlash            = "block.dot-slash"
+	TextDescKeyBlockGoRun               = "block.go-run"
+	TextDescKeyBlockAbsolutePath        = "block.absolute-path"
+	TextDescKeyPadKeyCreated            = "pad.key-created"
+	TextDescKeyParserGitNotFound        = "parser.git-not-found"
+	TextDescKeyParserSessionPrefix      = "parser.session_prefix"
+	TextDescKeyParserSessionPrefixAlt   = "parser.session_prefix_alt"
+	TextDescKeyPauseConfirmed           = "pause.confirmed"
+	TextDescKeyPostCommitFallback       = "post-commit.fallback"
+	TextDescKeyPostCommitRelayMessage   = "post-commit.relay-message"
+
+	TextDescKeyPruneDryRunLine    = "prune.dry-run-line"
+	TextDescKeyPruneDryRunSummary = "prune.dry-run-summary"
+	TextDescKeyPruneErrorLine     = "prune.error-line"
+	TextDescKeyPruneSummary       = "prune.summary"
+
+	TextDescKeyMarkJournalChecked     = "mark-journal.checked"
+	TextDescKeyMarkJournalMarked      = "mark-journal.marked"
+	TextDescKeyMarkWrappedUpConfirmed = "mark-wrapped-up.confirmed"
+
+	TextDescKeyMessageCtxSpecificWarning = "message.ctx-specific-warning"
+	TextDescKeyMessageEditHint           = "message.edit-hint"
+	TextDescKeyMessageListHeaderCategory = "message.list-header-category"
+	TextDescKeyMessageListHeaderHook     = "message.list-header-hook"
+	TextDescKeyMessageListHeaderOverride = "message.list-header-override"
+	TextDescKeyMessageListHeaderVariant  = "message.list-header-variant"
+	TextDescKeyMessageNoOverride         = "message.no-override"
+	TextDescKeyMessageOverrideCreated    = "message.override-created"
+	TextDescKeyMessageOverrideLabel      = "message.override-label"
+	TextDescKeyMessageOverrideRemoved    = "message.override-removed"
+	TextDescKeyMessageSourceDefault      = "message.source-default"
+	TextDescKeyMessageSourceOverride     = "message.source-override"
+	TextDescKeyMessageTemplateVarsLabel  = "message.template-vars-label"
+	TextDescKeyMessageTemplateVarsNone   = "message.template-vars-none"
+
+	TextDescKeySpecsNudgeFallback     = "specs-nudge.fallback"
+	TextDescKeySpecsNudgeNudgeMessage = "specs-nudge.nudge-message"
+
+	TextDescKeyQaReminderFallback     = "qa-reminder.fallback"
+	TextDescKeyQaReminderRelayMessage = "qa-reminder.relay-message"
+
+	TextDescKeyResourcesAlertDisk    = "resources.alert-disk"
+	TextDescKeyResourcesAlertLoad    = "resources.alert-load"
+	TextDescKeyResourcesAlertMemory  = "resources.alert-memory"
+	TextDescKeyResourcesAlertSwap    = "resources.alert-swap"
+	TextDescKeyResourcesAlertDanger  = "resources.alert-danger"
+	TextDescKeyResourcesAlertWarning = "resources.alert-warning"
+	TextDescKeyResourcesAlerts       = "resources.alerts"
+	TextDescKeyResourcesAllClear     = "resources.all-clear"
+	TextDescKeyResourcesHeader       = "resources.header"
+	TextDescKeyResourcesSeparator    = "resources.separator"
+	TextDescKeyResourcesStatusDanger = "resources.status-danger"
+	TextDescKeyResourcesStatusOk     = "resources.status-ok"
+	TextDescKeyResourcesStatusWarn   = "resources.status-warn"
+	TextDescKeyResumeConfirmed       = "resume.confirmed"
+
+	TextDescKeyRcParseWarning = "rc.parse_warning"
+
+	TextDescKeySummaryActive     = "summary.active"
+	TextDescKeySummaryCompleted  = "summary.completed"
+	TextDescKeySummaryDecision   = "summary.decision"
+	TextDescKeySummaryDecisions  = "summary.decisions"
+	TextDescKeySummaryEmpty      = "summary.empty"
+	TextDescKeySummaryInvariants = "summary.invariants"
+	TextDescKeySummaryLoaded     = "summary.loaded"
+	TextDescKeySummaryTerm       = "summary.term"
+	TextDescKeySummaryTerms      = "summary.terms"
+
+	TextDescKeyTaskArchiveContentPreview = "task-archive.content-preview"
+	TextDescKeyTaskArchiveDryRunHeader   = "task-archive.dry-run-header"
+	TextDescKeyTaskArchiveDryRunSummary  = "task-archive.dry-run-summary"
+	TextDescKeyTaskArchiveNoCompleted    = "task-archive.no-completed"
+	TextDescKeyTaskArchivePendingRemain  = "task-archive.pending-remain"
+	TextDescKeyTaskArchiveSkipIncomplete = "task-archive.skip-incomplete"
+	TextDescKeyTaskArchiveSkipping       = "task-archive.skipping"
+	TextDescKeyTaskArchiveSuccess        = "task-archive.success"
+	TextDescKeyTaskArchiveSuccessWithAge = "task-archive.success-with-age"
+	TextDescKeyTaskSnapshotHeaderFormat  = "task-snapshot.header-format"
+	TextDescKeyTaskSnapshotCreatedFormat = "task-snapshot.created-format"
+	TextDescKeyTaskSnapshotSaved         = "task-snapshot.saved"
+	TextDescKeyWatchCloseLogError        = "watch.close-log-error"
+	TextDescKeyWatchDryRun               = "watch.dry-run"
+	TextDescKeyWatchStopHint             = "watch.stop-hint"
+	TextDescKeyWhyAdmonitionFormat       = "why.admonition-format"
+	TextDescKeyWhyBanner                 = "why.banner"
+	TextDescKeyWhyBlockquotePrefix       = "why.blockquote-prefix"
+	TextDescKeyWhyBoldFormat             = "why.bold-format"
+	TextDescKeyWhyMenuItemFormat         = "why.menu-item-format"
+	TextDescKeyWhyMenuPrompt             = "why.menu-prompt"
+
+	TextDescKeyWatchApplyFailed   = "watch.apply-failed"
+	TextDescKeyWatchApplySuccess  = "watch.apply-success"
+	TextDescKeyWatchDryRunPreview = "watch.dry-run-preview"
+	TextDescKeyWatchWatching      = "watch.watching"
+
+	TextDescKeyDriftCleared = "drift.cleared"
+
+	TextDescKeyMemoryDiffOldFormat = "memory.diff-old-format"
+	TextDescKeyMemoryDiffNewFormat = "memory.diff-new-format"
+	TextDescKeyMemoryImportSource  = "memory.import-source"
+	TextDescKeyMemoryPublishTitle  = "memory.publish-title"
+	TextDescKeyMemoryPublishTasks  = "memory.publish-tasks"
+	TextDescKeyMemoryPublishDec    = "memory.publish-decisions"
+	TextDescKeyMemoryPublishConv   = "memory.publish-conventions"
+	TextDescKeyMemoryPublishLrn    = "memory.publish-learnings"
+	TextDescKeyMemorySelectContent = "memory.select-content"
+	TextDescKeyMemoryWriteMemory   = "memory.write-memory"
+	TextDescKeyMemoryImportReview  = "memory.import-review"
+
+	TextDescKeyMCPResConstitution = "mcp.res-constitution"
+	TextDescKeyMCPResTasks        = "mcp.res-tasks"
+	TextDescKeyMCPResConventions  = "mcp.res-conventions"
+	TextDescKeyMCPResArchitecture = "mcp.res-architecture"
+	TextDescKeyMCPResDecisions    = "mcp.res-decisions"
+	TextDescKeyMCPResLearnings    = "mcp.res-learnings"
+	TextDescKeyMCPResGlossary     = "mcp.res-glossary"
+	TextDescKeyMCPResPlaybook     = "mcp.res-playbook"
+	TextDescKeyMCPResAgent        = "mcp.res-agent"
+	TextDescKeyMCPFailedMarshal   = "mcp.failed-marshal"
+	TextDescKeyMCPLoadContext     = "mcp.load-context"
+	TextDescKeyMCPMethodNotFound  = "mcp.method-not-found"
+	TextDescKeyMCPPacketHeader    = "mcp.packet-header"
+	TextDescKeyMCPParseError      = "mcp.parse-error"
+	TextDescKeyMCPFileNotFound    = "mcp.file-not-found"
+	TextDescKeyMCPInvalidParams   = "mcp.invalid-params"
+	TextDescKeyMCPUnknownResource = "mcp.unknown-resource"
+	TextDescKeyMCPUnknownTool     = "mcp.unknown-tool"
+
+	TextDescKeyMCPToolStatusDesc      = "mcp.tool-status-desc"
+	TextDescKeyMCPToolAddDesc         = "mcp.tool-add-desc"
+	TextDescKeyMCPToolCompleteDesc    = "mcp.tool-complete-desc"
+	TextDescKeyMCPToolDriftDesc       = "mcp.tool-drift-desc"
+	TextDescKeyMCPToolPropType        = "mcp.tool-prop-type"
+	TextDescKeyMCPToolPropContent     = "mcp.tool-prop-content"
+	TextDescKeyMCPToolPropPriority    = "mcp.tool-prop-priority"
+	TextDescKeyMCPToolPropContext     = "mcp.tool-prop-context"
+	TextDescKeyMCPToolPropRationale   = "mcp.tool-prop-rationale"
+	TextDescKeyMCPToolPropConseq      = "mcp.tool-prop-consequences"
+	TextDescKeyMCPToolPropLesson      = "mcp.tool-prop-lesson"
+	TextDescKeyMCPToolPropApplication = "mcp.tool-prop-application"
+	TextDescKeyMCPToolPropQuery       = "mcp.tool-prop-query"
+	TextDescKeyMCPTypeContentRequired = "mcp.type-content-required"
+	TextDescKeyMCPQueryRequired       = "mcp.query-required"
+	TextDescKeyMCPWriteFailed         = "mcp.write-failed"
+	TextDescKeyMCPAddedFormat         = "mcp.added-format"
+	TextDescKeyMCPCompletedFormat     = "mcp.completed-format"
+	TextDescKeyMCPStatusContextFormat = "mcp.status-context-format"
+	TextDescKeyMCPStatusFilesFormat   = "mcp.status-files-format"
+	TextDescKeyMCPStatusTokensFormat  = "mcp.status-tokens-format"
+	TextDescKeyMCPStatusFileFormat    = "mcp.status-file-format"
+	TextDescKeyMCPStatusOK            = "mcp.status-ok"
+	TextDescKeyMCPStatusEmpty         = "mcp.status-empty"
+	TextDescKeyMCPDriftStatusFormat   = "mcp.drift-status-format"
+	TextDescKeyMCPDriftViolations     = "mcp.drift-violations"
+	TextDescKeyMCPDriftWarnings       = "mcp.drift-warnings"
+	TextDescKeyMCPDriftPassed         = "mcp.drift-passed"
+	TextDescKeyMCPDriftIssueFormat    = "mcp.drift-issue-format"
+	TextDescKeyMCPDriftPassedFormat   = "mcp.drift-passed-format"
+	TextDescKeyMCPSectionFormat       = "mcp.section-format"
+	TextDescKeyMCPAlsoNoted           = "mcp.also-noted"
+	TextDescKeyMCPOmittedFormat       = "mcp.omitted-format"
+	TextDescKeyDriftDeadPath          = "drift.dead-path"
+	TextDescKeyDriftEntryCount        = "drift.entry-count"
+	TextDescKeyDriftMissingFile       = "drift.missing-file"
+	TextDescKeyDriftRegenerated       = "drift.regenerated"
+	TextDescKeyDriftMissingPackage    = "drift.missing-package"
+	TextDescKeyDriftSecret            = "drift.secret"
+	TextDescKeyDriftStaleAge          = "drift.stale-age"
+	TextDescKeyDriftStaleness         = "drift.staleness"
+
+	TextDescKeyJournalMocSessionLink    = "journal.moc.session-link"
+	TextDescKeyJournalMocNavDescription = "journal.moc.nav-description"
+	TextDescKeyJournalMocBrowseBy       = "journal.moc.browse-by"
+	TextDescKeyJournalMocTopicsDesc     = "journal.moc.topics-description"
+	TextDescKeyJournalMocFilesDesc      = "journal.moc.files-description"
+	TextDescKeyJournalMocTypesDesc      = "journal.moc.types-description"
+
+	TextDescKeyWriteAddedTo                    = "write.added-to"
+	TextDescKeyWriteArchived                   = "write.archived"
+	TextDescKeyWriteBackupResult               = "write.backup-result"
+	TextDescKeyWriteBackupSMBDest              = "write.backup-smb-dest"
+	TextDescKeyWriteBootstrapDir               = "write.bootstrap-dir"
+	TextDescKeyWriteBootstrapFiles             = "write.bootstrap-files"
+	TextDescKeyWriteBootstrapNextSteps         = "write.bootstrap-next-steps"
+	TextDescKeyWriteBootstrapNumbered          = "write.bootstrap-numbered"
+	TextDescKeyWriteBootstrapRules             = "write.bootstrap-rules"
+	TextDescKeyWriteBootstrapSep               = "write.bootstrap-sep"
+	TextDescKeyWriteBootstrapTitle             = "write.bootstrap-title"
+	TextDescKeyWriteBootstrapWarning           = "write.bootstrap-warning"
+	TextDescKeyWriteCompletedTask              = "write.completed-task"
+	TextDescKeyWriteConfigProfileBase          = "write.config-profile-base"
+	TextDescKeyWriteConfigProfileDev           = "write.config-profile-dev"
+	TextDescKeyWriteConfigProfileNone          = "write.config-profile-none"
+	TextDescKeyWriteDepsLookingFor             = "write.deps-looking-for"
+	TextDescKeyWriteDepsNoDeps                 = "write.deps-no-deps"
+	TextDescKeyWriteDepsNoProject              = "write.deps-no-project"
+	TextDescKeyWriteDepsUseType                = "write.deps-use-type"
+	TextDescKeyWriteDryRun                     = "write.dry-run"
+	TextDescKeyWriteExistsWritingAsAlternative = "write.exists-writing-as-alternative"
+	TextDescKeyWriteHookCopilotCreated         = "write.hook-copilot-created"
+	TextDescKeyWriteHookCopilotForceHint       = "write.hook-copilot-force-hint"
+	TextDescKeyWriteHookCopilotMerged          = "write.hook-copilot-merged"
+	TextDescKeyWriteHookCopilotSessionsDir     = "write.hook-copilot-sessions-dir"
+	TextDescKeyWriteHookCopilotSkipped         = "write.hook-copilot-skipped"
+	TextDescKeyWriteHookCopilotSummary         = "write.hook-copilot-summary"
+	TextDescKeyWriteHookUnknownTool            = "write.hook-unknown-tool"
+	TextDescKeyWriteImportAdded                = "write.import-added"
+	TextDescKeyWriteImportClassified           = "write.import-classified"
+	TextDescKeyWriteImportClassifiedSkip       = "write.import-classified-skip"
+	TextDescKeyWriteImportDuplicates           = "write.import-duplicates"
+	TextDescKeyWriteImportEntry                = "write.import-entry"
+	TextDescKeyWriteImportFound                = "write.import-found"
+	TextDescKeyWriteImportNoEntries            = "write.import-no-entries"
+	TextDescKeyWriteImportScanning             = "write.import-scanning"
+	TextDescKeyWriteImportSkipped              = "write.import-skipped"
+	TextDescKeyWriteImportSummary              = "write.import-summary"
+	TextDescKeyWriteImportSummaryDryRun        = "write.import-summary-dry-run"
+	TextDescKeyWriteInitAborted                = "write.init-aborted"
+	TextDescKeyWriteInitBackup                 = "write.init-backup"
+	TextDescKeyWriteInitCreatedDir             = "write.init-created-dir"
+	TextDescKeyWriteInitCreatedWith            = "write.init-created-with"
+	TextDescKeyWriteInitCreatingRootFiles      = "write.init-creating-root-files"
+	TextDescKeyWriteInitCtxContentExists       = "write.init-ctx-content-exists"
+	TextDescKeyWriteInitExistsSkipped          = "write.init-exists-skipped"
+	TextDescKeyWriteInitFileCreated            = "write.init-file-created"
+	TextDescKeyWriteInitFileExistsNoCtx        = "write.init-file-exists-no-ctx"
+	TextDescKeyWriteInitGitignoreReview        = "write.init-gitignore-review"
+	TextDescKeyWriteInitGitignoreUpdated       = "write.init-gitignore-updated"
+	TextDescKeyWriteInitMakefileAppended       = "write.init-makefile-appended"
+	TextDescKeyWriteInitMakefileCreated        = "write.init-makefile-created"
+	TextDescKeyWriteInitMakefileIncludes       = "write.init-makefile-includes"
+	TextDescKeyWriteInitMerged                 = "write.init-merged"
+	TextDescKeyWriteInitNextSteps              = "write.init-next-steps"
+	TextDescKeyWriteInitNoChanges              = "write.init-no-changes"
+	TextDescKeyWriteInitOverwritePrompt        = "write.init-overwrite-prompt"
+	TextDescKeyWriteInitPermsAllow             = "write.init-perms-allow"
+	TextDescKeyWriteInitPermsAllowDeny         = "write.init-perms-allow-deny"
+	TextDescKeyWriteInitPermsDeduped           = "write.init-perms-deduped"
+	TextDescKeyWriteInitPermsDeny              = "write.init-perms-deny"
+	TextDescKeyWriteInitPermsMergedDeduped     = "write.init-perms-merged-deduped"
+	TextDescKeyWriteInitPluginAlreadyEnabled   = "write.init-plugin-already-enabled"
+	TextDescKeyWriteInitPluginEnabled          = "write.init-plugin-enabled"
+	TextDescKeyWriteInitPluginInfo             = "write.init-plugin-info"
+	TextDescKeyWriteInitPluginNote             = "write.init-plugin-note"
+	TextDescKeyWriteInitPluginSkipped          = "write.init-plugin-skipped"
+	TextDescKeyWriteInitScratchpadKeyCreated   = "write.init-scratchpad-key-created"
+	TextDescKeyWriteInitScratchpadNoKey        = "write.init-scratchpad-no-key"
+	TextDescKeyWriteInitScratchpadPlaintext    = "write.init-scratchpad-plaintext"
+	TextDescKeyWriteInitSettingUpPermissions   = "write.init-setting-up-permissions"
+	TextDescKeyWriteInitSkippedDir             = "write.init-skipped-dir"
+	TextDescKeyWriteInitSkippedPlain           = "write.init-skipped-plain"
+	TextDescKeyWriteInitUpdatedCtxSection      = "write.init-updated-ctx-section"
+	TextDescKeyWriteInitUpdatedPlanSection     = "write.init-updated-plan-section"
+	TextDescKeyWriteInitUpdatedPromptSection   = "write.init-updated-prompt-section"
+	TextDescKeyWriteInitWarnNonFatal           = "write.init-warn-non-fatal"
+	TextDescKeyWriteInitialized                = "write.initialized"
+	TextDescKeyWriteJournalOrphanRemoved       = "write.journal-orphan-removed"
+	TextDescKeyWriteJournalSiteAlt             = "write.journal-site-alt"
+	TextDescKeyWriteJournalSiteBuilding        = "write.journal-site-building"
+	TextDescKeyWriteJournalSiteGenerated       = "write.journal-site-generated"
+	TextDescKeyWriteJournalSiteNextSteps       = "write.journal-site-next-steps"
+	TextDescKeyWriteJournalSiteStarting        = "write.journal-site-starting"
+	TextDescKeyWriteJournalSyncLocked          = "write.journal-sync-locked"
+	TextDescKeyWriteJournalSyncLockedCount     = "write.journal-sync-locked-count"
+	TextDescKeyWriteJournalSyncMatch           = "write.journal-sync-match"
+	TextDescKeyWriteJournalSyncNone            = "write.journal-sync-none"
+	TextDescKeyWriteJournalSyncUnlocked        = "write.journal-sync-unlocked"
+	TextDescKeyWriteJournalSyncUnlockedCount   = "write.journal-sync-unlocked-count"
+	TextDescKeyWriteLines                      = "write.lines"
+	TextDescKeyWriteLinesPrevious              = "write.lines-previous"
+	TextDescKeyWriteLockUnlockEntry            = "write.lock-unlock-entry"
+	TextDescKeyWriteLockUnlockNoChanges        = "write.lock-unlock-no-changes"
+	TextDescKeyWriteLockUnlockSummary          = "write.lock-unlock-summary"
+	TextDescKeyWriteLoopCompletion             = "write.loop-completion"
+	TextDescKeyWriteLoopGenerated              = "write.loop-generated"
+	TextDescKeyWriteLoopMaxIterations          = "write.loop-max-iterations"
+	TextDescKeyWriteLoopPrompt                 = "write.loop-prompt"
+	TextDescKeyWriteLoopRunCmd                 = "write.loop-run-cmd"
+	TextDescKeyWriteLoopTool                   = "write.loop-tool"
+	TextDescKeyWriteLoopUnlimited              = "write.loop-unlimited"
+	TextDescKeyWriteMemoryArchives             = "write.memory-archives"
+	TextDescKeyWriteMemoryBridgeHeader         = "write.memory-bridge-header"
+	TextDescKeyWriteMemoryDriftDetected        = "write.memory-drift-detected"
+	TextDescKeyWriteMemoryDriftNone            = "write.memory-drift-none"
+	TextDescKeyWriteMemoryLastSync             = "write.memory-last-sync"
+	TextDescKeyWriteMemoryLastSyncNever        = "write.memory-last-sync-never"
+	TextDescKeyWriteMemoryMirror               = "write.memory-mirror"
+	TextDescKeyWriteMemoryMirrorLines          = "write.memory-mirror-lines"
+	TextDescKeyWriteMemoryMirrorNotSynced      = "write.memory-mirror-not-synced"
+	TextDescKeyWriteMemoryNoChanges            = "write.memory-no-changes"
+	TextDescKeyWriteMemorySource               = "write.memory-source"
+	TextDescKeyWriteMemorySourceLines          = "write.memory-source-lines"
+	TextDescKeyWriteMemorySourceLinesDrift     = "write.memory-source-lines-drift"
+	TextDescKeyWriteMemorySourceNotActive      = "write.memory-source-not-active"
+	TextDescKeyWriteMirror                     = "write.mirror"
+	TextDescKeyWriteMovingTask                 = "write.moving-task"
+	TextDescKeyWriteNewContent                 = "write.new-content"
+	TextDescKeyWriteObsidianGenerated          = "write.obsidian-generated"
+	TextDescKeyWriteObsidianNextSteps          = "write.obsidian-next-steps"
+	TextDescKeyWritePadBlobWritten             = "write.pad-blob-written"
+	TextDescKeyWritePadEmpty                   = "write.pad-empty"
+	TextDescKeyWritePadEntryAdded              = "write.pad-entry-added"
+	TextDescKeyWritePadEntryMoved              = "write.pad-entry-moved"
+	TextDescKeyWritePadEntryRemoved            = "write.pad-entry-removed"
+	TextDescKeyWritePadEntryUpdated            = "write.pad-entry-updated"
+	TextDescKeyWritePadExportDone              = "write.pad-export-done"
+	TextDescKeyWritePadExportNone              = "write.pad-export-none"
+	TextDescKeyWritePadExportPlan              = "write.pad-export-plan"
+	TextDescKeyWritePadExportSummary           = "write.pad-export-summary"
+	TextDescKeyWritePadExportVerbDone          = "write.pad-export-verb-done"
+	TextDescKeyWritePadExportVerbDryRun        = "write.pad-export-verb-dry-run"
+	TextDescKeyWritePadExportWriteFailed       = "write.pad-export-write-failed"
+	TextDescKeyWritePadImportBlobAdded         = "write.pad-import-blob-added"
+	TextDescKeyWritePadImportBlobNone          = "write.pad-import-blob-none"
+	TextDescKeyWritePadImportBlobSkipped       = "write.pad-import-blob-skipped"
+	TextDescKeyWritePadImportBlobSummary       = "write.pad-import-blob-summary"
+	TextDescKeyWritePadImportBlobTooLarge      = "write.pad-import-blob-too-large"
+	TextDescKeyWritePadImportCloseWarning      = "write.pad-import-close-warning"
+	TextDescKeyWritePadImportDone              = "write.pad-import-done"
+	TextDescKeyWritePadImportNone              = "write.pad-import-none"
+	TextDescKeyWritePadKeyCreated              = "write.pad-key-created"
+	TextDescKeyWritePadMergeAdded              = "write.pad-merge-added"
+	TextDescKeyWritePadMergeBinaryWarning      = "write.pad-merge-binary-warning"
+	TextDescKeyWritePadMergeBlobConflict       = "write.pad-merge-blob-conflict"
+	TextDescKeyWritePadMergeDone               = "write.pad-merge-done"
+	TextDescKeyWritePadMergeDryRun             = "write.pad-merge-dry-run"
+	TextDescKeyWritePadMergeDupe               = "write.pad-merge-dupe"
+	TextDescKeyWritePadMergeNone               = "write.pad-merge-none"
+	TextDescKeyWritePadMergeNoneNew            = "write.pad-merge-none-new"
+	TextDescKeyWritePadResolveEntry            = "write.pad-resolve-entry"
+	TextDescKeyWritePadResolveHeader           = "write.pad-resolve-header"
+	TextDescKeyWritePathExists                 = "write.path-exists"
+	TextDescKeyWritePaused                     = "write.paused"
+	TextDescKeyWritePrefixError                = "write.prefix-error"
+	TextDescKeyWritePromptCreated              = "write.prompt-created"
+	TextDescKeyWritePromptItem                 = "write.prompt-item"
+	TextDescKeyWritePromptNone                 = "write.prompt-none"
+	TextDescKeyWritePromptRemoved              = "write.prompt-removed"
+	TextDescKeyWritePublishBlock               = "write.publish-block"
+	TextDescKeyWritePublishBudget              = "write.publish-budget"
+	TextDescKeyWritePublishConventions         = "write.publish-conventions"
+	TextDescKeyWritePublishDecisions           = "write.publish-decisions"
+	TextDescKeyWritePublishDone                = "write.publish-done"
+	TextDescKeyWritePublishDryRun              = "write.publish-dry-run"
+	TextDescKeyWritePublishHeader              = "write.publish-header"
+	TextDescKeyWritePublishLearnings           = "write.publish-learnings"
+	TextDescKeyWritePublishSourceFiles         = "write.publish-source-files"
+	TextDescKeyWritePublishTasks               = "write.publish-tasks"
+	TextDescKeyWritePublishTotal               = "write.publish-total"
+	TextDescKeyWriteReminderAdded              = "write.reminder-added"
+	TextDescKeyWriteReminderAfterSuffix        = "write.reminder-after-suffix"
+	TextDescKeyWriteReminderDismissed          = "write.reminder-dismissed"
+	TextDescKeyWriteReminderDismissedAll       = "write.reminder-dismissed-all"
+	TextDescKeyWriteReminderItem               = "write.reminder-item"
+	TextDescKeyWriteReminderNone               = "write.reminder-none"
+	TextDescKeyWriteReminderNotDue             = "write.reminder-not-due"
+	TextDescKeyWriteRestoreAdded               = "write.restore-added"
+	TextDescKeyWriteRestoreDenyDroppedHeader   = "write.restore-deny-dropped-header"
+	TextDescKeyWriteRestoreDenyRestoredHeader  = "write.restore-deny-restored-header"
+	TextDescKeyWriteRestoreDone                = "write.restore-done"
+	TextDescKeyWriteRestoreDroppedHeader       = "write.restore-dropped-header"
+	TextDescKeyWriteRestoreMatch               = "write.restore-match"
+	TextDescKeyWriteRestoreNoLocal             = "write.restore-no-local"
+	TextDescKeyWriteRestorePermMatch           = "write.restore-perm-match"
+	TextDescKeyWriteRestoreRemoved             = "write.restore-removed"
+	TextDescKeyWriteRestoreRestoredHeader      = "write.restore-restored-header"
+	TextDescKeyWriteResumed                    = "write.resumed"
+	TextDescKeyWriteSetupDone                  = "write.setup-done"
+	TextDescKeyWriteSetupPrompt                = "write.setup-prompt"
+	TextDescKeyWriteSkillLine                  = "write.skill-line"
+	TextDescKeyWriteSkillsHeader               = "write.skills-header"
+	TextDescKeyWriteSnapshotSaved              = "write.snapshot-saved"
+	TextDescKeyWriteSnapshotUpdated            = "write.snapshot-updated"
+	TextDescKeyWriteSource                     = "write.source"
+	TextDescKeyWriteStatusActivityHeader       = "write.status-activity-header"
+	TextDescKeyWriteStatusActivityItem         = "write.status-activity-item"
+	TextDescKeyWriteStatusDir                  = "write.status-dir"
+	TextDescKeyWriteStatusDrift                = "write.status-drift"
+	TextDescKeyWriteStatusFileCompact          = "write.status-file-compact"
+	TextDescKeyWriteStatusFileVerbose          = "write.status-file-verbose"
+	TextDescKeyWriteStatusFiles                = "write.status-files"
+	TextDescKeyWriteStatusFilesHeader          = "write.status-files-header"
+	TextDescKeyWriteStatusNoDrift              = "write.status-no-drift"
+	TextDescKeyWriteStatusPreviewLine          = "write.status-preview-line"
+	TextDescKeyWriteStatusSeparator            = "write.status-separator"
+	TextDescKeyWriteStatusTitle                = "write.status-title"
+	TextDescKeyWriteStatusTokens               = "write.status-tokens"
+	TextDescKeyWriteSynced                     = "write.synced"
+	TextDescKeyWriteSyncAction                 = "write.sync-action"
+	TextDescKeyWriteSyncDryRun                 = "write.sync-dry-run"
+	TextDescKeyWriteSyncDryRunSummary          = "write.sync-dry-run-summary"
+	TextDescKeyWriteSyncHeader                 = "write.sync-header"
+	TextDescKeyWriteSyncInSync                 = "write.sync-in-sync"
+	TextDescKeyWriteSyncSeparator              = "write.sync-separator"
+	TextDescKeyWriteSyncSuggestion             = "write.sync-suggestion"
+	TextDescKeyWriteSyncSummary                = "write.sync-summary"
+	TextDescKeyWriteTestFiltered               = "write.test-filtered"
+	TextDescKeyWriteTestNoWebhook              = "write.test-no-webhook"
+	TextDescKeyWriteTestResult                 = "write.test-result"
+	TextDescKeyWriteTestWorking                = "write.test-working"
+	TextDescKeyWriteTimeDayAgo                 = "write.time-day-ago"
+	TextDescKeyWriteTimeDaysAgo                = "write.time-days-ago"
+	TextDescKeyWriteTimeHourAgo                = "write.time-hour-ago"
+	TextDescKeyWriteTimeHoursAgo               = "write.time-hours-ago"
+	TextDescKeyWriteTimeJustNow                = "write.time-just-now"
+	TextDescKeyWriteTimeMinuteAgo              = "write.time-minute-ago"
+	TextDescKeyWriteTimeMinutesAgo             = "write.time-minutes-ago"
+	TextDescKeyWriteUnpublishDone              = "write.unpublish-done"
+	TextDescKeyWriteUnpublishNotFound          = "write.unpublish-not-found"
+)
+
 // Template reads a template file by name from the embedded filesystem.
 //
 // Parameters:
@@ -278,6 +1119,24 @@ func HookMessageRegistry() ([]byte, error) {
 	return FS.ReadFile("hooks/messages/registry.yaml")
 }
 
+// CopilotInstructions reads the embedded Copilot instructions template.
+//
+// Returns:
+//   - []byte: Template content from hooks/copilot-instructions.md
+//   - error: Non-nil if the file is not found or read fails
+func CopilotInstructions() ([]byte, error) {
+	return FS.ReadFile("hooks/copilot-instructions.md")
+}
+
+// JournalExtraCSS reads the embedded extra.css for journal site generation.
+//
+// Returns:
+//   - []byte: CSS content
+//   - error: Non-nil if the file is not found or read fails
+func JournalExtraCSS() ([]byte, error) {
+	return FS.ReadFile("journal/extra.css")
+}
+
 // ListHookMessages returns available hook message directory names.
 //
 // Each hook is a directory under hooks/messages/ containing one or
@@ -333,7 +1192,7 @@ func ListHookVariants(hook string) ([]string, error) {
 //   - []byte: Document content from why/
 //   - error: Non-nil if the file is not found or read fails
 func WhyDoc(name string) ([]byte, error) {
-	return FS.ReadFile("why/" + name + config.ExtMarkdown)
+	return FS.ReadFile("why/" + name + file.ExtMarkdown)
 }
 
 // ListWhyDocs returns available "why" document names (without extension).
@@ -351,7 +1210,7 @@ func ListWhyDocs() ([]string, error) {
 	for _, entry := range entries {
 		if !entry.IsDir() {
 			name := entry.Name()
-			if len(name) > 3 && name[len(name)-3:] == config.ExtMarkdown {
+			if len(name) > 3 && name[len(name)-3:] == file.ExtMarkdown {
 				names = append(names, name[:len(name)-3])
 			}
 		}
@@ -371,6 +1230,12 @@ func Schema() ([]byte, error) {
 var (
 	commandsOnce sync.Once
 	commandsMap  map[string]commandEntry
+	flagsOnce    sync.Once
+	flagsMap     map[string]commandEntry
+	textOnce     sync.Once
+	textMap      map[string]commandEntry
+	examplesOnce sync.Once
+	examplesMap  map[string]commandEntry
 )
 
 type commandEntry struct {
@@ -378,21 +1243,33 @@ type commandEntry struct {
 	Long  string `yaml:"long"`
 }
 
-// loadCommands parses the embedded commands.yaml once.
+// loadYAML parses an embedded YAML file into a commandEntry map.
+func loadYAML(path string) map[string]commandEntry {
+	data, readErr := FS.ReadFile(path)
+	if readErr != nil {
+		return make(map[string]commandEntry)
+	}
+	m := make(map[string]commandEntry)
+	if parseErr := yaml.Unmarshal(data, &m); parseErr != nil {
+		return make(map[string]commandEntry)
+	}
+	return m
+}
+
 func loadCommands() {
-	commandsOnce.Do(func() {
-		data, readErr := FS.ReadFile("commands/commands.yaml")
-		if readErr != nil {
-			commandsMap = make(map[string]commandEntry)
-			return
-		}
-		m := make(map[string]commandEntry)
-		if parseErr := yaml.Unmarshal(data, &m); parseErr != nil {
-			commandsMap = make(map[string]commandEntry)
-			return
-		}
-		commandsMap = m
-	})
+	commandsOnce.Do(func() { commandsMap = loadYAML("commands/commands.yaml") })
+}
+
+func loadFlags() {
+	flagsOnce.Do(func() { flagsMap = loadYAML("commands/flags.yaml") })
+}
+
+func loadText() {
+	textOnce.Do(func() { textMap = loadYAML("commands/text.yaml") })
+}
+
+func loadExamples() {
+	examplesOnce.Do(func() { examplesMap = loadYAML("commands/examples.yaml") })
 }
 
 // CommandDesc returns the Short and Long descriptions for a command.
@@ -415,19 +1292,19 @@ func CommandDesc(key string) (short, long string) {
 	return entry.Short, entry.Long
 }
 
-// FlagDesc returns the description for a global flag.
+// FlagDesc returns the description for a flag.
 //
-// Keys use the format "_flags." (e.g., "_flags.context-dir").
+// Keys use dot notation: "add.file", "context-dir".
 // Returns an empty string if the key is not found.
 //
 // Parameters:
-//   - name: Flag name (without the _flags. prefix)
+//   - name: Flag key in dot notation
 //
 // Returns:
 //   - string: Flag description
 func FlagDesc(name string) string {
-	loadCommands()
-	entry, ok := commandsMap["_flags."+name]
+	loadFlags()
+	entry, ok := flagsMap[name]
 	if !ok {
 		return ""
 	}
@@ -436,23 +1313,66 @@ func FlagDesc(name string) string {
 
 // ExampleDesc returns example usage text for a given key.
 //
-// Keys use the format "_examples." (e.g., "_examples.decision").
+// Keys match entry types: "decision", "learning", "task", "convention".
 // Returns an empty string if the key is not found.
 //
 // Parameters:
-//   - name: Example key (without the _examples. prefix)
+//   - name: Entry type key
 //
 // Returns:
 //   - string: Example text
 func ExampleDesc(name string) string {
-	loadCommands()
-	entry, ok := commandsMap["_examples."+name]
+	loadExamples()
+	entry, ok := examplesMap[name]
 	if !ok {
 		return ""
 	}
 	return entry.Short
 }
 
+// TextDesc returns a user-facing text string by key.
+//
+// Keys use dot notation: "agent.instruction", "backup.run-hint".
+// Returns an empty string if the key is not found.
+//
+// Parameters:
+//   - name: Text key in dot notation
+//
+// Returns:
+//   - string: Text content
+func TextDesc(name string) string {
+	loadText()
+	entry, ok := textMap[name]
+	if !ok {
+		return ""
+	}
+	return entry.Short
+}
+
+var (
+	stopWordsOnce sync.Once
+	stopWordsMap  map[string]bool
+)
+
+// StopWords returns the default set of stop words for keyword extraction.
+//
+// Loaded from the embedded text.yaml asset under "stopwords".
+// The result is cached after the first call.
+//
+// Returns:
+//   - map[string]bool: Set of lowercase stop words
+func StopWords() map[string]bool {
+	stopWordsOnce.Do(func() {
+		raw := TextDesc("stopwords")
+		words := strings.Fields(raw)
+		stopWordsMap = make(map[string]bool, len(words))
+		for _, w := range words {
+			stopWordsMap[w] = true
+		}
+	})
+	return stopWordsMap
+}
+
 var (
 	allowOnce  sync.Once
 	allowPerms []string
@@ -466,7 +1386,7 @@ var (
 // Lines are trimmed; empty lines and lines starting with '#' are skipped.
 func parsePermissions(data []byte) []string {
 	var result []string
-	for _, line := range strings.Split(string(data), config.NewlineLF) {
+	for _, line := range strings.Split(string(data), token.NewlineLF) {
 		line = strings.TrimSpace(line)
 		if line == "" || strings.HasPrefix(line, "#") {
 			continue
diff --git a/internal/assets/entry-templates/decision.md b/internal/assets/entry-templates/decision.md
index b31d13ca..7bd80301 100644
--- a/internal/assets/entry-templates/decision.md
+++ b/internal/assets/entry-templates/decision.md
@@ -5,8 +5,8 @@
 **Context**: [What situation prompted this decision? What constraints exist?]
 
 **Alternatives Considered**:
-1. **[Option A]**: [Description] — Pros: [...] / Cons: [...]
-2. **[Option B]**: [Description] — Pros: [...] / Cons: [...]
+1. **[Option A]**: [Description]: Pros: [...] / Cons: [...]
+2. **[Option B]**: [Description]: Pros: [...] / Cons: [...]
 
 **Decision**: [What was decided?]
 
diff --git a/internal/config/heading.go b/internal/assets/heading.go
similarity index 94%
rename from internal/config/heading.go
rename to internal/assets/heading.go
index 9d62dca4..72afd427 100644
--- a/internal/config/heading.go
+++ b/internal/assets/heading.go
@@ -1,12 +1,10 @@
-//	/    ctx:                         https://ctx.ist
-//
+//   /    ctx:                         https://ctx.ist
 // ,'`./    do you remember?
-//
-//	`.,'\
-//	  \    Copyright 2026-present Context contributors.
-//	                SPDX-License-Identifier: Apache-2.0
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Learnings
 const (
diff --git a/internal/assets/hooks/copilot-instructions.md b/internal/assets/hooks/copilot-instructions.md
new file mode 100644
index 00000000..bb62c209
--- /dev/null
+++ b/internal/assets/hooks/copilot-instructions.md
@@ -0,0 +1,106 @@
+# Project Context
+
+
+
+
+## Context System
+
+This project uses Context (`ctx`) for persistent AI context
+management. Your memory is NOT ephemeral: it lives in `.context/` files.
+
+## On Session Start
+
+Read these files **in order** before starting any work:
+
+1. `.context/CONSTITUTION.md`: Hard rules, NEVER violate
+2. `.context/TASKS.md`: Current work items
+3. `.context/CONVENTIONS.md`: Code patterns and standards
+4. `.context/ARCHITECTURE.md`: System structure
+5. `.context/DECISIONS.md`: Architectural decisions with rationale
+6. `.context/LEARNINGS.md`: Gotchas, tips, lessons learned
+7. `.context/GLOSSARY.md`: Domain terms and abbreviations
+8. `.context/AGENT_PLAYBOOK.md`: How to use this context system
+
+After reading, confirm: "I have read the required context files and I'm
+following project conventions."
+
+## When Asked "Do You Remember?"
+
+**Do this FIRST (silently):**
+- Read `.context/TASKS.md`
+- Read `.context/DECISIONS.md` and `.context/LEARNINGS.md`
+- Check `.context/sessions/` for recent session files
+
+**Then respond with a structured readback:**
+1. **Last session**: cite the most recent session topic and date
+2. **Active work**: list pending or in-progress tasks
+3. **Recent context**: mention 1-2 recent decisions or learnings
+4. **Next step**: offer to continue or ask what to focus on
+
+**Never** lead with "I don't have memory." The `.context/` files are
+your memory. Read them silently, then present what you found as recall.
+
+## Session Persistence
+
+After completing meaningful work, save a session summary to
+`.context/sessions/`.
+
+### Session File Format
+
+Create a file named `YYYY-MM-DD-topic.md`:
+
+```markdown
+# Session: YYYY-MM-DD - Brief Topic Description
+
+## What Was Done
+- Describe completed work items
+
+## Decisions
+- Key decisions made and their rationale
+
+## Learnings
+- Gotchas, tips, or insights discovered
+
+## Next Steps
+- Follow-up work or remaining items
+```
+
+### When to Save
+
+- After completing a task or feature
+- After making architectural decisions
+- After a debugging session
+- Before ending the session
+- At natural breakpoints in long sessions
+
+## Context Updates During Work
+
+Proactively update context files as you work:
+
+| Event                       | Action                           |
+|-----------------------------|----------------------------------|
+| Made architectural decision | Add to `.context/DECISIONS.md`   |
+| Discovered gotcha/bug       | Add to `.context/LEARNINGS.md`   |
+| Established new pattern     | Add to `.context/CONVENTIONS.md` |
+| Completed task              | Mark [x] in `.context/TASKS.md`  |
+
+## Self-Check
+
+Periodically ask yourself:
+
+> "If this session ended right now, would the next session know what happened?"
+
+If no: save a session file or update context files before continuing.
+
+## CLI Commands
+
+If `ctx` is installed, use these commands:
+
+```bash
+ctx status        # Context summary and health check
+ctx agent         # AI-ready context packet
+ctx drift         # Check for stale context
+ctx recall list   # Recent session history
+```
+
+
diff --git a/internal/assets/hooks/messages/check-map-staleness/stale.txt b/internal/assets/hooks/messages/check-map-staleness/stale.txt
index 1464eb2b..b7cf0d4b 100644
--- a/internal/assets/hooks/messages/check-map-staleness/stale.txt
+++ b/internal/assets/hooks/messages/check-map-staleness/stale.txt
@@ -1,5 +1,5 @@
 ARCHITECTURE.md hasn't been refreshed since {{.LastRefreshDate}}
 and there are commits touching {{.ModuleCount}} modules.
-/ctx-map keeps architecture docs drift-free.
+/ctx-architecture keeps architecture docs drift-free.
 
-Want me to run /ctx-map to refresh?
\ No newline at end of file
+Want me to run /ctx-architecture to refresh?
\ No newline at end of file
diff --git a/internal/assets/hooks/messages/check-memory-drift/nudge.txt b/internal/assets/hooks/messages/check-memory-drift/nudge.txt
new file mode 100644
index 00000000..8c0be7d6
--- /dev/null
+++ b/internal/assets/hooks/messages/check-memory-drift/nudge.txt
@@ -0,0 +1,2 @@
+MEMORY.md has changed since last sync.
+Run: ctx memory sync
\ No newline at end of file
diff --git a/internal/assets/hooks/messages/registry.yaml b/internal/assets/hooks/messages/registry.yaml
index 44b218a4..d46f8742 100644
--- a/internal/assets/hooks/messages/registry.yaml
+++ b/internal/assets/hooks/messages/registry.yaml
@@ -116,6 +116,11 @@
   description: Architecture map staleness nudge
   vars: [LastRefreshDate, ModuleCount]
 
+- hook: check-memory-drift
+  variant: nudge
+  category: customizable
+  description: Memory drift nudge when MEMORY.md has changed
+
 - hook: check-persistence
   variant: nudge
   category: customizable
diff --git a/internal/assets/hooks/messages/registry_test.go b/internal/assets/hooks/messages/registry_test.go
index 5445f95d..1df62ed7 100644
--- a/internal/assets/hooks/messages/registry_test.go
+++ b/internal/assets/hooks/messages/registry_test.go
@@ -13,8 +13,8 @@ func TestRegistryCount(t *testing.T) {
 	if registryErr != nil {
 		t.Fatalf("Registry() parse error: %v", registryErr)
 	}
-	if len(entries) != 30 {
-		t.Errorf("Registry() returned %d entries, want 30", len(entries))
+	if len(entries) != 31 {
+		t.Errorf("Registry() returned %d entries, want 31", len(entries))
 	}
 }
 
diff --git a/internal/cli/journal/cmd/site/extra.css b/internal/assets/journal/extra.css
similarity index 100%
rename from internal/cli/journal/cmd/site/extra.css
rename to internal/assets/journal/extra.css
diff --git a/internal/config/label.go b/internal/assets/label.go
similarity index 64%
rename from internal/config/label.go
rename to internal/assets/label.go
index d22e78a0..c8a22fdc 100644
--- a/internal/config/label.go
+++ b/internal/assets/label.go
@@ -4,7 +4,7 @@
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Bold metadata field prefixes in journal/session Markdown.
 const (
@@ -40,8 +40,8 @@ const (
 
 // Conversation role display labels used in exported journal entries.
 const (
-	// LabelRoleUser is the display label for user turns.
-	LabelRoleUser = "User"
+	// RoleUser is the display label for user turns.
+	RoleUser = "User"
 	// LabelRoleAssistant is the display label for assistant turns.
 	LabelRoleAssistant = "Assistant"
 )
@@ -56,9 +56,23 @@ const (
 const (
 	// FrontmatterTitle is the YAML frontmatter key for the entry title.
 	FrontmatterTitle = "title"
+	// FrontmatterDate is the YAML frontmatter key for the entry date.
+	FrontmatterDate = "date"
+	// FrontmatterType is the YAML frontmatter key for the session type.
+	FrontmatterType = "type"
+	// FrontmatterOutcome is the YAML frontmatter key for the session outcome.
+	FrontmatterOutcome = "outcome"
+	// FrontmatterTopics is the YAML frontmatter key for the topics list.
+	FrontmatterTopics = "topics"
+	// FrontmatterTechnologies is the YAML frontmatter key for the technologies list.
+	FrontmatterTechnologies = "technologies"
+	// FrontmatterKeyFiles is the YAML frontmatter key for the key files list.
+	FrontmatterKeyFiles = "key_files"
 	// FrontmatterLocked is the YAML frontmatter key and journal state
 	// marker for locked entries.
 	FrontmatterLocked = "locked"
+	// Unlocked is the display label for unlocked entries.
+	Unlocked = "unlocked"
 )
 
 // Additional bold metadata field prefixes for session show output.
@@ -91,6 +105,63 @@ const (
 	ColTokens = "Tokens"
 )
 
+// Claude Code tool names used in session transcripts.
+const (
+	ToolRead      = "Read"
+	ToolWrite     = "Write"
+	ToolEdit      = "Edit"
+	ToolBash      = "Bash"
+	ToolGrep      = "Grep"
+	ToolGlob      = "Glob"
+	ToolWebFetch  = "WebFetch"
+	ToolWebSearch = "WebSearch"
+	ToolTask      = "Task"
+)
+
+// Plain-text metadata labels used in HTML table rows.
+const (
+	MetaLabelID       = "ID"
+	MetaLabelDate     = "Date"
+	MetaLabelTime     = "Time"
+	MetaLabelDuration = "Duration"
+	MetaLabelTool     = "Tool"
+	MetaLabelProject  = "Project"
+	MetaLabelBranch   = "Branch"
+	MetaLabelModel    = "Model"
+	MetaLabelTurns    = "Turns"
+	MetaLabelTokens   = "Tokens"
+	MetaLabelParts    = "Parts"
+)
+
+// YAML frontmatter field keys for journal export.
+const (
+	FmKeyDate      = "date"
+	FmKeyTime      = "time"
+	FmKeyProject   = "project"
+	FmKeyBranch    = "branch"
+	FmKeyModel     = "model"
+	FmKeyTokensIn  = "tokens_in"
+	FmKeyTokensOut = "tokens_out"
+	FmKeySessionID = "session_id"
+	FmKeyTitle     = "title"
+)
+
+// Claude Code tool input JSON keys for display formatting.
+const (
+	ToolInputFilePath    = "file_path"
+	ToolInputCommand     = "command"
+	ToolInputPattern     = "pattern"
+	ToolInputURL         = "url"
+	ToolInputQuery       = "query"
+	ToolInputDescription = "description"
+)
+
+// Tool display limits.
+const (
+	// ToolDisplayMaxLen is the max length for tool parameter display before truncation.
+	ToolDisplayMaxLen = 100
+)
+
 // CLI flag names used in multiple commands.
 const (
 	// FlagSince is the --since flag name.
@@ -135,8 +206,14 @@ const (
 
 // Journal turn markers for content transformation.
 const (
-	// LabelBoldReminder is the bold-style system reminder prefix.
-	LabelBoldReminder = "**System Reminder**:"
-	// LabelToolOutput is the turn role label for tool output turns.
-	LabelToolOutput = "Tool Output"
+	// BoldReminder is the bold-style system reminder prefix.
+	BoldReminder = "**System Reminder**:"
+	// ToolOutput is the turn role label for tool output turns.
+	ToolOutput = "Tool Output"
+)
+
+// Loop output markers.
+const (
+	// LoopComplete is the banner printed when the loop finishes.
+	LoopComplete = "=== Loop Complete ==="
 )
diff --git a/internal/config/obsidian.go b/internal/assets/obsidian.go
similarity index 57%
rename from internal/config/obsidian.go
rename to internal/assets/obsidian.go
index 9c5bbcdf..1c46c28c 100644
--- a/internal/config/obsidian.go
+++ b/internal/assets/obsidian.go
@@ -1,21 +1,10 @@
 //   /    ctx:                         https://ctx.ist
 // ,'`./    do you remember?
-// `.,'\
+// `.,'\\
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
-
-// Obsidian vault output directory constants.
-const (
-	// ObsidianDirName is the default output directory for the Obsidian vault
-	// within .context/.
-	ObsidianDirName = "journal-obsidian"
-	// ObsidianDirEntries is the subdirectory for journal entry files.
-	ObsidianDirEntries = "entries"
-	// ObsidianConfigDir is the Obsidian configuration directory name.
-	ObsidianConfigDir = ".obsidian"
-)
+package assets
 
 // Obsidian vault configuration.
 const (
@@ -27,23 +16,6 @@ const (
   "strictLineBreaks": false
 }
 `
-	// ObsidianAppConfigFile is the Obsidian app configuration filename.
-	ObsidianAppConfigFile = "app.json"
-)
-
-// Obsidian MOC (Map of Content) page filenames.
-const (
-	// ObsidianMOCPrefix is prepended to MOC filenames so they sort first
-	// in the Obsidian file explorer.
-	ObsidianMOCPrefix = "_"
-	// ObsidianHomeMOC is the root navigation hub filename.
-	ObsidianHomeMOC = "Home.md"
-	// ObsidianTopicsMOC is the topics index MOC filename.
-	ObsidianTopicsMOC = "_Topics.md"
-	// ObsidianFilesMOC is the key files index MOC filename.
-	ObsidianFilesMOC = "_Key Files.md"
-	// ObsidianTypesMOC is the session types index MOC filename.
-	ObsidianTypesMOC = "_Session Types.md"
 )
 
 // Obsidian vault format templates.
diff --git a/internal/config/pattern.go b/internal/assets/pattern.go
similarity index 97%
rename from internal/config/pattern.go
rename to internal/assets/pattern.go
index 3d8fb66b..f6e63cdd 100644
--- a/internal/config/pattern.go
+++ b/internal/assets/pattern.go
@@ -1,10 +1,10 @@
 //   /    ctx:                         https://ctx.ist
 // ,'`./    do you remember?
-// `.,'\
+// `.,'\\
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Pattern represents a config file pattern and its documentation topic.
 //
diff --git a/internal/assets/permissions/allow.txt b/internal/assets/permissions/allow.txt
index 1777c428..9a43ac46 100644
--- a/internal/assets/permissions/allow.txt
+++ b/internal/assets/permissions/allow.txt
@@ -24,7 +24,7 @@ Skill(ctx-journal-enrich)
 Skill(ctx-journal-enrich-all)
 Skill(ctx-journal-normalize)
 Skill(ctx-loop)
-Skill(ctx-map)
+Skill(ctx-architecture)
 Skill(ctx-next)
 Skill(ctx-pad)
 Skill(ctx-pause)
diff --git a/internal/assets/project/IMPLEMENTATION_PLAN.md b/internal/assets/project/IMPLEMENTATION_PLAN.md
index 9b2ae2ff..d4d682db 100644
--- a/internal/assets/project/IMPLEMENTATION_PLAN.md
+++ b/internal/assets/project/IMPLEMENTATION_PLAN.md
@@ -13,9 +13,9 @@ This file provides high-level direction. Detailed tasks live in `.context/TASKS.
 
 What does "done" look like for this project?
 
-1. **Goal** — Define your end state
-2. **Validation** — How will you know it works?
-3. **Handoff** — Can someone else pick this up?
+1. **Goal**: Define your end state
+2. **Validation**: How will you know it works?
+3. **Handoff**: Can someone else pick this up?
 
 ## Notes
 
diff --git a/internal/assets/project/ideas-README.md b/internal/assets/project/ideas-README.md
index 1da3e07c..a8d7c598 100644
--- a/internal/assets/project/ideas-README.md
+++ b/internal/assets/project/ideas-README.md
@@ -21,4 +21,4 @@ When an idea reaches a reviewable state, promote it to `specs/`:
 2. Move to `specs/` (or write a fresh spec inspired by the idea)
 3. Add a Phase to TASKS.md referencing the spec
 
-Ideas that stay here indefinitely are fine — not everything needs to ship.
+Ideas that stay here indefinitely are fine: not everything needs to ship.
diff --git a/internal/assets/project/specs-README.md b/internal/assets/project/specs-README.md
index 67329c84..815619d6 100644
--- a/internal/assets/project/specs-README.md
+++ b/internal/assets/project/specs-README.md
@@ -2,19 +2,19 @@
 
 Formalized plans for features, refactors, and non-trivial changes.
 
-A spec is what comes out of a planning session — problem statement,
+A spec is what comes out of a planning session: problem statement,
 proposed solution, CLI surface, storage, error cases, and non-goals.
 It's complete enough that another session could implement from it alone.
 
 ## Lifecycle
 
-1. **Draft** — write the spec in this directory
-2. **Reference** — add a Phase to TASKS.md with `Spec: specs/.md`
-3. **Implement** — follow the spec, checking off tasks as you go
-4. **Archive** — move to `specs/done/` when all tasks are complete
+1. **Draft**: write the spec in this directory
+2. **Reference**: add a Phase to TASKS.md with `Spec: specs/.md`
+3. **Implement**: follow the spec, checking off tasks as you go
+4. **Archive**: move to `specs/done/` when all tasks are complete
 
 ## Tips
 
 - Keep specs concise. A page is usually enough.
-- Non-goals are as important as goals — they prevent scope creep.
+- Non-goals are as important as goals: they prevent scope creep.
 - If a spec grows beyond two pages, split it.
diff --git a/internal/assets/prompt-templates/refactor.md b/internal/assets/prompt-templates/refactor.md
index c003da6a..e4528418 100644
--- a/internal/assets/prompt-templates/refactor.md
+++ b/internal/assets/prompt-templates/refactor.md
@@ -2,10 +2,10 @@
 
 Refactor the specified code following these rules:
 
-1. **Write or verify tests first** — confirm existing behavior is captured before changing structure.
-2. **Preserve all existing behavior** — refactoring changes structure, not outcomes.
-3. **Make one structural change at a time** — keep each step reviewable and revertible.
-4. **Run tests after each step** — catch regressions immediately, not at the end.
-5. **Check project conventions** — consult `.context/CONVENTIONS.md` to ensure the refactored code follows established patterns.
+1. **Write or verify tests first**: confirm existing behavior is captured before changing structure.
+2. **Preserve all existing behavior**: refactoring changes structure, not outcomes.
+3. **Make one structural change at a time**: keep each step reviewable and revertible.
+4. **Run tests after each step**: catch regressions immediately, not at the end.
+5. **Check project conventions**: consult `.context/CONVENTIONS.md` to ensure the refactored code follows established patterns.
 
 If a refactoring step would change observable behavior, stop and flag it as a separate task.
diff --git a/internal/assets/ralph/PROMPT.md b/internal/assets/ralph/PROMPT.md
index c8e178c2..f77e3ba9 100644
--- a/internal/assets/ralph/PROMPT.md
+++ b/internal/assets/ralph/PROMPT.md
@@ -9,11 +9,11 @@ You are working on this project autonomously. Follow these steps each iteration.
 
 Read these files in order:
 
-1. `.context/CONSTITUTION.md` — NEVER violate these rules
-2. `.context/TASKS.md` — Find work to do
-3. `.context/CONVENTIONS.md` — Follow these patterns
-4. `.context/DECISIONS.md` — Understand past choices
-5. `.context/LEARNINGS.md` — Avoid known pitfalls
+1. `.context/CONSTITUTION.md`: NEVER violate these rules
+2. `.context/TASKS.md`: Find work to do
+3. `.context/CONVENTIONS.md`: Follow these patterns
+4. `.context/DECISIONS.md`: Understand past choices
+5. `.context/LEARNINGS.md`: Avoid known pitfalls
 
 ## 2. Pick One Task
 
@@ -33,7 +33,7 @@ From `.context/TASKS.md`, select ONE task that is:
 
 After completing work:
 
-- Mark task complete: `ctx complete ""`
+- Mark task complete: `ctx tasks complete ""`
 - Add learnings: `ctx add learning "..."`
 - Add decisions: `ctx add decision "..."`
 
@@ -45,18 +45,18 @@ Create a focused commit with a clear message. Include `.context/` changes.
 
 End your response with exactly ONE of:
 
-| Signal | When to Use |
-|--------|-------------|
-| `SYSTEM_CONVERGED` | All tasks in TASKS.md are complete |
-| `SYSTEM_BLOCKED` | Cannot proceed without human input (explain why) |
-| *(no signal)* | More work remains, continue to next iteration |
+| Signal             | When to Use                                      |
+|--------------------|--------------------------------------------------|
+| `SYSTEM_CONVERGED` | All tasks in TASKS.md are complete               |
+| `SYSTEM_BLOCKED`   | Cannot proceed without human input (explain why) |
+| *(no signal)*      | More work remains, continue to next iteration    |
 
 ## Rules
 
-- **ONE task per iteration** — stay focused
-- **NEVER skip tests** — verify your work
-- **NEVER violate CONSTITUTION.md** — hard rules are inviolable
-- **Commit after each task** — preserve progress
-- **Don't ask questions** — if blocked, emit SYSTEM_BLOCKED with explanation
+- **ONE task per iteration**: stay focused
+- **NEVER skip tests**: verify your work
+- **NEVER violate CONSTITUTION.md**: hard rules are inviolable
+- **Commit after each task**: preserve progress
+- **Don't ask questions**: if blocked, emit SYSTEM_BLOCKED with explanation
 
 
diff --git a/internal/config/tpl_entry.go b/internal/assets/tpl_entry.go
similarity index 97%
rename from internal/config/tpl_entry.go
rename to internal/assets/tpl_entry.go
index c2b76a13..b375c703 100644
--- a/internal/config/tpl_entry.go
+++ b/internal/assets/tpl_entry.go
@@ -1,10 +1,10 @@
 //   /    ctx:                         https://ctx.ist
 // ,'`./    do you remember?
-// `.,'\
+// `.,'\\
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Markdown format templates for context entries.
 //
diff --git a/internal/config/tpl_journal.go b/internal/assets/tpl_journal.go
similarity index 97%
rename from internal/config/tpl_journal.go
rename to internal/assets/tpl_journal.go
index 42180dfb..30233030 100644
--- a/internal/config/tpl_journal.go
+++ b/internal/assets/tpl_journal.go
@@ -4,7 +4,7 @@
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Journal site format templates.
 //
@@ -213,9 +213,4 @@ combine_header_slug = true
 	// TplZensicalExtraCSS is the extra_css line for zensical.toml.
 	// Must appear under [project] (after nav, before [project.theme]).
 	TplZensicalExtraCSS = `extra_css = ["stylesheets/extra.css"]`
-
-	// JournalExtraCSS is deprecated — the stylesheet is now embedded
-	// from internal/cli/journal/extra.css via go:embed.
-	// This constant is kept only as a fallback reference.
-	JournalExtraCSS = ""
 )
diff --git a/internal/config/tpl_loop.go b/internal/assets/tpl_loop.go
similarity index 99%
rename from internal/config/tpl_loop.go
rename to internal/assets/tpl_loop.go
index e7574ac5..92136ed9 100644
--- a/internal/config/tpl_loop.go
+++ b/internal/assets/tpl_loop.go
@@ -4,7 +4,7 @@
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Agent load and loop script templates.
 const (
diff --git a/internal/config/tpl_recall.go b/internal/assets/tpl_recall.go
similarity index 88%
rename from internal/config/tpl_recall.go
rename to internal/assets/tpl_recall.go
index 94e5f0a3..fcf6b636 100644
--- a/internal/config/tpl_recall.go
+++ b/internal/assets/tpl_recall.go
@@ -4,7 +4,7 @@
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-package config
+package assets
 
 // Recall export format templates.
 //
@@ -95,4 +95,20 @@ const (
 	// TplMetaRow formats a single row in an HTML metadata table.
 	// Args: label, value.
 	TplMetaRow = "%s%s"
+
+	// TplFmQuoted formats a YAML frontmatter quoted string field.
+	// Args: key, value.
+	TplFmQuoted = "%s: %q"
+
+	// TplFmString formats a YAML frontmatter bare string field.
+	// Args: key, value.
+	TplFmString = "%s: %s"
+
+	// TplFmInt formats a YAML frontmatter integer field.
+	// Args: key, value.
+	TplFmInt = "%s: %d"
+
+	// TplToolDisplay formats a tool name with its key parameter.
+	// Args: tool name, parameter value.
+	TplToolDisplay = "%s: %s"
 )
diff --git a/internal/assets/why/about.md b/internal/assets/why/about.md
index 121fe7ae..059bc7f6 100644
--- a/internal/assets/why/about.md
+++ b/internal/assets/why/about.md
@@ -35,7 +35,7 @@ It lives in a `.context/` directory in your repo.
 
 * A session is *interactive*.
 * `ctx` enables **cognitive continuity**.
-* **Cognitive continuity** enables durable, *symbiotic-like* human–AI workflows.
+* **Cognitive continuity** enables durable, *symbiotic-like* human-AI workflows.
 
 !!! tip "Context Files"
     Context files let AI tools **remember** [*decisions*, *conventions*, and *learnings*](context-files.md):
@@ -95,7 +95,7 @@ Without persistent memory, every session starts at zero.
     AI:   "Per DECISIONS.md (2026-02-12): JWT, not sessions: 
            The project uses PostgreSQL with JWT
            authentication. The API follows snake_case conventions.
-           Last session you completed the login endpoint — the
+           Last session you completed the login endpoint: the
            registration endpoint is next on TASKS.md. Want me to
            start there?"
 
@@ -141,5 +141,5 @@ makes the *next* session **smarter**.
 
 **Ready to Get Started?**
 
-* [Getting Started →](getting-started.md) — full installation and setup
-* [Your First Session →](first-session.md) — step-by-step walkthrough from `ctx init` to verified recall
+* [Getting Started →](getting-started.md): full installation and setup
+* [Your First Session →](first-session.md): step-by-step walkthrough from `ctx init` to verified recall
diff --git a/internal/assets/why/design-invariants.md b/internal/assets/why/design-invariants.md
index 62a8bf84..c8a2d4cd 100644
--- a/internal/assets/why/design-invariants.md
+++ b/internal/assets/why/design-invariants.md
@@ -9,6 +9,8 @@ title: Invariants
 icon: lucide/anchor
 ---
 
+![ctx](../images/ctx-banner.png)
+
 # The System Explains Itself
 
 These are the properties that **must hold** for any valid `ctx` implementation.
diff --git a/internal/bootstrap/bootstrap.go b/internal/bootstrap/bootstrap.go
index 6d86b891..8427a93e 100644
--- a/internal/bootstrap/bootstrap.go
+++ b/internal/bootstrap/bootstrap.go
@@ -22,7 +22,6 @@ import (
 	"github.com/ActiveMemory/ctx/internal/cli/agent"
 	"github.com/ActiveMemory/ctx/internal/cli/changes"
 	"github.com/ActiveMemory/ctx/internal/cli/compact"
-	"github.com/ActiveMemory/ctx/internal/cli/complete"
 	"github.com/ActiveMemory/ctx/internal/cli/config"
 	"github.com/ActiveMemory/ctx/internal/cli/decision"
 	"github.com/ActiveMemory/ctx/internal/cli/deps"
@@ -59,7 +58,7 @@ import (
 // Initialize registers all ctx subcommands with the root command.
 //
 // This function attaches all available subcommands to the provided root
-// command, including init, status, load, add, complete, agent, drift,
+// command, including init, status, load, add, agent, drift,
 // sync, compact, decision, watch, hook, learnings, tasks, loop, recall,
 // journal, and serve.
 //
@@ -74,7 +73,6 @@ func Initialize(cmd *cobra.Command) *cobra.Command {
 		agent.Cmd,
 		changes.Cmd,
 		compact.Cmd,
-		complete.Cmd,
 		config.Cmd,
 		decision.Cmd,
 		deps.Cmd,
diff --git a/internal/bootstrap/bootstrap_test.go b/internal/bootstrap/bootstrap_test.go
index 6c2ef441..021a24ac 100644
--- a/internal/bootstrap/bootstrap_test.go
+++ b/internal/bootstrap/bootstrap_test.go
@@ -11,9 +11,11 @@ import (
 	"path/filepath"
 	"testing"
 
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/flag"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -42,7 +44,7 @@ func TestRootCmd(t *testing.T) {
 	}
 
 	// Check global flags exist
-	contextDirFlag := cmd.PersistentFlags().Lookup(config.FlagContextDir)
+	contextDirFlag := cmd.PersistentFlags().Lookup(flag.ContextDir)
 	if contextDirFlag == nil {
 		t.Error("--context-dir flag not found")
 	}
@@ -63,7 +65,6 @@ func TestInitialize(t *testing.T) {
 		"status",
 		"load",
 		"add",
-		"complete",
 		"agent",
 		"drift",
 		"sync",
@@ -109,7 +110,7 @@ func TestRootCmdVersion(t *testing.T) {
 func TestRootCmdAllowOutsideCwdFlag(t *testing.T) {
 	cmd := RootCmd()
 
-	flag := cmd.PersistentFlags().Lookup(config.FlagAllowOutsideCwd)
+	flag := cmd.PersistentFlags().Lookup(flag.AllowOutsideCwd)
 	if flag == nil {
 		t.Fatal("--allow-outside-cwd flag not found")
 	}
@@ -123,7 +124,7 @@ func TestRootCmdPersistentPreRun_ContextDir(t *testing.T) {
 
 	dummy := &cobra.Command{
 		Use:         "dummy",
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
+		Annotations: map[string]string{cli.AnnotationSkipInit: "true"},
 		Run:         func(cmd *cobra.Command, args []string) {},
 	}
 	cmd.AddCommand(dummy)
@@ -147,7 +148,7 @@ func TestRootCmdPersistentPreRun_DefaultFlags(t *testing.T) {
 
 	dummy := &cobra.Command{
 		Use:         "dummy",
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
+		Annotations: map[string]string{cli.AnnotationSkipInit: "true"},
 		Run:         func(cmd *cobra.Command, args []string) {},
 	}
 	cmd.AddCommand(dummy)
@@ -185,7 +186,7 @@ func TestRootCmdPersistentPreRun_BoundaryViolation(t *testing.T) {
 	cmd := RootCmd()
 	dummy := &cobra.Command{
 		Use:         "dummy",
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
+		Annotations: map[string]string{cli.AnnotationSkipInit: "true"},
 		Run:         func(cmd *cobra.Command, args []string) {},
 	}
 	cmd.AddCommand(dummy)
@@ -223,7 +224,7 @@ func TestInitGuard_AllowsAnnotatedCommand(t *testing.T) {
 	cmd := RootCmd()
 	dummy := &cobra.Command{
 		Use:         "dummy",
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
+		Annotations: map[string]string{cli.AnnotationSkipInit: "true"},
 		Run:         func(cmd *cobra.Command, args []string) {},
 	}
 	cmd.AddCommand(dummy)
@@ -298,7 +299,7 @@ func TestInitGuard_AllowsInitializedCommand(t *testing.T) {
 	tmp := t.TempDir()
 
 	// Create required context files so Initialized() returns true.
-	for _, f := range config.FilesRequired {
+	for _, f := range ctx.FilesRequired {
 		path := filepath.Join(tmp, f)
 		if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil {
 			t.Fatalf("setup: %v", writeErr)
diff --git a/internal/bootstrap/cmd.go b/internal/bootstrap/cmd.go
index eba0d2d3..74c5a6a1 100644
--- a/internal/bootstrap/cmd.go
+++ b/internal/bootstrap/cmd.go
@@ -14,7 +14,9 @@ import (
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/flag"
+	ctxcontext "github.com/ActiveMemory/ctx/internal/context"
 	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
 	"github.com/ActiveMemory/ctx/internal/validation"
@@ -37,12 +39,12 @@ var version = "dev"
 // Returns:
 //   - *cobra.Command: The configured root command with usage and version info
 func RootCmd() *cobra.Command {
-	config.BinaryVersion = version
+	const completionCmd = "completion"
 
 	var contextDir string
 	var allowOutsideCwd bool
 
-	short, long := assets.CommandDesc("ctx")
+	short, long := assets.CommandDesc(assets.CmdDescKeyCtx)
 
 	cmd := &cobra.Command{
 		Use:     "ctx",
@@ -70,12 +72,12 @@ func RootCmd() *cobra.Command {
 			if cmd.Hidden {
 				return nil
 			}
-			if p := cmd.Parent(); p != nil && p.Name() == config.CmdCompletion {
+			if p := cmd.Parent(); p != nil && p.Name() == completionCmd {
 				return nil
 			}
 
 			// Skip init check for annotated commands.
-			if _, ok := cmd.Annotations[config.AnnotationSkipInit]; ok {
+			if _, ok := cmd.Annotations[cli.AnnotationSkipInit]; ok {
 				return nil
 			}
 
@@ -85,7 +87,7 @@ func RootCmd() *cobra.Command {
 			}
 
 			// Require initialization.
-			if !config.Initialized(rc.ContextDir()) {
+			if !ctxcontext.Initialized(rc.ContextDir()) {
 				return ctxerr.NotInitialized()
 			}
 
@@ -101,15 +103,15 @@ func RootCmd() *cobra.Command {
 	// Global flags available to all subcommands
 	cmd.PersistentFlags().StringVar(
 		&contextDir,
-		config.FlagContextDir,
+		flag.ContextDir,
 		"",
-		assets.FlagDesc(config.FlagContextDir),
+		assets.FlagDesc(flag.ContextDir),
 	)
 	cmd.PersistentFlags().BoolVar(
 		&allowOutsideCwd,
-		config.FlagAllowOutsideCwd,
+		flag.AllowOutsideCwd,
 		false,
-		assets.FlagDesc(config.FlagAllowOutsideCwd),
+		assets.FlagDesc(flag.AllowOutsideCwd),
 	)
 
 	return cmd
diff --git a/internal/cli/add/cmd/coverage_test.go b/internal/cli/add/cmd/coverage_test.go
index 4166c091..b7486878 100644
--- a/internal/cli/add/cmd/coverage_test.go
+++ b/internal/cli/add/cmd/coverage_test.go
@@ -12,14 +12,16 @@ import (
 	"strings"
 	"testing"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/add/cmd/root"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/write/add"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/add/core"
 	"github.com/ActiveMemory/ctx/internal/cli/initialize"
-	"github.com/ActiveMemory/ctx/internal/config"
+	entrytype "github.com/ActiveMemory/ctx/internal/config/entry"
 	"github.com/ActiveMemory/ctx/internal/entry"
-	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // ---------------------------------------------------------------------------
@@ -27,16 +29,16 @@ import (
 // ---------------------------------------------------------------------------
 
 func TestErrNoContent(t *testing.T) {
-	err := write.ErrNoContent()
+	err := add.ErrNoContent()
 	if err == nil || err.Error() != "no content provided" {
 		t.Errorf("ErrNoContent() = %v, want 'no content provided'", err)
 	}
 }
 
 func TestErrNoContentProvided(t *testing.T) {
-	for _, fType := range []string{"decision", "task", "learning", "convention", "unknown"} {
+	for _, fType := range []string{entrytype.Decision, entrytype.Task, entrytype.Learning, entrytype.Convention, entrytype.Unknown} {
 		t.Run(fType, func(t *testing.T) {
-			err := write.ErrNoContentProvided(fType, core.ExamplesForType(fType))
+			err := add.ErrNoContentProvided(fType, core.ExamplesForType(fType))
 			if err == nil {
 				t.Fatal("expected non-nil error")
 			}
@@ -52,7 +54,7 @@ func TestErrNoContentProvided(t *testing.T) {
 }
 
 func TestErrFileRead(t *testing.T) {
-	err := write.ErrFileRead("/some/path", os.ErrNotExist)
+	err := add.ErrFileRead("/some/path", os.ErrNotExist)
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -62,7 +64,7 @@ func TestErrFileRead(t *testing.T) {
 }
 
 func TestErrFileWrite(t *testing.T) {
-	err := write.ErrFileWriteAdd("/some/path", os.ErrPermission)
+	err := add.ErrFileWriteAdd("/some/path", os.ErrPermission)
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -72,7 +74,7 @@ func TestErrFileWrite(t *testing.T) {
 }
 
 func TestErrStdinRead(t *testing.T) {
-	err := write.ErrStdinRead(os.ErrClosed)
+	err := add.ErrStdinRead(os.ErrClosed)
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -82,7 +84,7 @@ func TestErrStdinRead(t *testing.T) {
 }
 
 func TestErrIndexUpdate(t *testing.T) {
-	err := write.ErrIndexUpdate("/some/file", os.ErrPermission)
+	err := add.ErrIndexUpdate("/some/file", os.ErrPermission)
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -92,7 +94,7 @@ func TestErrIndexUpdate(t *testing.T) {
 }
 
 func TestErrUnknownType(t *testing.T) {
-	err := write.ErrUnknownType("foobar")
+	err := add.ErrUnknownType("foobar")
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -106,7 +108,7 @@ func TestErrUnknownType(t *testing.T) {
 }
 
 func TestErrFileNotFound(t *testing.T) {
-	err := write.ErrFileNotFound("/missing/file")
+	err := add.ErrFileNotFound("/missing/file")
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -120,7 +122,7 @@ func TestErrFileNotFound(t *testing.T) {
 }
 
 func TestErrMissingFields(t *testing.T) {
-	err := write.ErrMissingFields("decision", []string{"context", "rationale"})
+	err := add.ErrMissingFields("decision", []string{"context", "rationale"})
 	if err == nil {
 		t.Fatal("expected non-nil error")
 	}
@@ -142,11 +144,11 @@ func TestExamplesForType(t *testing.T) {
 		fType    string
 		contains string
 	}{
-		{"decision", "ctx add decision"},
-		{"task", "ctx add task"},
-		{"learning", "ctx add learning"},
-		{"convention", "ctx add convention"},
-		{"unknown", "ctx add "},
+		{entrytype.Decision, "ctx add decision"},
+		{entrytype.Task, "ctx add task"},
+		{entrytype.Learning, "ctx add learning"},
+		{entrytype.Convention, "ctx add convention"},
+		{entrytype.Unknown, "ctx add "},
 	}
 	for _, tt := range tests {
 		t.Run(tt.fType, func(t *testing.T) {
@@ -336,12 +338,12 @@ func TestInsertAfterHeader_HeaderAtEndOfFile(t *testing.T) {
 
 func TestInsertAfterHeader_WithCtxMarkers(t *testing.T) {
 	content := "# Learnings\n" +
-		config.CtxMarkerStart + "\nsome context\n" + config.CommentClose + "\n\n" +
+		marker.CtxMarkerStart + "\nsome context\n" + marker.CommentClose + "\n\n" +
 		"## [2026-01-01] Existing\n"
 	entry := "## [2026-01-02] New\n"
 
 	// The header "# Learnings" is found, then markers are skipped
-	result := core.InsertAfterHeader(content, entry, config.HeadingLearnings)
+	result := core.InsertAfterHeader(content, entry, assets.HeadingLearnings)
 	resultStr := string(result)
 
 	if !strings.Contains(resultStr, "New") {
@@ -351,10 +353,10 @@ func TestInsertAfterHeader_WithCtxMarkers(t *testing.T) {
 
 func TestInsertAfterHeader_CtxMarkerWithoutClose(t *testing.T) {
 	// ctx marker start present but no close marker
-	content := "# Learnings\n" + config.CtxMarkerStart + "\nunclosed marker content\nExisting\n"
+	content := "# Learnings\n" + marker.CtxMarkerStart + "\nunclosed marker content\nExisting\n"
 	entry := "## New entry\n"
 
-	result := core.InsertAfterHeader(content, entry, config.HeadingLearnings)
+	result := core.InsertAfterHeader(content, entry, assets.HeadingLearnings)
 	resultStr := string(result)
 
 	if !strings.Contains(resultStr, "New entry") {
@@ -809,10 +811,10 @@ func TestContainsNewLine(t *testing.T) {
 }
 
 func TestStartsWithCtxMarker(t *testing.T) {
-	if !core.StartsWithCtxMarker(config.CtxMarkerStart + " rest") {
+	if !core.StartsWithCtxMarker(marker.CtxMarkerStart + " rest") {
 		t.Error("should detect CtxMarkerStart")
 	}
-	if !core.StartsWithCtxMarker(config.CtxMarkerEnd + " rest") {
+	if !core.StartsWithCtxMarker(marker.CtxMarkerEnd + " rest") {
 		t.Error("should detect CtxMarkerEnd")
 	}
 	if core.StartsWithCtxMarker("no marker here") {
diff --git a/internal/cli/add/cmd/doc.go b/internal/cli/add/cmd/doc.go
new file mode 100644
index 00000000..d6f4f4a7
--- /dev/null
+++ b/internal/cli/add/cmd/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package cmd provides coverage tests for the ctx add subcommand tree.
+//
+// It verifies that all registered subcommands are reachable and properly
+// wired into the add command group.
+package cmd
diff --git a/internal/cli/add/cmd/root/cmd.go b/internal/cli/add/cmd/root/cmd.go
index 5d098d7c..e279dc5b 100644
--- a/internal/cli/add/cmd/root/cmd.go
+++ b/internal/cli/add/cmd/root/cmd.go
@@ -43,7 +43,7 @@ func Cmd() *cobra.Command {
 		application  string
 	)
 
-	short, long := assets.CommandDesc("add")
+	short, long := assets.CommandDesc(assets.CmdDescKeyAdd)
 
 	cmd := &cobra.Command{
 		Use:       "add  [content]",
@@ -68,7 +68,7 @@ func Cmd() *cobra.Command {
 	cmd.Flags().StringVarP(
 		&priority,
 		"priority", "p", "",
-		assets.FlagDesc("add.priority"),
+		assets.FlagDesc(assets.FlagDescKeyAddPriority),
 	)
 	_ = cmd.RegisterFlagCompletionFunc(
 		"priority", func(_ *cobra.Command, _ []string, _ string) (
@@ -79,37 +79,37 @@ func Cmd() *cobra.Command {
 	cmd.Flags().StringVarP(
 		§ion,
 		"section", "s", "",
-		assets.FlagDesc("add.section"),
+		assets.FlagDesc(assets.FlagDescKeyAddSection),
 	)
 	cmd.Flags().StringVarP(
 		&fromFile,
 		"file", "f", "",
-		assets.FlagDesc("add.file"),
+		assets.FlagDesc(assets.FlagDescKeyAddFile),
 	)
 	cmd.Flags().StringVarP(
 		&context,
 		"context", "c", "",
-		assets.FlagDesc("add.context"),
+		assets.FlagDesc(assets.FlagDescKeyAddContext),
 	)
 	cmd.Flags().StringVarP(
 		&rationale,
 		"rationale", "r", "",
-		assets.FlagDesc("add.rationale"),
+		assets.FlagDesc(assets.FlagDescKeyAddRationale),
 	)
 	cmd.Flags().StringVar(
 		&consequences,
 		"consequences", "",
-		assets.FlagDesc("add.consequences"),
+		assets.FlagDesc(assets.FlagDescKeyAddConsequences),
 	)
 	cmd.Flags().StringVarP(
 		&lesson,
 		"lesson", "l", "",
-		assets.FlagDesc("add.lesson"),
+		assets.FlagDesc(assets.FlagDescKeyAddLesson),
 	)
 	cmd.Flags().StringVarP(
 		&application,
 		"application", "a", "",
-		assets.FlagDesc("add.application"),
+		assets.FlagDesc(assets.FlagDescKeyAddApplication),
 	)
 
 	return cmd
diff --git a/internal/cli/add/cmd/root/doc.go b/internal/cli/add/cmd/root/doc.go
new file mode 100644
index 00000000..33ef2fae
--- /dev/null
+++ b/internal/cli/add/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx add command.
+//
+// It appends new decisions, tasks, learnings, or conventions to the
+// appropriate context file.
+package root
diff --git a/internal/cli/add/cmd/root/run.go b/internal/cli/add/cmd/root/run.go
index 980f91ad..3e3f5d24 100644
--- a/internal/cli/add/cmd/root/run.go
+++ b/internal/cli/add/cmd/root/run.go
@@ -9,10 +9,11 @@ package root
 import (
 	"strings"
 
+	entry2 "github.com/ActiveMemory/ctx/internal/config/entry"
+	"github.com/ActiveMemory/ctx/internal/write/add"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/add/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/entry"
 	"github.com/ActiveMemory/ctx/internal/write"
 )
@@ -38,7 +39,7 @@ func Run(cmd *cobra.Command, args []string, flags Config) error {
 
 	content, extractErr := core.ExtractContent(args, flags)
 	if extractErr != nil || content == "" {
-		return write.ErrNoContentProvided(fType, core.ExamplesForType(fType))
+		return add.ErrNoContentProvided(fType, core.ExamplesForType(fType))
 	}
 
 	params := entry.Params{
@@ -57,9 +58,9 @@ func Run(cmd *cobra.Command, args []string, flags Config) error {
 		return validateErr
 	}
 
-	fName, ok := config.FileType[fType]
+	fName, ok := entry2.ToCtxFile[fType]
 	if !ok {
-		return write.ErrUnknownType(fType)
+		return add.ErrUnknownType(fType)
 	}
 
 	if writeErr := entry.Write(params); writeErr != nil {
diff --git a/internal/cli/add/core/append.go b/internal/cli/add/core/append.go
index 3fd85de3..d1e4ba14 100644
--- a/internal/cli/add/core/append.go
+++ b/internal/cli/add/core/append.go
@@ -6,7 +6,7 @@
 
 package core
 
-import "github.com/ActiveMemory/ctx/internal/config"
+import "github.com/ActiveMemory/ctx/internal/assets"
 
 // AppendEntry inserts a formatted entry into existing file content.
 //
@@ -34,7 +34,7 @@ func AppendEntry(
 		return InsertTask(entry, existingStr, section)
 	// Decisions: insert before existing entries for reverse-chronological order
 	case FileTypeIsDecision(fileType):
-		return InsertDecision(existingStr, entry, config.HeadingDecisions)
+		return InsertDecision(existingStr, entry, assets.HeadingDecisions)
 	// Learnings: insert before existing entries for reverse-chronological order
 	case FileTypeIsLearning(fileType):
 		return InsertLearning(existingStr, entry)
diff --git a/internal/cli/add/core/before.go b/internal/cli/add/core/before.go
index c459cb83..0e040314 100644
--- a/internal/cli/add/core/before.go
+++ b/internal/cli/add/core/before.go
@@ -9,7 +9,8 @@ package core
 import (
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // insertBeforeFirstEntry scans for the first "## [" marker not inside an
@@ -27,7 +28,7 @@ func insertBeforeFirstEntry(content, entry, header string) []byte {
 	search := content
 	offset := 0
 	for {
-		rel := strings.Index(search, config.HeadingLearningStart)
+		rel := strings.Index(search, assets.HeadingLearningStart)
 		if rel == -1 {
 			break
 		}
@@ -35,12 +36,12 @@ func insertBeforeFirstEntry(content, entry, header string) []byte {
 		if !IsInsideHTMLComment(content, entryIdx) {
 			return []byte(
 				content[:entryIdx] + entry +
-					config.NewlineLF + config.Separator +
-					config.NewlineLF + config.NewlineLF +
+					token.NewlineLF + token.Separator +
+					token.NewlineLF + token.NewlineLF +
 					content[entryIdx:],
 			)
 		}
-		offset = entryIdx + len(config.HeadingLearningStart)
+		offset = entryIdx + len(assets.HeadingLearningStart)
 		search = content[offset:]
 	}
 
diff --git a/internal/cli/add/core/content.go b/internal/cli/add/core/content.go
index 2cf8679a..01c7bbce 100644
--- a/internal/cli/add/core/content.go
+++ b/internal/cli/add/core/content.go
@@ -11,8 +11,8 @@ import (
 	"os"
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
-	"github.com/ActiveMemory/ctx/internal/write"
+	"github.com/ActiveMemory/ctx/internal/config/token"
+	"github.com/ActiveMemory/ctx/internal/write/add"
 )
 
 // ExtractContent retrieves content from various sources for adding entries.
@@ -34,7 +34,7 @@ func ExtractContent(args []string, flags Config) (string, error) {
 		// Read from the file
 		fileContent, err := os.ReadFile(flags.FromFile)
 		if err != nil {
-			return "", write.ErrFileRead(flags.FromFile, err)
+			return "", add.ErrFileRead(flags.FromFile, err)
 		}
 		return strings.TrimSpace(string(fileContent)), nil
 	}
@@ -54,9 +54,9 @@ func ExtractContent(args []string, flags Config) (string, error) {
 			lines = append(lines, scanner.Text())
 		}
 		if err := scanner.Err(); err != nil {
-			return "", write.ErrStdinRead(err)
+			return "", add.ErrStdinRead(err)
 		}
-		return strings.TrimSpace(strings.Join(lines, config.NewlineLF)), nil
+		return strings.TrimSpace(strings.Join(lines, token.NewlineLF)), nil
 	}
-	return "", write.ErrNoContent()
+	return "", add.ErrNoContent()
 }
diff --git a/internal/cli/add/core/example.go b/internal/cli/add/core/example.go
index bb68ee6d..21121f2e 100644
--- a/internal/cli/add/core/example.go
+++ b/internal/cli/add/core/example.go
@@ -8,7 +8,6 @@ package core
 
 import (
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
 )
 
 // ExamplesForType returns example usage strings for a given entry type.
@@ -24,13 +23,7 @@ import (
 func ExamplesForType(fileType string) string {
 	const defaultKeyName = "default"
 
-	key := config.UserInputToEntry(fileType)
-
-	if key == "" {
-		key = defaultKeyName
-	}
-
-	if desc := assets.ExampleDesc(key); desc != "" {
+	if desc := assets.ExampleDesc(fileType); desc != "" {
 		return desc
 	}
 
diff --git a/internal/cli/add/core/fmt.go b/internal/cli/add/core/fmt.go
index 11ba86b4..7c0fc3db 100644
--- a/internal/cli/add/core/fmt.go
+++ b/internal/cli/add/core/fmt.go
@@ -10,7 +10,8 @@ import (
 	"fmt"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	time2 "github.com/ActiveMemory/ctx/internal/config/time"
 )
 
 // FormatTask formats a task entry as a Markdown checkbox item.
@@ -26,12 +27,12 @@ import (
 //   - string: Formatted task line with trailing newline
 func FormatTask(content string, priority string) string {
 	// Use YYYY-MM-DD-HHMMSS timestamp for session correlation
-	timestamp := time.Now().Format(config.TimestampCompact)
+	timestamp := time.Now().Format(time2.TimestampCompact)
 	var priorityTag string
 	if priority != "" {
-		priorityTag = fmt.Sprintf(config.TplTaskPriority, priority)
+		priorityTag = fmt.Sprintf(assets.TplTaskPriority, priority)
 	}
-	return fmt.Sprintf(config.TplTask, content, priorityTag, timestamp)
+	return fmt.Sprintf(assets.TplTask, content, priorityTag, timestamp)
 }
 
 // FormatLearning formats a learning entry as a structured Markdown section.
@@ -48,9 +49,9 @@ func FormatTask(content string, priority string) string {
 // Returns:
 //   - string: Formatted learning section with all fields
 func FormatLearning(title, context, lesson, application string) string {
-	timestamp := time.Now().Format(config.TimestampCompact)
+	timestamp := time.Now().Format(time2.TimestampCompact)
 	return fmt.Sprintf(
-		config.TplLearning, timestamp, title, context, lesson, application,
+		assets.TplLearning, timestamp, title, context, lesson, application,
 	)
 }
 
@@ -64,7 +65,7 @@ func FormatLearning(title, context, lesson, application string) string {
 // Returns:
 //   - string: Formatted convention line with trailing newline
 func FormatConvention(content string) string {
-	return fmt.Sprintf(config.TplConvention, content)
+	return fmt.Sprintf(assets.TplConvention, content)
 }
 
 // FormatDecision formats a decision entry as a structured Markdown section.
@@ -81,9 +82,9 @@ func FormatConvention(content string) string {
 // Returns:
 //   - string: Formatted decision section with all ADR fields
 func FormatDecision(title, context, rationale, consequences string) string {
-	timestamp := time.Now().Format(config.TimestampCompact)
+	timestamp := time.Now().Format(time2.TimestampCompact)
 	return fmt.Sprintf(
-		config.TplDecision,
+		assets.TplDecision,
 		timestamp, title, context, title, rationale, consequences,
 	)
 }
diff --git a/internal/cli/add/core/index.go b/internal/cli/add/core/index.go
index cc0d6695..7cf54273 100644
--- a/internal/cli/add/core/index.go
+++ b/internal/cli/add/core/index.go
@@ -10,7 +10,7 @@ package core
 // that delegate to the internal/index package.
 
 import (
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/index"
 )
 
@@ -72,7 +72,7 @@ func GenerateIndexTable(entries []IndexEntry, columnHeader string) string {
 // Returns:
 //   - string: Markdown table or empty string if no entries
 func GenerateIndex(entries []DecisionEntry) string {
-	return index.GenerateTable(entries, config.ColumnDecision)
+	return index.GenerateTable(entries, assets.ColumnDecision)
 }
 
 // UpdateIndex regenerates the decision index in DECISIONS.md content.
diff --git a/internal/cli/add/core/index_test.go b/internal/cli/add/core/index_test.go
index 32dc05ac..9851d8be 100644
--- a/internal/cli/add/core/index_test.go
+++ b/internal/cli/add/core/index_test.go
@@ -10,7 +10,7 @@ import (
 	"strings"
 	"testing"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
 )
 
 // TestDelegation verifies that the wrapper functions correctly delegate
@@ -52,7 +52,7 @@ func TestDelegation(t *testing.T) {
 
 	// Test UpdateIndex delegation
 	updated := UpdateIndex(content)
-	if !strings.Contains(updated, config.IndexStart) {
+	if !strings.Contains(updated, marker.IndexStart) {
 		t.Error("UpdateIndex() missing INDEX:START marker")
 	}
 
@@ -68,7 +68,7 @@ func TestDelegation(t *testing.T) {
 **Application**: Test
 `
 	updatedLearning := UpdateLearningsIndex(learningContent)
-	if !strings.Contains(updatedLearning, config.IndexStart) {
+	if !strings.Contains(updatedLearning, marker.IndexStart) {
 		t.Error("UpdateLearningsIndex() missing INDEX:START marker")
 	}
 	if !strings.Contains(updatedLearning, "| Date | Learning |") {
diff --git a/internal/cli/add/core/insert.go b/internal/cli/add/core/insert.go
index ab5f196a..4dec040b 100644
--- a/internal/cli/add/core/insert.go
+++ b/internal/cli/add/core/insert.go
@@ -9,7 +9,9 @@ package core
 import (
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // InsertAfterHeader finds a header line and inserts content after it.
@@ -50,7 +52,7 @@ func InsertAfterHeader(content, entry, header string) []byte {
 		}
 
 		// Not an HTML comment: we found the insertion point.
-		if !strings.HasPrefix(content[insertPoint:], config.CommentOpen) {
+		if !strings.HasPrefix(content[insertPoint:], marker.CommentOpen) {
 			break
 		}
 
@@ -60,7 +62,7 @@ func InsertAfterHeader(content, entry, header string) []byte {
 			break
 		}
 
-		insertPoint += endIdx + len(config.CommentClose)
+		insertPoint += endIdx + len(marker.CommentClose)
 		insertPoint = SkipWhitespace(content, insertPoint)
 	}
 
@@ -79,9 +81,9 @@ func InsertAfterHeader(content, entry, header string) []byte {
 //   - []byte: Content with entry appended
 func AppendAtEnd(content, entry string) []byte {
 	if !EndsWithNewline(content) {
-		content += config.NewlineLF
+		content += token.NewlineLF
 	}
-	return []byte(content + config.NewlineLF + entry)
+	return []byte(content + token.NewlineLF + entry)
 }
 
 // InsertTask inserts a task entry into TASKS.md.
@@ -105,17 +107,17 @@ func InsertTask(entry, existingStr, section string) []byte {
 	}
 
 	// Default: insert before the first unchecked task.
-	pendingIdx := strings.Index(existingStr, config.PrefixTaskUndone)
+	pendingIdx := strings.Index(existingStr, marker.PrefixTaskUndone)
 	if pendingIdx != -1 {
 		return []byte(existingStr[:pendingIdx] + entry +
-			config.NewlineLF + existingStr[pendingIdx:])
+			token.NewlineLF + existingStr[pendingIdx:])
 	}
 
 	// No unchecked tasks: append at the end.
 	if !EndsWithNewline(existingStr) {
-		existingStr += config.NewlineLF
+		existingStr += token.NewlineLF
 	}
-	return []byte(existingStr + config.NewlineLF + entry)
+	return []byte(existingStr + token.NewlineLF + entry)
 }
 
 // InsertTaskAfterSection inserts a task after a named section header.
@@ -137,20 +139,20 @@ func InsertTaskAfterSection(entry, content, section string) []byte {
 	found, idx := Contains(content, header)
 	if !found {
 		if !EndsWithNewline(content) {
-			content += config.NewlineLF
+			content += token.NewlineLF
 		}
-		return []byte(content + config.NewlineLF + entry)
+		return []byte(content + token.NewlineLF + entry)
 	}
 
 	hasNewLine, lineEnd := ContainsNewLine(content[idx:])
 	if hasNewLine {
 		insertPoint := idx + lineEnd
 		insertPoint = SkipNewline(content, insertPoint)
-		return []byte(content[:insertPoint] + config.NewlineLF +
+		return []byte(content[:insertPoint] + token.NewlineLF +
 			entry + content[insertPoint:])
 	}
 
-	return []byte(content + config.NewlineLF + entry)
+	return []byte(content + token.NewlineLF + entry)
 }
 
 // IsInsideHTMLComment reports whether the position idx in content falls
@@ -164,19 +166,19 @@ func InsertTaskAfterSection(entry, content, section string) []byte {
 //   - bool: True if idx is between a 
 func IsInsideHTMLComment(content string, idx int) bool {
 	// Find the last  closes that block before idx
-	closeIdx := strings.Index(content[openIdx:], config.CommentClose)
+	closeIdx := strings.Index(content[openIdx:], marker.CommentClose)
 	if closeIdx == -1 {
 		// Unclosed comment — treat as inside
 		return true
 	}
 	// The comment closes at openIdx+closeIdx; if that position is >= idx,
 	// the position is still inside the comment.
-	return openIdx+closeIdx+len(config.CommentClose) > idx
+	return openIdx+closeIdx+len(marker.CommentClose) > idx
 }
 
 // InsertDecision inserts a decision entry before existing entries.
@@ -209,5 +211,5 @@ func InsertDecision(content, entry, header string) []byte {
 // Returns:
 //   - []byte: Modified content with entry inserted
 func InsertLearning(content, entry string) []byte {
-	return insertBeforeFirstEntry(content, entry, config.HeadingLearnings)
+	return insertBeforeFirstEntry(content, entry, assets.HeadingLearnings)
 }
diff --git a/internal/cli/add/core/normalize.go b/internal/cli/add/core/normalize.go
index c0dfa45d..97c418aa 100644
--- a/internal/cli/add/core/normalize.go
+++ b/internal/cli/add/core/normalize.go
@@ -9,7 +9,7 @@ package core
 import (
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // CheckRequired returns the names of any fields whose values are empty.
@@ -41,8 +41,8 @@ func CheckRequired(fields [][2]string) []string {
 // Returns:
 //   - string: Normalized section heading (e.g., "## Phase 1")
 func NormalizeTargetSection(section string) string {
-	if !strings.HasPrefix(section, config.HeadingLevelTwoStart) {
-		return config.HeadingLevelTwoStart + section
+	if !strings.HasPrefix(section, token.HeadingLevelTwoStart) {
+		return token.HeadingLevelTwoStart + section
 	}
 	return section
 }
diff --git a/internal/cli/add/core/pos.go b/internal/cli/add/core/pos.go
index 5195cc2b..4cd40c7b 100644
--- a/internal/cli/add/core/pos.go
+++ b/internal/cli/add/core/pos.go
@@ -6,7 +6,9 @@
 
 package core
 
-import "github.com/ActiveMemory/ctx/internal/config"
+import (
+	"github.com/ActiveMemory/ctx/internal/config/token"
+)
 
 // SkipNewline advances pos past a newline (CRLF or LF) if present.
 //
@@ -20,12 +22,12 @@ func SkipNewline(s string, pos int) int {
 	if pos >= len(s) {
 		return pos
 	}
-	if pos+len(config.NewlineCRLF) <= len(s) &&
-		s[pos] == config.NewlineCRLF[0] && s[pos+1] == config.NewlineCRLF[1] {
-		return pos + len(config.NewlineCRLF)
+	if pos+len(token.NewlineCRLF) <= len(s) &&
+		s[pos] == token.NewlineCRLF[0] && s[pos+1] == token.NewlineCRLF[1] {
+		return pos + len(token.NewlineCRLF)
 	}
-	if s[pos] == config.NewlineLF[0] {
-		return pos + len(config.NewlineLF)
+	if s[pos] == token.NewlineLF[0] {
+		return pos + len(token.NewlineLF)
 	}
 	return pos
 }
@@ -42,7 +44,7 @@ func SkipWhitespace(s string, pos int) int {
 	for pos < len(s) {
 		if n := SkipNewline(s, pos); n > pos {
 			pos = n
-		} else if s[pos] == config.Space[0] || s[pos] == config.Tab[0] {
+		} else if s[pos] == token.Space[0] || s[pos] == token.Tab[0] {
 			pos++
 		} else {
 			break
@@ -60,11 +62,11 @@ func SkipWhitespace(s string, pos int) int {
 //   - int: Index of the first newline (-1 if not found)
 func FindNewline(s string) int {
 	for i := 0; i < len(s); i++ {
-		if i+len(config.NewlineCRLF) <= len(s) &&
-			s[i] == config.NewlineCRLF[0] && s[i+1] == config.NewlineCRLF[1] {
+		if i+len(token.NewlineCRLF) <= len(s) &&
+			s[i] == token.NewlineCRLF[0] && s[i+1] == token.NewlineCRLF[1] {
 			return i
 		}
-		if s[i] == config.NewlineLF[0] {
+		if s[i] == token.NewlineLF[0] {
 			return i
 		}
 	}
diff --git a/internal/cli/add/core/predicate.go b/internal/cli/add/core/predicate.go
index 62dce4ca..173bf931 100644
--- a/internal/cli/add/core/predicate.go
+++ b/internal/cli/add/core/predicate.go
@@ -6,17 +6,19 @@
 
 package core
 
-import "github.com/ActiveMemory/ctx/internal/config"
+import (
+	"github.com/ActiveMemory/ctx/internal/config/entry"
+)
 
 // FileTypeIsTask reports whether fileType represents a task entry.
 //
 // Parameters:
-//   - fileType: The type string to check (e.g., "task", "tasks")
+//   - fileType: The type string to check
 //
 // Returns:
 //   - bool: True if fileType is a task type
 func FileTypeIsTask(fileType string) bool {
-	return config.UserInputToEntry(fileType) == config.EntryTask
+	return entry.FromUserInput(fileType) == entry.Task
 }
 
 // FileTypeIsDecision reports whether fileType represents a decision entry.
@@ -27,7 +29,7 @@ func FileTypeIsTask(fileType string) bool {
 // Returns:
 //   - bool: True if fileType is a decision type
 func FileTypeIsDecision(fileType string) bool {
-	return config.UserInputToEntry(fileType) == config.EntryDecision
+	return entry.FromUserInput(fileType) == entry.Decision
 }
 
 // FileTypeIsLearning reports whether fileType represents a learning entry.
@@ -38,5 +40,5 @@ func FileTypeIsDecision(fileType string) bool {
 // Returns:
 //   - bool: True if fileType is a learning type
 func FileTypeIsLearning(fileType string) bool {
-	return config.UserInputToEntry(fileType) == config.EntryLearning
+	return entry.FromUserInput(fileType) == entry.Learning
 }
diff --git a/internal/cli/add/core/strings.go b/internal/cli/add/core/strings.go
index a10ad0ee..b908f400 100644
--- a/internal/cli/add/core/strings.go
+++ b/internal/cli/add/core/strings.go
@@ -9,7 +9,8 @@ package core
 import (
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // EndsWithNewline reports whether s ends with a newline (CRLF or LF).
@@ -20,8 +21,8 @@ import (
 // Returns:
 //   - bool: True if s ends with a newline
 func EndsWithNewline(s string) bool {
-	return strings.HasSuffix(s, config.NewlineCRLF) ||
-		strings.HasSuffix(s, config.NewlineLF)
+	return strings.HasSuffix(s, token.NewlineCRLF) ||
+		strings.HasSuffix(s, token.NewlineLF)
 }
 
 // Contains reports whether content contains the header and returns its index.
@@ -61,7 +62,7 @@ func ContainsNewLine(content string) (bool, int) {
 //   - bool: True if comment close marker is found
 //   - int: Index of marker (-1 if not found)
 func ContainsEndComment(content string) (bool, int) {
-	commentEnd := strings.Index(content, config.CommentClose)
+	commentEnd := strings.Index(content, marker.CommentClose)
 	return commentEnd != -1, commentEnd
 }
 
@@ -73,6 +74,6 @@ func ContainsEndComment(content string) (bool, int) {
 // Returns:
 //   - bool: True if s starts with CtxMarkerStart or CtxMarkerEnd
 func StartsWithCtxMarker(s string) bool {
-	return strings.HasPrefix(s, config.CtxMarkerStart) ||
-		strings.HasPrefix(s, config.CtxMarkerEnd)
+	return strings.HasPrefix(s, marker.CtxMarkerStart) ||
+		strings.HasPrefix(s, marker.CtxMarkerEnd)
 }
diff --git a/internal/cli/agent/agent.go b/internal/cli/agent/agent.go
index 8fef5618..ef4806f9 100644
--- a/internal/cli/agent/agent.go
+++ b/internal/cli/agent/agent.go
@@ -7,68 +7,12 @@
 package agent
 
 import (
-	"time"
-
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	agentroot "github.com/ActiveMemory/ctx/internal/cli/agent/cmd/root"
-	"github.com/ActiveMemory/ctx/internal/cli/agent/core"
-	"github.com/ActiveMemory/ctx/internal/config"
-	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 // Cmd returns the "ctx agent" command for generating AI-ready context packets.
-//
-// The command reads context files from .context/ and outputs a concise packet
-// optimized for AI consumption, including constitution rules, active tasks,
-// conventions, and recent decisions.
-//
-// Flags:
-//   - --budget: Token budget for the context packet (default 8000)
-//   - --format: Output format, "md" for Markdown or "json" (default "md")
-//   - --cooldown: Suppress repeated output within this duration (default 10m)
-//   - --session: Session identifier for cooldown tombstone isolation
-//
-// Returns:
-//   - *cobra.Command: Configured agent command with flags registered
 func Cmd() *cobra.Command {
-	var (
-		budget   int
-		format   string
-		cooldown time.Duration
-		session  string
-	)
-
-	short, long := assets.CommandDesc("agent")
-
-	cmd := &cobra.Command{
-		Use:   "agent",
-		Short: short,
-		Long:  long,
-		RunE: func(cmd *cobra.Command, args []string) error {
-			if !cmd.Flags().Changed("budget") {
-				budget = rc.TokenBudget()
-			}
-			return agentroot.Run(cmd, budget, format, cooldown, session)
-		},
-	}
-
-	cmd.Flags().IntVar(
-		&budget,
-		"budget", rc.DefaultTokenBudget, assets.FlagDesc("agent.budget"),
-	)
-	cmd.Flags().StringVar(
-		&format, "format", config.FormatMarkdown, assets.FlagDesc("agent.format"),
-	)
-	cmd.Flags().DurationVar(
-		&cooldown, "cooldown", core.DefaultCooldown,
-		assets.FlagDesc("agent.cooldown"),
-	)
-	cmd.Flags().StringVar(
-		&session, "session", "",
-		assets.FlagDesc("agent.session"),
-	)
-
-	return cmd
+	return agentroot.Cmd()
 }
diff --git a/internal/cli/agent/cmd/root/cmd.go b/internal/cli/agent/cmd/root/cmd.go
index d37451a9..dfb6c407 100644
--- a/internal/cli/agent/cmd/root/cmd.go
+++ b/internal/cli/agent/cmd/root/cmd.go
@@ -5,3 +5,69 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"time"
+
+	"github.com/ActiveMemory/ctx/internal/config/fmt"
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/cli/agent/core"
+	"github.com/ActiveMemory/ctx/internal/rc"
+)
+
+// Cmd returns the "ctx agent" command for generating AI-ready context packets.
+//
+// The command reads context files from .context/ and outputs a concise packet
+// optimized for AI consumption, including constitution rules, active tasks,
+// conventions, and recent decisions.
+//
+// Flags:
+//   - --budget: Token budget for the context packet (default 8000)
+//   - --format: Output format, "md" for Markdown or "json" (default "md")
+//   - --cooldown: Suppress repeated output within this duration (default 10m)
+//   - --session: Session identifier for cooldown tombstone isolation
+//
+// Returns:
+//   - *cobra.Command: Configured agent command with flags registered
+func Cmd() *cobra.Command {
+	var (
+		budget   int
+		format   string
+		cooldown time.Duration
+		session  string
+	)
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyAgent)
+
+	cmd := &cobra.Command{
+		Use:   "agent",
+		Short: short,
+		Long:  long,
+		RunE: func(cmd *cobra.Command, args []string) error {
+			if !cmd.Flags().Changed("budget") {
+				budget = rc.TokenBudget()
+			}
+			return Run(cmd, budget, format, cooldown, session)
+		},
+	}
+
+	cmd.Flags().IntVar(
+		&budget,
+		"budget", rc.DefaultTokenBudget, assets.FlagDesc(assets.FlagDescKeyAgentBudget),
+	)
+	cmd.Flags().StringVar(
+		&format, "format", fmt.FormatMarkdown, assets.FlagDesc(assets.FlagDescKeyAgentFormat),
+	)
+	cmd.Flags().DurationVar(
+		&cooldown, "cooldown", core.DefaultCooldown,
+		assets.FlagDesc(assets.FlagDescKeyAgentCooldown),
+	)
+	cmd.Flags().StringVar(
+		&session, "session", "",
+		assets.FlagDesc(assets.FlagDescKeyAgentSession),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/agent/cmd/root/doc.go b/internal/cli/agent/cmd/root/doc.go
new file mode 100644
index 00000000..a8eb9460
--- /dev/null
+++ b/internal/cli/agent/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx agent command.
+//
+// It prints a concise, AI-ready context packet assembled from .context/
+// files within a configurable token budget.
+package root
diff --git a/internal/cli/agent/cmd/root/run.go b/internal/cli/agent/cmd/root/run.go
index fefe720f..a7e55010 100644
--- a/internal/cli/agent/cmd/root/run.go
+++ b/internal/cli/agent/cmd/root/run.go
@@ -8,19 +8,16 @@ package root
 
 import (
 	"errors"
-	"fmt"
 	"time"
 
+	"github.com/ActiveMemory/ctx/internal/config/fmt"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/agent/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/context"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
-// DefaultCooldown is re-exported from core for backward compatibility.
-const DefaultCooldown = core.DefaultCooldown
-
 // Run executes the agent command logic.
 //
 // When a session and cooldown are provided, it checks a tombstone file
@@ -54,15 +51,13 @@ func Run(
 	if err != nil {
 		var notFoundError *context.NotFoundError
 		if errors.As(err, ¬FoundError) {
-			return fmt.Errorf(
-				"no .context/ directory found. Run 'ctx init' first",
-			)
+			return ctxerr.NotInitialized()
 		}
 		return err
 	}
 
 	var outputErr error
-	if format == config.FormatJSON {
+	if format == fmt.FormatJSON {
 		outputErr = core.OutputAgentJSON(cmd, ctx, budget)
 	} else {
 		outputErr = core.OutputAgentMarkdown(cmd, ctx, budget)
diff --git a/internal/cli/agent/core/budget.go b/internal/cli/agent/core/budget.go
index 754dc066..b6ae2e80 100644
--- a/internal/cli/agent/core/budget.go
+++ b/internal/cli/agent/core/budget.go
@@ -11,16 +11,15 @@ import (
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/agent"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/index"
 )
 
-// Budget tier allocation percentages.
-const (
-	TaskBudgetPct       = 0.40
-	ConventionBudgetPct = 0.20
-)
+// Budget tier allocation percentages are defined in config.
 
 // AssembledPacket holds the budget-aware output sections ready for rendering.
 //
@@ -54,7 +53,7 @@ type AssembledPacket struct {
 //   - Tier 1 (always): constitution, read order, instruction
 //   - Tier 2 (40%): active tasks
 //   - Tier 3 (20%): conventions
-//   - Tier 4+5 (remaining): decisions + learnings, scored by relevance
+//   - Tier 4+5 (remaining): decisions and learnings, scored by relevance
 //
 // Parameters:
 //   - ctx: Loaded context containing the files
@@ -65,10 +64,8 @@ type AssembledPacket struct {
 func AssembleBudgetPacket(ctx *context.Context, budget int) *AssembledPacket {
 	now := time.Now()
 	pkt := &AssembledPacket{
-		Budget: budget,
-		Instruction: "Before starting work, confirm to the user: " +
-			"\"I have read the required context files and " +
-			"I'm following project conventions.\"",
+		Budget:      budget,
+		Instruction: assets.TextDesc(assets.TextDescKeyAgentInstruction),
 	}
 
 	remaining := budget
@@ -87,8 +84,8 @@ func AssembleBudgetPacket(ctx *context.Context, budget int) *AssembledPacket {
 		return pkt
 	}
 
-	// Tier 2: Tasks (up to 40% of original budget)
-	taskCap := int(float64(budget) * TaskBudgetPct)
+	// Tier 2: Tasks (up to 40% of the original budget)
+	taskCap := int(float64(budget) * agent.TaskBudgetPct)
 	allTasks := ExtractActiveTasks(ctx)
 	pkt.Tasks = FitItemsInBudget(allTasks, taskCap)
 	taskTokens := EstimateSliceTokens(pkt.Tasks)
@@ -99,8 +96,8 @@ func AssembleBudgetPacket(ctx *context.Context, budget int) *AssembledPacket {
 		return pkt
 	}
 
-	// Tier 3: Conventions (up to 20% of original budget)
-	convCap := int(float64(budget) * ConventionBudgetPct)
+	// Tier 3: Conventions (up to 20% of the original budget)
+	convCap := int(float64(budget) * agent.ConventionBudgetPct)
 	allConventions := ExtractAllConventions(ctx)
 	pkt.Conventions = FitItemsInBudget(allConventions, convCap)
 	convTokens := EstimateSliceTokens(pkt.Conventions)
@@ -115,13 +112,13 @@ func AssembleBudgetPacket(ctx *context.Context, budget int) *AssembledPacket {
 	keywords := ExtractTaskKeywords(pkt.Tasks)
 
 	// Tier 4+5: Decisions + Learnings (share remaining budget)
-	decisionBlocks := ParseEntryBlocks(ctx, config.FileDecision)
-	learningBlocks := ParseEntryBlocks(ctx, config.FileLearning)
+	decisionBlocks := ParseEntryBlocks(ctx, ctxCfg.Decision)
+	learningBlocks := ParseEntryBlocks(ctx, ctxCfg.Learning)
 
 	scoredDecisions := ScoreEntries(decisionBlocks, keywords, now)
 	scoredLearnings := ScoreEntries(learningBlocks, keywords, now)
 
-	// Split remaining budget: proportional to content size, minimum 30% each
+	// Split the remaining budget: proportional to content size, minimum 30% each
 	decTokens, learnTokens := SplitBudget(
 		remaining, scoredDecisions, scoredLearnings,
 	)
@@ -149,7 +146,7 @@ func AssembleBudgetPacket(ctx *context.Context, budget int) *AssembledPacket {
 // Returns:
 //   - []string: All convention bullet items; nil if the file is not found
 func ExtractAllConventions(ctx *context.Context) []string {
-	if f := ctx.File(config.FileConvention); f != nil {
+	if f := ctx.File(ctxCfg.Convention); f != nil {
 		return ExtractBulletItems(string(f.Content), 1000)
 	}
 	return nil
@@ -159,10 +156,10 @@ func ExtractAllConventions(ctx *context.Context) []string {
 //
 // Parameters:
 //   - ctx: Loaded context
-//   - fileName: Name of the file to parse (e.g., config.FileDecision)
+//   - fileName: Name of the file to parse (e.g., config.Decision)
 //
 // Returns:
-//   - []index.EntryBlock: Parsed entry blocks; nil if file not found
+//   - []index.EntryBlock: Parsed entry blocks; nil if the file is not found
 func ParseEntryBlocks(ctx *context.Context, fileName string) []index.EntryBlock {
 	if f := ctx.File(fileName); f != nil {
 		return index.ParseEntryBlocks(string(f.Content))
@@ -220,7 +217,7 @@ func SplitBudget(total int, a, b []ScoredEntry) (int, int) {
 
 // FillSection selects scored entries to fill a budget, with graceful degradation.
 //
-// Includes full entries by score order until ~80% of budget is consumed.
+// Includes full entries by score order until ~80% of the budget is consumed.
 // Remaining entries get title-only summaries.
 //
 // Parameters:
@@ -329,7 +326,7 @@ func TotalEntryTokens(entries []ScoredEntry) int {
 //   - string: Formatted Markdown output
 func RenderMarkdownPacket(pkt *AssembledPacket) string {
 	var sb strings.Builder
-	nl := config.NewlineLF
+	nl := token.NewlineLF
 
 	sb.WriteString("# Context Packet" + nl)
 	sb.WriteString(
diff --git a/internal/cli/agent/core/cooldown.go b/internal/cli/agent/core/cooldown.go
index 84c47a88..8d71f56c 100644
--- a/internal/cli/agent/core/cooldown.go
+++ b/internal/cli/agent/core/cooldown.go
@@ -11,16 +11,14 @@ import (
 	"path/filepath"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/agent"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 // DefaultCooldown is the default cooldown duration between context packet
 // emissions within the same session.
-const DefaultCooldown = 10 * time.Minute
-
-// tombstonePrefix is the filename prefix for cooldown tombstone files.
-const tombstonePrefix = "ctx-agent-"
+const DefaultCooldown = agent.DefaultCooldown
 
 // CooldownActive checks whether the cooldown tombstone for the given
 // session is still fresh.
@@ -62,7 +60,7 @@ func TouchTombstone(session string) {
 // Returns:
 //   - string: absolute path in the system temp directory
 func TombstonePath(session string) string {
-	stateDir := filepath.Join(rc.ContextDir(), config.DirState)
+	stateDir := filepath.Join(rc.ContextDir(), dir.State)
 	_ = os.MkdirAll(stateDir, 0o750)
-	return filepath.Join(stateDir, tombstonePrefix+session)
+	return filepath.Join(stateDir, agent.TombstonePrefix+session)
 }
diff --git a/internal/cli/agent/core/extract.go b/internal/cli/agent/core/extract.go
index 701a77c7..723c87d8 100644
--- a/internal/cli/agent/core/extract.go
+++ b/internal/cli/agent/core/extract.go
@@ -9,7 +9,9 @@ package core
 import (
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/task"
 )
@@ -25,7 +27,7 @@ import (
 // Returns:
 //   - []string: Bullet item text without the "- " prefix
 func ExtractBulletItems(content string, limit int) []string {
-	matches := config.RegExBulletItem.FindAllStringSubmatch(content, -1)
+	matches := regex.BulletItem.FindAllStringSubmatch(content, -1)
 	items := make([]string, 0, limit)
 	for i, m := range matches {
 		if i >= limit {
@@ -33,7 +35,7 @@ func ExtractBulletItems(content string, limit int) []string {
 		}
 		text := strings.TrimSpace(m[1])
 		// Skip empty or header-only items
-		if text != "" && !strings.HasPrefix(text, "#") {
+		if text != "" && !strings.HasPrefix(text, token.PrefixHeading) {
 			items = append(items, text)
 		}
 	}
@@ -50,7 +52,7 @@ func ExtractBulletItems(content string, limit int) []string {
 // Returns:
 //   - []string: Text content of each checkbox item
 func ExtractCheckboxItems(content string) []string {
-	matches := config.RegExTask.FindAllStringSubmatch(content, -1)
+	matches := regex.Task.FindAllStringSubmatch(content, -1)
 	items := make([]string, 0, len(matches))
 	for _, m := range matches {
 		items = append(items, strings.TrimSpace(task.Content(m)))
@@ -66,7 +68,7 @@ func ExtractCheckboxItems(content string) []string {
 // Returns:
 //   - []string: List of constitution rules; nil if the file is not found
 func ExtractConstitutionRules(ctx *context.Context) []string {
-	if f := ctx.File(config.FileConstitution); f != nil {
+	if f := ctx.File(ctxCfg.Constitution); f != nil {
 		return ExtractCheckboxItems(string(f.Content))
 	}
 	return nil
@@ -83,7 +85,7 @@ func ExtractConstitutionRules(ctx *context.Context) []string {
 // Returns:
 //   - []string: Unchecked task items with "- [ ]" prefix
 func ExtractUncheckedTasks(content string) []string {
-	matches := config.RegExTaskMultiline.FindAllStringSubmatch(content, -1)
+	matches := regex.TaskMultiline.FindAllStringSubmatch(content, -1)
 	items := make([]string, 0, len(matches))
 	for _, m := range matches {
 		if task.Pending(m) {
@@ -102,7 +104,7 @@ func ExtractUncheckedTasks(content string) []string {
 //   - []string: List of active tasks with "- [ ]" prefix; nil if
 //     the file is not found
 func ExtractActiveTasks(ctx *context.Context) []string {
-	if f := ctx.File(config.FileTask); f != nil {
+	if f := ctx.File(ctxCfg.Task); f != nil {
 		return ExtractUncheckedTasks(string(f.Content))
 	}
 	return nil
diff --git a/internal/cli/agent/core/score.go b/internal/cli/agent/core/score.go
index 83c023fb..a9769d79 100644
--- a/internal/cli/agent/core/score.go
+++ b/internal/cli/agent/core/score.go
@@ -10,6 +10,9 @@ import (
 	"strings"
 	"time"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/agent"
+	time2 "github.com/ActiveMemory/ctx/internal/config/time"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/index"
 )
@@ -41,20 +44,20 @@ type ScoredEntry struct {
 // Returns:
 //   - float64: Recency score between 0.2 and 1.0
 func RecencyScore(eb *index.EntryBlock, now time.Time) float64 {
-	entryDate, err := time.ParseInLocation("2006-01-02", eb.Entry.Date, time.Local)
+	entryDate, err := time.ParseInLocation(time2.DateFormat, eb.Entry.Date, time.Local)
 	if err != nil {
 		return 0.2
 	}
 	days := int(now.Sub(entryDate).Hours() / 24)
 	switch {
-	case days <= 7:
-		return 1.0
-	case days <= 30:
-		return 0.7
-	case days <= 90:
-		return 0.4
+	case days <= agent.RecencyDaysWeek:
+		return agent.RecencyScoreWeek
+	case days <= agent.RecencyDaysMonth:
+		return agent.RecencyScoreMonth
+	case days <= agent.RecencyDaysQuarter:
+		return agent.RecencyScoreQuarter
 	default:
-		return 0.2
+		return agent.RecencyScoreOld
 	}
 }
 
@@ -80,10 +83,10 @@ func RelevanceScore(eb *index.EntryBlock, keywords []string) float64 {
 			matches++
 		}
 	}
-	if matches >= 3 {
+	if matches >= agent.RelevanceMatchCap {
 		return 1.0
 	}
-	return float64(matches) / 3.0
+	return float64(matches) / float64(agent.RelevanceMatchCap)
 }
 
 // ScoreEntry computes the combined relevance score for an entry block.
@@ -105,21 +108,9 @@ func ScoreEntry(eb *index.EntryBlock, keywords []string, now time.Time) float64
 	return RecencyScore(eb, now) + RelevanceScore(eb, keywords)
 }
 
-// StopWords is a set of common English words to exclude from keyword extraction.
-var StopWords = map[string]bool{
-	"the": true, "and": true, "for": true, "that": true, "this": true,
-	"with": true, "from": true, "are": true, "was": true, "were": true,
-	"been": true, "have": true, "has": true, "had": true, "but": true,
-	"not": true, "you": true, "all": true, "can": true, "her": true,
-	"his": true, "she": true, "its": true, "our": true, "they": true,
-	"will": true, "each": true, "make": true, "like": true, "use": true,
-	"way": true, "may": true, "any": true, "into": true, "when": true,
-	"which": true, "their": true, "about": true, "would": true,
-	"there": true, "what": true, "also": true, "should": true,
-	"after": true, "before": true, "than": true, "then": true,
-	"them": true, "could": true, "more": true, "some": true,
-	"other": true, "only": true, "just": true, "see": true,
-	"add": true, "new": true, "update": true, "how": true,
+// stopWords returns the set of stop words from assets.
+func stopWords() map[string]bool {
+	return assets.StopWords()
 }
 
 // ExtractTaskKeywords extracts meaningful keywords from task text.
@@ -142,7 +133,7 @@ func ExtractTaskKeywords(tasks []string) []string {
 			return !isAlnum && r != '-' && r != '_'
 		})
 		for _, w := range words {
-			if len(w) < 3 || StopWords[w] || seen[w] {
+			if len(w) < 3 || stopWords()[w] || seen[w] {
 				continue
 			}
 			seen[w] = true
diff --git a/internal/cli/agent/core/sort.go b/internal/cli/agent/core/sort.go
index 5258f718..d73f0458 100644
--- a/internal/cli/agent/core/sort.go
+++ b/internal/cli/agent/core/sort.go
@@ -9,13 +9,13 @@ package core
 import (
 	"path/filepath"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
 	"github.com/ActiveMemory/ctx/internal/context"
 )
 
 // GetReadOrder returns context file paths in the recommended reading order.
 //
-// Files are ordered according to [config.FileReadOrder] and filtered to
+// Files are ordered according to [config.ReadOrder] and filtered to
 // exclude empty files. Paths are returned as full paths relative to the
 // context directory.
 //
@@ -26,7 +26,7 @@ import (
 //   - []string: File paths in reading order (e.g., ".context/CONSTITUTION.md")
 func GetReadOrder(ctx *context.Context) []string {
 	var order []string
-	for _, name := range config.FileReadOrder {
+	for _, name := range ctxCfg.ReadOrder {
 		if f := ctx.File(name); f != nil && !f.IsEmpty {
 			order = append(order, filepath.Join(ctx.Dir, f.Name))
 		}
diff --git a/internal/cli/changes/changes.go b/internal/cli/changes/changes.go
index 247d06ad..916e9bb5 100644
--- a/internal/cli/changes/changes.go
+++ b/internal/cli/changes/changes.go
@@ -9,29 +9,10 @@ package changes
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	changesroot "github.com/ActiveMemory/ctx/internal/cli/changes/cmd/root"
 )
 
 // Cmd returns the changes command.
-//
-// Returns:
-//   - *cobra.Command: Configured changes command with flags registered
 func Cmd() *cobra.Command {
-	var since string
-
-	short, long := assets.CommandDesc("changes")
-
-	cmd := &cobra.Command{
-		Use:   "changes",
-		Short: short,
-		Long:  long,
-		RunE: func(cmd *cobra.Command, _ []string) error {
-			return changesroot.Run(cmd, since)
-		},
-	}
-
-	cmd.Flags().StringVar(&since, "since", "", assets.FlagDesc("changes.since"))
-
-	return cmd
+	return changesroot.Cmd()
 }
diff --git a/internal/cli/changes/cmd/root/cmd.go b/internal/cli/changes/cmd/root/cmd.go
index d37451a9..382dbba5 100644
--- a/internal/cli/changes/cmd/root/cmd.go
+++ b/internal/cli/changes/cmd/root/cmd.go
@@ -5,3 +5,32 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the changes command.
+//
+// Returns:
+//   - *cobra.Command: Configured changes command with flags registered
+func Cmd() *cobra.Command {
+	var since string
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyChanges)
+
+	cmd := &cobra.Command{
+		Use:   "changes",
+		Short: short,
+		Long:  long,
+		RunE: func(cmd *cobra.Command, _ []string) error {
+			return Run(cmd, since)
+		},
+	}
+
+	cmd.Flags().StringVar(&since, "since", "", assets.FlagDesc(assets.FlagDescKeyChangesSince))
+
+	return cmd
+}
diff --git a/internal/cli/changes/cmd/root/doc.go b/internal/cli/changes/cmd/root/doc.go
new file mode 100644
index 00000000..edcc3127
--- /dev/null
+++ b/internal/cli/changes/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx changes command.
+//
+// It shows what changed in context files since the last session or a
+// specified time range.
+package root
diff --git a/internal/cli/changes/cmd/root/run.go b/internal/cli/changes/cmd/root/run.go
index eb86716d..d6edef3e 100644
--- a/internal/cli/changes/cmd/root/run.go
+++ b/internal/cli/changes/cmd/root/run.go
@@ -7,11 +7,10 @@
 package root
 
 import (
-	"fmt"
-
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/changes/core"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
 // Run executes the changes command logic.
@@ -28,7 +27,7 @@ import (
 func Run(cmd *cobra.Command, since string) error {
 	refTime, refLabel, err := core.DetectReferenceTime(since)
 	if err != nil {
-		return fmt.Errorf("detecting reference time: %w", err)
+		return ctxerr.DetectReferenceTime(err)
 	}
 
 	ctxChanges, _ := core.FindContextChanges(refTime)
diff --git a/internal/cli/changes/core/cmd_test.go b/internal/cli/changes/core/cmd_test.go
index 91c33d53..4fe42e97 100644
--- a/internal/cli/changes/core/cmd_test.go
+++ b/internal/cli/changes/core/cmd_test.go
@@ -13,7 +13,7 @@ import (
 	"testing"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -186,24 +186,6 @@ func TestRenderChanges_NoChanges(t *testing.T) {
 	}
 }
 
-func TestItoa(t *testing.T) {
-	tests := []struct {
-		n    int
-		want string
-	}{
-		{0, "0"},
-		{1, "1"},
-		{42, "42"},
-		{-5, "-5"},
-		{100, "100"},
-	}
-	for _, tt := range tests {
-		if got := Itoa(tt.n); got != tt.want {
-			t.Errorf("Itoa(%d) = %q, want %q", tt.n, got, tt.want)
-		}
-	}
-}
-
 func TestDetectReferenceTime_SinceFlag(t *testing.T) {
 	_, label, detectErr := DetectReferenceTime("6h")
 	if detectErr != nil {
@@ -219,7 +201,7 @@ func TestDetectReferenceTime_Fallback(t *testing.T) {
 	t.Setenv("CTX_DIR", tmp)
 	rc.Reset()
 
-	stateDir := filepath.Join(tmp, config.DirState)
+	stateDir := filepath.Join(tmp, dir.State)
 	mkErr := os.MkdirAll(stateDir, 0o755)
 	if mkErr != nil {
 		t.Fatalf("MkdirAll: %v", mkErr)
@@ -239,7 +221,7 @@ func TestDetectReferenceTime_FromMarkers(t *testing.T) {
 	t.Setenv("CTX_DIR", tmp)
 	rc.Reset()
 
-	stateDir := filepath.Join(tmp, config.DirState)
+	stateDir := filepath.Join(tmp, dir.State)
 	mkErr := os.MkdirAll(stateDir, 0o755)
 	if mkErr != nil {
 		t.Fatalf("MkdirAll: %v", mkErr)
diff --git a/internal/cli/changes/core/detect.go b/internal/cli/changes/core/detect.go
index 52fc2166..2a79e5d8 100644
--- a/internal/cli/changes/core/detect.go
+++ b/internal/cli/changes/core/detect.go
@@ -10,10 +10,15 @@ import (
 	"os"
 	"path/filepath"
 	"sort"
+	"strconv"
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/load_gate"
+	time2 "github.com/ActiveMemory/ctx/internal/config/time"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -49,7 +54,7 @@ func DetectReferenceTime(since string) (time.Time, string, error) {
 
 	// Fallback: 24h ago.
 	t := time.Now().Add(-24 * time.Hour)
-	return t, "24 hour(s) ago (default)", nil
+	return t, assets.TextDesc(assets.TextDescKeyChangesFallbackLabel), nil
 }
 
 // ParseSinceFlag parses a duration (like "24h") or date (like "2026-03-01").
@@ -69,8 +74,8 @@ func ParseSinceFlag(since string) (time.Time, string, error) {
 	}
 
 	// Try date.
-	if t, err := time.Parse("2006-01-02", since); err == nil {
-		return t, "since " + since, nil
+	if t, err := time.Parse(time2.DateFormat, since); err == nil {
+		return t, assets.TextDesc(assets.TextDescKeyChangesSincePrefix) + since, nil
 	}
 
 	// Try RFC3339.
@@ -88,7 +93,7 @@ func ParseSinceFlag(since string) (time.Time, string, error) {
 //   - time.Time: Marker file modification time
 //   - bool: True if a valid marker was found
 func DetectFromMarkers() (time.Time, bool) {
-	stateDir := filepath.Join(rc.ContextDir(), config.DirState)
+	stateDir := filepath.Join(rc.ContextDir(), dir.State)
 	entries, err := os.ReadDir(stateDir)
 	if err != nil {
 		return time.Time{}, false
@@ -100,7 +105,7 @@ func DetectFromMarkers() (time.Time, bool) {
 
 	var markers []markerInfo
 	for _, e := range entries {
-		if !strings.HasPrefix(e.Name(), "ctx-loaded-") {
+		if !strings.HasPrefix(e.Name(), load_gate.PrefixCtxLoaded) {
 			continue
 		}
 		info, infoErr := e.Info()
@@ -130,17 +135,17 @@ func DetectFromMarkers() (time.Time, bool) {
 //   - time.Time: Event timestamp
 //   - bool: True if a valid event was found
 func DetectFromEvents() (time.Time, bool) {
-	eventsPath := filepath.Join(rc.ContextDir(), config.DirState, "events.jsonl")
+	eventsPath := filepath.Join(rc.ContextDir(), dir.State, "events.jsonl")
 	data, err := os.ReadFile(eventsPath) //nolint:gosec // state dir path
 	if err != nil {
 		return time.Time{}, false
 	}
 
-	lines := strings.Split(strings.TrimSpace(string(data)), config.NewlineLF)
+	lines := strings.Split(strings.TrimSpace(string(data)), token.NewlineLF)
 	// Scan in reverse for last context-load-gate event.
 	for i := len(lines) - 1; i >= 0; i-- {
 		line := lines[i]
-		if !strings.Contains(line, "context-load-gate") {
+		if !strings.Contains(line, load_gate.EventContextLoadGate) {
 			continue
 		}
 		if t, ok := ExtractTimestamp(line); ok {
@@ -161,7 +166,7 @@ func DetectFromEvents() (time.Time, bool) {
 //   - time.Time: Parsed timestamp
 //   - bool: True if extraction succeeded
 func ExtractTimestamp(jsonLine string) (time.Time, bool) {
-	const key = `"timestamp":"`
+	key := load_gate.JSONKeyTimestamp
 	idx := strings.Index(jsonLine, key)
 	if idx < 0 {
 		return time.Time{}, false
@@ -186,18 +191,19 @@ func ExtractTimestamp(jsonLine string) (time.Time, bool) {
 // Returns:
 //   - string: Human-readable time description
 func HumanAgo(d time.Duration) string {
+	ago := assets.TextDesc(assets.TextDescKeyTimeAgo)
 	switch {
 	case d < time.Minute:
-		return "just now"
+		return assets.TextDesc(assets.TextDescKeyTimeJustNow)
 	case d < time.Hour:
 		m := int(d.Minutes())
-		return Pluralize(m, "minute") + " ago"
+		return Pluralize(m, assets.TextDesc(assets.TextDescKeyTimeMinute)) + ago
 	case d < 24*time.Hour:
 		h := int(d.Hours())
-		return Pluralize(h, "hour") + " ago"
+		return Pluralize(h, assets.TextDesc(assets.TextDescKeyTimeHour)) + ago
 	default:
 		days := int(d.Hours() / 24)
-		return Pluralize(days, "day") + " ago"
+		return Pluralize(days, assets.TextDesc(assets.TextDescKeyTimeDay)) + ago
 	}
 }
 
@@ -213,27 +219,5 @@ func Pluralize(n int, unit string) string {
 	if n == 1 {
 		return "1 " + unit
 	}
-	return Itoa(n) + " " + unit + "s"
-}
-
-// Itoa is a minimal int-to-string without importing strconv.
-//
-// Parameters:
-//   - n: Integer to convert
-//
-// Returns:
-//   - string: String representation
-func Itoa(n int) string {
-	if n == 0 {
-		return "0"
-	}
-	if n < 0 {
-		return "-" + Itoa(-n)
-	}
-	var digits []byte
-	for n > 0 {
-		digits = append([]byte{byte('0' + n%10)}, digits...)
-		n /= 10
-	}
-	return string(digits)
+	return strconv.Itoa(n) + " " + unit + "s"
 }
diff --git a/internal/cli/changes/core/format.go b/internal/cli/changes/core/format.go
index 38b4a752..f02fa96c 100644
--- a/internal/cli/changes/core/format.go
+++ b/internal/cli/changes/core/format.go
@@ -10,7 +10,8 @@ import (
 	"fmt"
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/time"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // RenderChanges renders the full CLI output for `ctx changes`.
@@ -22,7 +23,9 @@ import (
 //
 // Returns:
 //   - string: Formatted Markdown output
-func RenderChanges(refLabel string, ctxChanges []ContextChange, code CodeSummary) string {
+func RenderChanges(
+	refLabel string, ctxChanges []ContextChange, code CodeSummary,
+) string {
 	var b strings.Builder
 
 	b.WriteString("## Changes Since Last Session\n\n")
@@ -32,9 +35,9 @@ func RenderChanges(refLabel string, ctxChanges []ContextChange, code CodeSummary
 		b.WriteString("### Context File Changes\n")
 		for _, c := range ctxChanges {
 			b.WriteString(fmt.Sprintf("- `%s` — modified %s\n",
-				c.Name, c.ModTime.Format("2006-01-02 15:04")))
+				c.Name, c.ModTime.Format(time.DateTimeFormat)))
 		}
-		b.WriteString(config.NewlineLF)
+		b.WriteString(token.NewlineLF)
 	}
 
 	if code.CommitCount > 0 {
@@ -52,7 +55,7 @@ func RenderChanges(refLabel string, ctxChanges []ContextChange, code CodeSummary
 			b.WriteString(fmt.Sprintf("- **Authors**: %s\n",
 				strings.Join(code.Authors, ", ")))
 		}
-		b.WriteString(config.NewlineLF)
+		b.WriteString(token.NewlineLF)
 	}
 
 	if len(ctxChanges) == 0 && code.CommitCount == 0 {
@@ -95,5 +98,5 @@ func RenderChangesForHook(refLabel string, ctxChanges []ContextChange, code Code
 		return ""
 	}
 
-	return "Changes since last session: " + strings.Join(parts, ". ") + config.NewlineLF
+	return "Changes since last session: " + strings.Join(parts, ". ") + token.NewlineLF
 }
diff --git a/internal/cli/changes/core/scan.go b/internal/cli/changes/core/scan.go
index 4ad8e5d6..0bef7466 100644
--- a/internal/cli/changes/core/scan.go
+++ b/internal/cli/changes/core/scan.go
@@ -13,7 +13,9 @@ import (
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/token"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -48,7 +50,7 @@ func FindContextChanges(refTime time.Time) ([]ContextChange, error) {
 
 	var changes []ContextChange
 	for _, e := range entries {
-		if e.IsDir() || !strings.HasSuffix(e.Name(), config.ExtMarkdown) {
+		if e.IsDir() || !strings.HasSuffix(e.Name(), file.ExtMarkdown) {
 			continue
 		}
 		info, infoErr := e.Info()
@@ -93,7 +95,7 @@ func SummarizeCodeChanges(refTime time.Time) (CodeSummary, error) {
 	if lines == "" {
 		return summary, nil
 	}
-	commitLines := strings.Split(lines, config.NewlineLF)
+	commitLines := strings.Split(lines, token.NewlineLF)
 	summary.CommitCount = len(commitLines)
 
 	// Latest commit message (first line of oneline output).
@@ -132,6 +134,9 @@ func SummarizeCodeChanges(refTime time.Time) (CodeSummary, error) {
 //   - []byte: Raw git output
 //   - error: Non-nil if git fails
 func GitLogSince(t time.Time, extraArgs ...string) ([]byte, error) {
+	if _, lookErr := exec.LookPath("git"); lookErr != nil {
+		return nil, ctxerr.GitNotFound()
+	}
 	args := []string{"log", "--since", t.Format(time.RFC3339)}
 	args = append(args, extraArgs...)
 	return exec.Command("git", args...).Output() //nolint:gosec // args are literal flags + time.Format output
@@ -146,7 +151,7 @@ func GitLogSince(t time.Time, extraArgs ...string) ([]byte, error) {
 //   - []string: Sorted unique top-level directory names
 func UniqueTopDirs(output string) []string {
 	seen := make(map[string]bool)
-	for _, line := range strings.Split(strings.TrimSpace(output), config.NewlineLF) {
+	for _, line := range strings.Split(strings.TrimSpace(output), token.NewlineLF) {
 		line = strings.TrimSpace(line)
 		if line == "" {
 			continue
@@ -175,7 +180,7 @@ func UniqueTopDirs(output string) []string {
 //   - []string: Sorted unique non-empty lines
 func UniqueLines(output string) []string {
 	seen := make(map[string]bool)
-	for _, line := range strings.Split(strings.TrimSpace(output), config.NewlineLF) {
+	for _, line := range strings.Split(strings.TrimSpace(output), token.NewlineLF) {
 		line = strings.TrimSpace(line)
 		if line != "" {
 			seen[line] = true
diff --git a/internal/cli/compact/cmd/root/cmd.go b/internal/cli/compact/cmd/root/cmd.go
index d37451a9..843da211 100644
--- a/internal/cli/compact/cmd/root/cmd.go
+++ b/internal/cli/compact/cmd/root/cmd.go
@@ -5,3 +5,44 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx compact" command for cleaning up context files.
+//
+// The command moves completed tasks to a "Completed (Recent)" section,
+// optionally archives old content, and removes empty sections from all
+// context files.
+//
+// Flags:
+//   - --archive: Create .context/archive/ for old completed tasks
+//
+// Returns:
+//   - *cobra.Command: Configured compact command with flags registered
+func Cmd() *cobra.Command {
+	var archive bool
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyCompact)
+
+	cmd := &cobra.Command{
+		Use:   "compact",
+		Short: short,
+		Long:  long,
+		RunE: func(cmd *cobra.Command, args []string) error {
+			return Run(cmd, archive)
+		},
+	}
+
+	cmd.Flags().BoolVar(
+		&archive,
+		"archive",
+		false,
+		assets.FlagDesc(assets.FlagDescKeyCompactArchive),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/compact/cmd/root/doc.go b/internal/cli/compact/cmd/root/doc.go
new file mode 100644
index 00000000..fa3f21fa
--- /dev/null
+++ b/internal/cli/compact/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx compact command.
+//
+// It archives completed tasks and cleans up context files by removing
+// empty sections and consolidating old content.
+package root
diff --git a/internal/cli/compact/cmd/root/run.go b/internal/cli/compact/cmd/root/run.go
index 435a61ce..006c7f4d 100644
--- a/internal/cli/compact/cmd/root/run.go
+++ b/internal/cli/compact/cmd/root/run.go
@@ -14,7 +14,8 @@ import (
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/compact/core"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
@@ -61,12 +62,12 @@ func Run(cmd *cobra.Command, archive bool) error {
 
 	// Process other files for empty sections
 	for _, f := range ctx.Files {
-		if f.Name == config.FileTask {
+		if f.Name == ctxCfg.Task {
 			continue
 		}
 		cleaned, count := core.RemoveEmptySections(string(f.Content))
 		if count > 0 {
-			if err := os.WriteFile(f.Path, []byte(cleaned), config.PermFile); err == nil {
+			if err := os.WriteFile(f.Path, []byte(cleaned), fs.PermFile); err == nil {
 				cmd.Println(
 					fmt.Sprintf("✓ Removed %d empty sections from %s", count, f.Name),
 				)
diff --git a/internal/cli/compact/compact.go b/internal/cli/compact/compact.go
index bb1934f9..1f344a7c 100644
--- a/internal/cli/compact/compact.go
+++ b/internal/cli/compact/compact.go
@@ -9,41 +9,10 @@ package compact
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	compactroot "github.com/ActiveMemory/ctx/internal/cli/compact/cmd/root"
 )
 
 // Cmd returns the "ctx compact" command for cleaning up context files.
-//
-// The command moves completed tasks to a "Completed (Recent)" section,
-// optionally archives old content, and removes empty sections from all
-// context files.
-//
-// Flags:
-//   - --archive: Create .context/archive/ for old completed tasks
-//
-// Returns:
-//   - *cobra.Command: Configured compact command with flags registered
 func Cmd() *cobra.Command {
-	var archive bool
-
-	short, long := assets.CommandDesc("compact")
-
-	cmd := &cobra.Command{
-		Use:   "compact",
-		Short: short,
-		Long:  long,
-		RunE: func(cmd *cobra.Command, args []string) error {
-			return compactroot.Run(cmd, archive)
-		},
-	}
-
-	cmd.Flags().BoolVar(
-		&archive,
-		"archive",
-		false,
-		assets.FlagDesc("compact.archive"),
-	)
-
-	return cmd
+	return compactroot.Cmd()
 }
diff --git a/internal/cli/compact/compact_test.go b/internal/cli/compact/compact_test.go
index acd37e35..dc0746d9 100644
--- a/internal/cli/compact/compact_test.go
+++ b/internal/cli/compact/compact_test.go
@@ -11,8 +11,8 @@ import (
 	"testing"
 
 	"github.com/ActiveMemory/ctx/internal/cli/add"
-	"github.com/ActiveMemory/ctx/internal/cli/complete"
 	"github.com/ActiveMemory/ctx/internal/cli/initialize"
+	taskcomplete "github.com/ActiveMemory/ctx/internal/cli/task/cmd/complete"
 )
 
 // TestCompactCommand tests the compact command.
@@ -72,7 +72,7 @@ func TestCompactWithTasks(t *testing.T) {
 		t.Fatalf("add task failed: %v", err)
 	}
 
-	completeCmd := complete.Cmd()
+	completeCmd := taskcomplete.Cmd()
 	completeCmd.SetArgs([]string{"Task to complete"})
 	if err := completeCmd.Execute(); err != nil {
 		t.Fatalf("complete task failed: %v", err)
diff --git a/internal/cli/compact/core/archive.go b/internal/cli/compact/core/archive.go
index 8b824548..3d5b0280 100644
--- a/internal/cli/compact/core/archive.go
+++ b/internal/cli/compact/core/archive.go
@@ -12,14 +12,19 @@ import (
 	"path/filepath"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/archive"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	time2 "github.com/ActiveMemory/ctx/internal/config/time"
+	"github.com/ActiveMemory/ctx/internal/config/token"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 // WriteArchive writes content to a dated archive file in .context/archive/.
 //
 // Creates the archive directory if needed. If a file for today already exists,
-// the new content is appended. Otherwise a new file is created with a header.
+// the new content is appended. Otherwise, a new file is created with a header.
 //
 // Parameters:
 //   - prefix: File name prefix (e.g., "tasks", "decisions", "learnings")
@@ -28,28 +33,29 @@ import (
 //
 // Returns the path to the written archive file.
 func WriteArchive(prefix, heading, content string) (string, error) {
-	archiveDir := filepath.Join(rc.ContextDir(), config.DirArchive)
-	if err := os.MkdirAll(archiveDir, config.PermExec); err != nil {
-		return "", fmt.Errorf("failed to create archive directory: %w", err)
+	archiveDir := filepath.Join(rc.ContextDir(), dir.Archive)
+	if mkErr := os.MkdirAll(archiveDir, fs.PermExec); mkErr != nil {
+		return "", ctxerr.CreateArchiveDir(mkErr)
 	}
 
 	now := time.Now()
+	dateStr := now.Format(time2.DateFormat)
 	archiveFile := filepath.Join(
 		archiveDir,
-		fmt.Sprintf("%s-%s.md", prefix, now.Format("2006-01-02")),
+		fmt.Sprintf(archive.TplArchiveFilename, prefix, dateStr),
 	)
 
-	nl := config.NewlineLF
+	nl := token.NewlineLF
 	var finalContent string
-	if existing, err := os.ReadFile(filepath.Clean(archiveFile)); err == nil {
+	if existing, readErr := os.ReadFile(filepath.Clean(archiveFile)); readErr == nil {
 		finalContent = string(existing) + nl + content
 	} else {
-		finalContent = heading + " - " +
-			now.Format("2006-01-02") + nl + nl + content
+		finalContent = heading + archive.ArchiveDateSep +
+			dateStr + nl + nl + content
 	}
 
-	if err := os.WriteFile(archiveFile, []byte(finalContent), config.PermFile); err != nil {
-		return "", fmt.Errorf("failed to write archive: %w", err)
+	if writeErr := os.WriteFile(archiveFile, []byte(finalContent), fs.PermFile); writeErr != nil {
+		return "", ctxerr.WriteArchive(writeErr)
 	}
 
 	return archiveFile, nil
diff --git a/internal/cli/compact/core/block.go b/internal/cli/compact/core/block.go
index 50d9c1de..33d0a309 100644
--- a/internal/cli/compact/core/block.go
+++ b/internal/cli/compact/core/block.go
@@ -10,7 +10,9 @@ import (
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/ActiveMemory/ctx/internal/task"
 )
 
@@ -44,17 +46,17 @@ func ParseTaskBlocks(lines []string) []TaskBlock {
 		line := lines[i]
 
 		// Track if we're in the Completed section
-		if strings.HasPrefix(line, config.HeadingCompleted) {
+		if strings.HasPrefix(line, assets.HeadingCompleted) {
 			inCompletedSection = true
 			i++
 			continue
 		}
-		if strings.HasPrefix(line, config.HeadingLevelTwoStart) && inCompletedSection {
+		if strings.HasPrefix(line, token.HeadingLevelTwoStart) && inCompletedSection {
 			inCompletedSection = false
 		}
 
 		// Skip if in the Completed section or not a checked task
-		match := config.RegExTask.FindStringSubmatch(line)
+		match := regex.Task.FindStringSubmatch(line)
 		if inCompletedSection || match == nil || !task.Completed(match) {
 			i++
 			continue
@@ -83,7 +85,7 @@ func ParseTaskBlocks(lines []string) []TaskBlock {
 // Returns:
 //   - string: All lines joined with newlines
 func (b *TaskBlock) BlockContent() string {
-	return strings.Join(b.Lines, config.NewlineLF)
+	return strings.Join(b.Lines, token.NewlineLF)
 }
 
 // ParentTaskText extracts just the task text from the parent line.
@@ -94,7 +96,7 @@ func (b *TaskBlock) ParentTaskText() string {
 	if len(b.Lines) == 0 {
 		return ""
 	}
-	match := config.RegExTask.FindStringSubmatch(b.Lines[0])
+	match := regex.Task.FindStringSubmatch(b.Lines[0])
 	if match != nil {
 		return task.Content(match)
 	}
diff --git a/internal/cli/compact/core/parse.go b/internal/cli/compact/core/parse.go
index 8a9a2a2f..e50f67c0 100644
--- a/internal/cli/compact/core/parse.go
+++ b/internal/cli/compact/core/parse.go
@@ -10,7 +10,9 @@ import (
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	time2 "github.com/ActiveMemory/ctx/internal/config/time"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/ActiveMemory/ctx/internal/task"
 )
 
@@ -22,7 +24,7 @@ import (
 // Returns:
 //   - int: Number of leading whitespace characters (spaces and tabs)
 func indentLevel(line string) int {
-	return len(line) - len(strings.TrimLeft(line, config.Whitespace))
+	return len(line) - len(strings.TrimLeft(line, token.Whitespace))
 }
 
 // parseBlockAt parses a task block starting at the given index.
@@ -85,7 +87,7 @@ func parseBlockAt(lines []string, startIdx int) TaskBlock {
 		block.EndIndex = i + 1
 
 		// Check if this is an unchecked task
-		nestedMatch := config.RegExTask.FindStringSubmatch(line)
+		nestedMatch := regex.Task.FindStringSubmatch(line)
 		if nestedMatch != nil && task.Pending(nestedMatch) {
 			block.IsArchivable = false
 		}
@@ -102,13 +104,13 @@ func parseBlockAt(lines []string, startIdx int) TaskBlock {
 // Returns:
 //   - *time.Time: Parsed time, or nil if no valid timestamp is found
 func parseDoneTimestamp(line string) *time.Time {
-	match := config.RegExTaskDoneTimestamp.FindStringSubmatch(line)
+	match := regex.TaskDoneTimestamp.FindStringSubmatch(line)
 	if len(match) < 2 {
 		return nil
 	}
 
 	// Parse YYYY-MM-DD-HHMMSS format
-	t, err := time.Parse(config.TimestampCompact, match[1])
+	t, err := time.Parse(time2.TimestampCompact, match[1])
 	if err != nil {
 		return nil
 	}
diff --git a/internal/cli/compact/core/sanitize.go b/internal/cli/compact/core/sanitize.go
index c2a0112c..2a3b4c42 100644
--- a/internal/cli/compact/core/sanitize.go
+++ b/internal/cli/compact/core/sanitize.go
@@ -10,7 +10,7 @@ import (
 	"strings"
 	"unicode/utf8"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // RemoveEmptySections removes Markdown sections that contain no content.
@@ -25,7 +25,7 @@ import (
 //   - string: Content with empty sections removed
 //   - int: Number of sections removed
 func RemoveEmptySections(content string) (string, int) {
-	lines := strings.Split(content, config.NewlineLF)
+	lines := strings.Split(content, token.NewlineLF)
 	var result []string
 	removed := 0
 
@@ -34,7 +34,7 @@ func RemoveEmptySections(content string) (string, int) {
 		line := lines[i]
 
 		// Check if this is a section header
-		if strings.HasPrefix(line, config.HeadingLevelTwoStart) {
+		if strings.HasPrefix(line, token.HeadingLevelTwoStart) {
 			// Look ahead to see if the section is empty
 			sectionStart := i
 			i++
@@ -46,8 +46,8 @@ func RemoveEmptySections(content string) (string, int) {
 
 			// Check if we hit another section or end of the file
 			if i >= len(lines) ||
-				strings.HasPrefix(lines[i], config.HeadingLevelTwoStart) ||
-				strings.HasPrefix(lines[i], config.HeadingLevelOneStart) {
+				strings.HasPrefix(lines[i], token.HeadingLevelTwoStart) ||
+				strings.HasPrefix(lines[i], token.HeadingLevelOneStart) {
 				// Section is empty, skip it
 				removed++
 				continue
@@ -62,7 +62,7 @@ func RemoveEmptySections(content string) (string, int) {
 		i++
 	}
 
-	return strings.Join(result, config.NewlineLF), removed
+	return strings.Join(result, token.NewlineLF), removed
 }
 
 // TruncateString shortens a string to maxLen, adding "..." if truncated.
@@ -78,5 +78,5 @@ func TruncateString(s string, maxLen int) string {
 		return s
 	}
 	runes := []rune(s)
-	return string(runes[:maxLen-3]) + config.Ellipsis
+	return string(runes[:maxLen-3]) + token.Ellipsis
 }
diff --git a/internal/cli/compact/core/task.go b/internal/cli/compact/core/task.go
index 1ef9bfa7..9b3cdebb 100644
--- a/internal/cli/compact/core/task.go
+++ b/internal/cli/compact/core/task.go
@@ -7,15 +7,18 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/rc"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // CompactTasks moves completed tasks to the "Completed" section in TASKS.md.
@@ -36,14 +39,14 @@ import (
 func CompactTasks(
 	cmd *cobra.Command, ctx *context.Context, archive bool,
 ) (int, error) {
-	tasksFile := ctx.File(config.FileTask)
+	tasksFile := ctx.File(ctxCfg.Task)
 
 	if tasksFile == nil {
 		return 0, nil
 	}
 
 	content := string(tasksFile.Content)
-	lines := strings.Split(content, config.NewlineLF)
+	lines := strings.Split(content, token.NewlineLF)
 
 	// Parse task blocks
 	blocks := ParseTaskBlocks(lines)
@@ -53,15 +56,9 @@ func CompactTasks(
 	for _, block := range blocks {
 		if block.IsArchivable {
 			archivableBlocks = append(archivableBlocks, block)
-			cmd.Println(fmt.Sprintf(
-				"✓ Moving completed task: %s",
-				TruncateString(block.ParentTaskText(), 50),
-			))
+			write.InfoMovingTask(cmd, TruncateString(block.ParentTaskText(), 50))
 		} else {
-			cmd.Println(fmt.Sprintf(
-				"! Skipping (has incomplete children): %s",
-				TruncateString(block.ParentTaskText(), 50),
-			))
+			write.InfoSkippingTask(cmd, TruncateString(block.ParentTaskText(), 50))
 		}
 	}
 
@@ -74,11 +71,11 @@ func CompactTasks(
 
 	// Add blocks to the Completed section
 	for i, line := range newLines {
-		if strings.HasPrefix(line, config.HeadingCompleted) {
+		if strings.HasPrefix(line, assets.HeadingCompleted) {
 			// Find the next line that's either empty or another section
 			insertIdx := i + 1
 			for insertIdx < len(newLines) && newLines[insertIdx] != "" &&
-				!strings.HasPrefix(newLines[insertIdx], config.HeadingLevelTwoStart) {
+				!strings.HasPrefix(newLines[insertIdx], token.HeadingLevelTwoStart) {
 				insertIdx++
 			}
 
@@ -108,25 +105,22 @@ func CompactTasks(
 		}
 
 		if len(blocksToArchive) > 0 {
-			nl := config.NewlineLF
+			nl := token.NewlineLF
 			var archiveContent string
 			for _, block := range blocksToArchive {
 				archiveContent += block.BlockContent() + nl + nl
 			}
-			if archiveFile, err := WriteArchive("tasks", config.HeadingArchivedTasks, archiveContent); err == nil {
-				cmd.Println(fmt.Sprintf(
-					"✓ Archived %d tasks to %s (older than %d days)",
-					len(blocksToArchive), archiveFile, archiveDays,
-				))
+			if archiveFile, archiveErr := WriteArchive("tasks", assets.HeadingArchivedTasks, archiveContent); archiveErr == nil {
+				write.InfoArchivedTasks(cmd, len(blocksToArchive), archiveFile, archiveDays)
 			}
 		}
 	}
 
 	// Write back
-	newContent := strings.Join(newLines, config.NewlineLF)
+	newContent := strings.Join(newLines, token.NewlineLF)
 	if newContent != content {
 		if err := os.WriteFile(
-			tasksFile.Path, []byte(newContent), config.PermFile,
+			tasksFile.Path, []byte(newContent), fs.PermFile,
 		); err != nil {
 			return 0, err
 		}
diff --git a/internal/cli/compact/exports.go b/internal/cli/compact/exports.go
deleted file mode 100644
index f407d781..00000000
--- a/internal/cli/compact/exports.go
+++ /dev/null
@@ -1,52 +0,0 @@
-//   /    ctx:                         https://ctx.ist
-// ,'`./    do you remember?
-// `.,'\
-//   \    Copyright 2026-present Context contributors.
-//                 SPDX-License-Identifier: Apache-2.0
-
-package compact
-
-import "github.com/ActiveMemory/ctx/internal/cli/compact/core"
-
-// TaskBlock is re-exported from core for backward compatibility.
-//
-// See core.TaskBlock for full documentation.
-type TaskBlock = core.TaskBlock
-
-// WriteArchive delegates to core.WriteArchive for backward compatibility.
-//
-// Parameters:
-//   - prefix: File name prefix (e.g., "tasks", "decisions", "learnings")
-//   - heading: Markdown heading for new archive files
-//   - content: The content to archive
-//
-// Returns:
-//   - string: Path to the written archive file
-//   - error: Non-nil if directory creation or file write fails
-func WriteArchive(prefix, heading, content string) (string, error) {
-	return core.WriteArchive(prefix, heading, content)
-}
-
-// ParseTaskBlocks delegates to core.ParseTaskBlocks for backward compatibility.
-//
-// Parameters:
-//   - lines: Slice of lines from the tasks file
-//
-// Returns:
-//   - []TaskBlock: All completed top-level task blocks found
-func ParseTaskBlocks(lines []string) []TaskBlock {
-	return core.ParseTaskBlocks(lines)
-}
-
-// RemoveBlocksFromLines delegates to core.RemoveBlocksFromLines for backward
-// compatibility.
-//
-// Parameters:
-//   - lines: Original lines from the file
-//   - blocks: Task blocks to remove (must be sorted by StartIndex)
-//
-// Returns:
-//   - []string: New lines with blocks removed
-func RemoveBlocksFromLines(lines []string, blocks []TaskBlock) []string {
-	return core.RemoveBlocksFromLines(lines, blocks)
-}
diff --git a/internal/cli/complete/complete_test.go b/internal/cli/complete/complete_test.go
deleted file mode 100644
index 62bcc444..00000000
--- a/internal/cli/complete/complete_test.go
+++ /dev/null
@@ -1,64 +0,0 @@
-//   /    ctx:                         https://ctx.ist
-// ,'`./    do you remember?
-// `.,'\
-//   \    Copyright 2026-present Context contributors.
-//                 SPDX-License-Identifier: Apache-2.0
-
-package complete
-
-import (
-	"os"
-	"path/filepath"
-	"strings"
-	"testing"
-
-	"github.com/ActiveMemory/ctx/internal/cli/add"
-	"github.com/ActiveMemory/ctx/internal/cli/initialize"
-)
-
-// TestCompleteCommand tests the complete command.
-func TestCompleteCommand(t *testing.T) {
-	tmpDir, err := os.MkdirTemp("", "cli-complete-test-*")
-	if err != nil {
-		t.Fatalf("failed to create temp dir: %v", err)
-	}
-	defer func() { _ = os.RemoveAll(tmpDir) }()
-
-	origDir, _ := os.Getwd()
-	if err = os.Chdir(tmpDir); err != nil {
-		t.Fatalf("failed to chdir: %v", err)
-	}
-	defer func() { _ = os.Chdir(origDir) }()
-
-	// First init
-	initCmd := initialize.Cmd()
-	initCmd.SetArgs([]string{})
-	if err = initCmd.Execute(); err != nil {
-		t.Fatalf("init failed: %v", err)
-	}
-
-	// Add a task
-	addCmd := add.Cmd()
-	addCmd.SetArgs([]string{"task", "Task to complete"})
-	if err = addCmd.Execute(); err != nil {
-		t.Fatalf("add task command failed: %v", err)
-	}
-
-	// Complete the task
-	completeCmd := Cmd()
-	completeCmd.SetArgs([]string{"Task to complete"})
-	if err = completeCmd.Execute(); err != nil {
-		t.Fatalf("complete command failed: %v", err)
-	}
-
-	// Verify the task was completed
-	tasksPath := filepath.Join(tmpDir, ".context", "TASKS.md")
-	content, err := os.ReadFile(filepath.Clean(tasksPath))
-	if err != nil {
-		t.Fatalf("failed to read TASKS.md: %v", err)
-	}
-
-	if !strings.Contains(string(content), "- [x]") {
-		t.Errorf("task was not marked as complete")
-	}
-}
diff --git a/internal/cli/complete/doc.go b/internal/cli/complete/doc.go
deleted file mode 100644
index c95b8607..00000000
--- a/internal/cli/complete/doc.go
+++ /dev/null
@@ -1,12 +0,0 @@
-//   /    ctx:                         https://ctx.ist
-// ,'`./    do you remember?
-// `.,'\
-//   \    Copyright 2026-present Context contributors.
-//                 SPDX-License-Identifier: Apache-2.0
-
-// Package complete implements the "ctx complete" command for marking
-// tasks as done in TASKS.md.
-//
-// Tasks can be identified by number or partial text match. The command
-// updates TASKS.md by changing "- [ ]" to "- [x]" for the matched task.
-package complete
diff --git a/internal/cli/config/cmd/schema/cmd.go b/internal/cli/config/cmd/schema/cmd.go
index baa3f2c9..31a79503 100644
--- a/internal/cli/config/cmd/schema/cmd.go
+++ b/internal/cli/config/cmd/schema/cmd.go
@@ -7,11 +7,10 @@
 package schema
 
 import (
-	"fmt"
-
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
 // Cmd returns the "ctx config schema" subcommand.
@@ -19,7 +18,7 @@ import (
 // Returns:
 //   - *cobra.Command: Configured schema subcommand
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("config.schema")
+	short, long := assets.CommandDesc(assets.CmdDescKeyConfigSchema)
 
 	return &cobra.Command{
 		Use:   "schema",
@@ -29,7 +28,7 @@ func Cmd() *cobra.Command {
 		RunE: func(cmd *cobra.Command, _ []string) error {
 			data, readErr := assets.Schema()
 			if readErr != nil {
-				return fmt.Errorf("read embedded schema: %w", readErr)
+				return ctxerr.ReadEmbeddedSchema(readErr)
 			}
 			cmd.Print(string(data))
 			return nil
diff --git a/internal/cli/config/cmd/schema/doc.go b/internal/cli/config/cmd/schema/doc.go
new file mode 100644
index 00000000..cb7017aa
--- /dev/null
+++ b/internal/cli/config/cmd/schema/doc.go
@@ -0,0 +1,10 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package schema implements the ctx config schema subcommand.
+//
+// It prints the embedded JSON schema for the .ctxrc configuration file.
+package schema
diff --git a/internal/cli/config/cmd/status/cmd.go b/internal/cli/config/cmd/status/cmd.go
index 55731001..22d210b3 100644
--- a/internal/cli/config/cmd/status/cmd.go
+++ b/internal/cli/config/cmd/status/cmd.go
@@ -7,11 +7,10 @@
 package status
 
 import (
+	internalConfig "github.com/ActiveMemory/ctx/internal/config/cli"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	internalConfig "github.com/ActiveMemory/ctx/internal/config"
-
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
 )
 
@@ -20,7 +19,7 @@ import (
 // Returns:
 //   - *cobra.Command: Configured status subcommand
 func Cmd() *cobra.Command {
-	short, _ := assets.CommandDesc("config.status")
+	short, _ := assets.CommandDesc(assets.CmdDescKeyConfigStatus)
 
 	return &cobra.Command{
 		Use:         "status",
@@ -32,7 +31,7 @@ func Cmd() *cobra.Command {
 			if rootErr != nil {
 				return rootErr
 			}
-			return RunStatus(cmd, root)
+			return Run(cmd, root)
 		},
 	}
 }
diff --git a/internal/cli/config/cmd/status/doc.go b/internal/cli/config/cmd/status/doc.go
new file mode 100644
index 00000000..ed7ea1b9
--- /dev/null
+++ b/internal/cli/config/cmd/status/doc.go
@@ -0,0 +1,10 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package status implements the ctx config status subcommand.
+//
+// It displays the active configuration profile and resolved settings.
+package status
diff --git a/internal/cli/config/cmd/status/run.go b/internal/cli/config/cmd/status/run.go
index 2cc48d4c..70714160 100644
--- a/internal/cli/config/cmd/status/run.go
+++ b/internal/cli/config/cmd/status/run.go
@@ -7,14 +7,13 @@
 package status
 
 import (
-	"fmt"
-
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
-// RunStatus prints the active .ctxrc profile.
+// Run prints the active .ctxrc profile.
 //
 // Parameters:
 //   - cmd: Cobra command for output
@@ -22,15 +21,15 @@ import (
 //
 // Returns:
 //   - error: Always nil (included for RunE compatibility)
-func RunStatus(cmd *cobra.Command, root string) error {
-	profile := core.DetectProfile(root)
+func Run(cmd *cobra.Command, root string) error {
+	profile := core.DetectProfile()
 	switch profile {
 	case core.ProfileDev:
-		cmd.Println("active: dev (verbose logging enabled)")
+		write.InfoConfigProfileDev(cmd)
 	case core.ProfileBase:
-		cmd.Println("active: base (defaults)")
+		write.InfoConfigProfileBase(cmd)
 	default:
-		cmd.Println(fmt.Sprintf("active: none (%s does not exist)", core.FileCtxRC))
+		write.InfoConfigProfileNone(cmd, core.FileCtxRC)
 	}
 	return nil
 }
diff --git a/internal/cli/config/cmd/status/run_test.go b/internal/cli/config/cmd/status/run_test.go
index f0295781..77f07073 100644
--- a/internal/cli/config/cmd/status/run_test.go
+++ b/internal/cli/config/cmd/status/run_test.go
@@ -16,11 +16,12 @@ import (
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
+	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 const (
-	devContent  = "notify:\n  events:\n    - loop\n"
-	baseContent = "# .ctxrc\n# context_dir: .context\n"
+	devContent  = "profile: dev\nnotify:\n  events:\n    - loop\n"
+	baseContent = "profile: base\n# context_dir: .context\n"
 )
 
 func newTestCmd() *cobra.Command {
@@ -34,6 +35,17 @@ func cmdOutput(cmd *cobra.Command) string {
 	return cmd.OutOrStdout().(*bytes.Buffer).String()
 }
 
+func chdirWithCleanup(t *testing.T, dir string) {
+	t.Helper()
+	origDir, _ := os.Getwd()
+	_ = os.Chdir(dir)
+	rc.Reset()
+	t.Cleanup(func() {
+		_ = os.Chdir(origDir)
+		rc.Reset()
+	})
+}
+
 func TestStatus_Dev(t *testing.T) {
 	root := t.TempDir()
 	if writeErr := os.WriteFile(
@@ -41,9 +53,10 @@ func TestStatus_Dev(t *testing.T) {
 	); writeErr != nil {
 		t.Fatal(writeErr)
 	}
+	chdirWithCleanup(t, root)
 
 	cmd := newTestCmd()
-	if statusErr := RunStatus(cmd, root); statusErr != nil {
+	if statusErr := Run(cmd, root); statusErr != nil {
 		t.Fatalf("unexpected error: %v", statusErr)
 	}
 
@@ -60,9 +73,10 @@ func TestStatus_Base(t *testing.T) {
 	); writeErr != nil {
 		t.Fatal(writeErr)
 	}
+	chdirWithCleanup(t, root)
 
 	cmd := newTestCmd()
-	if statusErr := RunStatus(cmd, root); statusErr != nil {
+	if statusErr := Run(cmd, root); statusErr != nil {
 		t.Fatalf("unexpected error: %v", statusErr)
 	}
 
@@ -74,9 +88,10 @@ func TestStatus_Base(t *testing.T) {
 
 func TestStatus_Missing(t *testing.T) {
 	root := t.TempDir()
+	chdirWithCleanup(t, root)
 
 	cmd := newTestCmd()
-	if statusErr := RunStatus(cmd, root); statusErr != nil {
+	if statusErr := Run(cmd, root); statusErr != nil {
 		t.Fatalf("unexpected error: %v", statusErr)
 	}
 
diff --git a/internal/cli/config/cmd/switchcmd/cmd.go b/internal/cli/config/cmd/switchcmd/cmd.go
index f0a9fa3a..3c652001 100644
--- a/internal/cli/config/cmd/switchcmd/cmd.go
+++ b/internal/cli/config/cmd/switchcmd/cmd.go
@@ -7,11 +7,10 @@
 package switchcmd
 
 import (
+	internalConfig "github.com/ActiveMemory/ctx/internal/config/cli"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	internalConfig "github.com/ActiveMemory/ctx/internal/config"
-
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
 )
 
@@ -20,7 +19,7 @@ import (
 // Returns:
 //   - *cobra.Command: Configured switch subcommand
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("config.switch")
+	short, long := assets.CommandDesc(assets.CmdDescKeyConfigSwitch)
 
 	return &cobra.Command{
 		Use:         "switch [dev|base]",
@@ -33,7 +32,7 @@ func Cmd() *cobra.Command {
 			if rootErr != nil {
 				return rootErr
 			}
-			return RunSwitch(cmd, root, args)
+			return Run(cmd, root, args)
 		},
 	}
 }
diff --git a/internal/cli/config/cmd/switchcmd/doc.go b/internal/cli/config/cmd/switchcmd/doc.go
new file mode 100644
index 00000000..52ea2ac7
--- /dev/null
+++ b/internal/cli/config/cmd/switchcmd/doc.go
@@ -0,0 +1,10 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package switchcmd implements the ctx config switch subcommand.
+//
+// It switches the active .ctxrc configuration between dev and base profiles.
+package switchcmd
diff --git a/internal/cli/config/cmd/switchcmd/run.go b/internal/cli/config/cmd/switchcmd/run.go
index 5fc09961..8a9f251e 100644
--- a/internal/cli/config/cmd/switchcmd/run.go
+++ b/internal/cli/config/cmd/switchcmd/run.go
@@ -7,14 +7,13 @@
 package switchcmd
 
 import (
-	"fmt"
-
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
-// RunSwitch executes the profile switch logic.
+// Run executes the profile switch logic.
 //
 // Parameters:
 //   - cmd: Cobra command for output
@@ -23,58 +22,39 @@ import (
 //
 // Returns:
 //   - error: Non-nil on unknown profile or copy failure
-func RunSwitch(cmd *cobra.Command, root string, args []string) error {
+func Run(cmd *cobra.Command, root string, args []string) error {
 	var target string
 	if len(args) > 0 {
 		target = args[0]
 	}
 
 	// Normalize "prod" alias.
-	if target == "prod" {
+	if target == core.ProfileProd {
 		target = core.ProfileBase
 	}
 
+	var profile string
 	switch target {
 	case core.ProfileDev:
-		return switchTo(cmd, root, core.ProfileDev)
+		profile = core.ProfileDev
 	case core.ProfileBase:
-		return switchTo(cmd, root, core.ProfileBase)
+		profile = core.ProfileBase
 	case "":
 		// Toggle.
-		current := core.DetectProfile(root)
+		current := core.DetectProfile()
 		if current == core.ProfileDev {
-			return switchTo(cmd, root, core.ProfileBase)
+			profile = core.ProfileBase
+		} else {
+			profile = core.ProfileDev
 		}
-		return switchTo(cmd, root, core.ProfileDev)
 	default:
-		return fmt.Errorf(
-			"unknown profile %q: must be dev, base, or prod", target)
-	}
-}
-
-// switchTo copies the requested profile and prints a status message.
-func switchTo(cmd *cobra.Command, root, profile string) error {
-	current := core.DetectProfile(root)
-	if current == profile {
-		cmd.Println(fmt.Sprintf("already on %s profile", profile))
-		return nil
-	}
-
-	var srcFile string
-	if profile == core.ProfileDev {
-		srcFile = core.FileCtxRCDev
-	} else {
-		srcFile = core.FileCtxRCBase
-	}
-
-	if copyErr := core.CopyProfile(root, srcFile); copyErr != nil {
-		return copyErr
+		return ctxerr.UnknownProfile(target)
 	}
 
-	if current == "" {
-		cmd.Println(fmt.Sprintf("created %s from %s profile", core.FileCtxRC, profile))
-	} else {
-		cmd.Println(fmt.Sprintf("switched to %s profile", profile))
+	msg, switchErr := core.SwitchTo(root, profile)
+	if switchErr != nil {
+		return switchErr
 	}
+	cmd.Println(msg)
 	return nil
 }
diff --git a/internal/cli/config/cmd/switchcmd/run_test.go b/internal/cli/config/cmd/switchcmd/run_test.go
index a8f55d64..aa8c3e4d 100644
--- a/internal/cli/config/cmd/switchcmd/run_test.go
+++ b/internal/cli/config/cmd/switchcmd/run_test.go
@@ -16,11 +16,12 @@ import (
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/config/core"
+	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 const (
-	devContent  = "notify:\n  events:\n    - loop\n"
-	baseContent = "# .ctxrc\n# context_dir: .context\n"
+	devContent  = "profile: dev\nnotify:\n  events:\n    - loop\n"
+	baseContent = "profile: base\n# context_dir: .context\n"
 )
 
 func setupProfiles(t *testing.T) string {
@@ -37,6 +38,15 @@ func setupProfiles(t *testing.T) string {
 	); writeErr != nil {
 		t.Fatal(writeErr)
 	}
+
+	origDir, _ := os.Getwd()
+	_ = os.Chdir(root)
+	rc.Reset()
+	t.Cleanup(func() {
+		_ = os.Chdir(origDir)
+		rc.Reset()
+	})
+
 	return root
 }
 
@@ -60,7 +70,7 @@ func TestSwitch_DevToBase(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, []string{"base"}); switchErr != nil {
+	if switchErr := Run(cmd, root, []string{"base"}); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
@@ -69,7 +79,8 @@ func TestSwitch_DevToBase(t *testing.T) {
 		t.Errorf("expected 'switched to base', got: %s", out)
 	}
 
-	if got := core.DetectProfile(root); got != core.ProfileBase {
+	rc.Reset()
+	if got := core.DetectProfile(); got != core.ProfileBase {
 		t.Errorf("profile should be base after switch, got %q", got)
 	}
 }
@@ -83,7 +94,7 @@ func TestSwitch_BaseToDev(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, []string{"dev"}); switchErr != nil {
+	if switchErr := Run(cmd, root, []string{"dev"}); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
@@ -92,7 +103,8 @@ func TestSwitch_BaseToDev(t *testing.T) {
 		t.Errorf("expected 'switched to dev', got: %s", out)
 	}
 
-	if got := core.DetectProfile(root); got != core.ProfileDev {
+	rc.Reset()
+	if got := core.DetectProfile(); got != core.ProfileDev {
 		t.Errorf("profile should be dev after switch, got %q", got)
 	}
 }
@@ -106,7 +118,7 @@ func TestSwitch_AlreadyOnProfile(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, []string{"dev"}); switchErr != nil {
+	if switchErr := Run(cmd, root, []string{"dev"}); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
@@ -125,7 +137,7 @@ func TestSwitch_ProdAlias(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, []string{"prod"}); switchErr != nil {
+	if switchErr := Run(cmd, root, []string{"prod"}); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
@@ -144,11 +156,12 @@ func TestSwitch_Toggle_DevToBase(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, nil); switchErr != nil {
+	if switchErr := Run(cmd, root, nil); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
-	if got := core.DetectProfile(root); got != core.ProfileBase {
+	rc.Reset()
+	if got := core.DetectProfile(); got != core.ProfileBase {
 		t.Errorf("toggle from dev should go to base, got %q", got)
 	}
 }
@@ -162,11 +175,12 @@ func TestSwitch_Toggle_BaseToDev(t *testing.T) {
 	}
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, nil); switchErr != nil {
+	if switchErr := Run(cmd, root, nil); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
-	if got := core.DetectProfile(root); got != core.ProfileDev {
+	rc.Reset()
+	if got := core.DetectProfile(); got != core.ProfileDev {
 		t.Errorf("toggle from base should go to dev, got %q", got)
 	}
 }
@@ -175,11 +189,12 @@ func TestSwitch_Toggle_MissingCtxrc(t *testing.T) {
 	root := setupProfiles(t)
 
 	cmd := newTestCmd()
-	if switchErr := RunSwitch(cmd, root, nil); switchErr != nil {
+	if switchErr := Run(cmd, root, nil); switchErr != nil {
 		t.Fatalf("unexpected error: %v", switchErr)
 	}
 
-	if got := core.DetectProfile(root); got != core.ProfileDev {
+	rc.Reset()
+	if got := core.DetectProfile(); got != core.ProfileDev {
 		t.Errorf("toggle from missing should go to dev, got %q", got)
 	}
 }
@@ -188,7 +203,7 @@ func TestSwitch_InvalidProfile(t *testing.T) {
 	root := setupProfiles(t)
 
 	cmd := newTestCmd()
-	switchErr := RunSwitch(cmd, root, []string{"invalid"})
+	switchErr := Run(cmd, root, []string{"invalid"})
 	if switchErr == nil {
 		t.Fatal("expected error for invalid profile")
 	}
diff --git a/internal/cli/config/config.go b/internal/cli/config/config.go
index 307d942e..99cc1e84 100644
--- a/internal/cli/config/config.go
+++ b/internal/cli/config/config.go
@@ -20,7 +20,7 @@ import (
 // Returns:
 //   - *cobra.Command: Configured config command with subcommands
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("config")
+	short, long := assets.CommandDesc(assets.CmdDescKeyConfig)
 
 	cmd := &cobra.Command{
 		Use:   "config",
diff --git a/internal/cli/config/core/core.go b/internal/cli/config/core/core.go
index 1ae69174..00104d9a 100644
--- a/internal/cli/config/core/core.go
+++ b/internal/cli/config/core/core.go
@@ -8,45 +8,35 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"os/exec"
 	"path/filepath"
 	"strings"
 
-	internalConfig "github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/rc"
+	"github.com/ActiveMemory/ctx/internal/validation"
 )
 
-// Profile file names and identifiers.
+// Profile file names and identifiers — aliased from internal/config.
 const (
-	FileCtxRC     = ".ctxrc"
-	FileCtxRCBase = ".ctxrc.base"
-	FileCtxRCDev  = ".ctxrc.dev"
-
-	ProfileDev  = "dev"
-	ProfileBase = "base"
+	FileCtxRC     = file.CtxRC
+	FileCtxRCBase = file.CtxRCBase
+	FileCtxRCDev  = file.CtxRCDev
+	ProfileDev    = file.ProfileDev
+	ProfileBase   = file.ProfileBase
+	ProfileProd   = file.ProfileProd
 )
 
-// DetectProfile reads .ctxrc and returns "dev" or "base" based on the
-// presence of an uncommented "notify:" line. Returns "" if the file is missing.
-//
-// Parameters:
-//   - root: Git repository root directory
+// DetectProfile returns the active profile name from the parsed .ctxrc.
+// Returns "" if .ctxrc is missing or has no profile field.
 //
 // Returns:
-//   - string: Profile name ("dev", "base", or "" if missing)
-func DetectProfile(root string) string {
-	data, readErr := os.ReadFile(filepath.Join(root, FileCtxRC)) //nolint:gosec // project-local config file
-	if readErr != nil {
-		return ""
-	}
-
-	for _, line := range strings.Split(string(data), internalConfig.NewlineLF) {
-		if strings.HasPrefix(strings.TrimSpace(line), "notify:") {
-			return ProfileDev
-		}
-	}
-	return ProfileBase
+//   - string: Profile name ("dev", "base", or "")
+func DetectProfile() string {
+	return rc.RC().Profile
 }
 
 // CopyProfile copies a source profile file to .ctxrc.
@@ -58,25 +48,61 @@ func DetectProfile(root string) string {
 // Returns:
 //   - error: Non-nil on read or write failure
 func CopyProfile(root, srcFile string) error {
-	src := filepath.Join(root, srcFile)
-	data, readErr := os.ReadFile(src) //nolint:gosec // project-local file
+	data, readErr := validation.SafeReadFile(root, srcFile)
 	if readErr != nil {
-		return fmt.Errorf("read %s: %w", srcFile, readErr)
+		return ctxerr.ReadProfile(srcFile, readErr)
 	}
 
 	dst := filepath.Join(root, FileCtxRC)
-	return os.WriteFile(dst, data, internalConfig.PermFile)
+	return os.WriteFile(dst, data, fs.PermFile)
 }
 
-// GitRoot returns the git repository root directory.
+// SwitchTo copies the requested profile to .ctxrc and returns a status message.
+//
+// If the requested profile is already active, returns a no-op message.
+// If .ctxrc did not previously exist, returns a "created" message.
+//
+// Parameters:
+//   - root: Git repository root directory
+//   - profile: Target profile name (ProfileDev or ProfileBase)
 //
 // Returns:
-//   - string: Absolute path to the git root
-//   - error: Non-nil when not inside a git repository
+//   - string: Status message for the user
+//   - error: Non-nil if the profile file copy fails
+func SwitchTo(root, profile string) (string, error) {
+	current := DetectProfile()
+	if current == profile {
+		return "already on " + profile + " profile", nil
+	}
+
+	srcFile := FileCtxRCBase
+	if profile == ProfileDev {
+		srcFile = FileCtxRCDev
+	}
+
+	if copyErr := CopyProfile(root, srcFile); copyErr != nil {
+		return "", copyErr
+	}
+
+	if current == "" {
+		return "created " + FileCtxRC + " from " + profile + " profile", nil
+	}
+	return "switched to " + profile + " profile", nil
+}
+
+// GitRoot returns the git repository root directory.
+//
+// Returns an error if git is not installed or the current directory is
+// not inside a git repository. Features that depend on git should
+// degrade gracefully when this returns an error.
 func GitRoot() (string, error) {
+	if _, lookErr := exec.LookPath("git"); lookErr != nil {
+		return "", ctxerr.GitNotFound()
+	}
+
 	out, execErr := exec.Command("git", "rev-parse", "--show-toplevel").Output()
 	if execErr != nil {
-		return "", fmt.Errorf("not in a git repository: %w", execErr)
+		return "", ctxerr.NotInGitRepo(execErr)
 	}
 	return strings.TrimSpace(string(out)), nil
 }
diff --git a/internal/cli/config/core/core_test.go b/internal/cli/config/core/core_test.go
index 44314d7f..799c9e65 100644
--- a/internal/cli/config/core/core_test.go
+++ b/internal/cli/config/core/core_test.go
@@ -10,13 +10,26 @@ import (
 	"os"
 	"path/filepath"
 	"testing"
+
+	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
 const (
-	devContent  = "notify:\n  events:\n    - loop\n"
-	baseContent = "# .ctxrc\n# context_dir: .context\n"
+	devContent  = "profile: dev\nnotify:\n  events:\n    - loop\n"
+	baseContent = "profile: base\n# context_dir: .context\n"
 )
 
+func chdirWithCleanup(t *testing.T, dir string) {
+	t.Helper()
+	origDir, _ := os.Getwd()
+	_ = os.Chdir(dir)
+	rc.Reset()
+	t.Cleanup(func() {
+		_ = os.Chdir(origDir)
+		rc.Reset()
+	})
+}
+
 // TestCopyProfile_MissingSource verifies error on nonexistent source file.
 func TestCopyProfile_MissingSource(t *testing.T) {
 	root := t.TempDir()
@@ -61,8 +74,9 @@ func TestDetectProfile_Dev(t *testing.T) {
 	); writeErr != nil {
 		t.Fatal(writeErr)
 	}
+	chdirWithCleanup(t, root)
 
-	got := DetectProfile(root)
+	got := DetectProfile()
 	if got != ProfileDev {
 		t.Errorf("expected dev, got %q", got)
 	}
@@ -76,8 +90,9 @@ func TestDetectProfile_Base(t *testing.T) {
 	); writeErr != nil {
 		t.Fatal(writeErr)
 	}
+	chdirWithCleanup(t, root)
 
-	got := DetectProfile(root)
+	got := DetectProfile()
 	if got != ProfileBase {
 		t.Errorf("expected base, got %q", got)
 	}
@@ -86,7 +101,8 @@ func TestDetectProfile_Base(t *testing.T) {
 // TestDetectProfile_Missing verifies empty string for missing file.
 func TestDetectProfile_Missing(t *testing.T) {
 	root := t.TempDir()
-	got := DetectProfile(root)
+	chdirWithCleanup(t, root)
+	got := DetectProfile()
 	if got != "" {
 		t.Errorf("expected empty for missing file, got %q", got)
 	}
diff --git a/internal/cli/config/core/doc.go b/internal/cli/config/core/doc.go
new file mode 100644
index 00000000..df42b177
--- /dev/null
+++ b/internal/cli/config/core/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package core provides shared helpers for config subcommands.
+//
+// It handles profile detection, symlink management, and .ctxrc file
+// resolution used by the config status and switch commands.
+package core
diff --git a/internal/cli/decision/cmd/reindex/cmd.go b/internal/cli/decision/cmd/reindex/cmd.go
index 06d81483..cd9a348d 100644
--- a/internal/cli/decision/cmd/reindex/cmd.go
+++ b/internal/cli/decision/cmd/reindex/cmd.go
@@ -18,11 +18,11 @@ import (
 // Returns:
 //   - *cobra.Command: Command for regenerating the DECISIONS.md index
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("decision.reindex")
+	short, long := assets.CommandDesc(assets.CmdDescKeyDecisionReindex)
 	return &cobra.Command{
 		Use:   "reindex",
 		Short: short,
 		Long:  long,
-		RunE:  run,
+		RunE:  Run,
 	}
 }
diff --git a/internal/cli/decision/cmd/reindex/doc.go b/internal/cli/decision/cmd/reindex/doc.go
new file mode 100644
index 00000000..1bb231dd
--- /dev/null
+++ b/internal/cli/decision/cmd/reindex/doc.go
@@ -0,0 +1,10 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package reindex implements the ctx decision reindex subcommand.
+//
+// It regenerates the DECISIONS.md index from individual decision files.
+package reindex
diff --git a/internal/cli/decision/cmd/reindex/run.go b/internal/cli/decision/cmd/reindex/run.go
index 5cf96a48..5dd516d2 100644
--- a/internal/cli/decision/cmd/reindex/run.go
+++ b/internal/cli/decision/cmd/reindex/run.go
@@ -9,28 +9,28 @@ package reindex
 import (
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/index"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
-// run regenerates the DECISIONS.md index.
+// Run regenerates the DECISIONS.md index.
 //
 // Parameters:
 //   - cmd: Cobra command for output messages
 //   - args: Command arguments (unused)
 //
 // Returns:
-//   - error: Non-nil if file read/write fails
-func run(cmd *cobra.Command, _ []string) error {
-	filePath := filepath.Join(rc.ContextDir(), config.FileDecision)
+//   - error: Non-nil if the file read/write fails
+func Run(cmd *cobra.Command, _ []string) error {
+	filePath := filepath.Join(rc.ContextDir(), ctx.Decision)
 	return index.ReindexFile(
 		cmd.OutOrStdout(),
 		filePath,
-		config.FileDecision,
+		ctx.Decision,
 		index.UpdateDecisions,
-		config.EntryPlural[config.EntryDecision],
+		"decisions",
 	)
 }
diff --git a/internal/cli/decision/decision.go b/internal/cli/decision/decision.go
index 13083f0f..b6fe8a2b 100644
--- a/internal/cli/decision/decision.go
+++ b/internal/cli/decision/decision.go
@@ -22,7 +22,7 @@ import (
 // Returns:
 //   - *cobra.Command: The decisions command with subcommands
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("decision")
+	short, long := assets.CommandDesc(assets.CmdDescKeyDecision)
 	cmd := &cobra.Command{
 		Use:   "decisions",
 		Short: short,
diff --git a/internal/cli/decision/decision_test.go b/internal/cli/decision/decision_test.go
index 36abcc57..80d98726 100644
--- a/internal/cli/decision/decision_test.go
+++ b/internal/cli/decision/decision_test.go
@@ -11,7 +11,8 @@ import (
 	"path/filepath"
 	"testing"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -85,7 +86,7 @@ func TestRunReindex_WithFile(t *testing.T) {
 	defer rc.Reset()
 
 	// Create the context directory and DECISIONS.md file
-	ctxDir := filepath.Join(tempDir, config.DirContext)
+	ctxDir := filepath.Join(tempDir, dir.Context)
 	_ = os.MkdirAll(ctxDir, 0750)
 
 	content := `# Decisions
@@ -96,7 +97,7 @@ func TestRunReindex_WithFile(t *testing.T) {
 **Rationale:** YAML is human-readable
 **Consequences:** Added yaml dependency
 `
-	_ = os.WriteFile(filepath.Join(ctxDir, config.FileDecision), []byte(content), 0600)
+	_ = os.WriteFile(filepath.Join(ctxDir, ctx.Decision), []byte(content), 0600)
 
 	cmd := Cmd()
 	cmd.SetArgs([]string{"reindex"})
@@ -107,7 +108,7 @@ func TestRunReindex_WithFile(t *testing.T) {
 	}
 
 	// Verify the file was updated
-	updated, err := os.ReadFile(filepath.Join(ctxDir, config.FileDecision)) //nolint:gosec // test temp path
+	updated, err := os.ReadFile(filepath.Join(ctxDir, ctx.Decision)) //nolint:gosec // test temp path
 	if err != nil {
 		t.Fatalf("failed to read updated file: %v", err)
 	}
@@ -126,9 +127,9 @@ func TestRunReindex_EmptyFile(t *testing.T) {
 	defer rc.Reset()
 
 	// Create the context directory and empty DECISIONS.md
-	ctxDir := filepath.Join(tempDir, config.DirContext)
+	ctxDir := filepath.Join(tempDir, dir.Context)
 	_ = os.MkdirAll(ctxDir, 0750)
-	_ = os.WriteFile(filepath.Join(ctxDir, config.FileDecision), []byte("# Decisions\n"), 0600)
+	_ = os.WriteFile(filepath.Join(ctxDir, ctx.Decision), []byte("# Decisions\n"), 0600)
 
 	cmd := Cmd()
 	cmd.SetArgs([]string{"reindex"})
diff --git a/internal/cli/decision/doc.go b/internal/cli/decision/doc.go
index 2fc82701..a29ad8c6 100644
--- a/internal/cli/decision/doc.go
+++ b/internal/cli/decision/doc.go
@@ -4,5 +4,5 @@
 //   \    Copyright 2026-present Context contributors.
 //                 SPDX-License-Identifier: Apache-2.0
 
-// Package decision manages DECISIONS.md file and its quick-reference index.
+// Package decision manages the DECISIONS.md file and its quick-reference index.
 package decision
diff --git a/internal/cli/deps/cmd/root/cmd.go b/internal/cli/deps/cmd/root/cmd.go
index d37451a9..82238f41 100644
--- a/internal/cli/deps/cmd/root/cmd.go
+++ b/internal/cli/deps/cmd/root/cmd.go
@@ -5,3 +5,42 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the deps command.
+//
+// Flags:
+//   - --format: Output format (mermaid, table, json)
+//   - --external: Include external module dependencies
+//   - --type: Force project type override
+//
+// Returns:
+//   - *cobra.Command: Configured deps command with flags registered
+func Cmd() *cobra.Command {
+	var (
+		format   string
+		external bool
+		projType string
+	)
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyDeps)
+	cmd := &cobra.Command{
+		Use:   "deps",
+		Short: short,
+		Long:  long,
+		RunE: func(cmd *cobra.Command, _ []string) error {
+			return Run(cmd, format, external, projType)
+		},
+	}
+
+	cmd.Flags().StringVar(&format, "format", "mermaid", assets.FlagDesc(assets.FlagDescKeyDepsFormat))
+	cmd.Flags().BoolVar(&external, "external", false, assets.FlagDesc(assets.FlagDescKeyDepsExternal))
+	cmd.Flags().StringVar(&projType, "type", "", assets.FlagDesc(assets.FlagDescKeyDepsType))
+
+	return cmd
+}
diff --git a/internal/cli/deps/cmd/root/doc.go b/internal/cli/deps/cmd/root/doc.go
new file mode 100644
index 00000000..62da7a43
--- /dev/null
+++ b/internal/cli/deps/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx deps command.
+//
+// It shows the package dependency graph for the current project in
+// Mermaid, table, or JSON format.
+package root
diff --git a/internal/cli/deps/cmd/root/run.go b/internal/cli/deps/cmd/root/run.go
index ab800683..520d0279 100644
--- a/internal/cli/deps/cmd/root/run.go
+++ b/internal/cli/deps/cmd/root/run.go
@@ -7,19 +7,21 @@
 package root
 
 import (
-	"fmt"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/fmt"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/cli/deps/core"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // Run executes the deps command logic.
 //
 // Parameters:
 //   - cmd: Cobra command for output stream
-//   - format: Output format ("mermaid", "table", or "json")
+//   - format: Output format (config.FormatMermaid, config.FormatTable, or config.FormatJSON)
 //   - external: If true, include external module dependencies
 //   - projType: Force project type override; empty for auto-detect
 //
@@ -27,24 +29,26 @@ import (
 //   - error: Non-nil if format is invalid, project type unknown,
 //     or graph building fails
 func Run(cmd *cobra.Command, format string, external bool, projType string) error {
+	supportedFormats := strings.Join([]string{
+		fmt.FormatMermaid, fmt.FormatTable, fmt.FormatJSON,
+	}, ", ")
+
 	switch format {
-	case "mermaid", "table", "json":
+	case fmt.FormatMermaid, fmt.FormatTable, fmt.FormatJSON:
 	default:
-		return fmt.Errorf("unknown format %q (supported: mermaid, table, json)", format)
+		return ctxerr.UnknownFormat(format, supportedFormats)
 	}
 
 	var builder core.GraphBuilder
 	if projType != "" {
 		builder = core.FindBuilder(projType)
 		if builder == nil {
-			return fmt.Errorf("unknown project type %q (supported: %s)", projType, strings.Join(core.BuilderNames(), ", "))
+			return ctxerr.UnknownProjectType(projType, strings.Join(core.BuilderNames(), ", "))
 		}
 	} else {
 		builder = core.DetectBuilder()
 		if builder == nil {
-			cmd.Println("No supported project detected.")
-			cmd.Println("Looking for: go.mod, package.json, requirements.txt, pyproject.toml, Cargo.toml")
-			cmd.Println("Use --type to force: " + strings.Join(core.BuilderNames(), ", "))
+			write.InfoDepsNoProject(cmd, strings.Join(core.BuilderNames(), ", "))
 			return nil
 		}
 	}
@@ -55,16 +59,16 @@ func Run(cmd *cobra.Command, format string, external bool, projType string) erro
 	}
 
 	if len(graph) == 0 {
-		cmd.Println("No dependencies found.")
+		write.InfoDepsNoDeps(cmd)
 		return nil
 	}
 
 	switch format {
-	case "mermaid":
+	case fmt.FormatMermaid:
 		cmd.Print(core.RenderMermaid(graph))
-	case "table":
+	case fmt.FormatTable:
 		cmd.Print(core.RenderTable(graph))
-	case "json":
+	case fmt.FormatJSON:
 		cmd.Print(core.RenderJSON(graph))
 	}
 
diff --git a/internal/cli/deps/core/builder.go b/internal/cli/deps/core/builder.go
index 910bad32..d75e2654 100644
--- a/internal/cli/deps/core/builder.go
+++ b/internal/cli/deps/core/builder.go
@@ -22,7 +22,7 @@ type GraphBuilder interface {
 }
 
 // Builders is the ordered registry of graph builders.
-// Detection walks this list; first match wins.
+// Detection walks this list; the first match wins.
 var Builders = []GraphBuilder{
 	&GoBuilder{},
 	&NodeBuilder{},
diff --git a/internal/cli/deps/core/format.go b/internal/cli/deps/core/format.go
index 2ebbf84b..e80ba971 100644
--- a/internal/cli/deps/core/format.go
+++ b/internal/cli/deps/core/format.go
@@ -12,7 +12,7 @@ import (
 	"sort"
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // MermaidID converts a package path to a valid Mermaid node ID.
@@ -83,7 +83,7 @@ func RenderTable(graph map[string][]string) string {
 //   - string: Pretty-printed JSON
 func RenderJSON(graph map[string][]string) string {
 	data, _ := json.MarshalIndent(graph, "", "  ")
-	return string(data) + config.NewlineLF
+	return string(data) + token.NewlineLF
 }
 
 // SortedKeys returns the keys of a map sorted alphabetically.
diff --git a/internal/cli/deps/core/python.go b/internal/cli/deps/core/python.go
index 09dace2c..7254d119 100644
--- a/internal/cli/deps/core/python.go
+++ b/internal/cli/deps/core/python.go
@@ -12,7 +12,7 @@ import (
 	"sort"
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // PythonEcosystem is the ecosystem label for Python projects.
@@ -192,7 +192,7 @@ func BuildPyprojectGraph(includeDevDeps bool) (map[string][]string, error) {
 // Returns:
 //   - []string: Extracted dependency names
 func ParsePyprojectDeps(content string, sectionSuffix string) []string {
-	lines := strings.Split(content, config.NewlineLF)
+	lines := strings.Split(content, token.NewlineLF)
 	var deps []string
 	inSection := false
 	inArray := false
diff --git a/internal/cli/deps/deps.go b/internal/cli/deps/deps.go
index 130c9896..5ac2ea20 100644
--- a/internal/cli/deps/deps.go
+++ b/internal/cli/deps/deps.go
@@ -9,39 +9,10 @@ package deps
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	depsroot "github.com/ActiveMemory/ctx/internal/cli/deps/cmd/root"
 )
 
 // Cmd returns the deps command.
-//
-// Flags:
-//   - --format: Output format (mermaid, table, json)
-//   - --external: Include external module dependencies
-//   - --type: Force project type override
-//
-// Returns:
-//   - *cobra.Command: Configured deps command with flags registered
 func Cmd() *cobra.Command {
-	var (
-		format   string
-		external bool
-		projType string
-	)
-
-	short, long := assets.CommandDesc("deps")
-	cmd := &cobra.Command{
-		Use:   "deps",
-		Short: short,
-		Long:  long,
-		RunE: func(cmd *cobra.Command, _ []string) error {
-			return depsroot.Run(cmd, format, external, projType)
-		},
-	}
-
-	cmd.Flags().StringVar(&format, "format", "mermaid", assets.FlagDesc("deps.format"))
-	cmd.Flags().BoolVar(&external, "external", false, assets.FlagDesc("deps.external"))
-	cmd.Flags().StringVar(&projType, "type", "", assets.FlagDesc("deps.type"))
-
-	return cmd
+	return depsroot.Cmd()
 }
diff --git a/internal/cli/doc.go b/internal/cli/doc.go
new file mode 100644
index 00000000..1f80ae1c
--- /dev/null
+++ b/internal/cli/doc.go
@@ -0,0 +1,9 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package cli contains integration tests that verify the assembled CLI
+// binary behaves correctly end-to-end.
+package cli
diff --git a/internal/cli/doctor/cmd/root/cmd.go b/internal/cli/doctor/cmd/root/cmd.go
index d37451a9..3d117135 100644
--- a/internal/cli/doctor/cmd/root/cmd.go
+++ b/internal/cli/doctor/cmd/root/cmd.go
@@ -5,3 +5,33 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx doctor" command.
+//
+// Flags:
+//   - --json, -j: Machine-readable JSON output
+//
+// Returns:
+//   - *cobra.Command: Configured doctor command with flags registered
+func Cmd() *cobra.Command {
+	short, long := assets.CommandDesc(assets.CmdDescKeyDoctor)
+	cmd := &cobra.Command{
+		Use:         "doctor",
+		Short:       short,
+		Annotations: map[string]string{cli.AnnotationSkipInit: cli.AnnotationTrue},
+		Long:        long,
+		RunE: func(cmd *cobra.Command, _ []string) error {
+			jsonOut, _ := cmd.Flags().GetBool("json")
+			return Run(cmd, jsonOut)
+		},
+	}
+	cmd.Flags().BoolP("json", "j", false, assets.FlagDesc(assets.FlagDescKeyDoctorJson))
+	return cmd
+}
diff --git a/internal/cli/doctor/cmd/root/doc.go b/internal/cli/doctor/cmd/root/doc.go
new file mode 100644
index 00000000..081eb2b4
--- /dev/null
+++ b/internal/cli/doctor/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx doctor command.
+//
+// It performs a structural health check on the .context/ directory,
+// verifying required files, formatting, and configuration integrity.
+package root
diff --git a/internal/cli/doctor/core/checks.go b/internal/cli/doctor/core/checks.go
index 7002387f..7e5df222 100644
--- a/internal/cli/doctor/core/checks.go
+++ b/internal/cli/doctor/core/checks.go
@@ -13,8 +13,16 @@ import (
 	"path/filepath"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/initialize"
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/crypto"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/doctor"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	"github.com/ActiveMemory/ctx/internal/config/reminder"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/drift"
 	"github.com/ActiveMemory/ctx/internal/eventlog"
@@ -29,17 +37,17 @@ import (
 func CheckContextInitialized(report *Report) {
 	if context.Exists("") {
 		report.Results = append(report.Results, Result{
-			Name:     "context_initialized",
-			Category: "Structure",
+			Name:     doctor.CheckContextInit,
+			Category: doctor.CategoryStructure,
 			Status:   StatusOK,
-			Message:  "Context initialized (.context/)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorContextInitializedOk),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "context_initialized",
-			Category: "Structure",
+			Name:     doctor.CheckContextInit,
+			Category: doctor.CategoryStructure,
 			Status:   StatusError,
-			Message:  "Context not initialized — run ctx init",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorContextInitializedError),
 		})
 	}
 }
@@ -51,29 +59,29 @@ func CheckContextInitialized(report *Report) {
 func CheckRequiredFiles(report *Report) {
 	dir := rc.ContextDir()
 	var missing []string
-	for _, f := range config.FilesRequired {
+	for _, f := range ctx.FilesRequired {
 		path := filepath.Join(dir, f)
 		if _, statErr := os.Stat(path); os.IsNotExist(statErr) {
 			missing = append(missing, f)
 		}
 	}
 
-	total := len(config.FilesRequired)
+	total := len(ctx.FilesRequired)
 	present := total - len(missing)
 
 	if len(missing) == 0 {
 		report.Results = append(report.Results, Result{
-			Name:     "required_files",
-			Category: "Structure",
+			Name:     doctor.CheckRequiredFiles,
+			Category: doctor.CategoryStructure,
 			Status:   StatusOK,
-			Message:  fmt.Sprintf("Required files present (%d/%d)", present, total),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorRequiredFilesOk), present, total),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "required_files",
-			Category: "Structure",
+			Name:     doctor.CheckRequiredFiles,
+			Category: doctor.CategoryStructure,
 			Status:   StatusError,
-			Message:  fmt.Sprintf("Missing required files (%d/%d): %s", present, total, strings.Join(missing, ", ")),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorRequiredFilesError), present, total, strings.Join(missing, ", ")),
 		})
 	}
 }
@@ -83,14 +91,14 @@ func CheckRequiredFiles(report *Report) {
 // Parameters:
 //   - report: Report to append the result to
 func CheckCtxrcValidation(report *Report) {
-	data, readErr := os.ReadFile(config.FileContextRC) //nolint:gosec // project-local config file
+	data, readErr := os.ReadFile(file.CtxRC) //nolint:gosec // project-local config file
 	if readErr != nil {
 		// No .ctxrc is fine — defaults are used.
 		report.Results = append(report.Results, Result{
-			Name:     "ctxrc_validation",
-			Category: "Structure",
+			Name:     doctor.CheckCtxrcValidation,
+			Category: doctor.CategoryStructure,
 			Status:   StatusOK,
-			Message:  "No .ctxrc file (using defaults)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorCtxrcValidationOkNoFile),
 		})
 		return
 	}
@@ -98,29 +106,29 @@ func CheckCtxrcValidation(report *Report) {
 	warnings, validateErr := rc.Validate(data)
 	if validateErr != nil {
 		report.Results = append(report.Results, Result{
-			Name:     "ctxrc_validation",
-			Category: "Structure",
+			Name:     doctor.CheckCtxrcValidation,
+			Category: doctor.CategoryStructure,
 			Status:   StatusError,
-			Message:  fmt.Sprintf(".ctxrc parse error: %v", validateErr),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorCtxrcValidationError), validateErr),
 		})
 		return
 	}
 
 	if len(warnings) > 0 {
 		report.Results = append(report.Results, Result{
-			Name:     "ctxrc_validation",
-			Category: "Structure",
+			Name:     doctor.CheckCtxrcValidation,
+			Category: doctor.CategoryStructure,
 			Status:   StatusWarning,
-			Message:  fmt.Sprintf(".ctxrc has unknown fields: %s", strings.Join(warnings, "; ")),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorCtxrcValidationWarning), strings.Join(warnings, "; ")),
 		})
 		return
 	}
 
 	report.Results = append(report.Results, Result{
-		Name:     "ctxrc_validation",
-		Category: "Structure",
+		Name:     doctor.CheckCtxrcValidation,
+		Category: doctor.CategoryStructure,
 		Status:   StatusOK,
-		Message:  ".ctxrc valid",
+		Message:  assets.TextDesc(assets.TextDescKeyDoctorCtxrcValidationOk),
 	})
 }
 
@@ -136,10 +144,10 @@ func CheckDrift(report *Report) {
 	ctx, loadErr := context.Load("")
 	if loadErr != nil {
 		report.Results = append(report.Results, Result{
-			Name:     "drift",
-			Category: "Quality",
+			Name:     doctor.CheckDrift,
+			Category: doctor.CategoryQuality,
 			Status:   StatusWarning,
-			Message:  fmt.Sprintf("Could not load context for drift check: %v", loadErr),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorDriftWarningLoad), loadErr),
 		})
 		return
 	}
@@ -150,20 +158,20 @@ func CheckDrift(report *Report) {
 
 	if warnCount == 0 && violCount == 0 {
 		report.Results = append(report.Results, Result{
-			Name:     "drift",
-			Category: "Quality",
+			Name:     doctor.CheckDrift,
+			Category: doctor.CategoryQuality,
 			Status:   StatusOK,
-			Message:  "No drift detected",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorDriftOk),
 		})
 		return
 	}
 
 	var parts []string
 	if violCount > 0 {
-		parts = append(parts, fmt.Sprintf("%d violations", violCount))
+		parts = append(parts, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorDriftViolations), violCount))
 	}
 	if warnCount > 0 {
-		parts = append(parts, fmt.Sprintf("%d warnings", warnCount))
+		parts = append(parts, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorDriftWarnings), warnCount))
 	}
 
 	status := StatusWarning
@@ -172,10 +180,10 @@ func CheckDrift(report *Report) {
 	}
 
 	report.Results = append(report.Results, Result{
-		Name:     "drift",
-		Category: "Quality",
+		Name:     doctor.CheckDrift,
+		Category: doctor.CategoryQuality,
 		Status:   status,
-		Message:  fmt.Sprintf("Drift: %s — run ctx drift for details", strings.Join(parts, ", ")),
+		Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorDriftDetected), strings.Join(parts, ", ")),
 	})
 }
 
@@ -187,19 +195,19 @@ func CheckPluginEnablement(report *Report) {
 	installed := initialize.PluginInstalled()
 	if !installed {
 		report.Results = append(report.Results, Result{
-			Name:     "plugin_installed",
-			Category: "Plugin",
+			Name:     doctor.CheckPluginInstalled,
+			Category: doctor.CategoryPlugin,
 			Status:   StatusInfo,
-			Message:  "ctx plugin not installed",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorPluginInstalledInfo),
 		})
 		return
 	}
 
 	report.Results = append(report.Results, Result{
-		Name:     "plugin_installed",
-		Category: "Plugin",
+		Name:     doctor.CheckPluginInstalled,
+		Category: doctor.CategoryPlugin,
 		Status:   StatusOK,
-		Message:  "ctx plugin installed",
+		Message:  assets.TextDesc(assets.TextDescKeyDoctorPluginInstalledOk),
 	})
 
 	globalEnabled := initialize.PluginEnabledGlobally()
@@ -207,30 +215,28 @@ func CheckPluginEnablement(report *Report) {
 
 	if globalEnabled {
 		report.Results = append(report.Results, Result{
-			Name:     "plugin_enabled_global",
-			Category: "Plugin",
+			Name:     doctor.CheckPluginEnabledGlobal,
+			Category: doctor.CategoryPlugin,
 			Status:   StatusOK,
-			Message:  "Plugin enabled globally (~/.claude/settings.json)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorPluginEnabledGlobalOk),
 		})
 	}
 
 	if localEnabled {
 		report.Results = append(report.Results, Result{
-			Name:     "plugin_enabled_local",
-			Category: "Plugin",
+			Name:     doctor.CheckPluginEnabledLocal,
+			Category: doctor.CategoryPlugin,
 			Status:   StatusOK,
-			Message:  "Plugin enabled locally (.claude/settings.local.json)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorPluginEnabledLocalOk),
 		})
 	}
 
 	if !globalEnabled && !localEnabled {
 		report.Results = append(report.Results, Result{
-			Name:     "plugin_enabled",
-			Category: "Plugin",
+			Name:     doctor.CheckPluginEnabled,
+			Category: doctor.CategoryPlugin,
 			Status:   StatusWarning,
-			Message: "Plugin installed but not enabled — run 'ctx init' to auto-enable, " +
-				"or add {\"enabledPlugins\": {\"" + config.PluginID +
-				"\": true}} to ~/.claude/settings.json",
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorPluginEnabledWarning), claude.PluginID),
 		})
 	}
 }
@@ -242,17 +248,17 @@ func CheckPluginEnablement(report *Report) {
 func CheckEventLogging(report *Report) {
 	if rc.EventLog() {
 		report.Results = append(report.Results, Result{
-			Name:     "event_logging",
-			Category: "Hooks",
+			Name:     doctor.CheckEventLogging,
+			Category: doctor.CategoryHooks,
 			Status:   StatusOK,
-			Message:  "Event logging enabled",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorEventLoggingOk),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "event_logging",
-			Category: "Hooks",
+			Name:     doctor.CheckEventLogging,
+			Category: doctor.CategoryHooks,
 			Status:   StatusInfo,
-			Message:  "Event logging disabled (enable with event_log: true in .ctxrc)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorEventLoggingInfo),
 		})
 	}
 }
@@ -263,20 +269,20 @@ func CheckEventLogging(report *Report) {
 //   - report: Report to append the result to
 func CheckWebhook(report *Report) {
 	dir := rc.ContextDir()
-	encPath := filepath.Join(dir, ".notify.enc")
+	encPath := filepath.Join(dir, crypto.NotifyEnc)
 	if _, statErr := os.Stat(encPath); statErr == nil {
 		report.Results = append(report.Results, Result{
-			Name:     "webhook",
-			Category: "Hooks",
+			Name:     doctor.CheckWebhook,
+			Category: doctor.CategoryHooks,
 			Status:   StatusOK,
-			Message:  "Webhook configured",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorWebhookOk),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "webhook",
-			Category: "Hooks",
+			Name:     doctor.CheckWebhook,
+			Category: doctor.CategoryHooks,
 			Status:   StatusInfo,
-			Message:  "No webhook configured (optional — use ctx notify setup)",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorWebhookInfo),
 		})
 	}
 }
@@ -287,14 +293,14 @@ func CheckWebhook(report *Report) {
 //   - report: Report to append the result to
 func CheckReminders(report *Report) {
 	dir := rc.ContextDir()
-	remindersPath := filepath.Join(dir, "reminders.json")
+	remindersPath := filepath.Join(dir, reminder.Reminders)
 	data, readErr := os.ReadFile(remindersPath) //nolint:gosec // project-local path
 	if readErr != nil {
 		report.Results = append(report.Results, Result{
-			Name:     "reminders",
-			Category: "State",
+			Name:     doctor.CheckReminders,
+			Category: doctor.CategoryState,
 			Status:   StatusOK,
-			Message:  "No pending reminders",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorRemindersOk),
 		})
 		return
 	}
@@ -302,10 +308,10 @@ func CheckReminders(report *Report) {
 	var reminders []any
 	if unmarshalErr := json.Unmarshal(data, &reminders); unmarshalErr != nil {
 		report.Results = append(report.Results, Result{
-			Name:     "reminders",
-			Category: "State",
+			Name:     doctor.CheckReminders,
+			Category: doctor.CategoryState,
 			Status:   StatusOK,
-			Message:  "No pending reminders",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorRemindersOk),
 		})
 		return
 	}
@@ -313,17 +319,17 @@ func CheckReminders(report *Report) {
 	count := len(reminders)
 	if count == 0 {
 		report.Results = append(report.Results, Result{
-			Name:     "reminders",
-			Category: "State",
+			Name:     doctor.CheckReminders,
+			Category: doctor.CategoryState,
 			Status:   StatusOK,
-			Message:  "No pending reminders",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorRemindersOk),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "reminders",
-			Category: "State",
+			Name:     doctor.CheckReminders,
+			Category: doctor.CategoryState,
 			Status:   StatusInfo,
-			Message:  fmt.Sprintf("%d pending reminders", count),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorRemindersInfo), count),
 		})
 	}
 }
@@ -334,16 +340,16 @@ func CheckReminders(report *Report) {
 //   - report: Report to append the result to
 func CheckTaskCompletion(report *Report) {
 	dir := rc.ContextDir()
-	tasksPath := filepath.Join(dir, config.FileTask)
+	tasksPath := filepath.Join(dir, ctx.Task)
 	data, readErr := os.ReadFile(tasksPath) //nolint:gosec // project-local path
 	if readErr != nil {
 		return // no tasks file, skip
 	}
 
-	matches := config.RegExTaskMultiline.FindAllStringSubmatch(string(data), -1)
+	matches := regex.TaskMultiline.FindAllStringSubmatch(string(data), -1)
 	var completed, pending int
 	for _, m := range matches {
-		if len(m) > 2 && m[2] == config.MarkTaskComplete {
+		if len(m) > 2 && m[2] == marker.MarkTaskComplete {
 			completed++
 		} else {
 			pending++
@@ -356,19 +362,19 @@ func CheckTaskCompletion(report *Report) {
 	}
 
 	ratio := completed * 100 / total
-	msg := fmt.Sprintf("Tasks: %d/%d completed (%d%%)", completed, total, ratio)
+	msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorTaskCompletionFormat), completed, total, ratio)
 
 	if ratio >= 80 && completed > 5 {
 		report.Results = append(report.Results, Result{
-			Name:     "task_completion",
-			Category: "State",
+			Name:     doctor.CheckTaskCompletion,
+			Category: doctor.CategoryState,
 			Status:   StatusWarning,
-			Message:  msg + " — consider archiving with ctx tasks archive",
+			Message:  msg + assets.TextDesc(assets.TextDescKeyDoctorTaskCompletionWarningSuffix),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "task_completion",
-			Category: "State",
+			Name:     doctor.CheckTaskCompletion,
+			Category: doctor.CategoryState,
 			Status:   StatusOK,
 			Message:  msg,
 		})
@@ -380,11 +386,11 @@ func CheckTaskCompletion(report *Report) {
 // Parameters:
 //   - report: Report to append the result to
 func CheckContextTokenSize(report *Report) {
-	// Only count files in FileReadOrder — these are the files actually
+	// Only count files in ReadOrder — these are the files actually
 	// loaded into agent context. Other .md files (DETAILED_DESIGN.md,
 	// map-tracking, etc.) exist on disk but aren't injected.
-	indexed := make(map[string]bool, len(config.FileReadOrder))
-	for _, f := range config.FileReadOrder {
+	indexed := make(map[string]bool, len(ctx.ReadOrder))
+	for _, f := range ctx.ReadOrder {
 		indexed[f] = true
 	}
 
@@ -410,20 +416,20 @@ func CheckContextTokenSize(report *Report) {
 	}
 
 	window := rc.ContextWindow()
-	msg := fmt.Sprintf("Context size: ~%d tokens (window: %d)", totalTokens, window)
+	msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorContextSizeFormat), totalTokens, window)
 
 	warnThreshold := window / 5 // 20% of context window
 	if totalTokens > warnThreshold {
 		report.Results = append(report.Results, Result{
-			Name:     "context_size",
-			Category: "Size",
+			Name:     doctor.CheckContextSize,
+			Category: doctor.CategorySize,
 			Status:   StatusWarning,
-			Message:  msg + " — consider ctx compact",
+			Message:  msg + assets.TextDesc(assets.TextDescKeyDoctorContextSizeWarningSuffix),
 		})
 	} else {
 		report.Results = append(report.Results, Result{
-			Name:     "context_size",
-			Category: "Size",
+			Name:     doctor.CheckContextSize,
+			Category: doctor.CategorySize,
 			Status:   StatusOK,
 			Message:  msg,
 		})
@@ -432,10 +438,10 @@ func CheckContextTokenSize(report *Report) {
 	// Add per-file breakdown as info results.
 	for _, ft := range breakdown {
 		report.Results = append(report.Results, Result{
-			Name:     "context_file_" + ft.name,
-			Category: "Size",
+			Name:     doctor.CheckContextFilePrefix + ft.name,
+			Category: doctor.CategorySize,
 			Status:   StatusInfo,
-			Message:  fmt.Sprintf("%-22s ~%d tokens", ft.name, ft.tokens),
+			Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorContextFileFormat), ft.name, ft.tokens),
 		})
 	}
 }
@@ -452,19 +458,19 @@ func CheckRecentEventActivity(report *Report) {
 	events, queryErr := eventlog.Query(eventlog.QueryOpts{Last: 1})
 	if queryErr != nil || len(events) == 0 {
 		report.Results = append(report.Results, Result{
-			Name:     "recent_events",
-			Category: "Events",
+			Name:     doctor.CheckRecentEvents,
+			Category: doctor.CategoryEvents,
 			Status:   StatusInfo,
-			Message:  "No events in log",
+			Message:  assets.TextDesc(assets.TextDescKeyDoctorRecentEventsInfo),
 		})
 		return
 	}
 
 	report.Results = append(report.Results, Result{
-		Name:     "recent_events",
-		Category: "Events",
+		Name:     doctor.CheckRecentEvents,
+		Category: doctor.CategoryEvents,
 		Status:   StatusOK,
-		Message:  fmt.Sprintf("Last event: %s", events[len(events)-1].Timestamp),
+		Message:  fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorRecentEventsOk), events[len(events)-1].Timestamp),
 	})
 }
 
@@ -495,13 +501,13 @@ func AddResourceResults(report *Report, snap sysinfo.Snapshot) {
 	// Memory.
 	if snap.Memory.Supported && snap.Memory.TotalBytes > 0 {
 		pct := ResourcePct(snap.Memory.UsedBytes, snap.Memory.TotalBytes)
-		msg := fmt.Sprintf("Memory %d%% (%s / %s GB)",
+		msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorResourceMemoryFormat),
 			pct,
 			sysinfo.FormatGiB(snap.Memory.UsedBytes),
 			sysinfo.FormatGiB(snap.Memory.TotalBytes))
 		report.Results = append(report.Results, Result{
-			Name:     "resource_memory",
-			Category: "Resources",
+			Name:     doctor.CheckResourceMemory,
+			Category: doctor.CategoryResources,
 			Status:   SeverityToStatus(sevMap["memory"]),
 			Message:  msg,
 		})
@@ -510,13 +516,13 @@ func AddResourceResults(report *Report, snap sysinfo.Snapshot) {
 	// Swap (only when swap is configured).
 	if snap.Memory.Supported && snap.Memory.SwapTotalBytes > 0 {
 		pct := ResourcePct(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes)
-		msg := fmt.Sprintf("Swap %d%% (%s / %s GB)",
+		msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorResourceSwapFormat),
 			pct,
 			sysinfo.FormatGiB(snap.Memory.SwapUsedBytes),
 			sysinfo.FormatGiB(snap.Memory.SwapTotalBytes))
 		report.Results = append(report.Results, Result{
-			Name:     "resource_swap",
-			Category: "Resources",
+			Name:     doctor.CheckResourceSwap,
+			Category: doctor.CategoryResources,
 			Status:   SeverityToStatus(sevMap["swap"]),
 			Message:  msg,
 		})
@@ -525,13 +531,13 @@ func AddResourceResults(report *Report, snap sysinfo.Snapshot) {
 	// Disk.
 	if snap.Disk.Supported && snap.Disk.TotalBytes > 0 {
 		pct := ResourcePct(snap.Disk.UsedBytes, snap.Disk.TotalBytes)
-		msg := fmt.Sprintf("Disk %d%% (%s / %s GB)",
+		msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorResourceDiskFormat),
 			pct,
 			sysinfo.FormatGiB(snap.Disk.UsedBytes),
 			sysinfo.FormatGiB(snap.Disk.TotalBytes))
 		report.Results = append(report.Results, Result{
-			Name:     "resource_disk",
-			Category: "Resources",
+			Name:     doctor.CheckResourceDisk,
+			Category: doctor.CategoryResources,
 			Status:   SeverityToStatus(sevMap["disk"]),
 			Message:  msg,
 		})
@@ -540,11 +546,11 @@ func AddResourceResults(report *Report, snap sysinfo.Snapshot) {
 	// Load (1-minute average relative to CPU count).
 	if snap.Load.Supported && snap.Load.NumCPU > 0 {
 		ratio := snap.Load.Load1 / float64(snap.Load.NumCPU)
-		msg := fmt.Sprintf("Load %.2fx (%.1f / %d CPUs)",
+		msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorResourceLoadFormat),
 			ratio, snap.Load.Load1, snap.Load.NumCPU)
 		report.Results = append(report.Results, Result{
-			Name:     "resource_load",
-			Category: "Resources",
+			Name:     doctor.CheckResourceLoad,
+			Category: doctor.CategoryResources,
 			Status:   SeverityToStatus(sevMap["load"]),
 			Message:  msg,
 		})
diff --git a/internal/cli/doctor/core/output.go b/internal/cli/doctor/core/output.go
index 95d6b8ca..83c38f33 100644
--- a/internal/cli/doctor/core/output.go
+++ b/internal/cli/doctor/core/output.go
@@ -10,6 +10,8 @@ import (
 	"encoding/json"
 	"fmt"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/doctor"
 	"github.com/spf13/cobra"
 )
 
@@ -39,12 +41,21 @@ func OutputJSON(cmd *cobra.Command, report *Report) error {
 // Returns:
 //   - error: Always nil (satisfies interface)
 func OutputHuman(cmd *cobra.Command, report *Report) error {
-	cmd.Println("ctx doctor")
-	cmd.Println("==========")
+	cmd.Println(assets.TextDesc(assets.TextDescKeyDoctorOutputHeader))
+	cmd.Println(assets.TextDesc(assets.TextDescKeyDoctorOutputSeparator))
 	cmd.Println()
 
 	// Group by category.
-	categories := []string{"Structure", "Quality", "Plugin", "Hooks", "State", "Size", "Resources", "Events"}
+	categories := []string{
+		doctor.CategoryStructure,
+		doctor.CategoryQuality,
+		doctor.CategoryPlugin,
+		doctor.CategoryHooks,
+		doctor.CategoryState,
+		doctor.CategorySize,
+		doctor.CategoryResources,
+		doctor.CategoryEvents,
+	}
 	grouped := make(map[string][]Result)
 	for _, r := range report.Results {
 		grouped[r.Category] = append(grouped[r.Category], r)
@@ -58,26 +69,32 @@ func OutputHuman(cmd *cobra.Command, report *Report) error {
 		cmd.Println(cat)
 		for _, r := range results {
 			icon := statusIcon(r.Status)
-			cmd.Println(fmt.Sprintf("  %s %s", icon, r.Message))
+			cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorOutputResultLine), icon, r.Message))
 		}
 		cmd.Println()
 	}
 
-	cmd.Println(fmt.Sprintf("Summary: %d warnings, %d errors", report.Warnings, report.Errors))
+	cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDoctorOutputSummary), report.Warnings, report.Errors))
 	return nil
 }
 
 // statusIcon returns a unicode icon for the given status string.
+//
+// Parameters:
+//   - status: One of StatusOK, StatusWarning, StatusError, or StatusInfo
+//
+// Returns:
+//   - string: A single unicode character representing the status
 func statusIcon(status string) string {
 	switch status {
 	case StatusOK:
-		return "\u2713" // check mark
+		return "✓"
 	case StatusWarning:
-		return "\u26a0" // warning sign
+		return "⚠"
 	case StatusError:
-		return "\u2717" // ballot x
+		return "✗"
 	case StatusInfo:
-		return "\u25cb" // white circle
+		return "○"
 	default:
 		return "?"
 	}
diff --git a/internal/cli/doctor/core/types.go b/internal/cli/doctor/core/types.go
index 4a1c8940..bd3f43c8 100644
--- a/internal/cli/doctor/core/types.go
+++ b/internal/cli/doctor/core/types.go
@@ -6,12 +6,16 @@
 
 package core
 
-// Status constants for check results.
+import (
+	"github.com/ActiveMemory/ctx/internal/config/stats"
+)
+
+// Status constants — aliased from config for local use.
 const (
-	StatusOK      = "ok"
-	StatusWarning = "warning"
-	StatusError   = "error"
-	StatusInfo    = "info"
+	StatusOK      = stats.StatusOK
+	StatusWarning = stats.StatusWarning
+	StatusError   = stats.StatusError
+	StatusInfo    = stats.StatusInfo
 )
 
 // Result represents a single check outcome.
diff --git a/internal/cli/doctor/doctor.go b/internal/cli/doctor/doctor.go
index 513da3a4..6b0ba546 100644
--- a/internal/cli/doctor/doctor.go
+++ b/internal/cli/doctor/doctor.go
@@ -11,30 +11,10 @@ package doctor
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/doctor/cmd/root"
-	"github.com/ActiveMemory/ctx/internal/config"
 )
 
 // Cmd returns the "ctx doctor" command.
-//
-// Flags:
-//   - --json, -j: Machine-readable JSON output
-//
-// Returns:
-//   - *cobra.Command: Configured doctor command with flags registered
 func Cmd() *cobra.Command {
-	short, long := assets.CommandDesc("doctor")
-	cmd := &cobra.Command{
-		Use:         "doctor",
-		Short:       short,
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
-		Long:        long,
-		RunE: func(cmd *cobra.Command, _ []string) error {
-			jsonOut, _ := cmd.Flags().GetBool("json")
-			return root.Run(cmd, jsonOut)
-		},
-	}
-	cmd.Flags().BoolP("json", "j", false, assets.FlagDesc("doctor.json"))
-	return cmd
+	return root.Cmd()
 }
diff --git a/internal/cli/doctor/doctor_test.go b/internal/cli/doctor/doctor_test.go
index cfeadadf..730056cd 100644
--- a/internal/cli/doctor/doctor_test.go
+++ b/internal/cli/doctor/doctor_test.go
@@ -15,7 +15,9 @@ import (
 	"testing"
 
 	"github.com/ActiveMemory/ctx/internal/cli/doctor/core"
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/doctor"
 	"github.com/ActiveMemory/ctx/internal/rc"
 	"github.com/ActiveMemory/ctx/internal/sysinfo"
 )
@@ -27,7 +29,7 @@ func setupContextDir(t *testing.T) string {
 	rc.Reset()
 
 	// Create required files.
-	for _, f := range config.FilesRequired {
+	for _, f := range ctx.FilesRequired {
 		path := filepath.Join(dir, f)
 		if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil {
 			t.Fatal(writeErr)
@@ -126,7 +128,7 @@ func TestDoctor_HighCompletion(t *testing.T) {
 		tasks += "- [x] Completed task\n"
 	}
 	tasks += "- [ ] Pending task\n"
-	tasksPath := filepath.Join(dir, config.FileTask)
+	tasksPath := filepath.Join(dir, ctx.Task)
 	if writeErr := os.WriteFile(tasksPath, []byte(tasks), 0o600); writeErr != nil {
 		t.Fatal(writeErr)
 	}
@@ -151,7 +153,7 @@ func TestDoctor_ContextSizeBreakdown(t *testing.T) {
 	if writeErr := os.WriteFile(archPath, []byte(strings.Repeat("word ", 500)), 0o600); writeErr != nil {
 		t.Fatal(writeErr)
 	}
-	tasksPath := filepath.Join(dir, config.FileTask)
+	tasksPath := filepath.Join(dir, ctx.Task)
 	if writeErr := os.WriteFile(tasksPath, []byte(strings.Repeat("task ", 200)), 0o600); writeErr != nil {
 		t.Fatal(writeErr)
 	}
@@ -247,7 +249,7 @@ func TestDoctor_PluginInstalledNotEnabled(t *testing.T) {
 	pluginsData := map[string]any{
 		"version": 2,
 		"plugins": map[string]any{
-			config.PluginID: []map[string]string{
+			claude.PluginID: []map[string]string{
 				{"scope": "user", "version": "0.7.2"},
 			},
 		},
@@ -328,7 +330,7 @@ func TestAddResourceResults_AllHealthy(t *testing.T) {
 		if r.Status != core.StatusOK {
 			t.Errorf("result %s: expected ok, got %s", r.Name, r.Status)
 		}
-		if r.Category != "Resources" {
+		if r.Category != doctor.CategoryResources {
 			t.Errorf("result %s: expected Resources category, got %s", r.Name, r.Category)
 		}
 	}
diff --git a/internal/cli/drift/cmd/root/cmd.go b/internal/cli/drift/cmd/root/cmd.go
index d37451a9..0ed52ebe 100644
--- a/internal/cli/drift/cmd/root/cmd.go
+++ b/internal/cli/drift/cmd/root/cmd.go
@@ -5,3 +5,46 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx drift" command for detecting stale context.
+//
+// The command checks for broken path references, staleness indicators,
+// constitution violations, and missing required files.
+//
+// Flags:
+//   - --json: Output results as JSON for machine parsing
+//   - --fix: Auto-fix supported issues (staleness, missing_file)
+//
+// Returns:
+//   - *cobra.Command: Configured drift command with flags registered
+func Cmd() *cobra.Command {
+	var (
+		jsonOutput bool
+		fix        bool
+	)
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyDrift)
+	cmd := &cobra.Command{
+		Use:   "drift",
+		Short: short,
+		Long:  long,
+		RunE: func(cmd *cobra.Command, args []string) error {
+			return Run(cmd, jsonOutput, fix)
+		},
+	}
+
+	cmd.Flags().BoolVar(
+		&jsonOutput, "json", false, assets.FlagDesc(assets.FlagDescKeyDriftJson),
+	)
+	cmd.Flags().BoolVar(&fix,
+		"fix", false, assets.FlagDesc(assets.FlagDescKeyDriftFix),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/drift/cmd/root/doc.go b/internal/cli/drift/cmd/root/doc.go
new file mode 100644
index 00000000..053001a2
--- /dev/null
+++ b/internal/cli/drift/cmd/root/doc.go
@@ -0,0 +1,12 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx drift command.
+//
+// It detects stale or invalid context by checking for broken path
+// references, staleness indicators, constitution violations, and
+// missing required files.
+package root
diff --git a/internal/cli/drift/cmd/root/run.go b/internal/cli/drift/cmd/root/run.go
index 9df410af..46b82547 100644
--- a/internal/cli/drift/cmd/root/run.go
+++ b/internal/cli/drift/cmd/root/run.go
@@ -15,6 +15,7 @@ import (
 	"github.com/ActiveMemory/ctx/internal/cli/drift/core"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/drift"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
 // Run executes the drift command logic.
@@ -35,7 +36,7 @@ func Run(cmd *cobra.Command, jsonOutput, fix bool) error {
 	if err != nil {
 		var notFoundError *context.NotFoundError
 		if errors.As(err, ¬FoundError) {
-			return core.ErrNoContext()
+			return ctxerr.NotInitialized()
 		}
 		return err
 	}
diff --git a/internal/cli/drift/core/err.go b/internal/cli/drift/core/err.go
deleted file mode 100644
index 29b01482..00000000
--- a/internal/cli/drift/core/err.go
+++ /dev/null
@@ -1,77 +0,0 @@
-//   /    ctx:                         https://ctx.ist
-// ,'`./    do you remember?
-// `.,'\
-//   \    Copyright 2026-present Context contributors.
-//                 SPDX-License-Identifier: Apache-2.0
-
-package core
-
-import "fmt"
-
-// ErrTasksNotFound returns an error when TASKS.md is not in the context.
-//
-// Returns:
-//   - error: Descriptive error
-func ErrTasksNotFound() error {
-	return fmt.Errorf("TASKS.md not found")
-}
-
-// ErrNoCompletedTasks returns an error when there are no completed tasks to archive.
-//
-// Returns:
-//   - error: Descriptive error
-func ErrNoCompletedTasks() error {
-	return fmt.Errorf("no completed tasks to archive")
-}
-
-// ErrMkdir wraps a directory creation failure.
-//
-// Parameters:
-//   - path: Directory path that failed
-//   - err: Underlying error
-//
-// Returns:
-//   - error: Wrapped error with path context
-func ErrMkdir(path string, err error) error {
-	return fmt.Errorf("failed to create %s: %w", path, err)
-}
-
-// ErrFileWrite wraps a file write failure.
-//
-// Parameters:
-//   - path: File path that failed
-//   - err: Underlying error
-//
-// Returns:
-//   - error: Wrapped error with path context
-func ErrFileWrite(path string, err error) error {
-	return fmt.Errorf("failed to write %s: %w", path, err)
-}
-
-// ErrNoTemplate returns an error when no template is available for a file.
-//
-// Parameters:
-//   - filename: Name of the file without a template
-//   - err: Underlying error
-//
-// Returns:
-//   - error: Wrapped error with filename context
-func ErrNoTemplate(filename string, err error) error {
-	return fmt.Errorf("no template available for %s: %w", filename, err)
-}
-
-// ErrViolationsFound returns an error when drift violations are detected.
-//
-// Returns:
-//   - error: Descriptive error
-func ErrViolationsFound() error {
-	return fmt.Errorf("drift detection found violations")
-}
-
-// ErrNoContext returns an error when .context/ directory is not found.
-//
-// Returns:
-//   - error: Descriptive error
-func ErrNoContext() error {
-	return fmt.Errorf("no .context/ directory found. Run 'ctx init' first")
-}
diff --git a/internal/cli/drift/core/fix.go b/internal/cli/drift/core/fix.go
index e5d4f5ea..c6ef3f60 100644
--- a/internal/cli/drift/core/fix.go
+++ b/internal/cli/drift/core/fix.go
@@ -12,13 +12,18 @@ import (
 	"path/filepath"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/cli/compact"
-	"github.com/ActiveMemory/ctx/internal/config"
+	compactCore "github.com/ActiveMemory/ctx/internal/cli/compact/core"
+	ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/ActiveMemory/ctx/internal/context"
 	"github.com/ActiveMemory/ctx/internal/drift"
+	ctxErr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
 	"github.com/ActiveMemory/ctx/internal/task"
 )
@@ -104,13 +109,13 @@ func ApplyFixes(
 // Returns:
 //   - error: Non-nil if file operations fail
 func FixStaleness(cmd *cobra.Command, ctx *context.Context) error {
-	tasksFile := ctx.File(config.FileTask)
+	tasksFile := ctx.File(ctxCfg.Task)
 
 	if tasksFile == nil {
-		return ErrTasksNotFound()
+		return ctxErr.TaskFileNotFound()
 	}
 
-	nl := config.NewlineLF
+	nl := token.NewlineLF
 	content := string(tasksFile.Content)
 	lines := strings.Split(content, nl)
 
@@ -121,19 +126,19 @@ func FixStaleness(cmd *cobra.Command, ctx *context.Context) error {
 
 	for _, line := range lines {
 		// Track if we're in the Completed section
-		if strings.HasPrefix(line, config.HeadingCompleted) {
+		if strings.HasPrefix(line, assets.HeadingCompleted) {
 			inCompletedSection = true
 			newLines = append(newLines, line)
 			continue
 		}
 		if strings.HasPrefix(
-			line, config.HeadingLevelTwoStart,
+			line, token.HeadingLevelTwoStart,
 		) && inCompletedSection {
 			inCompletedSection = false
 		}
 
 		// Collect completed tasks from the Completed section for archiving
-		match := config.RegExTask.FindStringSubmatch(line)
+		match := regex.Task.FindStringSubmatch(line)
 		if inCompletedSection && match != nil && task.Completed(match) {
 			completedTasks = append(completedTasks, task.Content(match))
 			continue // Remove from the file
@@ -143,16 +148,16 @@ func FixStaleness(cmd *cobra.Command, ctx *context.Context) error {
 	}
 
 	if len(completedTasks) == 0 {
-		return ErrNoCompletedTasks()
+		return ctxErr.NoCompletedTasks()
 	}
 
 	// Build archive content
 	var archiveContent string
 	for _, t := range completedTasks {
-		archiveContent += config.PrefixTaskDone + " " + t + nl
+		archiveContent += marker.PrefixTaskDone + " " + t + nl
 	}
 
-	archiveFile, writeErr := compact.WriteArchive("tasks", config.HeadingArchivedTasks, archiveContent)
+	archiveFile, writeErr := compactCore.WriteArchive("tasks", assets.HeadingArchivedTasks, archiveContent)
 	if writeErr != nil {
 		return writeErr
 	}
@@ -160,9 +165,9 @@ func FixStaleness(cmd *cobra.Command, ctx *context.Context) error {
 	// Write updated TASKS.md
 	newContent := strings.Join(newLines, nl)
 	if writeErr := os.WriteFile(
-		tasksFile.Path, []byte(newContent), config.PermFile,
+		tasksFile.Path, []byte(newContent), fs.PermFile,
 	); writeErr != nil {
-		return ErrFileWrite(tasksFile.Path, writeErr)
+		return ctxErr.TaskFileWrite(writeErr)
 	}
 
 	cmd.Println(fmt.Sprintf("  Archived %d completed tasks to %s",
@@ -181,20 +186,20 @@ func FixStaleness(cmd *cobra.Command, ctx *context.Context) error {
 func FixMissingFile(filename string) error {
 	content, err := assets.Template(filename)
 	if err != nil {
-		return ErrNoTemplate(filename, err)
+		return ctxErr.NoTemplate(filename, err)
 	}
 
 	targetPath := filepath.Join(rc.ContextDir(), filename)
 
 	// Ensure .context/ directory exists
-	if mkErr := os.MkdirAll(rc.ContextDir(), config.PermExec); mkErr != nil {
-		return ErrMkdir(rc.ContextDir(), mkErr)
+	if mkErr := os.MkdirAll(rc.ContextDir(), fs.PermExec); mkErr != nil {
+		return ctxErr.Mkdir(rc.ContextDir(), mkErr)
 	}
 
 	if writeErr := os.WriteFile(
-		targetPath, content, config.PermFile,
+		targetPath, content, fs.PermFile,
 	); writeErr != nil {
-		return ErrFileWrite(targetPath, writeErr)
+		return ctxErr.FileWrite(targetPath, writeErr)
 	}
 
 	return nil
diff --git a/internal/cli/drift/core/out.go b/internal/cli/drift/core/out.go
index 50d0a023..3ef0e8b9 100644
--- a/internal/cli/drift/core/out.go
+++ b/internal/cli/drift/core/out.go
@@ -14,6 +14,7 @@ import (
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/drift"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 )
 
 // OutputDriftText writes the drift report as formatted text with colors.
@@ -116,7 +117,7 @@ func OutputDriftText(cmd *cobra.Command, report *drift.Report) error {
 	case drift.StatusViolation:
 		cmd.Println()
 		cmd.Println("Status: VIOLATION — Constitution violations detected")
-		return ErrViolationsFound()
+		return ctxerr.DriftViolations()
 	case drift.StatusWarning:
 		cmd.Println()
 		cmd.Println("Status: WARNING — Issues detected that should be addressed")
diff --git a/internal/cli/drift/drift.go b/internal/cli/drift/drift.go
index 75050e6f..8d2fa919 100644
--- a/internal/cli/drift/drift.go
+++ b/internal/cli/drift/drift.go
@@ -9,41 +9,10 @@ package drift
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	driftroot "github.com/ActiveMemory/ctx/internal/cli/drift/cmd/root"
 )
 
 // Cmd returns the "ctx drift" command for detecting stale context.
-//
-// The command checks for broken path references, staleness indicators,
-// constitution violations, and missing required files.
-//
-// Flags:
-//   - --json: Output results as JSON for machine parsing
-//   - --fix: Auto-fix supported issues (staleness, missing_file)
-//
-// Returns:
-//   - *cobra.Command: Configured drift command with flags registered
 func Cmd() *cobra.Command {
-	var (
-		jsonOutput bool
-		fix        bool
-	)
-
-	short, long := assets.CommandDesc("drift")
-	cmd := &cobra.Command{
-		Use:   "drift",
-		Short: short,
-		Long:  long,
-		RunE: func(cmd *cobra.Command, args []string) error {
-			return driftroot.Run(cmd, jsonOutput, fix)
-		},
-	}
-
-	cmd.Flags().BoolVar(&jsonOutput, "json", false, assets.FlagDesc("drift.json"))
-	cmd.Flags().BoolVar(&fix,
-		"fix", false, assets.FlagDesc("drift.fix"),
-	)
-
-	return cmd
+	return driftroot.Cmd()
 }
diff --git a/internal/cli/drift/drift_test.go b/internal/cli/drift/drift_test.go
index 0d5512e3..b9036738 100644
--- a/internal/cli/drift/drift_test.go
+++ b/internal/cli/drift/drift_test.go
@@ -15,7 +15,8 @@ import (
 	"testing"
 
 	"github.com/ActiveMemory/ctx/internal/cli/initialize"
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -104,7 +105,7 @@ func TestRunDrift_NoContext(t *testing.T) {
 	if runErr == nil {
 		t.Fatal("expected error when no .context/ exists")
 	}
-	if !strings.Contains(runErr.Error(), "no .context/ directory found") {
+	if !strings.Contains(runErr.Error(), "not initialized") {
 		t.Errorf("unexpected error: %v", runErr)
 	}
 }
@@ -144,7 +145,7 @@ func TestRunDrift_WithFix(t *testing.T) {
 	defer cleanup()
 
 	// Write TASKS.md with completed tasks to trigger staleness fix
-	tasksPath := filepath.Join(tmpDir, config.DirContext, config.FileTask)
+	tasksPath := filepath.Join(tmpDir, dir.Context, ctx.Task)
 	tasksContent := "# Tasks\n\n## In Progress\n\n- [ ] Do something\n\n## Completed\n\n- [x] Done thing 1\n- [x] Done thing 2\n- [x] Done thing 3\n- [x] Done thing 4\n- [x] Done thing 5\n- [x] Done thing 6\n"
 	if err := os.WriteFile(tasksPath, []byte(tasksContent), 0600); err != nil {
 		t.Fatalf("failed to write TASKS.md: %v", err)
@@ -165,7 +166,7 @@ func TestRunDrift_JSONWithViolations(t *testing.T) {
 	defer cleanup()
 
 	// Create a file that looks like it has secrets to trigger a violation
-	constPath := filepath.Join(tmpDir, config.DirContext, "CONSTITUTION.md")
+	constPath := filepath.Join(tmpDir, dir.Context, "CONSTITUTION.md")
 	constContent := "# Constitution\n\n- NEVER commit secrets\n"
 	if err := os.WriteFile(constPath, []byte(constContent), 0600); err != nil {
 		t.Fatalf("failed to write CONSTITUTION.md: %v", err)
@@ -185,7 +186,7 @@ func TestRunDrift_FixWithStaleness(t *testing.T) {
 	defer cleanup()
 
 	// Create TASKS.md with many completed tasks to trigger staleness
-	tasksPath := filepath.Join(tmpDir, config.DirContext, config.FileTask)
+	tasksPath := filepath.Join(tmpDir, dir.Context, ctx.Task)
 	var sb strings.Builder
 	sb.WriteString("# Tasks\n\n## In Progress\n\n- [ ] Active task\n\n## Completed\n\n")
 	for i := 0; i < 10; i++ {
@@ -212,7 +213,7 @@ func TestRunDrift_FixTriggersRecheck(t *testing.T) {
 	defer cleanup()
 
 	// Remove a required file so fixMissingFile gets called and succeeds
-	constPath := filepath.Join(tmpDir, config.DirContext, "CONSTITUTION.md")
+	constPath := filepath.Join(tmpDir, dir.Context, "CONSTITUTION.md")
 	_ = os.Remove(constPath)
 
 	// Use Cmd directly with captured output
@@ -250,7 +251,7 @@ func TestRunDrift_GenericError(t *testing.T) {
 	defer rc.Reset()
 
 	// Create .context as a file, not a directory.
-	if err := os.WriteFile(filepath.Join(tmpDir, config.DirContext), []byte("not a dir"), 0600); err != nil {
+	if err := os.WriteFile(filepath.Join(tmpDir, dir.Context), []byte("not a dir"), 0600); err != nil {
 		t.Fatalf("failed to create fake .context: %v", err)
 	}
 
diff --git a/internal/cli/guide/cmd/root/cmd.go b/internal/cli/guide/cmd/root/cmd.go
index d37451a9..ee92dca6 100644
--- a/internal/cli/guide/cmd/root/cmd.go
+++ b/internal/cli/guide/cmd/root/cmd.go
@@ -5,3 +5,47 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx guide" cobra command.
+//
+// Returns:
+//   - *cobra.Command: Configured guide command with flags registered
+func Cmd() *cobra.Command {
+	var (
+		showSkills   bool
+		showCommands bool
+	)
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyGuide)
+	cmd := &cobra.Command{
+		Use:         "guide",
+		Short:       short,
+		Annotations: map[string]string{cli.AnnotationSkipInit: ""},
+		Long:        long,
+		RunE: func(cmd *cobra.Command, args []string) error {
+			return Run(cmd, showSkills, showCommands)
+		},
+	}
+
+	cmd.Flags().BoolVar(
+		&showSkills,
+		"skills",
+		false,
+		assets.FlagDesc(assets.FlagDescKeyGuideSkills),
+	)
+	cmd.Flags().BoolVar(
+		&showCommands,
+		"commands",
+		false,
+		assets.FlagDesc(assets.FlagDescKeyGuideCommands),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/guide/cmd/root/doc.go b/internal/cli/guide/cmd/root/doc.go
new file mode 100644
index 00000000..43904f28
--- /dev/null
+++ b/internal/cli/guide/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx guide command.
+//
+// It prints a quick-reference cheat sheet covering ctx commands, skills,
+// and workflows.
+package root
diff --git a/internal/cli/guide/cmd/root/skills.go b/internal/cli/guide/cmd/root/skills.go
index b1b47911..978186e7 100644
--- a/internal/cli/guide/cmd/root/skills.go
+++ b/internal/cli/guide/cmd/root/skills.go
@@ -7,42 +7,38 @@
 package root
 
 import (
-	"bytes"
-	"fmt"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 	"gopkg.in/yaml.v3"
 
 	"github.com/ActiveMemory/ctx/internal/claude"
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
-// skillMeta holds the frontmatter fields we care about.
-type skillMeta struct {
-	Name        string `yaml:"name"`
-	Description string `yaml:"description"`
-}
-
 // parseSkillFrontmatter extracts YAML frontmatter from a SKILL.md file.
 //
+// Parameters:
+//   - content: Raw SKILL.md content
+//
 // Returns:
 //   - skillMeta: Parsed name and description (zero value if no frontmatter)
 //   - error: Non-nil if YAML parsing fails
 func parseSkillFrontmatter(content []byte) (skillMeta, error) {
-	const sep = "---"
-
 	text := string(content)
-	if !strings.HasPrefix(text, sep+config.NewlineLF) {
+	prefix := token.Separator + token.NewlineLF
+	if !strings.HasPrefix(text, prefix) {
 		return skillMeta{}, nil
 	}
 
-	end := strings.Index(text[4:], config.NewlineLF+sep)
+	offset := len(prefix)
+	end := strings.Index(text[offset:], token.NewlineLF+token.Separator)
 	if end < 0 {
 		return skillMeta{}, nil
 	}
 
-	block := []byte(text[4 : 4+end])
+	block := []byte(text[offset : offset+end])
 	var meta skillMeta
 	if yamlErr := yaml.Unmarshal(block, &meta); yamlErr != nil {
 		return skillMeta{}, yamlErr
@@ -51,28 +47,38 @@ func parseSkillFrontmatter(content []byte) (skillMeta, error) {
 }
 
 // truncateDescription returns the first sentence or up to maxLen characters.
+//
+// Parameters:
+//   - desc: Full description text
+//   - maxLen: Maximum character length
+//
+// Returns:
+//   - string: Truncated description
 func truncateDescription(desc string, maxLen int) string {
-	// Try sentence boundary first.
 	if idx := strings.Index(desc, ". "); idx >= 0 && idx < maxLen {
 		return desc[:idx+1]
 	}
 	if len(desc) <= maxLen {
 		return desc
 	}
-	return desc[:maxLen] + "..."
+	return desc[:maxLen] + token.Ellipsis
 }
 
 // listSkills prints all available skills with their descriptions.
+//
+// Parameters:
+//   - cmd: Cobra command for output
+//
+// Returns:
+//   - error: Non-nil if skill listing fails
 func listSkills(cmd *cobra.Command) error {
 	names, skillsErr := claude.Skills()
 	if skillsErr != nil {
 		return skillsErr
 	}
 
-	cmd.Println("Available Skills:")
-	cmd.Println()
+	write.InfoSkillsHeader(cmd)
 
-	var buf bytes.Buffer
 	for _, name := range names {
 		content, readErr := claude.SkillContent(name)
 		if readErr != nil {
@@ -85,9 +91,7 @@ func listSkills(cmd *cobra.Command) error {
 		}
 
 		desc := truncateDescription(meta.Description, 70)
-		buf.Reset()
-		fmt.Fprintf(&buf, "  /%-22s %s", name, desc)
-		cmd.Println(buf.String())
+		write.InfoSkillLine(cmd, name, desc)
 	}
 	return nil
 }
diff --git a/internal/cli/guide/cmd/root/types.go b/internal/cli/guide/cmd/root/types.go
new file mode 100644
index 00000000..99a584b9
--- /dev/null
+++ b/internal/cli/guide/cmd/root/types.go
@@ -0,0 +1,13 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+package root
+
+// skillMeta holds the frontmatter fields extracted from a SKILL.md file.
+type skillMeta struct {
+	Name        string `yaml:"name"`
+	Description string `yaml:"description"`
+}
diff --git a/internal/cli/guide/guide.go b/internal/cli/guide/guide.go
index 47370f67..0939d1e5 100644
--- a/internal/cli/guide/guide.go
+++ b/internal/cli/guide/guide.go
@@ -9,34 +9,10 @@ package guide
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	guideroot "github.com/ActiveMemory/ctx/internal/cli/guide/cmd/root"
-	"github.com/ActiveMemory/ctx/internal/config"
 )
 
 // Cmd returns the "ctx guide" cobra command.
-//
-// Returns:
-//   - *cobra.Command: Configured guide command with flags registered
 func Cmd() *cobra.Command {
-	var (
-		showSkills   bool
-		showCommands bool
-	)
-
-	short, long := assets.CommandDesc("guide")
-	cmd := &cobra.Command{
-		Use:         "guide",
-		Short:       short,
-		Annotations: map[string]string{config.AnnotationSkipInit: ""},
-		Long:        long,
-		RunE: func(cmd *cobra.Command, args []string) error {
-			return guideroot.Run(cmd, showSkills, showCommands)
-		},
-	}
-
-	cmd.Flags().BoolVar(&showSkills, "skills", false, assets.FlagDesc("guide.skills"))
-	cmd.Flags().BoolVar(&showCommands, "commands", false, assets.FlagDesc("guide.commands"))
-
-	return cmd
+	return guideroot.Cmd()
 }
diff --git a/internal/cli/hook/cmd/root/cmd.go b/internal/cli/hook/cmd/root/cmd.go
index d37451a9..ad4cdc2b 100644
--- a/internal/cli/hook/cmd/root/cmd.go
+++ b/internal/cli/hook/cmd/root/cmd.go
@@ -5,3 +5,43 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx hook" command for generating AI tool integrations.
+//
+// The command outputs configuration snippets and instructions for integrating
+// Context with various AI coding tools like Claude Code, Cursor, Aider, etc.
+//
+// Flags:
+//   - --write, -w: Write the configuration file instead of printing
+//
+// Returns:
+//   - *cobra.Command: Configured hook command that accepts a tool name argument
+func Cmd() *cobra.Command {
+	var write bool
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyHook)
+	cmd := &cobra.Command{
+		Use:         "hook ",
+		Short:       short,
+		Annotations: map[string]string{cli.AnnotationSkipInit: cli.AnnotationTrue},
+		Long:        long,
+		Args:        cobra.ExactArgs(1),
+		RunE: func(cmd *cobra.Command, args []string) error {
+			return Run(cmd, args, write)
+		},
+	}
+
+	cmd.Flags().BoolVarP(
+		&write, "write", "w", false,
+		assets.FlagDesc(assets.FlagDescKeyHookWrite),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/hook/cmd/root/doc.go b/internal/cli/hook/cmd/root/doc.go
new file mode 100644
index 00000000..04c5337c
--- /dev/null
+++ b/internal/cli/hook/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx hook command.
+//
+// It generates configuration snippets for integrating ctx with AI coding
+// tools such as Claude Code, Cursor, and Aider.
+package root
diff --git a/internal/cli/hook/cmd/root/run.go b/internal/cli/hook/cmd/root/run.go
index becd6dfb..1b1486f8 100644
--- a/internal/cli/hook/cmd/root/run.go
+++ b/internal/cli/hook/cmd/root/run.go
@@ -7,137 +7,21 @@
 package root
 
 import (
-	"fmt"
 	"os"
 	"path/filepath"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
-// CopilotInstructions is the comprehensive GitHub Copilot integration
-// template for .github/copilot-instructions.md.
-//
-// It instructs Copilot Chat (agent mode) to load .context/ files at
-// session start, persist session summaries to .context/sessions/, and
-// proactively update context files during work.
-const CopilotInstructions = `# Project Context
-
-
-
-
-## Context System
-
-This project uses Context (` + "`ctx`" + `) for persistent AI context
-management. Your memory is NOT ephemeral — it lives in ` + "`.context/`" + ` files.
-
-## On Session Start
-
-Read these files **in order** before starting any work:
-
-1. ` + "`.context/CONSTITUTION.md`" + ` — Hard rules, NEVER violate
-2. ` + "`.context/TASKS.md`" + ` — Current work items
-3. ` + "`.context/CONVENTIONS.md`" + ` — Code patterns and standards
-4. ` + "`.context/ARCHITECTURE.md`" + ` — System structure
-5. ` + "`.context/DECISIONS.md`" + ` — Architectural decisions with rationale
-6. ` + "`.context/LEARNINGS.md`" + ` — Gotchas, tips, lessons learned
-7. ` + "`.context/GLOSSARY.md`" + ` — Domain terms and abbreviations
-8. ` + "`.context/AGENT_PLAYBOOK.md`" + ` — How to use this context system
-
-After reading, confirm: "I have read the required context files and I'm
-following project conventions."
-
-## When Asked "Do You Remember?"
-
-**Do this FIRST (silently):**
-- Read ` + "`.context/TASKS.md`" + `
-- Read ` + "`.context/DECISIONS.md`" + ` and ` + "`.context/LEARNINGS.md`" + `
-- Check ` + "`.context/sessions/`" + ` for recent session files
-
-**Then respond with a structured readback:**
-1. **Last session**: cite the most recent session topic and date
-2. **Active work**: list pending or in-progress tasks
-3. **Recent context**: mention 1-2 recent decisions or learnings
-4. **Next step**: offer to continue or ask what to focus on
-
-**Never** lead with "I don't have memory." The ` + "`.context/`" + ` files are
-your memory. Read them silently, then present what you found as recall.
-
-## Session Persistence
-
-After completing meaningful work, save a session summary to
-` + "`.context/sessions/`" + `.
-
-### Session File Format
-
-Create a file named ` + "`YYYY-MM-DD-topic.md`" + `:
-
-` + "```" + `markdown
-# Session: YYYY-MM-DD — Brief Topic Description
-
-## What Was Done
-- Describe completed work items
-
-## Decisions
-- Key decisions made and their rationale
-
-## Learnings
-- Gotchas, tips, or insights discovered
-
-## Next Steps
-- Follow-up work or remaining items
-` + "```" + `
-
-### When to Save
-
-- After completing a task or feature
-- After making architectural decisions
-- After a debugging session
-- Before ending the session
-- At natural breakpoints in long sessions
-
-## Context Updates During Work
-
-Proactively update context files as you work:
-
-| Event                       | Action                              |
-|-----------------------------|-------------------------------------|
-| Made architectural decision | Add to ` + "`.context/DECISIONS.md`" + `  |
-| Discovered gotcha/bug       | Add to ` + "`.context/LEARNINGS.md`" + `  |
-| Established new pattern     | Add to ` + "`.context/CONVENTIONS.md`" + ` |
-| Completed task              | Mark [x] in ` + "`.context/TASKS.md`" + ` |
-
-## Self-Check
-
-Periodically ask yourself:
-
-> "If this session ended right now, would the next session know what happened?"
-
-If no — save a session file or update context files before continuing.
-
-## CLI Commands
-
-If ` + "`ctx`" + ` is installed, use these commands:
-
-` + "```" + `bash
-ctx status        # Context summary and health check
-ctx agent         # AI-ready context packet
-ctx drift         # Check for stale context
-ctx recall list   # Recent session history
-` + "```" + `
-
-
-`
-
-// ToolConfigFiles maps tool names to their configuration file paths.
-var ToolConfigFiles = map[string]string{
-	"copilot":  filepath.Join(".github", "copilot-instructions.md"),
-	"cursor":   ".cursorrules",
-	"windsurf": ".windsurfrules",
-}
-
 // Run executes the hook command logic.
 //
 // Outputs integration instructions and configuration snippets for the
@@ -147,111 +31,42 @@ var ToolConfigFiles = map[string]string{
 // Parameters:
 //   - cmd: Cobra command for output stream
 //   - args: Command arguments; args[0] is the tool name
-//   - write: If true, write the configuration file instead of printing
+//   - writeFile: If true, write the configuration file instead of printing
 //
 // Returns:
 //   - error: Non-nil if the tool is not supported or file write fails
-func Run(cmd *cobra.Command, args []string, write bool) error {
+func Run(cmd *cobra.Command, args []string, writeFile bool) error {
 	tool := strings.ToLower(args[0])
 
 	switch tool {
 	case "claude-code", "claude":
-		cmd.Println("Claude Code Integration")
-		cmd.Println("=======================")
-		cmd.Println()
-		cmd.Println("Claude Code integration is now provided via the ctx plugin.")
-		cmd.Println()
-		cmd.Println("Install the plugin:")
-		cmd.Println("  /plugin marketplace add ActiveMemory/ctx")
-		cmd.Println("  /plugin install ctx@activememory-ctx")
-		cmd.Println()
-		cmd.Println("The plugin provides hooks (context monitoring, persistence")
-		cmd.Println("nudges, post-commit capture) and 25 skills automatically.")
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookClaude))
 
 	case "cursor":
-		cmd.Println("Cursor IDE Integration")
-		cmd.Println("======================")
-		cmd.Println()
-		cmd.Println("Add to your .cursorrules file:")
-		cmd.Println()
-		cmd.Println("```markdown")
-		cmd.Print(`# Project Context
-
-Always read these files before making changes:
-- .context/CONSTITUTION.md (NEVER violate these rules)
-- .context/TASKS.md (current work)
-- .context/CONVENTIONS.md (how we write code)
-- .context/ARCHITECTURE.md (system structure)
-
-Run 'ctx agent' for a context summary.
-Run 'ctx drift' to check for stale context.
-`)
-		cmd.Println("```")
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookCursor))
 
 	case "aider":
-		cmd.Println("Aider Integration")
-		cmd.Println("=================")
-		cmd.Println()
-		cmd.Println("Add to your .aider.conf.yml:")
-		cmd.Println()
-		cmd.Println("```yaml")
-		cmd.Println(`read:
-  - .context/CONSTITUTION.md
-  - .context/TASKS.md
-  - .context/CONVENTIONS.md
-  - .context/ARCHITECTURE.md
-  - .context/DECISIONS.md`)
-		cmd.Println("```")
-		cmd.Println()
-		cmd.Println("Or pass context via command line:")
-		cmd.Println()
-		cmd.Println("```bash")
-		cmd.Println(`ctx agent | aider --message "$(cat -)"`)
-		cmd.Println("```")
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookAider))
 
 	case "copilot":
-		if write {
+		if writeFile {
 			return WriteCopilotInstructions(cmd)
 		}
-		cmd.Println("GitHub Copilot Integration")
-		cmd.Println("==========================")
-		cmd.Println()
-		cmd.Println("Add the following to .github/copilot-instructions.md,")
-		cmd.Println("or run with --write to generate the file directly:")
-		cmd.Println()
-		cmd.Println("  ctx hook copilot --write")
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookCopilot))
 		cmd.Println()
-		cmd.Print(CopilotInstructions)
+		content, readErr := assets.CopilotInstructions()
+		if readErr != nil {
+			return readErr
+		}
+		cmd.Print(string(content))
 
 	case "windsurf":
-		cmd.Println("Windsurf Integration")
-		cmd.Println("====================")
-		cmd.Println()
-		cmd.Println("Add to your .windsurfrules file:")
-		cmd.Println()
-		cmd.Println("```markdown")
-		cmd.Print(`# Context
-
-Read order for context:
-1. .context/CONSTITUTION.md
-2. .context/TASKS.md
-3. .context/CONVENTIONS.md
-4. .context/ARCHITECTURE.md
-5. .context/DECISIONS.md
-
-Run 'ctx agent' for AI-ready context packet.
-`)
-		cmd.Println("```")
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookWindsurf))
 
 	default:
-		cmd.Println(fmt.Sprintf("Unknown tool: %s\n", tool))
-		cmd.Println("Supported tools:")
-		cmd.Println("  claude-code  - Anthropic's Claude Code CLI (use plugin instead)")
-		cmd.Println("  cursor       - Cursor IDE")
-		cmd.Println("  aider        - Aider AI coding assistant")
-		cmd.Println("  copilot      - GitHub Copilot")
-		cmd.Println("  windsurf     - Windsurf IDE")
-		return fmt.Errorf("unsupported tool: %s", tool)
+		write.InfoHookUnknownTool(cmd, tool)
+		write.InfoHookTool(cmd, assets.TextDesc(assets.TextDescKeyHookSupportedTools))
+		return ctxerr.UnsupportedTool(tool)
 	}
 
 	return nil
@@ -273,56 +88,53 @@ func WriteCopilotInstructions(cmd *cobra.Command) error {
 	targetFile := filepath.Join(targetDir, "copilot-instructions.md")
 
 	// Create .github/ directory if needed
-	if err := os.MkdirAll(targetDir, config.PermExec); err != nil {
-		return fmt.Errorf("failed to create %s: %w", targetDir, err)
+	if err := os.MkdirAll(targetDir, fs.PermExec); err != nil {
+		return ctxerr.Mkdir(targetDir, err)
+	}
+
+	// Load the copilot instructions from embedded assets
+	instructions, readErr := assets.CopilotInstructions()
+	if readErr != nil {
+		return readErr
 	}
 
 	// Check if file exists
-	existingContent, err := os.ReadFile(filepath.Clean(targetFile)) //nolint:gosec // targetFile is constructed from constants, not user input
+	existingContent, err := os.ReadFile(filepath.Clean(targetFile))
 	fileExists := err == nil
 
 	if fileExists {
 		existingStr := string(existingContent)
-		if strings.Contains(existingStr, "") {
-			cmd.Println(fmt.Sprintf(
-				"  ○ %s (ctx content exists, skipped)", targetFile,
-			))
-			cmd.Println("  Use --force to overwrite (not yet implemented).")
+		if strings.Contains(existingStr, marker.CopilotMarkerStart) {
+			write.InfoHookCopilotSkipped(cmd, targetFile)
 			return nil
 		}
 
 		// File exists without ctx markers: append ctx content
-		merged := existingStr + config.NewlineLF + CopilotInstructions
-		if err := os.WriteFile(targetFile, []byte(merged), config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", targetFile, err)
+		merged := existingStr + token.NewlineLF + string(instructions)
+		if writeErr := os.WriteFile(targetFile, []byte(merged), fs.PermFile); writeErr != nil {
+			return ctxerr.FileWrite(targetFile, writeErr)
 		}
-		cmd.Println(fmt.Sprintf("  ✓ %s (merged)", targetFile))
+		write.InfoHookCopilotMerged(cmd, targetFile)
 		return nil
 	}
 
 	// File doesn't exist: create it
-	if err := os.WriteFile(
-		targetFile, []byte(CopilotInstructions), config.PermFile,
-	); err != nil {
-		return fmt.Errorf("failed to write %s: %w", targetFile, err)
+	if writeErr := os.WriteFile(
+		targetFile, instructions, fs.PermFile,
+	); writeErr != nil {
+		return ctxerr.FileWrite(targetFile, writeErr)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s", targetFile))
+	write.InfoHookCopilotCreated(cmd, targetFile)
 
 	// Also create .context/sessions/ if it doesn't exist
-	sessionsDir := filepath.Join(config.DirContext, config.DirSessions)
-	if err := os.MkdirAll(sessionsDir, config.PermExec); err != nil {
-		cmd.Println(fmt.Sprintf(
-			"  ⚠ %s: %v", sessionsDir, err,
-		))
+	sessionsDir := filepath.Join(dir.Context, dir.Sessions)
+	if mkErr := os.MkdirAll(sessionsDir, fs.PermExec); mkErr != nil {
+		write.WarnFileErr(cmd, sessionsDir, mkErr)
 	} else {
-		cmd.Println(fmt.Sprintf("  ✓ %s/", sessionsDir))
+		write.InfoHookCopilotSessionsDir(cmd, sessionsDir)
 	}
 
-	cmd.Println()
-	cmd.Println("Copilot Chat (agent mode) will now:")
-	cmd.Println("  1. Read .context/ files at session start")
-	cmd.Println("  2. Save session summaries to .context/sessions/")
-	cmd.Println("  3. Proactively update context during work")
+	write.InfoHookCopilotSummary(cmd)
 
 	return nil
 }
diff --git a/internal/cli/hook/hook.go b/internal/cli/hook/hook.go
index 05921c0a..fea9ee33 100644
--- a/internal/cli/hook/hook.go
+++ b/internal/cli/hook/hook.go
@@ -9,40 +9,10 @@ package hook
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	hookroot "github.com/ActiveMemory/ctx/internal/cli/hook/cmd/root"
-	"github.com/ActiveMemory/ctx/internal/config"
 )
 
 // Cmd returns the "ctx hook" command for generating AI tool integrations.
-//
-// The command outputs configuration snippets and instructions for integrating
-// Context with various AI coding tools like Claude Code, Cursor, Aider, etc.
-//
-// Flags:
-//   - --write, -w: Write the configuration file instead of printing
-//
-// Returns:
-//   - *cobra.Command: Configured hook command that accepts a tool name argument
 func Cmd() *cobra.Command {
-	var write bool
-
-	short, long := assets.CommandDesc("hook")
-	cmd := &cobra.Command{
-		Use:         "hook ",
-		Short:       short,
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
-		Long:        long,
-		Args:        cobra.ExactArgs(1),
-		RunE: func(cmd *cobra.Command, args []string) error {
-			return hookroot.Run(cmd, args, write)
-		},
-	}
-
-	cmd.Flags().BoolVarP(
-		&write, "write", "w", false,
-		assets.FlagDesc("hook.write"),
-	)
-
-	return cmd
+	return hookroot.Cmd()
 }
diff --git a/internal/cli/initialize/cmd/root/cmd.go b/internal/cli/initialize/cmd/root/cmd.go
index d37451a9..d9452efd 100644
--- a/internal/cli/initialize/cmd/root/cmd.go
+++ b/internal/cli/initialize/cmd/root/cmd.go
@@ -5,3 +5,73 @@
 //                 SPDX-License-Identifier: Apache-2.0
 
 package root
+
+import (
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/spf13/cobra"
+
+	"github.com/ActiveMemory/ctx/internal/assets"
+)
+
+// Cmd returns the "ctx init" command for initializing a .context/ directory.
+//
+// The command creates template files for maintaining persistent context
+// for AI coding assistants. Files include constitution rules, tasks,
+// decisions, learnings, conventions, and architecture documentation.
+//
+// Flags:
+//   - --force, -f: Overwrite existing context files without prompting
+//   - --minimal, -m: Only create essential files
+//     (TASKS, DECISIONS, CONSTITUTION)
+//   - --merge: Auto-merge ctx content into existing CLAUDE.md and PROMPT.md
+//   - --ralph: Use autonomous loop templates (no clarifying questions,
+//     one-task-per-iteration, completion signals)
+//   - --no-plugin-enable: Skip auto-enabling the ctx plugin in
+//     ~/.claude/settings.json
+//
+// Returns:
+//   - *cobra.Command: Configured init command with flags registered
+func Cmd() *cobra.Command {
+	var (
+		force          bool
+		minimal        bool
+		merge          bool
+		ralph          bool
+		noPluginEnable bool
+	)
+
+	short, long := assets.CommandDesc(assets.CmdDescKeyInitialize)
+	cmd := &cobra.Command{
+		Use:         "init",
+		Short:       short,
+		Annotations: map[string]string{cli.AnnotationSkipInit: cli.AnnotationTrue},
+		Long:        long,
+		RunE: func(cmd *cobra.Command, args []string) error {
+			return Run(cmd, force, minimal, merge, ralph, noPluginEnable)
+		},
+	}
+
+	cmd.Flags().BoolVarP(
+		&force,
+		"force", "f", false, assets.FlagDesc(assets.FlagDescKeyInitializeForce),
+	)
+	cmd.Flags().BoolVarP(
+		&minimal,
+		"minimal", "m", false,
+		assets.FlagDesc(assets.FlagDescKeyInitializeMinimal),
+	)
+	cmd.Flags().BoolVar(
+		&merge, "merge", false,
+		assets.FlagDesc(assets.FlagDescKeyInitializeMerge),
+	)
+	cmd.Flags().BoolVar(
+		&ralph, "ralph", false,
+		assets.FlagDesc(assets.FlagDescKeyInitializeRalph),
+	)
+	cmd.Flags().BoolVar(
+		&noPluginEnable, "no-plugin-enable", false,
+		assets.FlagDesc(assets.FlagDescKeyInitializeNoPluginEnable),
+	)
+
+	return cmd
+}
diff --git a/internal/cli/initialize/cmd/root/doc.go b/internal/cli/initialize/cmd/root/doc.go
new file mode 100644
index 00000000..004792d6
--- /dev/null
+++ b/internal/cli/initialize/cmd/root/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package root implements the ctx init command.
+//
+// It initializes a new .context/ directory with template files for
+// maintaining persistent context for AI coding assistants.
+package root
diff --git a/internal/cli/initialize/cmd/root/run.go b/internal/cli/initialize/cmd/root/run.go
index b8eb3bb3..57de60ba 100644
--- a/internal/cli/initialize/cmd/root/run.go
+++ b/internal/cli/initialize/cmd/root/run.go
@@ -8,20 +8,29 @@ package root
 
 import (
 	"bufio"
-	"fmt"
 	"os"
 	"path/filepath"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/pad"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/initialize/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/crypto"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
+// gitignoreHeader is the section comment prepended to ctx-managed entries.
+const gitignoreHeader = "# ctx managed entries"
+
 // Run executes the init command logic.
 //
 // Creates a .context/ directory with template files. Handles existing
@@ -51,34 +60,34 @@ func Run(cmd *cobra.Command, force, minimal, merge, ralph, noPluginEnable bool)
 	if _, err := os.Stat(contextDir); err == nil {
 		if !force && hasEssentialFiles(contextDir) {
 			// Prompt for confirmation
-			cmd.Print(fmt.Sprintf("%s already exists. Overwrite? [y/N] ", contextDir))
+			write.InfoInitOverwritePrompt(cmd, contextDir)
 			reader := bufio.NewReader(os.Stdin)
 			response, err := reader.ReadString('\n')
 			if err != nil {
-				return fmt.Errorf("failed to read input: %w", err)
+				return ctxerr.ReadInput(err)
 			}
 			response = strings.TrimSpace(strings.ToLower(response))
-			if response != config.ConfirmShort && response != config.ConfirmLong {
-				cmd.Println("Aborted.")
+			if response != cli.ConfirmShort && response != cli.ConfirmLong {
+				write.InfoInitAborted(cmd)
 				return nil
 			}
 		}
 	}
 
 	// Create .context/ directory
-	if err := os.MkdirAll(contextDir, config.PermExec); err != nil {
-		return fmt.Errorf("failed to create %s: %w", contextDir, err)
+	if err := os.MkdirAll(contextDir, fs.PermExec); err != nil {
+		return ctxerr.Mkdir(contextDir, err)
 	}
 
 	// Get the list of templates to create
 	var templatesToCreate []string
 	if minimal {
-		templatesToCreate = config.FilesRequired
+		templatesToCreate = ctx.FilesRequired
 	} else {
 		var listErr error
 		templatesToCreate, listErr = assets.List()
 		if listErr != nil {
-			return fmt.Errorf("failed to list templates: %w", listErr)
+			return ctxerr.ListTemplates(listErr)
 		}
 	}
 
@@ -88,114 +97,98 @@ func Run(cmd *cobra.Command, force, minimal, merge, ralph, noPluginEnable bool)
 
 		// Check if the file exists and --force not set
 		if _, err := os.Stat(targetPath); err == nil && !force {
-			cmd.Println(fmt.Sprintf(
-				"  ○ %s (exists, skipped)\n", name,
-			))
+			write.InfoInitExistsSkipped(cmd, name)
 			continue
 		}
 
 		content, err := assets.Template(name)
 		if err != nil {
-			return fmt.Errorf("failed to read template %s: %w", name, err)
+			return ctxerr.ReadTemplate(name, err)
 		}
 
-		if err := os.WriteFile(targetPath, content, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", targetPath, err)
+		if err := os.WriteFile(targetPath, content, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(targetPath, err)
 		}
 
-		cmd.Println(fmt.Sprintf("  ✓ %s", name))
+		write.InfoInitFileCreated(cmd, name)
 	}
 
-	cmd.Println(fmt.Sprintf("\nContext initialized in %s/", contextDir))
+	write.InfoInitialized(cmd, contextDir)
 
 	// Create entry templates in .context/templates/
 	if err := core.CreateEntryTemplates(cmd, contextDir, force); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ Entry templates: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Entry templates", err)
 	}
 
 	// Create prompt templates in .context/prompts/
 	if err := core.CreatePromptTemplates(cmd, contextDir, force); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ Prompt templates: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Prompt templates", err)
 	}
 
 	// Migrate legacy key files and promote to global path.
-	config.MigrateKeyFile(contextDir)
+	crypto.MigrateKeyFile(contextDir)
 
 	// Set up scratchpad
 	if err := initScratchpad(cmd, contextDir); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ Scratchpad: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Scratchpad", err)
 	}
 
 	// Create project root files
-	cmd.Println("\nCreating project root files...")
+	write.InfoInitCreatingRootFiles(cmd)
 
 	// Create specs/ and ideas/ directories with README.md
 	if err := core.CreateProjectDirs(cmd); err != nil {
-		cmd.Println(fmt.Sprintf("  ⚠ Project dirs: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Project dirs", err)
 	}
 
 	// Create PROMPT.md (uses ralph template if --ralph flag set)
 	if err := core.HandlePromptMd(cmd, force, merge, ralph); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ PROMPT.md: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "PROMPT.md", err)
 	}
 
 	// Create IMPLEMENTATION_PLAN.md
 	if err := core.HandleImplementationPlan(cmd, force, merge); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf(
-			"  ⚠ IMPLEMENTATION_PLAN.md: %v\n", err,
-		))
+		write.InfoInitWarnNonFatal(cmd, "IMPLEMENTATION_PLAN.md", err)
 	}
 
 	// Merge permissions into settings.local.json (no hook scaffolding)
-	cmd.Println("\nSetting up Claude Code permissions...")
+	write.InfoInitSettingUpPermissions(cmd)
 	if err := core.MergeSettingsPermissions(cmd); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ Permissions: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Permissions", err)
 	}
 
 	// Auto-enable plugin globally unless suppressed
 	if !noPluginEnable {
 		if pluginErr := core.EnablePluginGlobally(cmd); pluginErr != nil {
 			// Non-fatal: warn but continue
-			cmd.Println(fmt.Sprintf("  ⚠ Plugin enablement: %v", pluginErr))
+			write.InfoInitWarnNonFatal(cmd, "Plugin enablement", pluginErr)
 		}
 	}
 
 	// Handle CLAUDE.md creation/merge
 	if err := core.HandleClaudeMd(cmd, force, merge); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ CLAUDE.md: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "CLAUDE.md", err)
 	}
 
 	// Deploy Makefile.ctx and amend user Makefile
 	if err := core.HandleMakefileCtx(cmd); err != nil {
 		// Non-fatal: warn but continue
-		cmd.Println(fmt.Sprintf("  ⚠ Makefile: %v", err))
+		write.InfoInitWarnNonFatal(cmd, "Makefile", err)
 	}
 
 	// Update .gitignore with recommended entries
 	if err := ensureGitignoreEntries(cmd); err != nil {
-		cmd.Println(fmt.Sprintf("  ⚠ .gitignore: %v", err))
+		write.InfoInitWarnNonFatal(cmd, ".gitignore", err)
 	}
 
-	cmd.Println("\nNext steps:")
-	cmd.Println("  1. Edit .context/TASKS.md to add your current tasks")
-	cmd.Println("  2. Run 'ctx status' to see context summary")
-	cmd.Println("  3. Run 'ctx agent' to get AI-ready context packet")
-	cmd.Println()
-	cmd.Println("Claude Code users: install the ctx plugin for hooks & skills:")
-	cmd.Println("  /plugin marketplace add ActiveMemory/ctx")
-	cmd.Println("  /plugin install ctx@activememory-ctx")
-	cmd.Println()
-	cmd.Println("Note: local plugin installs are not auto-enabled globally.")
-	cmd.Println("Run 'ctx init' again after installing the plugin to enable it,")
-	cmd.Println("or manually add to ~/.claude/settings.json:")
-	cmd.Println("  {\"enabledPlugins\": {\"ctx@activememory-ctx\": true}}")
+	write.InfoInitNextSteps(cmd)
 
 	return nil
 }
@@ -219,50 +212,49 @@ func Run(cmd *cobra.Command, force, minimal, merge, ralph, noPluginEnable bool)
 func initScratchpad(cmd *cobra.Command, contextDir string) error {
 	if !rc.ScratchpadEncrypt() {
 		// Plaintext mode: create empty scratchpad.md if not present
-		mdPath := filepath.Join(contextDir, config.FileScratchpadMd)
+		mdPath := filepath.Join(contextDir, pad.Md)
 		if _, err := os.Stat(mdPath); err != nil {
-			if err := os.WriteFile(mdPath, nil, config.PermFile); err != nil {
-				return fmt.Errorf("failed to create %s: %w", mdPath, err)
+			if err := os.WriteFile(mdPath, nil, fs.PermFile); err != nil {
+				return ctxerr.Mkdir(mdPath, err)
 			}
-			cmd.Println(fmt.Sprintf("  ✓ %s (plaintext scratchpad)", mdPath))
+			write.InfoInitScratchpadPlaintext(cmd, mdPath)
 		} else {
-			cmd.Println(fmt.Sprintf("  ○ %s (exists, skipped)", mdPath))
+			write.InfoInitExistsSkipped(cmd, mdPath)
 		}
 		return nil
 	}
 
 	// Encrypted mode
 	kPath := rc.KeyPath()
-	encPath := filepath.Join(contextDir, config.FileScratchpadEnc)
+	encPath := filepath.Join(contextDir, pad.Enc)
 
 	// Check if key already exists (idempotent)
 	if _, err := os.Stat(kPath); err == nil {
-		cmd.Println(fmt.Sprintf("  ○ %s (exists, skipped)", kPath))
+		write.InfoInitExistsSkipped(cmd, kPath)
 		return nil
 	}
 
 	// Warn if encrypted file exists but no key
 	if _, err := os.Stat(encPath); err == nil {
-		cmd.Println(fmt.Sprintf("  ⚠ Encrypted scratchpad found but no key at %s",
-			kPath))
+		write.InfoInitScratchpadNoKey(cmd, kPath)
 		return nil
 	}
 
 	// Ensure key directory exists.
-	if mkdirErr := os.MkdirAll(filepath.Dir(kPath), config.PermKeyDir); mkdirErr != nil {
-		return fmt.Errorf("failed to create key dir: %w", mkdirErr)
+	if mkdirErr := os.MkdirAll(filepath.Dir(kPath), fs.PermKeyDir); mkdirErr != nil {
+		return ctxerr.MkdirKeyDir(mkdirErr)
 	}
 
 	// Generate key
 	key, err := crypto.GenerateKey()
 	if err != nil {
-		return fmt.Errorf("failed to generate scratchpad key: %w", err)
+		return ctxerr.GenerateKey(err)
 	}
 
 	if err := crypto.SaveKey(kPath, key); err != nil {
-		return fmt.Errorf("failed to save scratchpad key: %w", err)
+		return ctxerr.SaveKey(err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ Scratchpad key created at %s", kPath))
+	write.InfoInitScratchpadKeyCreated(cmd, kPath)
 
 	return nil
 }
@@ -272,7 +264,7 @@ func initScratchpad(cmd *cobra.Command, contextDir string) error {
 // directory with only logs/ or other non-essential content is considered
 // uninitialized.
 func hasEssentialFiles(contextDir string) bool {
-	for _, f := range config.FilesRequired {
+	for _, f := range ctx.FilesRequired {
 		if _, err := os.Stat(filepath.Join(contextDir, f)); err == nil {
 			return true
 		}
@@ -292,13 +284,13 @@ func ensureGitignoreEntries(cmd *cobra.Command) error {
 
 	// Build set of existing trimmed lines.
 	existing := make(map[string]bool)
-	for _, line := range strings.Split(string(content), config.NewlineLF) {
+	for _, line := range strings.Split(string(content), token.NewlineLF) {
 		existing[strings.TrimSpace(line)] = true
 	}
 
 	// Collect missing entries.
 	var missing []string
-	for _, entry := range config.GitignoreEntries {
+	for _, entry := range file.Gitignore {
 		if !existing[entry] {
 			missing = append(missing, entry)
 		}
@@ -310,19 +302,19 @@ func ensureGitignoreEntries(cmd *cobra.Command) error {
 
 	// Build block to append.
 	var sb strings.Builder
-	if len(content) > 0 && !strings.HasSuffix(string(content), config.NewlineLF) {
-		sb.WriteString(config.NewlineLF)
+	if len(content) > 0 && !strings.HasSuffix(string(content), token.NewlineLF) {
+		sb.WriteString(token.NewlineLF)
 	}
-	sb.WriteString("\n# ctx managed entries\n")
+	sb.WriteString(token.NewlineLF + gitignoreHeader + token.NewlineLF)
 	for _, entry := range missing {
-		sb.WriteString(entry + config.NewlineLF)
+		sb.WriteString(entry + token.NewlineLF)
 	}
 
-	if err := os.WriteFile(gitignorePath, append(content, []byte(sb.String())...), config.PermFile); err != nil {
+	if err := os.WriteFile(gitignorePath, append(content, []byte(sb.String())...), fs.PermFile); err != nil {
 		return err
 	}
 
-	cmd.Println(fmt.Sprintf("  ✓ .gitignore updated (%d entries added)", len(missing)))
-	cmd.Println("  Review with: cat .gitignore")
+	write.InfoInitGitignoreUpdated(cmd, len(missing))
+	write.InfoInitGitignoreReview(cmd)
 	return nil
 }
diff --git a/internal/cli/initialize/core/claude.go b/internal/cli/initialize/core/claude.go
index dc2f9e6b..d4acdf50 100644
--- a/internal/cli/initialize/core/claude.go
+++ b/internal/cli/initialize/core/claude.go
@@ -13,10 +13,16 @@ import (
 	"strings"
 	"time"
 
+	"github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // HandleClaudeMd creates or merges CLAUDE.md with ctx content.
@@ -31,57 +37,57 @@ import (
 func HandleClaudeMd(cmd *cobra.Command, force, autoMerge bool) error {
 	templateContent, err := assets.ClaudeMd()
 	if err != nil {
-		return fmt.Errorf("failed to read CLAUDE.md template: %w", err)
+		return ctxerr.ReadInitTemplate("CLAUDE.md", err)
 	}
-	existingContent, err := os.ReadFile(config.FileClaudeMd)
+	existingContent, err := os.ReadFile(claude.Md)
 	fileExists := err == nil
 	if !fileExists {
-		if err := os.WriteFile(config.FileClaudeMd, templateContent, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", config.FileClaudeMd, err)
+		if err := os.WriteFile(claude.Md, templateContent, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(claude.Md, err)
 		}
-		cmd.Println(fmt.Sprintf("  ✓ %s", config.FileClaudeMd))
+		write.InitCreated(cmd, claude.Md)
 		return nil
 	}
 	existingStr := string(existingContent)
-	hasCtxMarkers := strings.Contains(existingStr, config.CtxMarkerStart)
+	hasCtxMarkers := strings.Contains(existingStr, marker.CtxMarkerStart)
 	if hasCtxMarkers {
 		if !force {
-			cmd.Println(fmt.Sprintf("  ○ %s (ctx content exists, skipped)\n", config.FileClaudeMd))
+			write.InitCtxContentExists(cmd, claude.Md)
 			return nil
 		}
 		return UpdateCtxSection(cmd, existingStr, templateContent)
 	}
 	if !autoMerge {
-		cmd.Println(fmt.Sprintf("\n%s exists but has no ctx content.\n", config.FileClaudeMd))
+		write.InitFileExistsNoCtx(cmd, claude.Md)
 		cmd.Println("Would you like to append ctx context management instructions?")
 		cmd.Print("[y/N] ")
 		reader := bufio.NewReader(os.Stdin)
 		response, err := reader.ReadString('\n')
 		if err != nil {
-			return fmt.Errorf("failed to read input: %w", err)
+			return ctxerr.ReadInput(err)
 		}
 		response = strings.TrimSpace(strings.ToLower(response))
-		if response != config.ConfirmShort && response != config.ConfirmLong {
-			cmd.Println(fmt.Sprintf("  ○ %s (skipped)", config.FileClaudeMd))
+		if response != cli.ConfirmShort && response != cli.ConfirmLong {
+			write.InitSkippedPlain(cmd, claude.Md)
 			return nil
 		}
 	}
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FileClaudeMd, timestamp)
-	if err := os.WriteFile(backupName, existingContent, config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup %s: %w", backupName, err)
+	backupName := fmt.Sprintf("%s.%d.bak", claude.Md, timestamp)
+	if err := os.WriteFile(backupName, existingContent, fs.PermFile); err != nil {
+		return ctxerr.CreateBackup(backupName, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
+	write.InitBackup(cmd, backupName)
 	insertPos := FindInsertionPoint(existingStr)
 	var mergedContent string
 	if insertPos == 0 {
-		mergedContent = string(templateContent) + config.NewlineLF + existingStr
+		mergedContent = string(templateContent) + token.NewlineLF + existingStr
 	} else {
-		mergedContent = existingStr[:insertPos] + config.NewlineLF + string(templateContent) + config.NewlineLF + existingStr[insertPos:]
+		mergedContent = existingStr[:insertPos] + token.NewlineLF + string(templateContent) + token.NewlineLF + existingStr[insertPos:]
 	}
-	if err := os.WriteFile(config.FileClaudeMd, []byte(mergedContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to write merged %s: %w", config.FileClaudeMd, err)
+	if err := os.WriteFile(claude.Md, []byte(mergedContent), fs.PermFile); err != nil {
+		return ctxerr.WriteMerged(claude.Md, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (merged)", config.FileClaudeMd))
+	write.InitMerged(cmd, claude.Md)
 	return nil
 }
diff --git a/internal/cli/initialize/core/dirs.go b/internal/cli/initialize/core/dirs.go
index 624d6738..659fec5d 100644
--- a/internal/cli/initialize/core/dirs.go
+++ b/internal/cli/initialize/core/dirs.go
@@ -7,21 +7,24 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // ProjectDirs lists the project-root directories created by ctx init,
 // each with an explanatory README.md.
 var ProjectDirs = []string{
-	config.DirSpecs,
-	config.DirIdeas,
+	dir.Specs,
+	dir.Ideas,
 }
 
 // CreateProjectDirs creates project-root directories (specs/, ideas/) with
@@ -36,26 +39,25 @@ var ProjectDirs = []string{
 func CreateProjectDirs(cmd *cobra.Command) error {
 	for _, dir := range ProjectDirs {
 		if _, statErr := os.Stat(dir); statErr == nil {
-			cmd.Println(fmt.Sprintf("  ○ %s/ (exists, skipped)", dir))
+			write.InitSkippedDir(cmd, dir)
 			continue
 		}
 
-		if mkdirErr := os.MkdirAll(dir, config.PermExec); mkdirErr != nil {
-			return fmt.Errorf("failed to create %s/: %w", dir, mkdirErr)
+		if mkdirErr := os.MkdirAll(dir, fs.PermExec); mkdirErr != nil {
+			return ctxerr.Mkdir(dir+"/", mkdirErr)
 		}
 
 		readme, readErr := assets.ProjectReadme(dir)
 		if readErr != nil {
-			return fmt.Errorf("failed to read %s README template: %w",
-				dir, readErr)
+			return ctxerr.ReadProjectReadme(dir, readErr)
 		}
 
-		readmePath := filepath.Join(dir, config.FilenameReadme)
-		if writeErr := os.WriteFile(readmePath, readme, config.PermFile); writeErr != nil {
-			return fmt.Errorf("failed to write %s: %w", readmePath, writeErr)
+		readmePath := filepath.Join(dir, file.Readme)
+		if writeErr := os.WriteFile(readmePath, readme, fs.PermFile); writeErr != nil {
+			return ctxerr.FileWrite(readmePath, writeErr)
 		}
 
-		cmd.Println(fmt.Sprintf("  ✓ %s/", dir))
+		write.InitCreatedDir(cmd, dir)
 	}
 
 	return nil
diff --git a/internal/cli/initialize/core/fs.go b/internal/cli/initialize/core/fs.go
index 83ba714e..859f6346 100644
--- a/internal/cli/initialize/core/fs.go
+++ b/internal/cli/initialize/core/fs.go
@@ -12,7 +12,12 @@ import (
 	"strings"
 	"time"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 	"github.com/spf13/cobra"
 )
 
@@ -24,7 +29,7 @@ import (
 // Returns:
 //   - int: Position to insert at
 func FindInsertionPoint(content string) int {
-	lines := strings.Split(content, config.NewlineLF)
+	lines := strings.Split(content, token.NewlineLF)
 	pos := 0
 	for i, line := range lines {
 		trimmed := strings.TrimSpace(line)
@@ -69,33 +74,33 @@ func FindInsertionPoint(content string) int {
 // Returns:
 //   - error: Non-nil if markers are missing or file operations fail
 func UpdateCtxSection(cmd *cobra.Command, existing string, newTemplate []byte) error {
-	startIdx := strings.Index(existing, config.CtxMarkerStart)
+	startIdx := strings.Index(existing, marker.CtxMarkerStart)
 	if startIdx == -1 {
-		return fmt.Errorf("ctx start marker not found")
+		return ctxerr.MarkerNotFound("ctx")
 	}
-	endIdx := strings.Index(existing, config.CtxMarkerEnd)
+	endIdx := strings.Index(existing, marker.CtxMarkerEnd)
 	if endIdx == -1 {
 		endIdx = len(existing)
 	} else {
-		endIdx += len(config.CtxMarkerEnd)
+		endIdx += len(marker.CtxMarkerEnd)
 	}
 	templateStr := string(newTemplate)
-	templateStart := strings.Index(templateStr, config.CtxMarkerStart)
-	templateEnd := strings.Index(templateStr, config.CtxMarkerEnd)
+	templateStart := strings.Index(templateStr, marker.CtxMarkerStart)
+	templateEnd := strings.Index(templateStr, marker.CtxMarkerEnd)
 	if templateStart == -1 || templateEnd == -1 {
-		return fmt.Errorf("template missing ctx markers")
+		return ctxerr.TemplateMissingMarkers("ctx")
 	}
-	ctxContent := templateStr[templateStart : templateEnd+len(config.CtxMarkerEnd)]
+	ctxContent := templateStr[templateStart : templateEnd+len(marker.CtxMarkerEnd)]
 	newContent := existing[:startIdx] + ctxContent + existing[endIdx:]
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FileClaudeMd, timestamp)
-	if err := os.WriteFile(backupName, []byte(existing), config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup: %w", err)
+	backupName := fmt.Sprintf("%s.%d.bak", claude.Md, timestamp)
+	if err := os.WriteFile(backupName, []byte(existing), fs.PermFile); err != nil {
+		return ctxerr.CreateBackupGeneric(err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
-	if err := os.WriteFile(config.FileClaudeMd, []byte(newContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to update %s: %w", config.FileClaudeMd, err)
+	write.InitBackup(cmd, backupName)
+	if err := os.WriteFile(claude.Md, []byte(newContent), fs.PermFile); err != nil {
+		return ctxerr.FileUpdate(claude.Md, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (updated ctx section)\n", config.FileClaudeMd))
+	write.InitUpdatedCtxSection(cmd, claude.Md)
 	return nil
 }
diff --git a/internal/cli/initialize/core/hook.go b/internal/cli/initialize/core/hook.go
index 73fff78f..fd5c691c 100644
--- a/internal/cli/initialize/core/hook.go
+++ b/internal/cli/initialize/core/hook.go
@@ -9,15 +9,18 @@ package core
 import (
 	"bytes"
 	"encoding/json"
-	"fmt"
 	"os"
 	"strings"
 
+	claude2 "github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/claude"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // MergeSettingsPermissions merges ctx permissions into settings.local.json.
@@ -29,11 +32,11 @@ import (
 //   - error: Non-nil if file operations fail
 func MergeSettingsPermissions(cmd *cobra.Command) error {
 	var settings claude.Settings
-	existingContent, err := os.ReadFile(config.FileSettings)
+	existingContent, err := os.ReadFile(claude2.Settings)
 	fileExists := err == nil
 	if fileExists {
 		if err := json.Unmarshal(existingContent, &settings); err != nil {
-			return fmt.Errorf("failed to parse existing %s: %w", config.FileSettings, err)
+			return ctxerr.ParseFile(claude2.Settings, err)
 		}
 	}
 	allowModified := MergePermissions(&settings.Permissions.Allow, assets.DefaultAllowPermissions())
@@ -41,39 +44,39 @@ func MergeSettingsPermissions(cmd *cobra.Command) error {
 	allowDeduped := DeduplicatePermissions(&settings.Permissions.Allow)
 	denyDeduped := DeduplicatePermissions(&settings.Permissions.Deny)
 	if !allowModified && !denyModified && !allowDeduped && !denyDeduped {
-		cmd.Println(fmt.Sprintf("  ○ %s (no changes needed)\n", config.FileSettings))
+		write.InitNoChanges(cmd, claude2.Settings)
 		return nil
 	}
-	if err := os.MkdirAll(config.DirClaude, config.PermExec); err != nil {
-		return fmt.Errorf("failed to create %s: %w", config.DirClaude, err)
+	if err := os.MkdirAll(dir.Claude, fs.PermExec); err != nil {
+		return ctxerr.Mkdir(dir.Claude, err)
 	}
 	var buf bytes.Buffer
 	encoder := json.NewEncoder(&buf)
 	encoder.SetEscapeHTML(false)
 	encoder.SetIndent("", "  ")
 	if err := encoder.Encode(settings); err != nil {
-		return fmt.Errorf("failed to marshal settings: %w", err)
+		return ctxerr.MarshalSettings(err)
 	}
-	if err := os.WriteFile(config.FileSettings, buf.Bytes(), config.PermFile); err != nil {
-		return fmt.Errorf("failed to write %s: %w", config.FileSettings, err)
+	if err := os.WriteFile(claude2.Settings, buf.Bytes(), fs.PermFile); err != nil {
+		return ctxerr.FileWrite(claude2.Settings, err)
 	}
 	if fileExists {
 		deduped := allowDeduped || denyDeduped
 		merged := allowModified || denyModified
 		switch {
 		case merged && deduped:
-			cmd.Println(fmt.Sprintf("  ✓ %s (added ctx permissions, removed duplicates)", config.FileSettings))
+			write.InitPermsMergedDeduped(cmd, claude2.Settings)
 		case deduped:
-			cmd.Println(fmt.Sprintf("  ✓ %s (removed duplicate permissions)", config.FileSettings))
+			write.InitPermsDeduped(cmd, claude2.Settings)
 		case allowModified && denyModified:
-			cmd.Println(fmt.Sprintf("  ✓ %s (added ctx allow + deny permissions)", config.FileSettings))
+			write.InitPermsAllowDeny(cmd, claude2.Settings)
 		case denyModified:
-			cmd.Println(fmt.Sprintf("  ✓ %s (added ctx deny permissions)", config.FileSettings))
+			write.InitPermsDeny(cmd, claude2.Settings)
 		default:
-			cmd.Println(fmt.Sprintf("  ✓ %s (added ctx permissions)", config.FileSettings))
+			write.InitPermsAllow(cmd, claude2.Settings)
 		}
 	} else {
-		cmd.Println(fmt.Sprintf("  ✓ %s", config.FileSettings))
+		write.InitCreated(cmd, claude2.Settings)
 	}
 	return nil
 }
diff --git a/internal/cli/initialize/core/makefile.go b/internal/cli/initialize/core/makefile.go
index f276cd9a..59326490 100644
--- a/internal/cli/initialize/core/makefile.go
+++ b/internal/cli/initialize/core/makefile.go
@@ -7,14 +7,17 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"strings"
 
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/project"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // IncludeDirective is the line appended to the user's Makefile to pull
@@ -31,33 +34,33 @@ const IncludeDirective = "-include Makefile.ctx"
 func HandleMakefileCtx(cmd *cobra.Command) error {
 	content, err := assets.MakefileCtx()
 	if err != nil {
-		return fmt.Errorf("failed to read Makefile.ctx template: %w", err)
+		return ctxerr.ReadInitTemplate("Makefile.ctx", err)
 	}
-	if err = os.WriteFile(config.FileMakefileCtx, content, config.PermFile); err != nil {
-		return fmt.Errorf("failed to write %s: %w", config.FileMakefileCtx, err)
+	if err = os.WriteFile(project.MakefileCtx, content, fs.PermFile); err != nil {
+		return ctxerr.FileWrite(project.MakefileCtx, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s", config.FileMakefileCtx))
+	write.InitCreated(cmd, project.MakefileCtx)
 	existing, err := os.ReadFile("Makefile")
 	if err != nil {
-		minimal := IncludeDirective + config.NewlineLF
-		if err := os.WriteFile("Makefile", []byte(minimal), config.PermFile); err != nil {
-			return fmt.Errorf("failed to create Makefile: %w", err)
+		minimal := IncludeDirective + token.NewlineLF
+		if err := os.WriteFile("Makefile", []byte(minimal), fs.PermFile); err != nil {
+			return ctxerr.CreateMakefile(err)
 		}
-		cmd.Println("  ✓ Makefile (created with ctx include)")
+		write.InitMakefileCreated(cmd)
 		return nil
 	}
 	if strings.Contains(string(existing), IncludeDirective) {
-		cmd.Println(fmt.Sprintf("  ○ Makefile (already includes %s)\n", config.FileMakefileCtx))
+		write.InitMakefileIncludes(cmd, project.MakefileCtx)
 		return nil
 	}
 	amended := string(existing)
-	if !strings.HasSuffix(amended, config.NewlineLF) {
-		amended += config.NewlineLF
+	if !strings.HasSuffix(amended, token.NewlineLF) {
+		amended += token.NewlineLF
 	}
-	amended += config.NewlineLF + IncludeDirective + config.NewlineLF
-	if err := os.WriteFile("Makefile", []byte(amended), config.PermFile); err != nil {
-		return fmt.Errorf("failed to amend Makefile: %w", err)
+	amended += token.NewlineLF + IncludeDirective + token.NewlineLF
+	if err := os.WriteFile("Makefile", []byte(amended), fs.PermFile); err != nil {
+		return ctxerr.FileAmend("Makefile", err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ Makefile (appended %s include)\n", config.FileMakefileCtx))
+	write.InitMakefileAppended(cmd, project.MakefileCtx)
 	return nil
 }
diff --git a/internal/cli/initialize/core/plan.go b/internal/cli/initialize/core/plan.go
index b65e917d..5bd66f10 100644
--- a/internal/cli/initialize/core/plan.go
+++ b/internal/cli/initialize/core/plan.go
@@ -13,10 +13,16 @@ import (
 	"strings"
 	"time"
 
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/project"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // HandleImplementationPlan creates or merges IMPLEMENTATION_PLAN.md.
@@ -29,60 +35,60 @@ import (
 // Returns:
 //   - error: Non-nil if file operations fail
 func HandleImplementationPlan(cmd *cobra.Command, force, autoMerge bool) error {
-	templateContent, err := assets.ProjectFile(config.FileImplementationPlan)
+	templateContent, err := assets.ProjectFile(project.ImplementationPlan)
 	if err != nil {
-		return fmt.Errorf("failed to read IMPLEMENTATION_PLAN.md template: %w", err)
+		return ctxerr.ReadInitTemplate("IMPLEMENTATION_PLAN.md", err)
 	}
-	existingContent, err := os.ReadFile(config.FileImplementationPlan)
+	existingContent, err := os.ReadFile(project.ImplementationPlan)
 	fileExists := err == nil
 	if !fileExists {
-		if err := os.WriteFile(config.FileImplementationPlan, templateContent, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", config.FileImplementationPlan, err)
+		if err := os.WriteFile(project.ImplementationPlan, templateContent, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(project.ImplementationPlan, err)
 		}
-		cmd.Println(fmt.Sprintf("  ✓ %s", config.FileImplementationPlan))
+		write.InitCreated(cmd, project.ImplementationPlan)
 		return nil
 	}
 	existingStr := string(existingContent)
-	hasCtxMarkers := strings.Contains(existingStr, config.PlanMarkerStart)
+	hasCtxMarkers := strings.Contains(existingStr, marker.PlanMarkerStart)
 	if hasCtxMarkers {
 		if !force {
-			cmd.Println(fmt.Sprintf("  ○ %s (ctx content exists, skipped)\n", config.FileImplementationPlan))
+			write.InitCtxContentExists(cmd, project.ImplementationPlan)
 			return nil
 		}
 		return UpdatePlanSection(cmd, existingStr, templateContent)
 	}
 	if !autoMerge {
-		cmd.Println(fmt.Sprintf("\n%s exists but has no ctx content.\n", config.FileImplementationPlan))
+		write.InitFileExistsNoCtx(cmd, project.ImplementationPlan)
 		cmd.Println("Would you like to merge ctx implementation plan template?")
 		cmd.Print("[y/N] ")
 		reader := bufio.NewReader(os.Stdin)
 		response, err := reader.ReadString('\n')
 		if err != nil {
-			return fmt.Errorf("failed to read input: %w", err)
+			return ctxerr.ReadInput(err)
 		}
 		response = strings.TrimSpace(strings.ToLower(response))
-		if response != config.ConfirmShort && response != config.ConfirmLong {
-			cmd.Println(fmt.Sprintf("  ○ %s (skipped)\n", config.FileImplementationPlan))
+		if response != cli.ConfirmShort && response != cli.ConfirmLong {
+			write.InitSkippedPlain(cmd, project.ImplementationPlan)
 			return nil
 		}
 	}
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FileImplementationPlan, timestamp)
-	if err := os.WriteFile(backupName, existingContent, config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup %s: %w", backupName, err)
+	backupName := fmt.Sprintf("%s.%d.bak", project.ImplementationPlan, timestamp)
+	if err := os.WriteFile(backupName, existingContent, fs.PermFile); err != nil {
+		return ctxerr.CreateBackup(backupName, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
+	write.InitBackup(cmd, backupName)
 	insertPos := FindInsertionPoint(existingStr)
 	var mergedContent string
 	if insertPos == 0 {
-		mergedContent = string(templateContent) + config.NewlineLF + existingStr
+		mergedContent = string(templateContent) + token.NewlineLF + existingStr
 	} else {
-		mergedContent = existingStr[:insertPos] + config.NewlineLF + string(templateContent) + config.NewlineLF + existingStr[insertPos:]
+		mergedContent = existingStr[:insertPos] + token.NewlineLF + string(templateContent) + token.NewlineLF + existingStr[insertPos:]
 	}
-	if err := os.WriteFile(config.FileImplementationPlan, []byte(mergedContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to write merged %s: %w", config.FileImplementationPlan, err)
+	if err := os.WriteFile(project.ImplementationPlan, []byte(mergedContent), fs.PermFile); err != nil {
+		return ctxerr.WriteMerged(project.ImplementationPlan, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (merged)", config.FileImplementationPlan))
+	write.InitMerged(cmd, project.ImplementationPlan)
 	return nil
 }
 
@@ -97,33 +103,33 @@ func HandleImplementationPlan(cmd *cobra.Command, force, autoMerge bool) error {
 // Returns:
 //   - error: Non-nil if markers are missing or file operations fail
 func UpdatePlanSection(cmd *cobra.Command, existing string, newTemplate []byte) error {
-	startIdx := strings.Index(existing, config.PlanMarkerStart)
+	startIdx := strings.Index(existing, marker.PlanMarkerStart)
 	if startIdx == -1 {
-		return fmt.Errorf("plan start marker not found")
+		return ctxerr.MarkerNotFound("plan")
 	}
-	endIdx := strings.Index(existing, config.PlanMarkerEnd)
+	endIdx := strings.Index(existing, marker.PlanMarkerEnd)
 	if endIdx == -1 {
 		endIdx = len(existing)
 	} else {
-		endIdx += len(config.PlanMarkerEnd)
+		endIdx += len(marker.PlanMarkerEnd)
 	}
 	templateStr := string(newTemplate)
-	templateStart := strings.Index(templateStr, config.PlanMarkerStart)
-	templateEnd := strings.Index(templateStr, config.PlanMarkerEnd)
+	templateStart := strings.Index(templateStr, marker.PlanMarkerStart)
+	templateEnd := strings.Index(templateStr, marker.PlanMarkerEnd)
 	if templateStart == -1 || templateEnd == -1 {
-		return fmt.Errorf("template missing plan markers")
+		return ctxerr.TemplateMissingMarkers("plan")
 	}
-	planContent := templateStr[templateStart : templateEnd+len(config.PlanMarkerEnd)]
+	planContent := templateStr[templateStart : templateEnd+len(marker.PlanMarkerEnd)]
 	newContent := existing[:startIdx] + planContent + existing[endIdx:]
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FileImplementationPlan, timestamp)
-	if err := os.WriteFile(backupName, []byte(existing), config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup: %w", err)
+	backupName := fmt.Sprintf("%s.%d.bak", project.ImplementationPlan, timestamp)
+	if err := os.WriteFile(backupName, []byte(existing), fs.PermFile); err != nil {
+		return ctxerr.CreateBackupGeneric(err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
-	if err := os.WriteFile(config.FileImplementationPlan, []byte(newContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to update %s: %w", config.FileImplementationPlan, err)
+	write.InitBackup(cmd, backupName)
+	if err := os.WriteFile(project.ImplementationPlan, []byte(newContent), fs.PermFile); err != nil {
+		return ctxerr.FileUpdate(project.ImplementationPlan, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (updated plan section)\n", config.FileImplementationPlan))
+	write.InitUpdatedPlanSection(cmd, project.ImplementationPlan)
 	return nil
 }
diff --git a/internal/cli/initialize/core/plugin.go b/internal/cli/initialize/core/plugin.go
index b67f021d..492b1a27 100644
--- a/internal/cli/initialize/core/plugin.go
+++ b/internal/cli/initialize/core/plugin.go
@@ -9,13 +9,16 @@ package core
 import (
 	"bytes"
 	"encoding/json"
-	"fmt"
 	"os"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/write/add"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 type installedPlugins struct {
@@ -34,32 +37,32 @@ type globalSettings map[string]json.RawMessage
 func EnablePluginGlobally(cmd *cobra.Command) error {
 	homeDir, homeErr := os.UserHomeDir()
 	if homeErr != nil {
-		return fmt.Errorf("cannot determine home directory: %w", homeErr)
+		return ctxerr.HomeDir(homeErr)
 	}
 	claudeDir := filepath.Join(homeDir, ".claude")
-	installedPath := filepath.Join(claudeDir, config.FileInstalledPlugins)
+	installedPath := filepath.Join(claudeDir, claude.InstalledPlugins)
 	installedData, readErr := os.ReadFile(installedPath) //nolint:gosec // G304: path from os.UserHomeDir
 	if readErr != nil {
-		cmd.Println("  ○ Plugin enablement skipped (plugin not installed)")
+		write.InitPluginSkipped(cmd)
 		return nil
 	}
 	var installed installedPlugins
 	if parseErr := json.Unmarshal(installedData, &installed); parseErr != nil {
-		return fmt.Errorf("failed to parse %s: %w", installedPath, parseErr)
+		return ctxerr.ParseFile(installedPath, parseErr)
 	}
-	if _, found := installed.Plugins[config.PluginID]; !found {
-		cmd.Println("  ○ Plugin enablement skipped (plugin not installed)")
+	if _, found := installed.Plugins[claude.PluginID]; !found {
+		write.InitPluginSkipped(cmd)
 		return nil
 	}
-	settingsPath := filepath.Join(claudeDir, config.FileGlobalSettings)
+	settingsPath := filepath.Join(claudeDir, claude.GlobalSettings)
 	var settings globalSettings
 	existingData, readErr := os.ReadFile(settingsPath) //nolint:gosec // G304: path from os.UserHomeDir
 	if readErr != nil && !os.IsNotExist(readErr) {
-		return fmt.Errorf("failed to read %s: %w", settingsPath, readErr)
+		return add.ErrFileRead(settingsPath, readErr)
 	}
 	if readErr == nil {
 		if parseErr := json.Unmarshal(existingData, &settings); parseErr != nil {
-			return fmt.Errorf("failed to parse %s: %w", settingsPath, parseErr)
+			return ctxerr.ParseFile(settingsPath, parseErr)
 		}
 	} else {
 		settings = make(globalSettings)
@@ -67,8 +70,8 @@ func EnablePluginGlobally(cmd *cobra.Command) error {
 	if raw, ok := settings["enabledPlugins"]; ok {
 		var enabled map[string]bool
 		if parseErr := json.Unmarshal(raw, &enabled); parseErr == nil {
-			if enabled[config.PluginID] {
-				cmd.Println("  ○ Plugin already enabled globally")
+			if enabled[claude.PluginID] {
+				write.InitPluginAlreadyEnabled(cmd)
 				return nil
 			}
 		}
@@ -81,10 +84,10 @@ func EnablePluginGlobally(cmd *cobra.Command) error {
 	} else {
 		enabled = make(map[string]bool)
 	}
-	enabled[config.PluginID] = true
+	enabled[claude.PluginID] = true
 	enabledJSON, marshalErr := json.Marshal(enabled)
 	if marshalErr != nil {
-		return fmt.Errorf("failed to marshal enabledPlugins: %w", marshalErr)
+		return ctxerr.MarshalPlugins(marshalErr)
 	}
 	settings["enabledPlugins"] = enabledJSON
 	var buf bytes.Buffer
@@ -92,12 +95,12 @@ func EnablePluginGlobally(cmd *cobra.Command) error {
 	encoder.SetEscapeHTML(false)
 	encoder.SetIndent("", "  ")
 	if encodeErr := encoder.Encode(settings); encodeErr != nil {
-		return fmt.Errorf("failed to marshal settings: %w", encodeErr)
+		return ctxerr.MarshalSettings(encodeErr)
 	}
-	if writeErr := os.WriteFile(settingsPath, buf.Bytes(), config.PermFile); writeErr != nil {
-		return fmt.Errorf("failed to write %s: %w", settingsPath, writeErr)
+	if writeErr := os.WriteFile(settingsPath, buf.Bytes(), fs.PermFile); writeErr != nil {
+		return ctxerr.FileWrite(settingsPath, writeErr)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ Plugin enabled globally in %s", settingsPath))
+	write.InitPluginEnabled(cmd, settingsPath)
 	return nil
 }
 
@@ -108,7 +111,7 @@ func PluginInstalled() bool {
 	if homeErr != nil {
 		return false
 	}
-	installedPath := filepath.Join(homeDir, ".claude", config.FileInstalledPlugins)
+	installedPath := filepath.Join(homeDir, ".claude", claude.InstalledPlugins)
 	data, readErr := os.ReadFile(installedPath) //nolint:gosec // G304: path from os.UserHomeDir
 	if readErr != nil {
 		return false
@@ -117,7 +120,7 @@ func PluginInstalled() bool {
 	if parseErr := json.Unmarshal(data, &installed); parseErr != nil {
 		return false
 	}
-	_, found := installed.Plugins[config.PluginID]
+	_, found := installed.Plugins[claude.PluginID]
 	return found
 }
 
@@ -128,7 +131,7 @@ func PluginEnabledGlobally() bool {
 	if homeErr != nil {
 		return false
 	}
-	settingsPath := filepath.Join(homeDir, ".claude", config.FileGlobalSettings)
+	settingsPath := filepath.Join(homeDir, ".claude", claude.GlobalSettings)
 	data, readErr := os.ReadFile(settingsPath) //nolint:gosec // G304: path from os.UserHomeDir
 	if readErr != nil {
 		return false
@@ -145,13 +148,13 @@ func PluginEnabledGlobally() bool {
 	if parseErr := json.Unmarshal(raw, &enabled); parseErr != nil {
 		return false
 	}
-	return enabled[config.PluginID]
+	return enabled[claude.PluginID]
 }
 
 // PluginEnabledLocally reports whether the ctx plugin is enabled in
 // .claude/settings.local.json in the current project.
 func PluginEnabledLocally() bool {
-	data, readErr := os.ReadFile(config.FileSettings)
+	data, readErr := os.ReadFile(claude.Settings)
 	if readErr != nil {
 		return false
 	}
@@ -167,5 +170,5 @@ func PluginEnabledLocally() bool {
 	if parseErr := json.Unmarshal(epRaw, &enabled); parseErr != nil {
 		return false
 	}
-	return enabled[config.PluginID]
+	return enabled[claude.PluginID]
 }
diff --git a/internal/cli/initialize/core/prompt.go b/internal/cli/initialize/core/prompt.go
index 801faf15..250527a8 100644
--- a/internal/cli/initialize/core/prompt.go
+++ b/internal/cli/initialize/core/prompt.go
@@ -13,10 +13,16 @@ import (
 	"strings"
 	"time"
 
+	"github.com/ActiveMemory/ctx/internal/config/cli"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/loop"
+	"github.com/ActiveMemory/ctx/internal/config/marker"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // HandlePromptMd creates or merges PROMPT.md with ctx content.
@@ -33,70 +39,70 @@ func HandlePromptMd(cmd *cobra.Command, force, autoMerge, ralph bool) error {
 	var templateContent []byte
 	var err error
 	if ralph {
-		templateContent, err = assets.RalphTemplate(config.FilePromptMd)
+		templateContent, err = assets.RalphTemplate(loop.PromptMd)
 		if err != nil {
-			return fmt.Errorf("failed to read ralph PROMPT.md template: %w", err)
+			return ctxerr.ReadInitTemplate("ralph PROMPT.md", err)
 		}
 	} else {
-		templateContent, err = assets.Template(config.FilePromptMd)
+		templateContent, err = assets.Template(loop.PromptMd)
 		if err != nil {
-			return fmt.Errorf("failed to read PROMPT.md template: %w", err)
+			return ctxerr.ReadInitTemplate("PROMPT.md", err)
 		}
 	}
-	existingContent, err := os.ReadFile(config.FilePromptMd)
+	existingContent, err := os.ReadFile(loop.PromptMd)
 	fileExists := err == nil
 	if !fileExists {
-		if err := os.WriteFile(config.FilePromptMd, templateContent, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", config.FilePromptMd, err)
+		if err := os.WriteFile(loop.PromptMd, templateContent, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(loop.PromptMd, err)
 		}
 		mode := ""
 		if ralph {
 			mode = " (ralph mode)"
 		}
-		cmd.Println(fmt.Sprintf("  ✓ %s%s", config.FilePromptMd, mode))
+		write.InitCreatedWith(cmd, loop.PromptMd, mode)
 		return nil
 	}
 	existingStr := string(existingContent)
-	hasCtxMarkers := strings.Contains(existingStr, config.PromptMarkerStart)
+	hasCtxMarkers := strings.Contains(existingStr, marker.PromptMarkerStart)
 	if hasCtxMarkers {
 		if !force {
-			cmd.Println(fmt.Sprintf("  ○ %s (ctx content exists, skipped)\n", config.FilePromptMd))
+			write.InitCtxContentExists(cmd, loop.PromptMd)
 			return nil
 		}
 		return UpdatePromptSection(cmd, existingStr, templateContent)
 	}
 	if !autoMerge {
-		cmd.Println(fmt.Sprintf("\n%s exists but has no ctx content.\n", config.FilePromptMd))
+		write.InitFileExistsNoCtx(cmd, loop.PromptMd)
 		cmd.Println("Would you like to merge ctx prompt instructions?")
 		cmd.Print("[y/N] ")
 		reader := bufio.NewReader(os.Stdin)
 		response, err := reader.ReadString('\n')
 		if err != nil {
-			return fmt.Errorf("failed to read input: %w", err)
+			return ctxerr.ReadInput(err)
 		}
 		response = strings.TrimSpace(strings.ToLower(response))
-		if response != config.ConfirmShort && response != config.ConfirmLong {
-			cmd.Println(fmt.Sprintf("  ○ %s (skipped)", config.FilePromptMd))
+		if response != cli.ConfirmShort && response != cli.ConfirmLong {
+			write.InitSkippedPlain(cmd, loop.PromptMd)
 			return nil
 		}
 	}
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FilePromptMd, timestamp)
-	if err := os.WriteFile(backupName, existingContent, config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup %s: %w", backupName, err)
+	backupName := fmt.Sprintf("%s.%d.bak", loop.PromptMd, timestamp)
+	if err := os.WriteFile(backupName, existingContent, fs.PermFile); err != nil {
+		return ctxerr.CreateBackup(backupName, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
+	write.InitBackup(cmd, backupName)
 	insertPos := FindInsertionPoint(existingStr)
 	var mergedContent string
 	if insertPos == 0 {
-		mergedContent = string(templateContent) + config.NewlineLF + existingStr
+		mergedContent = string(templateContent) + token.NewlineLF + existingStr
 	} else {
-		mergedContent = existingStr[:insertPos] + config.NewlineLF + string(templateContent) + config.NewlineLF + existingStr[insertPos:]
+		mergedContent = existingStr[:insertPos] + token.NewlineLF + string(templateContent) + token.NewlineLF + existingStr[insertPos:]
 	}
-	if err := os.WriteFile(config.FilePromptMd, []byte(mergedContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to write merged %s: %w", config.FilePromptMd, err)
+	if err := os.WriteFile(loop.PromptMd, []byte(mergedContent), fs.PermFile); err != nil {
+		return ctxerr.WriteMerged(loop.PromptMd, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (merged)", config.FilePromptMd))
+	write.InitMerged(cmd, loop.PromptMd)
 	return nil
 }
 
@@ -111,33 +117,33 @@ func HandlePromptMd(cmd *cobra.Command, force, autoMerge, ralph bool) error {
 // Returns:
 //   - error: Non-nil if markers are missing or file operations fail
 func UpdatePromptSection(cmd *cobra.Command, existing string, newTemplate []byte) error {
-	startIdx := strings.Index(existing, config.PromptMarkerStart)
+	startIdx := strings.Index(existing, marker.PromptMarkerStart)
 	if startIdx == -1 {
-		return fmt.Errorf("prompt start marker not found")
+		return ctxerr.MarkerNotFound("prompt")
 	}
-	endIdx := strings.Index(existing, config.PromptMarkerEnd)
+	endIdx := strings.Index(existing, marker.PromptMarkerEnd)
 	if endIdx == -1 {
 		endIdx = len(existing)
 	} else {
-		endIdx += len(config.PromptMarkerEnd)
+		endIdx += len(marker.PromptMarkerEnd)
 	}
 	templateStr := string(newTemplate)
-	templateStart := strings.Index(templateStr, config.PromptMarkerStart)
-	templateEnd := strings.Index(templateStr, config.PromptMarkerEnd)
+	templateStart := strings.Index(templateStr, marker.PromptMarkerStart)
+	templateEnd := strings.Index(templateStr, marker.PromptMarkerEnd)
 	if templateStart == -1 || templateEnd == -1 {
-		return fmt.Errorf("template missing prompt markers")
+		return ctxerr.TemplateMissingMarkers("prompt")
 	}
-	promptContent := templateStr[templateStart : templateEnd+len(config.PromptMarkerEnd)]
+	promptContent := templateStr[templateStart : templateEnd+len(marker.PromptMarkerEnd)]
 	newContent := existing[:startIdx] + promptContent + existing[endIdx:]
 	timestamp := time.Now().Unix()
-	backupName := fmt.Sprintf("%s.%d.bak", config.FilePromptMd, timestamp)
-	if err := os.WriteFile(backupName, []byte(existing), config.PermFile); err != nil {
-		return fmt.Errorf("failed to create backup: %w", err)
+	backupName := fmt.Sprintf("%s.%d.bak", loop.PromptMd, timestamp)
+	if err := os.WriteFile(backupName, []byte(existing), fs.PermFile); err != nil {
+		return ctxerr.CreateBackupGeneric(err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (backup)", backupName))
-	if err := os.WriteFile(config.FilePromptMd, []byte(newContent), config.PermFile); err != nil {
-		return fmt.Errorf("failed to update %s: %w", config.FilePromptMd, err)
+	write.InitBackup(cmd, backupName)
+	if err := os.WriteFile(loop.PromptMd, []byte(newContent), fs.PermFile); err != nil {
+		return ctxerr.FileUpdate(loop.PromptMd, err)
 	}
-	cmd.Println(fmt.Sprintf("  ✓ %s (updated prompt section)\n", config.FilePromptMd))
+	write.InitUpdatedPromptSection(cmd, loop.PromptMd)
 	return nil
 }
diff --git a/internal/cli/initialize/core/prompt_tpl.go b/internal/cli/initialize/core/prompt_tpl.go
index 6a2e58a5..39dd93bd 100644
--- a/internal/cli/initialize/core/prompt_tpl.go
+++ b/internal/cli/initialize/core/prompt_tpl.go
@@ -7,14 +7,16 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // CreatePromptTemplates creates prompt template files in .context/prompts/.
@@ -27,28 +29,28 @@ import (
 // Returns:
 //   - error: Non-nil if directory creation or file write fails
 func CreatePromptTemplates(cmd *cobra.Command, contextDir string, force bool) error {
-	promptDir := filepath.Join(contextDir, config.DirPrompts)
-	if err := os.MkdirAll(promptDir, config.PermExec); err != nil {
-		return fmt.Errorf("failed to create %s: %w", promptDir, err)
+	promptDir := filepath.Join(contextDir, dir.Prompts)
+	if err := os.MkdirAll(promptDir, fs.PermExec); err != nil {
+		return ctxerr.Mkdir(promptDir, err)
 	}
 	promptTemplates, err := assets.ListPromptTemplates()
 	if err != nil {
-		return fmt.Errorf("failed to list prompt templates: %w", err)
+		return ctxerr.ListPromptTemplates(err)
 	}
 	for _, name := range promptTemplates {
 		targetPath := filepath.Join(promptDir, name)
 		if _, err := os.Stat(targetPath); err == nil && !force {
-			cmd.Println(fmt.Sprintf("  ○ prompts/%s (exists, skipped)", name))
+			write.InitSkipped(cmd, "prompts/"+name)
 			continue
 		}
 		content, err := assets.PromptTemplate(name)
 		if err != nil {
-			return fmt.Errorf("failed to read prompt template %s: %w", name, err)
+			return ctxerr.ReadPromptTemplate(name, err)
 		}
-		if err := os.WriteFile(targetPath, content, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", targetPath, err)
+		if err := os.WriteFile(targetPath, content, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(targetPath, err)
 		}
-		cmd.Println(fmt.Sprintf("  ✓ prompts/%s", name))
+		write.InitCreated(cmd, "prompts/"+name)
 	}
 	return nil
 }
diff --git a/internal/cli/initialize/core/tpl.go b/internal/cli/initialize/core/tpl.go
index 4a67749e..d66d34e3 100644
--- a/internal/cli/initialize/core/tpl.go
+++ b/internal/cli/initialize/core/tpl.go
@@ -7,14 +7,15 @@
 package core
 
 import (
-	"fmt"
 	"os"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/fs"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
+	ctxerr "github.com/ActiveMemory/ctx/internal/err"
+	"github.com/ActiveMemory/ctx/internal/write"
 )
 
 // CreateEntryTemplates creates entry template files in .context/templates/.
@@ -28,27 +29,27 @@ import (
 //   - error: Non-nil if directory creation or file write fails
 func CreateEntryTemplates(cmd *cobra.Command, contextDir string, force bool) error {
 	templatesDir := filepath.Join(contextDir, "templates")
-	if err := os.MkdirAll(templatesDir, config.PermExec); err != nil {
-		return fmt.Errorf("failed to create %s: %w", templatesDir, err)
+	if err := os.MkdirAll(templatesDir, fs.PermExec); err != nil {
+		return ctxerr.Mkdir(templatesDir, err)
 	}
 	entryTemplates, err := assets.ListEntry()
 	if err != nil {
-		return fmt.Errorf("failed to list entry templates: %w", err)
+		return ctxerr.ListEntryTemplates(err)
 	}
 	for _, name := range entryTemplates {
 		targetPath := filepath.Join(templatesDir, name)
 		if _, err := os.Stat(targetPath); err == nil && !force {
-			cmd.Println(fmt.Sprintf("  ○ templates/%s (exists, skipped)", name))
+			write.InitSkipped(cmd, "templates/"+name)
 			continue
 		}
 		content, err := assets.Entry(name)
 		if err != nil {
-			return fmt.Errorf("failed to read entry template %s: %w", name, err)
+			return ctxerr.ReadEntryTemplate(name, err)
 		}
-		if err := os.WriteFile(targetPath, content, config.PermFile); err != nil {
-			return fmt.Errorf("failed to write %s: %w", targetPath, err)
+		if err := os.WriteFile(targetPath, content, fs.PermFile); err != nil {
+			return ctxerr.FileWrite(targetPath, err)
 		}
-		cmd.Println(fmt.Sprintf("  ✓ templates/%s", name))
+		write.InitCreated(cmd, "templates/"+name)
 	}
 	return nil
 }
diff --git a/internal/cli/initialize/core/validate.go b/internal/cli/initialize/core/validate.go
index f9005878..c20a2b01 100644
--- a/internal/cli/initialize/core/validate.go
+++ b/internal/cli/initialize/core/validate.go
@@ -10,9 +10,9 @@ import (
 	"os"
 	"os/exec"
 
+	"github.com/ActiveMemory/ctx/internal/config/env"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
 	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/write"
 )
@@ -25,7 +25,7 @@ import (
 // Returns:
 //   - error: Non-nil if ctx is not found in PATH
 func CheckCtxInPath(cmd *cobra.Command) error {
-	if os.Getenv(config.EnvSkipPathCheck) == config.EnvTrue {
+	if os.Getenv(env.SkipPathCheck) == env.True {
 		return nil
 	}
 	_, err := exec.LookPath("ctx")
diff --git a/internal/cli/initialize/init.go b/internal/cli/initialize/init.go
index 947c458d..89da9758 100644
--- a/internal/cli/initialize/init.go
+++ b/internal/cli/initialize/init.go
@@ -9,10 +9,8 @@ package initialize
 import (
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/assets"
 	initroot "github.com/ActiveMemory/ctx/internal/cli/initialize/cmd/root"
 	"github.com/ActiveMemory/ctx/internal/cli/initialize/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 )
 
 // PluginInstalled reports whether the ctx plugin is registered in
@@ -31,64 +29,6 @@ var PluginEnabledGlobally = core.PluginEnabledGlobally
 var PluginEnabledLocally = core.PluginEnabledLocally
 
 // Cmd returns the "ctx init" command for initializing a .context/ directory.
-//
-// The command creates template files for maintaining persistent context
-// for AI coding assistants. Files include constitution rules, tasks,
-// decisions, learnings, conventions, and architecture documentation.
-//
-// Flags:
-//   - --force, -f: Overwrite existing context files without prompting
-//   - --minimal, -m: Only create essential files
-//     (TASKS, DECISIONS, CONSTITUTION)
-//   - --merge: Auto-merge ctx content into existing CLAUDE.md and PROMPT.md
-//   - --ralph: Use autonomous loop templates (no clarifying questions,
-//     one-task-per-iteration, completion signals)
-//   - --no-plugin-enable: Skip auto-enabling the ctx plugin in
-//     ~/.claude/settings.json
-//
-// Returns:
-//   - *cobra.Command: Configured init command with flags registered
 func Cmd() *cobra.Command {
-	var (
-		force          bool
-		minimal        bool
-		merge          bool
-		ralph          bool
-		noPluginEnable bool
-	)
-
-	short, long := assets.CommandDesc("initialize")
-	cmd := &cobra.Command{
-		Use:         "init",
-		Short:       short,
-		Annotations: map[string]string{config.AnnotationSkipInit: "true"},
-		Long:        long,
-		RunE: func(cmd *cobra.Command, args []string) error {
-			return initroot.Run(cmd, force, minimal, merge, ralph, noPluginEnable)
-		},
-	}
-
-	cmd.Flags().BoolVarP(
-		&force,
-		"force", "f", false, assets.FlagDesc("initialize.force"),
-	)
-	cmd.Flags().BoolVarP(
-		&minimal,
-		"minimal", "m", false,
-		assets.FlagDesc("initialize.minimal"),
-	)
-	cmd.Flags().BoolVar(
-		&merge, "merge", false,
-		assets.FlagDesc("initialize.merge"),
-	)
-	cmd.Flags().BoolVar(
-		&ralph, "ralph", false,
-		assets.FlagDesc("initialize.ralph"),
-	)
-	cmd.Flags().BoolVar(
-		&noPluginEnable, "no-plugin-enable", false,
-		assets.FlagDesc("initialize.no-plugin-enable"),
-	)
-
-	return cmd
+	return initroot.Cmd()
 }
diff --git a/internal/cli/initialize/init_test.go b/internal/cli/initialize/init_test.go
index 2f69479d..72959fb1 100644
--- a/internal/cli/initialize/init_test.go
+++ b/internal/cli/initialize/init_test.go
@@ -14,7 +14,9 @@ import (
 	"testing"
 
 	"github.com/ActiveMemory/ctx/internal/claude"
-	"github.com/ActiveMemory/ctx/internal/config"
+	claude2 "github.com/ActiveMemory/ctx/internal/config/claude"
+	"github.com/ActiveMemory/ctx/internal/config/ctx"
+	"github.com/ActiveMemory/ctx/internal/config/env"
 )
 
 // TestInitCommand tests the init command creates the .context directory.
@@ -347,7 +349,7 @@ func TestRunInit_Minimal(t *testing.T) {
 	}
 	defer func() { _ = os.Chdir(origDir) }()
 	t.Setenv("HOME", tmpDir)
-	t.Setenv(config.EnvSkipPathCheck, config.EnvTrue)
+	t.Setenv(env.SkipPathCheck, env.True)
 
 	cmd := Cmd()
 	cmd.SetArgs([]string{"--minimal"})
@@ -355,14 +357,14 @@ func TestRunInit_Minimal(t *testing.T) {
 		t.Fatalf("init --minimal failed: %v", err)
 	}
 
-	for _, name := range config.FilesRequired {
+	for _, name := range ctx.FilesRequired {
 		path := filepath.Join(".context", name)
 		if _, err := os.Stat(path); err != nil {
 			t.Errorf("required file %s missing with --minimal: %v", name, err)
 		}
 	}
 
-	glossaryPath := filepath.Join(".context", config.FileGlossary)
+	glossaryPath := filepath.Join(".context", ctx.Glossary)
 	if _, err := os.Stat(glossaryPath); err == nil {
 		t.Error("GLOSSARY.md should not exist with --minimal")
 	}
@@ -381,7 +383,7 @@ func TestRunInit_Force(t *testing.T) {
 	}
 	defer func() { _ = os.Chdir(origDir) }()
 	t.Setenv("HOME", tmpDir)
-	t.Setenv(config.EnvSkipPathCheck, config.EnvTrue)
+	t.Setenv(env.SkipPathCheck, env.True)
 
 	cmd := Cmd()
 	cmd.SetArgs([]string{})
@@ -395,7 +397,7 @@ func TestRunInit_Force(t *testing.T) {
 		t.Fatalf("init --force failed: %v", err)
 	}
 
-	if _, err := os.Stat(filepath.Join(".context", config.FileConstitution)); err != nil {
+	if _, err := os.Stat(filepath.Join(".context", ctx.Constitution)); err != nil {
 		t.Error("CONSTITUTION.md missing after force reinit")
 	}
 }
@@ -413,9 +415,9 @@ func TestRunInit_Merge(t *testing.T) {
 	}
 	defer func() { _ = os.Chdir(origDir) }()
 	t.Setenv("HOME", tmpDir)
-	t.Setenv(config.EnvSkipPathCheck, config.EnvTrue)
+	t.Setenv(env.SkipPathCheck, env.True)
 
-	if err = os.WriteFile(config.FileClaudeMd, []byte("# My Project\n\nExisting.\n"), 0600); err != nil {
+	if err = os.WriteFile(claude2.Md, []byte("# My Project\n\nExisting.\n"), 0600); err != nil {
 		t.Fatal(err)
 	}
 
@@ -425,7 +427,7 @@ func TestRunInit_Merge(t *testing.T) {
 		t.Fatalf("init --merge failed: %v", err)
 	}
 
-	content, _ := os.ReadFile(config.FileClaudeMd)
+	content, _ := os.ReadFile(claude2.Md)
 	if !strings.Contains(string(content), "My Project") {
 		t.Error("original content lost with --merge")
 	}
diff --git a/internal/cli/journal/cmd/obsidian/cmd.go b/internal/cli/journal/cmd/obsidian/cmd.go
index 07e56f12..4f6b7085 100644
--- a/internal/cli/journal/cmd/obsidian/cmd.go
+++ b/internal/cli/journal/cmd/obsidian/cmd.go
@@ -9,10 +9,10 @@ package obsidian
 import (
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/obsidian"
 	"github.com/spf13/cobra"
 
 	"github.com/ActiveMemory/ctx/internal/assets"
-	"github.com/ActiveMemory/ctx/internal/config"
 	"github.com/ActiveMemory/ctx/internal/rc"
 )
 
@@ -24,19 +24,20 @@ import (
 func Cmd() *cobra.Command {
 	var output string
 
-	short, long := assets.CommandDesc("journal.obsidian")
+	short, long := assets.CommandDesc(assets.CmdDescKeyJournalObsidian)
 	cmd := &cobra.Command{
 		Use:   "obsidian",
 		Short: short,
 		Long:  long,
 		RunE: func(cmd *cobra.Command, args []string) error {
-			return runJournalObsidian(cmd, output)
+			return Run(cmd, output)
 		},
 	}
 
-	defaultOutput := filepath.Join(rc.ContextDir(), config.ObsidianDirName)
+	defaultOutput := filepath.Join(rc.ContextDir(), obsidian.DirName)
 	cmd.Flags().StringVarP(
-		&output, "output", "o", defaultOutput, assets.FlagDesc("journal.obsidian.output"),
+		&output, "output", "o",
+		defaultOutput, assets.FlagDesc(assets.FlagDescKeyJournalObsidianOutput),
 	)
 
 	return cmd
diff --git a/internal/cli/journal/cmd/obsidian/doc.go b/internal/cli/journal/cmd/obsidian/doc.go
new file mode 100644
index 00000000..5ecc50b9
--- /dev/null
+++ b/internal/cli/journal/cmd/obsidian/doc.go
@@ -0,0 +1,11 @@
+//   /    ctx:                         https://ctx.ist
+// ,'`./    do you remember?
+// `.,'\
+//   \    Copyright 2026-present Context contributors.
+//                 SPDX-License-Identifier: Apache-2.0
+
+// Package obsidian implements the ctx journal obsidian subcommand.
+//
+// It generates an Obsidian vault from journal entries, enabling visual
+// exploration of session history.
+package obsidian
diff --git a/internal/cli/journal/cmd/obsidian/run.go b/internal/cli/journal/cmd/obsidian/run.go
index 66a823c2..d3cd25dc 100644
--- a/internal/cli/journal/cmd/obsidian/run.go
+++ b/internal/cli/journal/cmd/obsidian/run.go
@@ -11,10 +11,14 @@ import (
 	"os"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/obsidian"
 	"github.com/spf13/cobra"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/journal/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/rc"
 	"github.com/ActiveMemory/ctx/internal/write"
@@ -24,7 +28,7 @@ import (
 // related sessions footer.
 const ObsidianMaxRelated = 5
 
-// runJournalObsidian generates an Obsidian vault from journal entries.
+// Run generates an Obsidian vault from journal entries.
 //
 // Pipeline:
 //  1. Scan entries (reuse core.ScanJournalEntries)
@@ -42,8 +46,8 @@ const ObsidianMaxRelated = 5
 //
 // Returns:
 //   - error: Non-nil if generation fails
-func runJournalObsidian(cmd *cobra.Command, output string) error {
-	return BuildObsidianVault(cmd, filepath.Join(rc.ContextDir(), config.DirJournal), output)
+func Run(cmd *cobra.Command, output string) error {
+	return BuildObsidianVault(cmd, filepath.Join(rc.ContextDir(), dir.Journal), output)
 }
 
 // BuildObsidianVault generates an Obsidian vault from journal entries in
@@ -73,34 +77,34 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 	// Create output directory structure
 	dirs := []string{
 		output,
-		filepath.Join(output, config.ObsidianDirEntries),
-		filepath.Join(output, config.ObsidianConfigDir),
-		filepath.Join(output, config.JournalDirTopics),
-		filepath.Join(output, config.JournalDirFiles),
-		filepath.Join(output, config.JournalDirTypes),
+		filepath.Join(output, obsidian.DirEntries),
+		filepath.Join(output, obsidian.DirConfig),
+		filepath.Join(output, dir.JournTopics),
+		filepath.Join(output, dir.JournalFiles),
+		filepath.Join(output, dir.JournalTypes),
 	}
 	for _, dir := range dirs {
-		if mkErr := os.MkdirAll(dir, config.PermExec); mkErr != nil {
+		if mkErr := os.MkdirAll(dir, fs.PermExec); mkErr != nil {
 			return ctxerr.Mkdir(dir, mkErr)
 		}
 	}
 
 	// Write .obsidian/app.json
 	appConfigPath := filepath.Join(
-		output, config.ObsidianConfigDir, config.ObsidianAppConfigFile,
+		output, obsidian.DirConfig, obsidian.AppConfigFile,
 	)
 	if writeErr := os.WriteFile(
-		appConfigPath, []byte(config.ObsidianAppConfig), config.PermFile,
+		appConfigPath, []byte(assets.ObsidianAppConfig), fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(appConfigPath, writeErr)
 	}
 
 	// Write README
-	readmePath := filepath.Join(output, config.FilenameReadme)
+	readmePath := filepath.Join(output, file.Readme)
 	if writeErr := os.WriteFile(
 		readmePath,
-		[]byte(fmt.Sprintf(config.ObsidianReadme, journalDir)),
-		config.PermFile,
+		[]byte(fmt.Sprintf(assets.ObsidianReadme, journalDir)),
+		fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(readmePath, writeErr)
 	}
@@ -123,7 +127,7 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 	// Transform and write entries
 	for _, entry := range entries {
 		src := entry.Path
-		dst := filepath.Join(output, config.ObsidianDirEntries, entry.Filename)
+		dst := filepath.Join(output, obsidian.DirEntries, entry.Filename)
 
 		content, readErr := os.ReadFile(filepath.Clean(src))
 		if readErr != nil {
@@ -144,14 +148,14 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 
 		// Transform for Obsidian
 		sourcePath := filepath.Join(
-			config.DirContext, config.DirJournal, entry.Filename,
+			dir.Context, dir.Journal, entry.Filename,
 		)
 		transformed := core.TransformFrontmatter(normalized, sourcePath)
 		transformed = core.ConvertMarkdownLinks(transformed)
 		transformed += core.GenerateRelatedFooter(entry, topicIndex, ObsidianMaxRelated)
 
 		if writeErr := os.WriteFile(
-			dst, []byte(transformed), config.PermFile,
+			dst, []byte(transformed), fs.PermFile,
 		); writeErr != nil {
 			write.WarnFileErr(cmd, entry.Filename, writeErr)
 			continue
@@ -160,11 +164,11 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 
 	// Write topic MOC and pages
 	if len(topics) > 0 {
-		topicsDir := filepath.Join(output, config.JournalDirTopics)
-		mocPath := filepath.Join(output, config.ObsidianTopicsMOC)
+		topicsDir := filepath.Join(output, dir.JournTopics)
+		mocPath := filepath.Join(output, obsidian.MOCTopics)
 		if writeErr := os.WriteFile(
 			mocPath, []byte(core.GenerateObsidianTopicsMOC(topics)),
-			config.PermFile,
+			fs.PermFile,
 		); writeErr != nil {
 			return ctxerr.FileWrite(mocPath, writeErr)
 		}
@@ -173,10 +177,10 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 			if !t.Popular {
 				continue
 			}
-			pagePath := filepath.Join(topicsDir, t.Name+config.ExtMarkdown)
+			pagePath := filepath.Join(topicsDir, t.Name+file.ExtMarkdown)
 			if writeErr := os.WriteFile(
 				pagePath, []byte(core.GenerateObsidianTopicPage(t)),
-				config.PermFile,
+				fs.PermFile,
 			); writeErr != nil {
 				write.WarnFileErr(cmd, pagePath, writeErr)
 			}
@@ -185,11 +189,11 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 
 	// Write key files MOC and pages
 	if len(keyFiles) > 0 {
-		filesDir := filepath.Join(output, config.JournalDirFiles)
-		mocPath := filepath.Join(output, config.ObsidianFilesMOC)
+		filesDir := filepath.Join(output, dir.JournalFiles)
+		mocPath := filepath.Join(output, obsidian.MOCFiles)
 		if writeErr := os.WriteFile(
 			mocPath, []byte(core.GenerateObsidianFilesMOC(keyFiles)),
-			config.PermFile,
+			fs.PermFile,
 		); writeErr != nil {
 			return ctxerr.FileWrite(mocPath, writeErr)
 		}
@@ -199,10 +203,10 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 				continue
 			}
 			slug := core.KeyFileSlug(kf.Path)
-			pagePath := filepath.Join(filesDir, slug+config.ExtMarkdown)
+			pagePath := filepath.Join(filesDir, slug+file.ExtMarkdown)
 			if writeErr := os.WriteFile(
 				pagePath, []byte(core.GenerateObsidianFilePage(kf)),
-				config.PermFile,
+				fs.PermFile,
 			); writeErr != nil {
 				write.WarnFileErr(cmd, pagePath, writeErr)
 			}
@@ -211,20 +215,20 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 
 	// Write types MOC and pages
 	if len(sessionTypes) > 0 {
-		typesDir := filepath.Join(output, config.JournalDirTypes)
-		mocPath := filepath.Join(output, config.ObsidianTypesMOC)
+		typesDir := filepath.Join(output, dir.JournalTypes)
+		mocPath := filepath.Join(output, obsidian.MOCTypes)
 		if writeErr := os.WriteFile(
 			mocPath, []byte(core.GenerateObsidianTypesMOC(sessionTypes)),
-			config.PermFile,
+			fs.PermFile,
 		); writeErr != nil {
 			return ctxerr.FileWrite(mocPath, writeErr)
 		}
 
 		for _, st := range sessionTypes {
-			pagePath := filepath.Join(typesDir, st.Name+config.ExtMarkdown)
+			pagePath := filepath.Join(typesDir, st.Name+file.ExtMarkdown)
 			if writeErr := os.WriteFile(
 				pagePath,
-				[]byte(core.GenerateObsidianTypePage(st)), config.PermFile,
+				[]byte(core.GenerateObsidianTypePage(st)), fs.PermFile,
 			); writeErr != nil {
 				write.WarnFileErr(cmd, pagePath, writeErr)
 			}
@@ -232,25 +236,19 @@ func BuildObsidianVault(cmd *cobra.Command, journalDir, output string) error {
 	}
 
 	// Write Home.md
-	homePath := filepath.Join(output, config.ObsidianHomeMOC)
+	homePath := filepath.Join(output, obsidian.MOCHome)
 	if writeErr := os.WriteFile(
 		homePath,
 		[]byte(core.GenerateHomeMOC(
 			regularEntries,
 			len(topics) > 0, len(keyFiles) > 0, len(sessionTypes) > 0,
 		)),
-		config.PermFile,
+		fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(homePath, writeErr)
 	}
 
-	cmd.Println(fmt.Sprintf(
-		"\u2713 Generated Obsidian vault with %d entries in %s",
-		len(entries), output,
-	))
-	cmd.Println()
-	cmd.Println("Next steps:")
-	cmd.Println("  Open Obsidian \u2192 Open folder as vault \u2192 Select " + output)
+	write.InfoObsidianGenerated(cmd, len(entries), output)
 
 	return nil
 }
diff --git a/internal/cli/journal/cmd/obsidian/run_test.go b/internal/cli/journal/cmd/obsidian/run_test.go
index 6c973ee7..c7578c59 100644
--- a/internal/cli/journal/cmd/obsidian/run_test.go
+++ b/internal/cli/journal/cmd/obsidian/run_test.go
@@ -12,16 +12,20 @@ import (
 	"strings"
 	"testing"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/obsidian"
 	"github.com/spf13/cobra"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
 )
 
 func TestRunJournalObsidianIntegration(t *testing.T) {
 	// Create a temporary journal directory with test entries
 	tmpDir := t.TempDir()
-	journalDir := filepath.Join(tmpDir, config.DirContext, config.DirJournal)
-	if mkErr := os.MkdirAll(journalDir, config.PermExec); mkErr != nil {
+	journalDir := filepath.Join(tmpDir, dir.Context, dir.Journal)
+	if mkErr := os.MkdirAll(journalDir, fs.PermExec); mkErr != nil {
 		t.Fatal(mkErr)
 	}
 
@@ -82,7 +86,7 @@ Just a plain session without enrichment.
 
 	for name, content := range entries {
 		path := filepath.Join(journalDir, name)
-		if writeErr := os.WriteFile(path, []byte(content), config.PermFile); writeErr != nil {
+		if writeErr := os.WriteFile(path, []byte(content), fs.PermFile); writeErr != nil {
 			t.Fatal(writeErr)
 		}
 	}
@@ -101,17 +105,17 @@ Just a plain session without enrichment.
 	}
 
 	// Verify vault structure
-	assertFileExists(t, filepath.Join(outputDir, config.ObsidianConfigDir, config.ObsidianAppConfigFile))
-	assertFileExists(t, filepath.Join(outputDir, config.ObsidianHomeMOC))
-	assertFileExists(t, filepath.Join(outputDir, config.FilenameReadme))
+	assertFileExists(t, filepath.Join(outputDir, obsidian.DirConfig, obsidian.AppConfigFile))
+	assertFileExists(t, filepath.Join(outputDir, obsidian.MOCHome))
+	assertFileExists(t, filepath.Join(outputDir, file.Readme))
 
 	// Verify entries were written
-	assertFileExists(t, filepath.Join(outputDir, config.ObsidianDirEntries, "2026-02-14-add-caching-abc12345.md"))
-	assertFileExists(t, filepath.Join(outputDir, config.ObsidianDirEntries, "2026-02-13-fix-cache-def67890.md"))
+	assertFileExists(t, filepath.Join(outputDir, obsidian.DirEntries, "2026-02-14-add-caching-abc12345.md"))
+	assertFileExists(t, filepath.Join(outputDir, obsidian.DirEntries, "2026-02-13-fix-cache-def67890.md"))
 
 	// Verify .obsidian/app.json content
 	appConfig, readErr := os.ReadFile(filepath.Join( //nolint:gosec // test file path
-		outputDir, config.ObsidianConfigDir, config.ObsidianAppConfigFile))
+		outputDir, obsidian.DirConfig, obsidian.AppConfigFile))
 	if readErr != nil {
 		t.Fatal(readErr)
 	}
@@ -120,7 +124,7 @@ Just a plain session without enrichment.
 	}
 
 	// Verify Home.md contains wikilinks
-	home, readErr := os.ReadFile(filepath.Join(outputDir, config.ObsidianHomeMOC)) //nolint:gosec // test file path
+	home, readErr := os.ReadFile(filepath.Join(outputDir, obsidian.MOCHome)) //nolint:gosec // test file path
 	if readErr != nil {
 		t.Fatal(readErr)
 	}
@@ -131,7 +135,7 @@ Just a plain session without enrichment.
 
 	// Verify entry has transformed frontmatter (topics -> tags)
 	entry1Out, readErr := os.ReadFile(filepath.Join( //nolint:gosec // test file path
-		outputDir, config.ObsidianDirEntries, "2026-02-14-add-caching-abc12345.md"))
+		outputDir, obsidian.DirEntries, "2026-02-14-add-caching-abc12345.md"))
 	if readErr != nil {
 		t.Fatal(readErr)
 	}
@@ -147,13 +151,13 @@ Just a plain session without enrichment.
 	}
 
 	// Verify entry has related footer
-	if !strings.Contains(entry1Str, config.ObsidianRelatedHeading) {
+	if !strings.Contains(entry1Str, assets.ObsidianRelatedHeading) {
 		t.Error("entry missing related sessions footer")
 	}
 
 	// Verify topic MOC was created (caching has 2 entries = popular)
-	assertFileExists(t, filepath.Join(outputDir, config.ObsidianTopicsMOC))
-	topicsMOC, readErr := os.ReadFile(filepath.Join(outputDir, config.ObsidianTopicsMOC)) //nolint:gosec // test file path
+	assertFileExists(t, filepath.Join(outputDir, obsidian.MOCTopics))
+	topicsMOC, readErr := os.ReadFile(filepath.Join(outputDir, obsidian.MOCTopics)) //nolint:gosec // test file path
 	if readErr != nil {
 		t.Fatal(readErr)
 	}
@@ -163,7 +167,7 @@ Just a plain session without enrichment.
 
 	// Verify popular topic page was created
 	assertFileExists(t, filepath.Join(
-		outputDir, config.JournalDirTopics, "caching.md"))
+		outputDir, dir.JournTopics, "caching.md"))
 }
 
 func assertFileExists(t *testing.T, path string) {
diff --git a/internal/cli/journal/cmd/site/cmd.go b/internal/cli/journal/cmd/site/cmd.go
index 55115129..4d2417a1 100644
--- a/internal/cli/journal/cmd/site/cmd.go
+++ b/internal/cli/journal/cmd/site/cmd.go
@@ -26,7 +26,7 @@ func Cmd() *cobra.Command {
 		build  bool
 	)
 
-	short, long := assets.CommandDesc("journal.site")
+	short, long := assets.CommandDesc(assets.CmdDescKeyJournalSite)
 	cmd := &cobra.Command{
 		Use:   "site",
 		Short: short,
@@ -38,13 +38,13 @@ func Cmd() *cobra.Command {
 
 	defaultOutput := filepath.Join(rc.ContextDir(), "journal-site")
 	cmd.Flags().StringVarP(
-		&output, "output", "o", defaultOutput, assets.FlagDesc("journal.site.output"),
+		&output, "output", "o", defaultOutput, assets.FlagDesc(assets.FlagDescKeyJournalSiteOutput),
 	)
 	cmd.Flags().BoolVar(
-		&build, "build", false, assets.FlagDesc("journal.site.build"),
+		&build, "build", false, assets.FlagDesc(assets.FlagDescKeyJournalSiteBuild),
 	)
 	cmd.Flags().BoolVar(
-		&serve, "serve", false, assets.FlagDesc("journal.site.serve"),
+		&serve, "serve", false, assets.FlagDesc(assets.FlagDescKeyJournalSiteServe),
 	)
 
 	return cmd
diff --git a/internal/cli/journal/cmd/site/run.go b/internal/cli/journal/cmd/site/run.go
index 43200d62..4db28e00 100644
--- a/internal/cli/journal/cmd/site/run.go
+++ b/internal/cli/journal/cmd/site/run.go
@@ -7,25 +7,24 @@
 package site
 
 import (
-	_ "embed"
-	"fmt"
 	"os"
 	"os/exec"
 	"path/filepath"
 
+	"github.com/ActiveMemory/ctx/internal/config/dir"
+	"github.com/ActiveMemory/ctx/internal/config/file"
+	"github.com/ActiveMemory/ctx/internal/config/fs"
+	"github.com/ActiveMemory/ctx/internal/config/zensical"
 	"github.com/spf13/cobra"
 
+	"github.com/ActiveMemory/ctx/internal/assets"
 	"github.com/ActiveMemory/ctx/internal/cli/journal/core"
-	"github.com/ActiveMemory/ctx/internal/config"
 	ctxerr "github.com/ActiveMemory/ctx/internal/err"
 	"github.com/ActiveMemory/ctx/internal/journal/state"
 	"github.com/ActiveMemory/ctx/internal/rc"
 	"github.com/ActiveMemory/ctx/internal/write"
 )
 
-//go:embed extra.css
-var extraCSS []byte
-
 // runZensical executes zensical build or serve in the output directory.
 //
 // Parameters:
@@ -36,12 +35,13 @@ var extraCSS []byte
 //   - error: Non-nil if zensical is not found or fails
 func runZensical(dir, command string) error {
 	// Check if zensical is available
-	_, lookErr := exec.LookPath(config.BinZensical)
+	_, lookErr := exec.LookPath(zensical.Bin)
 	if lookErr != nil {
 		return ctxerr.ZensicalNotFound()
 	}
 
-	cmd := exec.Command(config.BinZensical, command) //nolint:gosec // G204: binary is a constant, command is from caller
+	// G204: binary is a constant, command is from the caller
+	cmd := exec.Command(zensical.Bin, command) //nolint:gosec
 	cmd.Dir = dir
 	cmd.Stdout = os.Stdout
 	cmd.Stderr = os.Stderr
@@ -66,7 +66,7 @@ func runZensical(dir, command string) error {
 func runJournalSite(
 	cmd *cobra.Command, output string, build, serve bool,
 ) error {
-	journalDir := filepath.Join(rc.ContextDir(), config.DirJournal)
+	journalDir := filepath.Join(rc.ContextDir(), dir.Journal)
 
 	// Check if the journal directory exists
 	if _, statErr := os.Stat(journalDir); os.IsNotExist(statErr) {
@@ -76,7 +76,7 @@ func runJournalSite(
 	// Load journal state for per-file processing flags
 	jstate, loadErr := state.Load(journalDir)
 	if loadErr != nil {
-		return fmt.Errorf("load journal state: %w", loadErr)
+		return ctxerr.LoadJournalStateErr(loadErr)
 	}
 
 	// Scan journal files
@@ -90,28 +90,32 @@ func runJournalSite(
 	}
 
 	// Create output directory structure
-	docsDir := filepath.Join(output, config.JournalDirDocs)
-	if mkErr := os.MkdirAll(docsDir, config.PermExec); mkErr != nil {
+	docsDir := filepath.Join(output, dir.JournalDocs)
+	if mkErr := os.MkdirAll(docsDir, fs.PermExec); mkErr != nil {
 		return ctxerr.Mkdir(docsDir, mkErr)
 	}
 
-	// Write stylesheet for 
 overflow control
-	stylesDir := filepath.Join(docsDir, "stylesheets")
-	if mkErr := os.MkdirAll(stylesDir, config.PermExec); mkErr != nil {
+	// Write the stylesheet for 
 overflow control
+	stylesDir := filepath.Join(docsDir, zensical.Stylesheets)
+	if mkErr := os.MkdirAll(stylesDir, fs.PermExec); mkErr != nil {
 		return ctxerr.Mkdir(stylesDir, mkErr)
 	}
-	cssPath := filepath.Join(stylesDir, "extra.css")
+	cssPath := filepath.Join(stylesDir, zensical.ExtraCSS)
+	cssData, cssReadErr := assets.JournalExtraCSS()
+	if cssReadErr != nil {
+		return cssReadErr
+	}
 	if writeErr := os.WriteFile(
-		cssPath, extraCSS, config.PermFile,
+		cssPath, cssData, fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(cssPath, writeErr)
 	}
 
 	// Write README
-	readmePath := filepath.Join(output, config.FilenameReadme)
+	readmePath := filepath.Join(output, file.Readme)
 	if writeErr := os.WriteFile(
 		readmePath,
-		[]byte(core.GenerateSiteReadme(journalDir)), config.PermFile,
+		[]byte(core.GenerateSiteReadme(journalDir)), fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(readmePath, writeErr)
 	}
@@ -141,21 +145,21 @@ func runJournalSite(
 		)
 		if normalized != string(content) {
 			if writeErr := os.WriteFile(
-				src, []byte(normalized), config.PermFile,
+				src, []byte(normalized), fs.PermFile,
 			); writeErr != nil {
 				write.WarnFileErr(cmd, entry.Filename, writeErr)
 			}
 		}
 
 		// Generate site copy with Markdown fixes
-		fv := jstate.IsFencesVerified(entry.Filename)
+		fv := jstate.FencesVerified(entry.Filename)
 		withLinks := core.InjectSourceLink(normalized, src)
 		if entry.Summary != "" {
 			withLinks = core.InjectSummary(withLinks, entry.Summary)
 		}
 		siteContent := core.NormalizeContent(withLinks, fv)
 		if writeErr := os.WriteFile(
-			dst, []byte(siteContent), config.PermFile,
+			dst, []byte(siteContent), fs.PermFile,
 		); writeErr != nil {
 			write.WarnFileErr(cmd, entry.Filename, writeErr)
 			continue
@@ -164,7 +168,7 @@ func runJournalSite(
 
 	// Remove orphan site files — entries whose source was renamed or deleted.
 	knownFiles := make(map[string]bool, len(entries)+1)
-	knownFiles[config.FilenameIndex] = true
+	knownFiles[file.Index] = true
 	for _, e := range entries {
 		knownFiles[e.Filename] = true
 	}
@@ -175,16 +179,16 @@ func runJournalSite(
 			}
 			orphanPath := filepath.Join(docsDir, f.Name())
 			if rmErr := os.Remove(orphanPath); rmErr == nil {
-				cmd.Println(fmt.Sprintf("  removed orphan: %s", f.Name()))
+				write.InfoJournalOrphanRemoved(cmd, f.Name())
 			}
 		}
 	}
 
 	// Generate index.md
 	indexContent := core.GenerateIndex(entries)
-	indexPath := filepath.Join(docsDir, config.FilenameIndex)
+	indexPath := filepath.Join(docsDir, file.Index)
 	if writeErr := os.WriteFile(
-		indexPath, []byte(indexContent), config.PermFile,
+		indexPath, []byte(indexContent), fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(indexPath, writeErr)
 	}
@@ -202,17 +206,17 @@ func runJournalSite(
 
 	if len(topics) > 0 {
 		if writeErr := core.WriteSection(
-			docsDir, config.JournalDirTopics,
+			docsDir, dir.JournTopics,
 			core.GenerateTopicsIndex(topics),
 			func(dir string) {
 				for _, t := range topics {
 					if !t.Popular {
 						continue
 					}
-					pagePath := filepath.Join(dir, t.Name+config.ExtMarkdown)
+					pagePath := filepath.Join(dir, t.Name+file.ExtMarkdown)
 					if pageErr := os.WriteFile(
 						pagePath, []byte(core.GenerateTopicPage(t)),
-						config.PermFile,
+						fs.PermFile,
 					); pageErr != nil {
 						write.WarnFileErr(cmd, pagePath, pageErr)
 					}
@@ -235,7 +239,7 @@ func runJournalSite(
 
 	if len(keyFiles) > 0 {
 		if writeErr := core.WriteSection(
-			docsDir, config.JournalDirFiles,
+			docsDir, dir.JournalFiles,
 			core.GenerateKeyFilesIndex(keyFiles),
 			func(dir string) {
 				for _, kf := range keyFiles {
@@ -243,11 +247,11 @@ func runJournalSite(
 						continue
 					}
 					slug := core.KeyFileSlug(kf.Path)
-					pagePath := filepath.Join(dir, slug+config.ExtMarkdown)
+					pagePath := filepath.Join(dir, slug+file.ExtMarkdown)
 					if pageErr := os.WriteFile(
 						pagePath, []byte(
 							core.GenerateKeyFilePage(kf)),
-						config.PermFile,
+						fs.PermFile,
 					); pageErr != nil {
 						write.WarnFileErr(cmd, pagePath, pageErr)
 					}
@@ -271,14 +275,14 @@ func runJournalSite(
 	if len(sessionTypes) > 0 {
 		if writeErr := core.WriteSection(
 			docsDir,
-			config.JournalDirTypes,
+			dir.JournalTypes,
 			core.GenerateTypesIndex(sessionTypes),
 			func(dir string) {
 				for _, st := range sessionTypes {
-					pagePath := filepath.Join(dir, st.Name+config.ExtMarkdown)
+					pagePath := filepath.Join(dir, st.Name+file.ExtMarkdown)
 					if pageErr := os.WriteFile(
 						pagePath,
-						[]byte(core.GenerateTypePage(st)), config.PermFile,
+						[]byte(core.GenerateTypePage(st)), fs.PermFile,
 					); pageErr != nil {
 						write.WarnFileErr(cmd, pagePath, pageErr)
 					}
@@ -292,35 +296,23 @@ func runJournalSite(
 	tomlContent := core.GenerateZensicalToml(
 		entries, topics, keyFiles, sessionTypes,
 	)
-	tomlPath := filepath.Join(output, config.FileZensicalToml)
+	tomlPath := filepath.Join(output, zensical.Toml)
 	if writeErr := os.WriteFile(
 		tomlPath,
-		[]byte(tomlContent), config.PermFile,
+		[]byte(tomlContent), fs.PermFile,
 	); writeErr != nil {
 		return ctxerr.FileWrite(tomlPath, writeErr)
 	}
 
-	cmd.Println(fmt.Sprintf(
-		"\u2713 Generated site with %d entries in %s",
-		len(entries), output,
-	))
-
-	// Build or serve if requested
 	if serve {
-		cmd.Println()
-		cmd.Println("Starting local server...")
+		write.InfoJournalSiteStarting(cmd)
 		return runZensical(output, "serve")
 	} else if build {
-		cmd.Println()
-		cmd.Println("Building site...")
+		write.InfoJournalSiteBuilding(cmd)
 		return runZensical(output, "build")
 	}
 
-	cmd.Println()
-	cmd.Println("Next steps:")
-	cmd.Println(fmt.Sprintf("  cd %s && %s serve", output, config.BinZensical))
-	cmd.Println("  or")
-	cmd.Println("  ctx journal site --serve")
+	write.InfoJournalSiteGenerated(cmd, len(entries), output, zensical.Bin)
 
 	return nil
 }
diff --git a/internal/cli/journal/core/collapse.go b/internal/cli/journal/core/collapse.go
index cb367c3c..c6cb5e71 100644
--- a/internal/cli/journal/core/collapse.go
+++ b/internal/cli/journal/core/collapse.go
@@ -10,7 +10,10 @@ import (
 	"fmt"
 	"strings"
 
-	"github.com/ActiveMemory/ctx/internal/config"
+	"github.com/ActiveMemory/ctx/internal/assets"
+	"github.com/ActiveMemory/ctx/internal/config/journal"
+	"github.com/ActiveMemory/ctx/internal/config/regex"
+	"github.com/ActiveMemory/ctx/internal/config/token"
 )
 
 // CollapseToolOutputs wraps long Tool Output turn bodies in collapsible
@@ -23,13 +26,13 @@ import (
 // Returns:
 //   - string: Content with long tool outputs wrapped in 
tags func CollapseToolOutputs(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string i := 0 for i < len(lines) { trimmed := strings.TrimSpace(lines[i]) - matches := config.RegExTurnHeader.FindStringSubmatch(trimmed) + matches := regex.TurnHeader.FindStringSubmatch(trimmed) // Non-header lines pass through unchanged if matches == nil { @@ -49,7 +52,7 @@ func CollapseToolOutputs(content string) string { } bodyEnd := bodyStart for bodyEnd < len(lines) { - if config.RegExTurnHeader.MatchString( + if regex.TurnHeader.MatchString( strings.TrimSpace(lines[bodyEnd]), ) { break @@ -58,7 +61,7 @@ func CollapseToolOutputs(content string) string { } // Non-tool-output turns pass through unchanged - if role != config.LabelToolOutput { + if role != assets.ToolOutput { for k := i; k < bodyEnd; k++ { out = append(out, lines[k]) } @@ -75,23 +78,23 @@ func CollapseToolOutputs(content string) string { } body := strings.TrimSpace( - strings.Join(lines[bodyStart:bodyEnd], config.NewlineLF), + strings.Join(lines[bodyStart:bodyEnd], token.NewlineLF), ) alreadyWrapped := strings.HasPrefix(body, "
") - if nonBlank > config.RecallDetailsThreshold && !alreadyWrapped { + if nonBlank > journal.DetailsThreshold && !alreadyWrapped { summary := fmt.Sprintf( - config.TplRecallDetailsSummary, nonBlank, + assets.TplRecallDetailsSummary, nonBlank, ) out = append(out, header, "") out = append(out, - fmt.Sprintf(config.TplRecallDetailsOpen, summary), + fmt.Sprintf(assets.TplRecallDetailsOpen, summary), ) out = append(out, "") for k := bodyStart; k < bodyEnd; k++ { out = append(out, lines[k]) } - out = append(out, config.TplRecallDetailsClose, "") + out = append(out, assets.TplRecallDetailsClose, "") } else { for k := i; k < bodyEnd; k++ { out = append(out, lines[k]) @@ -101,5 +104,5 @@ func CollapseToolOutputs(content string) string { i = bodyEnd } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } diff --git a/internal/cli/journal/core/collapse_test.go b/internal/cli/journal/core/collapse_test.go index 940ae153..0d8a4ad0 100644 --- a/internal/cli/journal/core/collapse_test.go +++ b/internal/cli/journal/core/collapse_test.go @@ -11,7 +11,9 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/token" ) // helper: build a turn header line. @@ -25,11 +27,11 @@ func bodyLines(n int) string { for i := 1; i <= n; i++ { lines = append(lines, fmt.Sprintf("line %d", i)) } - return strings.Join(lines, config.NewlineLF) + return strings.Join(lines, token.NewlineLF) } func TestCollapseToolOutputs_LongOutputWrapped(t *testing.T) { - header := turnHeader(1, config.LabelToolOutput, "10:00:00") + header := turnHeader(1, assets.ToolOutput, "10:00:00") body := bodyLines(12) input := header + "\n\n" + body + "\n" @@ -53,7 +55,7 @@ func TestCollapseToolOutputs_LongOutputWrapped(t *testing.T) { } func TestCollapseToolOutputs_ShortOutputUnchanged(t *testing.T) { - header := turnHeader(1, config.LabelToolOutput, "10:00:00") + header := turnHeader(1, assets.ToolOutput, "10:00:00") body := bodyLines(5) input := header + "\n\n" + body + "\n" @@ -68,8 +70,8 @@ func TestCollapseToolOutputs_ShortOutputUnchanged(t *testing.T) { } func TestCollapseToolOutputs_ExactThresholdUnchanged(t *testing.T) { - header := turnHeader(1, config.LabelToolOutput, "10:00:00") - body := bodyLines(config.RecallDetailsThreshold) + header := turnHeader(1, assets.ToolOutput, "10:00:00") + body := bodyLines(journal.DetailsThreshold) input := header + "\n\n" + body + "\n" got := CollapseToolOutputs(input) @@ -80,7 +82,7 @@ func TestCollapseToolOutputs_ExactThresholdUnchanged(t *testing.T) { } func TestCollapseToolOutputs_AlreadyWrappedNotDoubled(t *testing.T) { - header := turnHeader(1, config.LabelToolOutput, "10:00:00") + header := turnHeader(1, assets.ToolOutput, "10:00:00") body := "
\n15 lines\n\n" + bodyLines(15) + "\n
" input := header + "\n\n" + body + "\n" @@ -107,9 +109,9 @@ func TestCollapseToolOutputs_NonToolTurnsUntouched(t *testing.T) { } func TestCollapseToolOutputs_MixedTurns(t *testing.T) { - short := turnHeader(1, config.LabelToolOutput, "10:00:00") + + short := turnHeader(1, assets.ToolOutput, "10:00:00") + "\n\n" + bodyLines(3) + "\n" - long := turnHeader(2, config.LabelToolOutput, "10:01:00") + + long := turnHeader(2, assets.ToolOutput, "10:01:00") + "\n\n" + bodyLines(15) + "\n" user := turnHeader(3, "User", "10:02:00") + "\n\n" + bodyLines(20) + "\n" diff --git a/internal/cli/journal/core/consolidate.go b/internal/cli/journal/core/consolidate.go index 654a761d..cf94aa8f 100644 --- a/internal/cli/journal/core/consolidate.go +++ b/internal/cli/journal/core/consolidate.go @@ -10,7 +10,8 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ) // ConsolidateToolRuns collapses consecutive turns with identical body content @@ -23,13 +24,13 @@ import ( // Returns: // - string: Content with consecutive identical turns collapsed func ConsolidateToolRuns(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string i := 0 for i < len(lines) { // Check if this line is a turn header - if !config.RegExTurnHeader.MatchString(strings.TrimSpace(lines[i])) { + if !regex.TurnHeader.MatchString(strings.TrimSpace(lines[i])) { out = append(out, lines[i]) i++ continue @@ -43,7 +44,7 @@ func ConsolidateToolRuns(content string) string { count := 1 j := bodyEnd for j < len(lines) { - if !config.RegExTurnHeader.MatchString(strings.TrimSpace(lines[j])) { + if !regex.TurnHeader.MatchString(strings.TrimSpace(lines[j])) { break } nextBody, nextBodyEnd := ExtractTurnBody(lines, j+1) @@ -57,7 +58,7 @@ func ConsolidateToolRuns(content string) string { if count > 1 { out = append(out, header, "", body, "", - fmt.Sprintf("(\u00d7%d)", count), "", + fmt.Sprintf("(×%d)", count), "", ) } else { // Keep original lines (preserves blank lines as-is) @@ -68,5 +69,5 @@ func ConsolidateToolRuns(content string) string { i = j } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } diff --git a/internal/cli/journal/core/fmt.go b/internal/cli/journal/core/fmt.go index 1df22ab5..5381e1e2 100644 --- a/internal/cli/journal/core/fmt.go +++ b/internal/cli/journal/core/fmt.go @@ -10,7 +10,9 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/token" ) // FormatSize formats a file size in human-readable form. @@ -59,6 +61,6 @@ func KeyFileSlug(path string) string { // Returns: // - string: Formatted line (e.g., "- [topic](topic.md) (3 sessions)\n") func FormatSessionLink(label, slug string, count int) string { - return fmt.Sprintf("- [%s](%s%s) (%d sessions)%s", - label, slug, config.ExtMarkdown, count, config.NewlineLF) + return fmt.Sprintf(assets.TextDesc(assets.TextDescKeyJournalMocSessionLink), + label, slug, file.ExtMarkdown, count, token.NewlineLF) } diff --git a/internal/cli/journal/core/frontmatter.go b/internal/cli/journal/core/frontmatter.go index fb1a82dd..62190b34 100644 --- a/internal/cli/journal/core/frontmatter.go +++ b/internal/cli/journal/core/frontmatter.go @@ -9,25 +9,11 @@ package core import ( "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/token" "gopkg.in/yaml.v3" - - "github.com/ActiveMemory/ctx/internal/config" ) -// ObsidianFrontmatter represents the YAML frontmatter for Obsidian vault -// entries. Extends JournalFrontmatter with Obsidian-specific fields. -type ObsidianFrontmatter struct { - Title string `yaml:"title"` - Date string `yaml:"date"` - Type string `yaml:"type,omitempty"` - Outcome string `yaml:"outcome,omitempty"` - Tags []string `yaml:"tags,omitempty"` - Technologies []string `yaml:"technologies,omitempty"` - KeyFiles []string `yaml:"key_files,omitempty"` - Aliases []string `yaml:"aliases,omitempty"` - SourceFile string `yaml:"source_file,omitempty"` -} - // TransformFrontmatter converts journal frontmatter to Obsidian format. // // Changes applied: @@ -43,20 +29,20 @@ type ObsidianFrontmatter struct { // Returns: // - string: Content with transformed frontmatter func TransformFrontmatter(content, sourcePath string) string { - nl := config.NewlineLF - fmOpen := len(config.Separator + nl) + nl := token.NewlineLF + fmOpen := len(token.Separator + nl) - if !strings.HasPrefix(content, config.Separator+nl) { + if !strings.HasPrefix(content, token.Separator+nl) { return content } - endIdx := strings.Index(content[fmOpen:], nl+config.Separator+nl) + endIdx := strings.Index(content[fmOpen:], nl+token.Separator+nl) if endIdx < 0 { return content } fmRaw := content[fmOpen : fmOpen+endIdx] - afterFM := content[fmOpen+endIdx+len(nl+config.Separator+nl):] + afterFM := content[fmOpen+endIdx+len(nl+token.Separator+nl):] // Parse the original frontmatter into a generic map to preserve // unknown fields, then extract known fields for transformation. @@ -68,26 +54,26 @@ func TransformFrontmatter(content, sourcePath string) string { // Build the Obsidian frontmatter ofm := ObsidianFrontmatter{} - if v, ok := raw["title"].(string); ok { + if v, ok := raw[assets.FrontmatterTitle].(string); ok { ofm.Title = v } - if v, ok := raw["date"].(string); ok { + if v, ok := raw[assets.FrontmatterDate].(string); ok { ofm.Date = v } - if v, ok := raw["type"].(string); ok { + if v, ok := raw[assets.FrontmatterType].(string); ok { ofm.Type = v } - if v, ok := raw["outcome"].(string); ok { + if v, ok := raw[assets.FrontmatterOutcome].(string); ok { ofm.Outcome = v } // topics -> tags - ofm.Tags = ExtractStringSlice(raw, "topics") + ofm.Tags = ExtractStringSlice(raw, assets.FrontmatterTopics) - ofm.Technologies = ExtractStringSlice(raw, "technologies") - ofm.KeyFiles = ExtractStringSlice(raw, "key_files") + ofm.Technologies = ExtractStringSlice(raw, assets.FrontmatterTechnologies) + ofm.KeyFiles = ExtractStringSlice(raw, assets.FrontmatterKeyFiles) - // Add aliases from title + // Add aliases from the title if ofm.Title != "" { ofm.Aliases = []string{ofm.Title} } @@ -103,9 +89,9 @@ func TransformFrontmatter(content, sourcePath string) string { } var sb strings.Builder - sb.WriteString(config.Separator + nl) + sb.WriteString(token.Separator + nl) sb.Write(out) - sb.WriteString(config.Separator + nl) + sb.WriteString(token.Separator + nl) sb.WriteString(afterFM) return sb.String() diff --git a/internal/cli/journal/core/generate.go b/internal/cli/journal/core/generate.go index b684c9f6..1d9f32ac 100644 --- a/internal/cli/journal/core/generate.go +++ b/internal/cli/journal/core/generate.go @@ -12,7 +12,12 @@ import ( "strings" "unicode/utf8" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/zensical" ) // GenerateSiteReadme creates a README for the journal-site directory. @@ -23,7 +28,7 @@ import ( // Returns: // - string: Markdown README content with regeneration instructions func GenerateSiteReadme(journalDir string) string { - return fmt.Sprintf(config.TplJournalSiteReadme, journalDir) + return fmt.Sprintf(assets.TplJournalSiteReadme, journalDir) } // GenerateIndex creates the index.md content for the journal site. @@ -35,7 +40,7 @@ func GenerateSiteReadme(journalDir string) string { // - string: Markdown content for index.md func GenerateIndex(entries []JournalEntry) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF // Separate regular sessions from suggestions and multi-part continuations var regular, suggestions []JournalEntry @@ -51,16 +56,16 @@ func GenerateIndex(entries []JournalEntry) string { } } - sb.WriteString(config.JournalHeadingSessionJournal + nl + nl) - sb.WriteString(config.TplJournalIndexIntro + nl + nl) - sb.WriteString(fmt.Sprintf(config.TplJournalIndexStats+ + sb.WriteString(assets.JournalHeadingSessionJournal + nl + nl) + sb.WriteString(assets.TplJournalIndexIntro + nl + nl) + sb.WriteString(fmt.Sprintf(assets.TplJournalIndexStats+ nl+nl, len(regular), len(suggestions))) // Group regular sessions by month months, monthOrder := GroupByMonth(regular) for _, month := range monthOrder { - sb.WriteString(fmt.Sprintf(config.TplJournalMonthHeading+nl+nl, month)) + sb.WriteString(fmt.Sprintf(assets.TplJournalMonthHeading+nl+nl, month)) for _, e := range months[month] { sb.WriteString(FormatIndexEntry(e, nl)) @@ -70,9 +75,9 @@ func GenerateIndex(entries []JournalEntry) string { // Suggestions section if len(suggestions) > 0 { - sb.WriteString(config.Separator + nl + nl) - sb.WriteString(config.JournalHeadingSuggestions + nl + nl) - sb.WriteString(config.TplJournalSuggestionsNote + nl + nl) + sb.WriteString(token.Separator + nl + nl) + sb.WriteString(assets.JournalHeadingSuggestions + nl + nl) + sb.WriteString(assets.TplJournalSuggestionsNote + nl + nl) for _, e := range suggestions { sb.WriteString(FormatIndexEntry(e, nl)) @@ -92,11 +97,11 @@ func GenerateIndex(entries []JournalEntry) string { // Returns: // - string: Formatted line (e.g., "- 14:30 [title](link.md) (project) `1.2KB`") func FormatIndexEntry(e JournalEntry, nl string) string { - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) timeStr := "" - if e.Time != "" && len(e.Time) >= config.JournalTimePrefixLen { - timeStr = e.Time[:config.JournalTimePrefixLen] + " " + if e.Time != "" && len(e.Time) >= journal.TimePrefixLen { + timeStr = e.Time[:journal.TimePrefixLen] + " " } project := "" @@ -107,10 +112,10 @@ func FormatIndexEntry(e JournalEntry, nl string) string { size := FormatSize(e.Size) line := fmt.Sprintf( - config.TplJournalIndexEntry+nl, timeStr, e.Title, link, project, size, + assets.TplJournalIndexEntry+nl, timeStr, e.Title, link, project, size, ) if e.Summary != "" { - line += fmt.Sprintf(config.TplJournalIndexSummary+nl, e.Summary) + line += fmt.Sprintf(assets.TplJournalIndexSummary+nl, e.Summary) } return line } @@ -125,17 +130,17 @@ func FormatIndexEntry(e JournalEntry, nl string) string { // Returns: // - string: Content with the summary admonition injected func InjectSummary(content, summary string) string { - nl := config.NewlineLF + nl := token.NewlineLF admonition := fmt.Sprintf( - config.TplJournalSummaryAdmonition+nl+nl, summary, + assets.TplJournalSummaryAdmonition+nl+nl, summary, ) // Insert after frontmatter closing delimiter - fmOpen := len(config.Separator + nl) - fmClose := len(nl + config.Separator + nl) - if strings.HasPrefix(content, config.Separator+nl) { + fmOpen := len(token.Separator + nl) + fmClose := len(nl + token.Separator + nl) + if strings.HasPrefix(content, token.Separator+nl) { if end := strings.Index(content[fmOpen:], nl+ - config.Separator+nl); end >= 0 { + token.Separator+nl); end >= 0 { insertAt := fmOpen + end + fmClose // Skip past any existing blank lines + source link after frontmatter rest := content[insertAt:] @@ -158,22 +163,22 @@ func InjectSummary(content, summary string) string { // Returns: // - string: Content with the source link injected func InjectSourceLink(content, sourcePath string) string { - nl := config.NewlineLF + nl := token.NewlineLF absPath, pathErr := filepath.Abs(sourcePath) if pathErr != nil { absPath = sourcePath } relPath := filepath.Join( - config.DirContext, config.DirJournal, filepath.Base(absPath), + dir.Context, dir.Journal, filepath.Base(absPath), ) - link := fmt.Sprintf(config.TplJournalSourceLink+nl+nl, + link := fmt.Sprintf(assets.TplJournalSourceLink+nl+nl, absPath, relPath, relPath) - fmOpen := len(config.Separator + nl) - fmClose := len(nl + config.Separator + nl) - if strings.HasPrefix(content, config.Separator+nl) { + fmOpen := len(token.Separator + nl) + fmClose := len(nl + token.Separator + nl) + if strings.HasPrefix(content, token.Separator+nl) { if end := strings.Index(content[fmOpen:], nl+ - config.Separator+nl); end >= 0 { + token.Separator+nl); end >= 0 { insertAt := fmOpen + end + fmClose return content[:insertAt] + nl + link + content[insertAt:] } @@ -198,30 +203,30 @@ func GenerateZensicalToml( keyFiles []KeyFileData, sessionTypes []TypeData, ) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF - sb.WriteString(config.TplZensicalProject + nl) + sb.WriteString(assets.TplZensicalProject + nl) // Build navigation - sb.WriteString(config.TomlNavOpen + nl) - sb.WriteString(fmt.Sprintf(config.TplJournalNavItem+nl, - config.JournalLabelHome, config.FilenameIndex)) + sb.WriteString(zensical.TomlNavOpen + nl) + sb.WriteString(fmt.Sprintf(assets.TplJournalNavItem+nl, + assets.JournalLabelHome, file.Index)) if len(topics) > 0 { - sb.WriteString(fmt.Sprintf(config.TplJournalNavItem+nl, - config.JournalLabelTopics, - filepath.Join(config.JournalDirTopics, config.FilenameIndex)), + sb.WriteString(fmt.Sprintf(assets.TplJournalNavItem+nl, + assets.JournalLabelTopics, + filepath.Join(dir.JournTopics, file.Index)), ) } if len(keyFiles) > 0 { - sb.WriteString(fmt.Sprintf(config.TplJournalNavItem+nl, - config.JournalLabelFiles, - filepath.Join(config.JournalDirFiles, config.FilenameIndex)), + sb.WriteString(fmt.Sprintf(assets.TplJournalNavItem+nl, + assets.JournalLabelFiles, + filepath.Join(dir.JournalFiles, file.Index)), ) } if len(sessionTypes) > 0 { - sb.WriteString(fmt.Sprintf(config.TplJournalNavItem+nl, - config.JournalLabelTypes, - filepath.Join(config.JournalDirTypes, config.FilenameIndex)), + sb.WriteString(fmt.Sprintf(assets.TplJournalNavItem+nl, + assets.JournalLabelTypes, + filepath.Join(dir.JournalTypes, file.Index)), ) } @@ -239,30 +244,30 @@ func GenerateZensicalToml( // Group recent entries (last N, excluding suggestions) recent := regular - if len(recent) > config.JournalMaxRecentSessions { - recent = recent[:config.JournalMaxRecentSessions] + if len(recent) > journal.MaxRecentSessions { + recent = recent[:journal.MaxRecentSessions] } sb.WriteString(fmt.Sprintf( - config.TplJournalNavSection+nl, config.JournalHeadingRecentSessions), + assets.TplJournalNavSection+nl, assets.JournalHeadingRecentSessions), ) for _, e := range recent { title := e.Title - if utf8.RuneCountInString(title) > config.JournalMaxNavTitleLen { + if utf8.RuneCountInString(title) > journal.MaxNavTitleLen { runes := []rune(title) - title = string(runes[:config.JournalMaxNavTitleLen]) + config.Ellipsis + title = string(runes[:journal.MaxNavTitleLen]) + token.Ellipsis } title = strings.ReplaceAll(title, `"`, `\"`) sb.WriteString(fmt.Sprintf( - config.TplJournalNavSessionItem+nl, title, e.Filename), + assets.TplJournalNavSessionItem+nl, title, e.Filename), ) } - sb.WriteString(config.TomlNavSectionClose + nl) - sb.WriteString(config.TomlNavClose + nl + nl) + sb.WriteString(zensical.TomlNavSectionClose + nl) + sb.WriteString(zensical.TomlNavClose + nl + nl) - sb.WriteString(config.TplZensicalExtraCSS + nl) + sb.WriteString(assets.TplZensicalExtraCSS + nl) - sb.WriteString(config.TplZensicalTheme) + sb.WriteString(assets.TplZensicalTheme) return sb.String() } diff --git a/internal/cli/journal/core/group.go b/internal/cli/journal/core/group.go index 3e54685f..662d7e34 100644 --- a/internal/cli/journal/core/group.go +++ b/internal/cli/journal/core/group.go @@ -9,7 +9,7 @@ package core import ( "sort" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/journal" ) // GroupByMonth groups journal entries by their YYYY-MM date prefix, @@ -28,8 +28,8 @@ func GroupByMonth( var monthOrder []string for _, e := range entries { - if len(e.Date) >= config.JournalMonthPrefixLen { - month := e.Date[:config.JournalMonthPrefixLen] + if len(e.Date) >= journal.MonthPrefixLen { + month := e.Date[:journal.MonthPrefixLen] if _, exists := months[month]; !exists { monthOrder = append(monthOrder, month) } @@ -65,7 +65,7 @@ func BuildGroupedIndex( result = append(result, GroupedIndex{ Key: key, Entries: ents, - Popular: len(ents) >= config.JournalPopularityThreshold, + Popular: len(ents) >= journal.PopularityThreshold, }) } diff --git a/internal/cli/journal/core/index.go b/internal/cli/journal/core/index.go index 133d4c54..b9b7f8f2 100644 --- a/internal/cli/journal/core/index.go +++ b/internal/cli/journal/core/index.go @@ -10,7 +10,8 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/token" ) // BuildTopicIndex aggregates entries by topic and returns sorted topic data. @@ -45,7 +46,7 @@ func BuildTopicIndex(entries []JournalEntry) []TopicData { // - string: Markdown content for topics/index.md func GenerateTopicsIndex(topics []TopicData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF var popular, longtail []TopicData for _, t := range topics { @@ -56,18 +57,18 @@ func GenerateTopicsIndex(topics []TopicData) string { } } - sb.WriteString(config.JournalHeadingTopics + nl + nl) + sb.WriteString(assets.JournalHeadingTopics + nl + nl) sb.WriteString(fmt.Sprintf( - config.TplJournalTopicStats+nl+nl, + assets.TplJournalTopicStats+nl+nl, len(topics), CountUniqueSessions(topics), len(popular), len(longtail))) WritePopularAndLongtail(&sb, - len(popular), config.JournalHeadingPopularTopics, + len(popular), assets.JournalHeadingPopularTopics, func(i int) (string, string, int) { return popular[i].Name, popular[i].Name, len(popular[i].Entries) }, - len(longtail), config.JournalHeadingLongtailTopics, - config.TplJournalLongtailEntry, + len(longtail), assets.JournalHeadingLongtailTopics, + assets.TplJournalLongtailEntry, func(i int) (string, JournalEntry) { return longtail[i].Name, longtail[i].Entries[0] }, @@ -86,8 +87,8 @@ func GenerateTopicsIndex(topics []TopicData) string { // - string: Markdown content for the topic page func GenerateTopicPage(topic TopicData) string { return GenerateGroupedPage( - fmt.Sprintf(config.TplJournalPageHeading, topic.Name), - fmt.Sprintf(config.TplJournalTopicPageStats, len(topic.Entries)), + fmt.Sprintf(assets.TplJournalPageHeading, topic.Name), + fmt.Sprintf(assets.TplJournalTopicPageStats, len(topic.Entries)), topic.Entries, ) } @@ -125,7 +126,7 @@ func BuildKeyFileIndex(entries []JournalEntry) []KeyFileData { // - string: Markdown content for files/index.md func GenerateKeyFilesIndex(keyFiles []KeyFileData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF var popular, longtail []KeyFileData for _, kf := range keyFiles { @@ -147,21 +148,21 @@ func GenerateKeyFilesIndex(keyFiles []KeyFileData) string { } } - sb.WriteString(config.JournalHeadingKeyFiles + nl + nl) + sb.WriteString(assets.JournalHeadingKeyFiles + nl + nl) sb.WriteString(fmt.Sprintf( - config.TplJournalFileStats+nl+nl, + assets.TplJournalFileStats+nl+nl, len(keyFiles), totalSessions, len(popular), len(longtail)), ) WritePopularAndLongtail(&sb, - len(popular), config.JournalHeadingFrequentlyTouched, + len(popular), assets.JournalHeadingFrequentlyTouched, func(i int) (string, string, int) { return "`" + popular[i].Path + "`", KeyFileSlug(popular[i].Path), len(popular[i].Entries) }, - len(longtail), config.JournalHeadingSingleSession, - config.TplJournalLongtailCodeEntry, + len(longtail), assets.JournalHeadingSingleSession, + assets.TplJournalLongtailCodeEntry, func(i int) (string, JournalEntry) { return longtail[i].Path, longtail[i].Entries[0] }, @@ -180,8 +181,8 @@ func GenerateKeyFilesIndex(keyFiles []KeyFileData) string { // - string: Markdown content for the key file page func GenerateKeyFilePage(kf KeyFileData) string { return GenerateGroupedPage( - fmt.Sprintf(config.TplJournalCodePageHeading, kf.Path), - fmt.Sprintf(config.TplJournalFilePageStats, len(kf.Entries)), + fmt.Sprintf(assets.TplJournalCodePageHeading, kf.Path), + fmt.Sprintf(assets.TplJournalFilePageStats, len(kf.Entries)), kf.Entries, ) } @@ -216,16 +217,16 @@ func BuildTypeIndex(entries []JournalEntry) []TypeData { // - string: Markdown content for types/index.md func GenerateTypesIndex(sessionTypes []TypeData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF totalSessions := 0 for _, st := range sessionTypes { totalSessions += len(st.Entries) } - sb.WriteString(config.JournalHeadingSessionTypes + nl + nl) + sb.WriteString(assets.JournalHeadingSessionTypes + nl + nl) sb.WriteString(fmt.Sprintf( - config.TplJournalTypeStats+nl+nl, len(sessionTypes), totalSessions), + assets.TplJournalTypeStats+nl+nl, len(sessionTypes), totalSessions), ) for _, st := range sessionTypes { @@ -246,8 +247,8 @@ func GenerateTypesIndex(sessionTypes []TypeData) string { // - string: Markdown content for the session type page func GenerateTypePage(st TypeData) string { return GenerateGroupedPage( - fmt.Sprintf(config.TplJournalPageHeading, st.Name), - fmt.Sprintf(config.TplJournalTypePageStats, len(st.Entries), st.Name), + fmt.Sprintf(assets.TplJournalPageHeading, st.Name), + fmt.Sprintf(assets.TplJournalTypePageStats, len(st.Entries), st.Name), st.Entries, ) } diff --git a/internal/cli/journal/core/moc.go b/internal/cli/journal/core/moc.go index bd972c59..9820854a 100644 --- a/internal/cli/journal/core/moc.go +++ b/internal/cli/journal/core/moc.go @@ -10,7 +10,10 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/token" ) // GenerateHomeMOC creates the root navigation hub for the Obsidian vault. @@ -30,36 +33,39 @@ func GenerateHomeMOC( hasTopics, hasFiles, hasTypes bool, ) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF - sb.WriteString("# Session Journal" + nl + nl) - sb.WriteString("Navigation hub for all journal entries." + nl + nl) + sb.WriteString(assets.JournalHeadingSessionJournal + nl + nl) + sb.WriteString(assets.TextDesc(assets.TextDescKeyJournalMocNavDescription) + nl + nl) - sb.WriteString("## Browse by" + nl + nl) + sb.WriteString(assets.TextDesc(assets.TextDescKeyJournalMocBrowseBy) + nl + nl) if hasTopics { sb.WriteString(fmt.Sprintf( - "- %s — sessions grouped by topic"+nl, - FormatWikilink("_Topics", "Topics"))) + "- %s %s"+nl, + FormatWikilink("_Topics", "Topics"), + assets.TextDesc(assets.TextDescKeyJournalMocTopicsDesc))) } if hasFiles { sb.WriteString(fmt.Sprintf( - "- %s — sessions grouped by file touched"+nl, - FormatWikilink("_Key Files", "Key Files"))) + "- %s %s"+nl, + FormatWikilink("_Key Files", "Key Files"), + assets.TextDesc(assets.TextDescKeyJournalMocFilesDesc))) } if hasTypes { sb.WriteString(fmt.Sprintf( - "- %s — sessions grouped by type"+nl, - FormatWikilink("_Session Types", "Session Types"))) + "- %s %s"+nl, + FormatWikilink("_Session Types", "Session Types"), + assets.TextDesc(assets.TextDescKeyJournalMocTypesDesc))) } sb.WriteString(nl) - // Recent sessions (up to JournalMaxRecentSessions) + // Recent sessions (up to MaxRecentSessions) recent := entries - if len(recent) > config.JournalMaxRecentSessions { - recent = recent[:config.JournalMaxRecentSessions] + if len(recent) > journal.MaxRecentSessions { + recent = recent[:journal.MaxRecentSessions] } - sb.WriteString("## Recent Sessions" + nl + nl) + sb.WriteString(token.HeadingLevelTwoStart + assets.JournalHeadingRecentSessions + nl + nl) for _, e := range recent { sb.WriteString(FormatWikilinkEntry(e) + nl) } @@ -80,7 +86,7 @@ func GenerateHomeMOC( // - string: Markdown content for _Topics.md func GenerateObsidianTopicsMOC(topics []TopicData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF var popular, longtail []TopicData for _, t := range topics { @@ -110,7 +116,7 @@ func GenerateObsidianTopicsMOC(topics []TopicData) string { sb.WriteString("## Long-tail Topics" + nl + nl) for _, t := range longtail { e := t.Entries[0] - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) sb.WriteString(fmt.Sprintf("- **%s** — %s"+nl, t.Name, FormatWikilink(link, e.Title))) } @@ -145,7 +151,7 @@ func GenerateObsidianTopicPage(topic TopicData) string { // - string: Markdown content for _Key Files.md func GenerateObsidianFilesMOC(keyFiles []KeyFileData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF var popular, longtail []KeyFileData for _, kf := range keyFiles { @@ -187,7 +193,7 @@ func GenerateObsidianFilesMOC(keyFiles []KeyFileData) string { sb.WriteString("## Single Session" + nl + nl) for _, kf := range longtail { e := kf.Entries[0] - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) sb.WriteString(fmt.Sprintf("- `%s` — %s"+nl, kf.Path, FormatWikilink(link, e.Title))) } @@ -223,7 +229,7 @@ func GenerateObsidianFilePage(kf KeyFileData) string { // - string: Markdown content for _Session Types.md func GenerateObsidianTypesMOC(sessionTypes []TypeData) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF totalSessions := 0 for _, st := range sessionTypes { @@ -274,7 +280,7 @@ func GenerateObsidianGroupedPage( heading, stats string, entries []JournalEntry, ) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF sb.WriteString(heading + nl + nl) sb.WriteString(stats + nl + nl) @@ -312,10 +318,10 @@ func GenerateRelatedFooter( } var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF - sb.WriteString(nl + config.Separator + nl + nl) - sb.WriteString(config.ObsidianRelatedHeading + nl + nl) + sb.WriteString(nl + token.Separator + nl + nl) + sb.WriteString(assets.ObsidianRelatedHeading + nl + nl) // Topic links if len(entry.Topics) > 0 { @@ -324,7 +330,7 @@ func GenerateRelatedFooter( FormatWikilink("_Topics", "Topics MOC")) for _, t := range entry.Topics { topicLinks = append(topicLinks, - fmt.Sprintf(config.ObsidianWikilinkPlain, t)) + fmt.Sprintf(assets.ObsidianWikilinkPlain, t)) } sb.WriteString("**Topics**: " + strings.Join(topicLinks, " · ") + nl + nl) } @@ -332,15 +338,15 @@ func GenerateRelatedFooter( // Type link if entry.Type != "" { sb.WriteString(fmt.Sprintf("**Type**: %s"+nl+nl, - fmt.Sprintf(config.ObsidianWikilinkPlain, entry.Type))) + fmt.Sprintf(assets.ObsidianWikilinkPlain, entry.Type))) } // See also: other entries sharing topics related := CollectRelated(entry, topicIndex, maxRelated) if len(related) > 0 { - sb.WriteString(config.ObsidianSeeAlso + nl) + sb.WriteString(assets.ObsidianSeeAlso + nl) for _, rel := range related { - link := strings.TrimSuffix(rel.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(rel.Filename, file.ExtMarkdown) sb.WriteString(fmt.Sprintf("- %s"+nl, FormatWikilink(link, rel.Title))) } diff --git a/internal/cli/journal/core/moc_test.go b/internal/cli/journal/core/moc_test.go index caeba75f..f9c0a1d5 100644 --- a/internal/cli/journal/core/moc_test.go +++ b/internal/cli/journal/core/moc_test.go @@ -10,7 +10,7 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" ) func TestGenerateHomeMOC(t *testing.T) { @@ -113,7 +113,7 @@ func TestGenerateRelatedFooter(t *testing.T) { got := GenerateRelatedFooter(entry, topicIndex, 5) - if !strings.Contains(got, config.ObsidianRelatedHeading) { + if !strings.Contains(got, assets.ObsidianRelatedHeading) { t.Error("missing related heading") } if !strings.Contains(got, "[[_Topics|Topics MOC]]") { diff --git a/internal/cli/journal/core/normalize.go b/internal/cli/journal/core/normalize.go index 4afc3073..d89ba894 100644 --- a/internal/cli/journal/core/normalize.go +++ b/internal/cli/journal/core/normalize.go @@ -13,7 +13,10 @@ import ( "strings" "unicode/utf8" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ) // HTML tag constants for pre-formatted blocks. @@ -53,21 +56,21 @@ func NormalizeContent(content string, fencesVerified bool) string { content = WrapToolOutputs(content) content = WrapUserTurns(content) - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string inFrontmatter := false inPreBlock := false // inside
...
from WrapToolOutputs/WrapUserTurns for i, line := range lines { // Skip frontmatter - if i == 0 && strings.TrimSpace(line) == config.Separator { + if i == 0 && strings.TrimSpace(line) == token.Separator { inFrontmatter = true out = append(out, line) continue } if inFrontmatter { out = append(out, line) - if strings.TrimSpace(line) == config.Separator { + if strings.TrimSpace(line) == token.Separator { inFrontmatter = false } continue @@ -90,49 +93,49 @@ func NormalizeContent(content string, fencesVerified bool) string { } // Sanitize H1 headings: strip Claude tags, truncate to max title len - if strings.HasPrefix(line, config.HeadingLevelOneStart) { - heading := strings.TrimPrefix(line, config.HeadingLevelOneStart) + if strings.HasPrefix(line, token.HeadingLevelOneStart) { + heading := strings.TrimPrefix(line, token.HeadingLevelOneStart) heading = strings.TrimSpace( - config.RegExClaudeTag.ReplaceAllString(heading, ""), + regex.SystemClaudeTag.ReplaceAllString(heading, ""), ) - if utf8.RuneCountInString(heading) > config.RecallMaxTitleLen { + if utf8.RuneCountInString(heading) > journal.MaxTitleLen { runes := []rune(heading) - truncated := string(runes[:config.RecallMaxTitleLen]) + truncated := string(runes[:journal.MaxTitleLen]) if idx := strings.LastIndex(truncated, " "); idx > 0 { truncated = truncated[:idx] } heading = truncated } - line = config.HeadingLevelOneStart + heading + line = token.HeadingLevelOneStart + heading } // Demote headings to bold: ## Foo → **Foo** // Preserves turn headers (### N. Role (HH:MM:SS)) and the H1 title. - if hm := config.RegExMarkdownHeading.FindStringSubmatch(line); hm != nil { - if hm[1] != "#" && !config.RegExTurnHeader.MatchString(strings.TrimSpace(line)) { + if hm := regex.MarkdownHeading.FindStringSubmatch(line); hm != nil { + if hm[1] != "#" && !regex.TurnHeader.MatchString(strings.TrimSpace(line)) { line = "**" + hm[2] + "**" } } // Insert blank line before list items when previous line is non-empty. // Python-Markdown requires a blank line before the first list item. - if config.RegExListStart.MatchString(line) && + if regex.ListStart.MatchString(line) && len(out) > 0 && strings.TrimSpace(out[len(out)-1]) != "" { out = append(out, "") } // Strip bold from tool-use lines - line = config.RegExToolBold.ReplaceAllString(line, `🔧 $1`) + line = regex.ToolBold.ReplaceAllString(line, `🔧 $1`) // Escape glob stars if !strings.HasPrefix(line, " ") { - line = config.RegExGlobStar.ReplaceAllString(line, `\*$1`) + line = regex.GlobStar.ReplaceAllString(line, `\*$1`) } // Replace inline code spans containing angle brackets: // `", ">") @@ -142,7 +145,7 @@ func NormalizeContent(content string, fencesVerified bool) string { out = append(out, line) } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } // WrapToolOutputs finds Tool Output sections and wraps their body in @@ -164,7 +167,7 @@ func NormalizeContent(content string, fencesVerified bool) string { // another session's file) because the real next turn (### 42.) is always // the smallest number > N. func WrapToolOutputs(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) mask := PreBlockMask(lines) turnSeq := CollectTurnNumbers(lines) var out []string @@ -177,10 +180,10 @@ func WrapToolOutputs(content string) string { i++ continue } - m := config.RegExTurnHeader.FindStringSubmatch( + m := regex.TurnHeader.FindStringSubmatch( strings.TrimSpace(lines[i]), ) - if m == nil || m[2] != config.LabelToolOutput { + if m == nil || m[2] != assets.ToolOutput { out = append(out, lines[i]) i++ continue @@ -206,7 +209,7 @@ func WrapToolOutputs(content string) string { if mask[j] { continue } - nm := config.RegExTurnHeader.FindStringSubmatch( + nm := regex.TurnHeader.FindStringSubmatch( strings.TrimSpace(lines[j]), ) if nm != nil { @@ -265,7 +268,7 @@ func WrapToolOutputs(content string) string { } } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } // WrapUserTurns finds User turn bodies and wraps them in

@@ -288,7 +291,7 @@ func WrapToolOutputs(content string) string {
 // Boundary detection reuses the same pre-scan + last-match-wins approach
 // as WrapToolOutputs.
 func WrapUserTurns(content string) string {
-	lines := strings.Split(content, config.NewlineLF)
+	lines := strings.Split(content, token.NewlineLF)
 	mask := PreBlockMask(lines)
 	turnSeq := CollectTurnNumbers(lines)
 	var out []string
@@ -301,10 +304,10 @@ func WrapUserTurns(content string) string {
 			i++
 			continue
 		}
-		m := config.RegExTurnHeader.FindStringSubmatch(
+		m := regex.TurnHeader.FindStringSubmatch(
 			strings.TrimSpace(lines[i]),
 		)
-		if m == nil || m[2] != config.LabelRoleUser {
+		if m == nil || m[2] != assets.RoleUser {
 			out = append(out, lines[i])
 			i++
 			continue
@@ -325,7 +328,7 @@ func WrapUserTurns(content string) string {
 			if mask[j] {
 				continue
 			}
-			nm := config.RegExTurnHeader.FindStringSubmatch(
+			nm := regex.TurnHeader.FindStringSubmatch(
 				strings.TrimSpace(lines[j]),
 			)
 			if nm != nil {
@@ -367,7 +370,7 @@ func WrapUserTurns(content string) string {
 		out = append(out, "")
 	}
 
-	return strings.Join(out, config.NewlineLF)
+	return strings.Join(out, token.NewlineLF)
 }
 
 // StripPreWrapper removes 
, ,
, 
,
@@ -485,7 +488,7 @@ func CollectTurnNumbers(lines []string) []int { if mask[i] { continue } - if m := config.RegExTurnHeader.FindStringSubmatch( + if m := regex.TurnHeader.FindStringSubmatch( strings.TrimSpace(line), ); m != nil { num, _ := strconv.Atoi(m[1]) @@ -519,7 +522,7 @@ func SplitTrailingFooter(body []string) ([]string, []string) { // Find the last "---" separator and check if a "**Part " line follows. sepIdx := -1 for j := len(body) - 1; j >= 0; j-- { - if strings.TrimSpace(body[j]) == config.Separator { + if strings.TrimSpace(body[j]) == token.Separator { sepIdx = j break } diff --git a/internal/cli/journal/core/parse.go b/internal/cli/journal/core/parse.go index 5d3e1a5e..1ca45167 100644 --- a/internal/cli/journal/core/parse.go +++ b/internal/cli/journal/core/parse.go @@ -12,9 +12,12 @@ import ( "sort" "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" "gopkg.in/yaml.v3" - - "github.com/ActiveMemory/ctx/internal/config" ) // ScanJournalEntries reads all journal Markdown files and extracts metadata. @@ -33,7 +36,7 @@ func ScanJournalEntries(journalDir string) ([]JournalEntry, error) { var entries []JournalEntry for _, f := range files { - if f.IsDir() || !strings.HasSuffix(f.Name(), config.ExtMarkdown) { + if f.IsDir() || !strings.HasSuffix(f.Name(), file.ExtMarkdown) { continue } @@ -68,14 +71,14 @@ func ParseJournalEntry(path, filename string) JournalEntry { } // Extract date from the filename (YYYY-MM-DD-slug-id.md) - if len(filename) >= config.JournalDatePrefixLen { - entry.Date = filename[:config.JournalDatePrefixLen] + if len(filename) >= journal.DatePrefixLen { + entry.Date = filename[:journal.DatePrefixLen] } // Read the file to extract metadata content, readErr := os.ReadFile(filepath.Clean(path)) if readErr != nil { - entry.Title = strings.TrimSuffix(filename, config.ExtMarkdown) + entry.Title = strings.TrimSuffix(filename, file.ExtMarkdown) return entry } @@ -85,11 +88,11 @@ func ParseJournalEntry(path, filename string) JournalEntry { contentStr := string(content) // Parse YAML frontmatter if present - nl := config.NewlineLF - fmOpen := len(config.Separator + nl) - if strings.HasPrefix(contentStr, config.Separator+nl) { + nl := token.NewlineLF + fmOpen := len(token.Separator + nl) + if strings.HasPrefix(contentStr, token.Separator+nl) { if end := strings.Index( - contentStr[fmOpen:], nl+config.Separator+nl, + contentStr[fmOpen:], nl+token.Separator+nl, ); end >= 0 { fmRaw := contentStr[fmOpen : fmOpen+end] var fm JournalFrontmatter @@ -121,7 +124,7 @@ func ParseJournalEntry(path, filename string) JournalEntry { } // Check for suggestion mode sessions - if strings.Contains(contentStr, config.LabelSuggestionMode) { + if strings.Contains(contentStr, assets.LabelSuggestionMode) { entry.Suggestive = true } @@ -132,22 +135,22 @@ func ParseJournalEntry(path, filename string) JournalEntry { // Title from first H1 (only if frontmatter didn't set it) if strings.HasPrefix( - line, config.HeadingLevelOneStart, + line, token.HeadingLevelOneStart, ) && entry.Title == "" { - entry.Title = strings.TrimPrefix(line, config.HeadingLevelOneStart) + entry.Title = strings.TrimPrefix(line, token.HeadingLevelOneStart) } // Time from metadata - if strings.HasPrefix(line, config.MetadataTime) { + if strings.HasPrefix(line, assets.MetadataTime) { entry.Time = strings.TrimSpace( - strings.TrimPrefix(line, config.MetadataTime), + strings.TrimPrefix(line, assets.MetadataTime), ) } // Project from metadata - if strings.HasPrefix(line, config.MetadataProject) { + if strings.HasPrefix(line, assets.MetadataProject) { entry.Project = strings.TrimSpace( - strings.TrimPrefix(line, config.MetadataProject), + strings.TrimPrefix(line, assets.MetadataProject), ) } @@ -158,13 +161,13 @@ func ParseJournalEntry(path, filename string) JournalEntry { } if entry.Title == "" { - entry.Title = strings.TrimSuffix(filename, config.ExtMarkdown) + entry.Title = strings.TrimSuffix(filename, file.ExtMarkdown) } // Strip Claude Code internal markup tags from titles - entry.Title = strings.TrimSpace(config.RegExClaudeTag.ReplaceAllString(entry.Title, "")) + entry.Title = strings.TrimSpace(regex.SystemClaudeTag.ReplaceAllString(entry.Title, "")) - // Sanitize characters that break markdown link text: angle brackets + // Sanitize characters that break Markdown link text: angle brackets // become HTML entities; backticks and # are stripped (they add no // meaning inside [...] link labels). entry.Title = strings.NewReplacer( diff --git a/internal/cli/journal/core/reduce.go b/internal/cli/journal/core/reduce.go index 9ff3f807..5b01c285 100644 --- a/internal/cli/journal/core/reduce.go +++ b/internal/cli/journal/core/reduce.go @@ -10,7 +10,10 @@ import ( "encoding/json" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ) // StripFences removes all code fence markers from content, leaving the inner @@ -33,34 +36,34 @@ func StripFences(content string, fencesVerified bool) string { return content } - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string inFrontmatter := false for i, line := range lines { // Preserve frontmatter - if i == 0 && strings.TrimSpace(line) == config.Separator { + if i == 0 && strings.TrimSpace(line) == token.Separator { inFrontmatter = true out = append(out, line) continue } if inFrontmatter { out = append(out, line) - if strings.TrimSpace(line) == config.Separator { + if strings.TrimSpace(line) == token.Separator { inFrontmatter = false } continue } // Remove fence markers - if config.RegExFenceLine.MatchString(line) { + if regex.CodeFenceLine.MatchString(line) { continue } out = append(out, line) } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } // StripSystemReminders removes internal Claude Code blocks from journal content. @@ -81,7 +84,7 @@ func StripFences(content string, fencesVerified bool) string { // Returns: // - string: Content with all internal blocks removed func StripSystemReminders(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string inTagReminder := false inBoldReminder := false @@ -92,19 +95,19 @@ func StripSystemReminders(content string) string { trimmed := strings.TrimSpace(line) // XML-style: ... - if trimmed == config.TagSystemReminderOpen { + if trimmed == marker.TagSystemReminderOpen { inTagReminder = true continue } if inTagReminder { - if trimmed == config.TagSystemReminderClose { + if trimmed == marker.TagSystemReminderClose { inTagReminder = false } continue } // Bold-style: **System Reminder**: ... (runs until blank line) - if strings.HasPrefix(trimmed, config.LabelBoldReminder) { + if strings.HasPrefix(trimmed, assets.BoldReminder) { inBoldReminder = true continue } @@ -118,12 +121,12 @@ func StripSystemReminders(content string) string { // Context compaction: standalone on its own line. // Single-line N lines (ours) won't match // because trimmed != "" when there's inline content. - if trimmed == config.TagCompactionSummaryOpen { + if trimmed == marker.TagCompactionSummaryOpen { inCompaction = true continue } if inCompaction { - if trimmed == config.TagCompactionSummaryClose { + if trimmed == marker.TagCompactionSummaryClose { inCompaction = false } continue @@ -131,7 +134,7 @@ func StripSystemReminders(content string) string { // Compaction boilerplate: "If you need specific details from // before compaction..." paragraph (runs until blank line) - if strings.HasPrefix(trimmed, config.CompactionBoilerplatePrefix) { + if strings.HasPrefix(trimmed, marker.CompactionBoilerplatePrefix) { inBoilerplate = true continue } @@ -145,7 +148,7 @@ func StripSystemReminders(content string) string { out = append(out, line) } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } // CleanToolOutputJSON extracts plain text from Tool Output turns whose body is @@ -158,15 +161,15 @@ func StripSystemReminders(content string) string { // Returns: // - string: Content with JSON tool output replaced by plain text func CleanToolOutputJSON(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string i := 0 for i < len(lines) { - matches := config.RegExTurnHeader.FindStringSubmatch( + matches := regex.TurnHeader.FindStringSubmatch( strings.TrimSpace(lines[i]), ) - if matches == nil || matches[2] != config.LabelToolOutput { + if matches == nil || matches[2] != assets.ToolOutput { out = append(out, lines[i]) i++ continue @@ -179,7 +182,7 @@ func CleanToolOutputJSON(content string) string { // Collect body until next header bodyStart := i for i < len(lines) { - if config.RegExTurnHeader.MatchString(strings.TrimSpace(lines[i])) { + if regex.TurnHeader.MatchString(strings.TrimSpace(lines[i])) { break } i++ @@ -216,5 +219,5 @@ func CleanToolOutputJSON(content string) string { out = append(out, bodyLines...) } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } diff --git a/internal/cli/journal/core/section.go b/internal/cli/journal/core/section.go index d3164b43..10f72c34 100644 --- a/internal/cli/journal/core/section.go +++ b/internal/cli/journal/core/section.go @@ -12,7 +12,12 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ctxerr "github.com/ActiveMemory/ctx/internal/err" ) @@ -33,13 +38,13 @@ func WriteSection( writePages func(dir string), ) error { dir := filepath.Join(docsDir, subdir) - if mkErr := os.MkdirAll(dir, config.PermExec); mkErr != nil { + if mkErr := os.MkdirAll(dir, fs.PermExec); mkErr != nil { return ctxerr.Mkdir(dir, mkErr) } - indexPath := filepath.Join(dir, config.FilenameIndex) + indexPath := filepath.Join(dir, file.Index) if writeErr := os.WriteFile( - indexPath, []byte(indexContent), config.PermFile, + indexPath, []byte(indexContent), fs.PermFile, ); writeErr != nil { return ctxerr.FileWrite(indexPath, writeErr) } @@ -61,20 +66,20 @@ func WriteMonthSections( months map[string][]JournalEntry, monthOrder []string, linkPrefix string, ) { - nl := config.NewlineLF + nl := token.NewlineLF for _, month := range monthOrder { - fmt.Fprintf(sb, config.TplJournalMonthHeading+nl+nl, month) + _, _ = fmt.Fprintf(sb, assets.TplJournalMonthHeading+nl+nl, month) for _, e := range months[month] { - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) timeStr := "" - if e.Time != "" && len(e.Time) >= config.JournalTimePrefixLen { - timeStr = e.Time[:config.JournalTimePrefixLen] + " " + if e.Time != "" && len(e.Time) >= journal.TimePrefixLen { + timeStr = e.Time[:journal.TimePrefixLen] + " " } - fmt.Fprintf(sb, - config.TplJournalSubpageEntry+nl, + _, _ = fmt.Fprintf(sb, + assets.TplJournalSubpageEntry+nl, timeStr, e.Title, linkPrefix, link) if e.Summary != "" { - fmt.Fprintf(sb, config.TplJournalIndexSummary+nl, e.Summary) + _, _ = fmt.Fprintf(sb, assets.TplJournalIndexSummary+nl, e.Summary) } } sb.WriteString(nl) @@ -93,13 +98,13 @@ func WriteMonthSections( // - string: Complete Markdown page content func GenerateGroupedPage(heading, stats string, entries []JournalEntry) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF sb.WriteString(heading + nl + nl) sb.WriteString(stats + nl + nl) months, monthOrder := GroupByMonth(entries) - WriteMonthSections(&sb, months, monthOrder, config.LinkPrefixParent) + WriteMonthSections(&sb, months, monthOrder, token.LinkPrefixParent) return sb.String() } @@ -124,7 +129,7 @@ func WritePopularAndLongtail( ltCount int, ltHeading, ltTpl string, ltItem func(int) (string, JournalEntry), ) { - nl := config.NewlineLF + nl := token.NewlineLF if popCount > 0 { sb.WriteString(popHeading + nl + nl) @@ -139,8 +144,8 @@ func WritePopularAndLongtail( sb.WriteString(ltHeading + nl + nl) for i := range ltCount { label, e := ltItem(i) - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) - fmt.Fprintf(sb, ltTpl+nl, label, e.Title, link) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) + _, _ = fmt.Fprintf(sb, ltTpl+nl, label, e.Title, link) } sb.WriteString(nl) } @@ -155,5 +160,5 @@ func WritePopularAndLongtail( // Returns: // - bool: True if the filename matches the multipart continuation pattern func ContinuesMultipart(filename string) bool { - return config.RegExMultiPart.MatchString(filename) + return regex.MultiPart.MatchString(filename) } diff --git a/internal/cli/journal/core/turn.go b/internal/cli/journal/core/turn.go index 4ff0b3a5..6a3b6963 100644 --- a/internal/cli/journal/core/turn.go +++ b/internal/cli/journal/core/turn.go @@ -9,7 +9,8 @@ package core import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ) // ExtractTurnBody extracts the body text from lines[start:] until the next @@ -31,14 +32,14 @@ func ExtractTurnBody(lines []string, start int) (string, int) { // Collect body until next turn header bodyEnd := bodyStart for bodyEnd < len(lines) { - if config.RegExTurnHeader.MatchString(strings.TrimSpace(lines[bodyEnd])) { + if regex.TurnHeader.MatchString(strings.TrimSpace(lines[bodyEnd])) { break } bodyEnd++ } // Trim trailing blank lines for comparison body := strings.TrimSpace( - strings.Join(lines[bodyStart:bodyEnd], config.NewlineLF), + strings.Join(lines[bodyStart:bodyEnd], token.NewlineLF), ) return body, bodyEnd } @@ -54,13 +55,13 @@ func ExtractTurnBody(lines []string, start int) (string, int) { // Returns: // - string: Content with consecutive same-role turns merged func MergeConsecutiveTurns(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string i := 0 for i < len(lines) { trimmed := strings.TrimSpace(lines[i]) - matches := config.RegExTurnHeader.FindStringSubmatch(trimmed) + matches := regex.TurnHeader.FindStringSubmatch(trimmed) if matches == nil { out = append(out, lines[i]) i++ @@ -77,7 +78,7 @@ func MergeConsecutiveTurns(content string) string { for { // Collect body lines until the next header or EOF for j < len(lines) { - if config.RegExTurnHeader.MatchString(strings.TrimSpace(lines[j])) { + if regex.TurnHeader.MatchString(strings.TrimSpace(lines[j])) { break } body = append(body, lines[j]) @@ -87,7 +88,7 @@ func MergeConsecutiveTurns(content string) string { if j >= len(lines) { break } - nextMatches := config.RegExTurnHeader.FindStringSubmatch( + nextMatches := regex.TurnHeader.FindStringSubmatch( strings.TrimSpace(lines[j]), ) if nextMatches == nil || nextMatches[2] != role { @@ -102,5 +103,5 @@ func MergeConsecutiveTurns(content string) string { i = j } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } diff --git a/internal/cli/journal/core/types.go b/internal/cli/journal/core/types.go index 91f17191..d1badced 100644 --- a/internal/cli/journal/core/types.go +++ b/internal/cli/journal/core/types.go @@ -71,3 +71,17 @@ type TypeData struct { Name string Entries []JournalEntry } + +// ObsidianFrontmatter represents the YAML frontmatter for Obsidian vault +// entries. Extends JournalFrontmatter with Obsidian-specific fields. +type ObsidianFrontmatter struct { + Title string `yaml:"title"` + Date string `yaml:"date"` + Type string `yaml:"type,omitempty"` + Outcome string `yaml:"outcome,omitempty"` + Tags []string `yaml:"tags,omitempty"` + Technologies []string `yaml:"technologies,omitempty"` + KeyFiles []string `yaml:"key_files,omitempty"` + Aliases []string `yaml:"aliases,omitempty"` + SourceFile string `yaml:"source_file,omitempty"` +} diff --git a/internal/cli/journal/core/wikilink.go b/internal/cli/journal/core/wikilink.go index 9d5b143f..0811bb4c 100644 --- a/internal/cli/journal/core/wikilink.go +++ b/internal/cli/journal/core/wikilink.go @@ -12,11 +12,13 @@ import ( "regexp" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/token" ) // RegexMarkdownLink matches Markdown links: [display](target) -var RegexMarkdownLink = regexp.MustCompile(`\[([^\]]+)\]\(([^)]+)\)`) +var RegexMarkdownLink = regexp.MustCompile(`\[([^]]+)]\(([^)]+)\)`) // ConvertMarkdownLinks replaces internal Markdown links with Obsidian // wikilinks. External links (http/https) are left unchanged. @@ -45,7 +47,7 @@ func ConvertMarkdownLinks(content string) string { // Strip path prefix (e.g., "../topics/", "../") and .md extension target = filepath.Base(target) - target = strings.TrimSuffix(target, config.ExtMarkdown) + target = strings.TrimSuffix(target, file.ExtMarkdown) return FormatWikilink(target, display) }) @@ -64,9 +66,9 @@ func ConvertMarkdownLinks(content string) string { // - string: Formatted wikilink func FormatWikilink(target, display string) string { if target == display { - return fmt.Sprintf(config.ObsidianWikilinkPlain, target) + return fmt.Sprintf(assets.ObsidianWikilinkPlain, target) } - return fmt.Sprintf(config.ObsidianWikilinkFmt, target, display) + return fmt.Sprintf(assets.ObsidianWikilinkFmt, target, display) } // FormatWikilinkEntry formats a journal entry as a wikilink list item. @@ -79,14 +81,14 @@ func FormatWikilink(target, display string) string { // Returns: // - string: Formatted list item with wikilink func FormatWikilinkEntry(e JournalEntry) string { - link := strings.TrimSuffix(e.Filename, config.ExtMarkdown) + link := strings.TrimSuffix(e.Filename, file.ExtMarkdown) var meta []string if e.Type != "" { - meta = append(meta, config.Backtick+e.Type+config.Backtick) + meta = append(meta, token.Backtick+e.Type+token.Backtick) } if e.Outcome != "" { - meta = append(meta, config.Backtick+e.Outcome+config.Backtick) + meta = append(meta, token.Backtick+e.Outcome+token.Backtick) } suffix := "" diff --git a/internal/cli/journal/core/wrap.go b/internal/cli/journal/core/wrap.go index 8deda618..f0818883 100644 --- a/internal/cli/journal/core/wrap.go +++ b/internal/cli/journal/core/wrap.go @@ -9,7 +9,8 @@ package core import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/token" ) // SoftWrapContent wraps long lines in source journal files to ~80 characters. @@ -23,35 +24,35 @@ import ( // Returns: // - string: Content with long lines soft-wrapped at word boundaries func SoftWrapContent(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var out []string inFrontmatter := false for i, line := range lines { // Skip frontmatter - if i == 0 && strings.TrimSpace(line) == config.Separator { + if i == 0 && strings.TrimSpace(line) == token.Separator { inFrontmatter = true out = append(out, line) continue } if inFrontmatter { out = append(out, line) - if strings.TrimSpace(line) == config.Separator { + if strings.TrimSpace(line) == token.Separator { inFrontmatter = false } continue } // Wrap long lines (skip tables) - if len(line) > config.JournalLineWrapWidth && + if len(line) > journal.LineWrapWidth && !strings.HasPrefix(strings.TrimSpace(line), "|") { - out = append(out, SoftWrap(line, config.JournalLineWrapWidth)...) + out = append(out, SoftWrap(line, journal.LineWrapWidth)...) } else { out = append(out, line) } } - return strings.Join(out, config.NewlineLF) + return strings.Join(out, token.NewlineLF) } // SoftWrap breaks a long line at word boundaries, preserving leading indent. @@ -63,7 +64,7 @@ func SoftWrapContent(content string) string { // Returns: // - []string: Wrapped lines preserving the original indentation func SoftWrap(line string, width int) []string { - trimmed := strings.TrimLeft(line, config.Whitespace) + trimmed := strings.TrimLeft(line, token.Whitespace) indent := line[:len(line)-len(trimmed)] words := strings.Fields(trimmed) diff --git a/internal/cli/journal/journal.go b/internal/cli/journal/journal.go index 8012d94d..e92bf719 100644 --- a/internal/cli/journal/journal.go +++ b/internal/cli/journal/journal.go @@ -22,7 +22,7 @@ import ( // Returns: // - *cobra.Command: The journal command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("journal") + short, long := assets.CommandDesc(assets.CmdDescKeyJournal) cmd := &cobra.Command{ Use: "journal", Short: short, diff --git a/internal/cli/learnings/cmd/reindex/cmd.go b/internal/cli/learnings/cmd/reindex/cmd.go index ef1c49be..64fbcdb7 100644 --- a/internal/cli/learnings/cmd/reindex/cmd.go +++ b/internal/cli/learnings/cmd/reindex/cmd.go @@ -18,11 +18,11 @@ import ( // Returns: // - *cobra.Command: Command for regenerating the LEARNINGS.md index func Cmd() *cobra.Command { - short, long := assets.CommandDesc("learnings.reindex") + short, long := assets.CommandDesc(assets.CmdDescKeyLearningsReindex) return &cobra.Command{ Use: "reindex", Short: short, Long: long, - RunE: run, + RunE: Run, } } diff --git a/internal/cli/learnings/cmd/reindex/doc.go b/internal/cli/learnings/cmd/reindex/doc.go new file mode 100644 index 00000000..c2313733 --- /dev/null +++ b/internal/cli/learnings/cmd/reindex/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package reindex implements the ctx learnings reindex subcommand. +// +// It regenerates the LEARNINGS.md index from individual learning files. +package reindex diff --git a/internal/cli/learnings/cmd/reindex/run.go b/internal/cli/learnings/cmd/reindex/run.go index aacd4c33..db423325 100644 --- a/internal/cli/learnings/cmd/reindex/run.go +++ b/internal/cli/learnings/cmd/reindex/run.go @@ -9,14 +9,14 @@ package reindex import ( "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/ctx" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/index" "github.com/ActiveMemory/ctx/internal/rc" ) -// run regenerates the LEARNINGS.md index. +// Run regenerates the LEARNINGS.md index. // // Parameters: // - cmd: Cobra command for output messages @@ -24,13 +24,13 @@ import ( // // Returns: // - error: Non-nil if file read/write fails -func run(cmd *cobra.Command, _ []string) error { - filePath := filepath.Join(rc.ContextDir(), config.FileLearning) +func Run(cmd *cobra.Command, _ []string) error { + filePath := filepath.Join(rc.ContextDir(), ctx.Learning) return index.ReindexFile( cmd.OutOrStdout(), filePath, - config.FileLearning, + ctx.Learning, index.UpdateLearnings, - config.EntryPlural[config.EntryLearning], + "learnings", ) } diff --git a/internal/cli/learnings/learnings.go b/internal/cli/learnings/learnings.go index 7a6731bb..1a681654 100644 --- a/internal/cli/learnings/learnings.go +++ b/internal/cli/learnings/learnings.go @@ -21,7 +21,7 @@ import ( // Returns: // - *cobra.Command: The learnings command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("learnings") + short, long := assets.CommandDesc(assets.CmdDescKeyLearnings) cmd := &cobra.Command{ Use: "learnings", Short: short, diff --git a/internal/cli/learnings/learnings_test.go b/internal/cli/learnings/learnings_test.go index 093b03ac..25815e1d 100644 --- a/internal/cli/learnings/learnings_test.go +++ b/internal/cli/learnings/learnings_test.go @@ -11,7 +11,8 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -85,7 +86,7 @@ func TestRunReindex_WithFile(t *testing.T) { defer rc.Reset() // Create the context directory and LEARNINGS.md file - ctxDir := filepath.Join(tempDir, config.DirContext) + ctxDir := filepath.Join(tempDir, dir.Context) _ = os.MkdirAll(ctxDir, 0750) content := `# Learnings @@ -96,7 +97,7 @@ func TestRunReindex_WithFile(t *testing.T) { **Lesson:** Validate at boundaries **Application:** Add validation to all handlers ` - _ = os.WriteFile(filepath.Join(ctxDir, config.FileLearning), []byte(content), 0600) + _ = os.WriteFile(filepath.Join(ctxDir, ctx.Learning), []byte(content), 0600) cmd := Cmd() cmd.SetArgs([]string{"reindex"}) @@ -107,7 +108,7 @@ func TestRunReindex_WithFile(t *testing.T) { } // Verify the file was updated - updated, err := os.ReadFile(filepath.Join(ctxDir, config.FileLearning)) //nolint:gosec // test temp path + updated, err := os.ReadFile(filepath.Join(ctxDir, ctx.Learning)) //nolint:gosec // test temp path if err != nil { t.Fatalf("failed to read updated file: %v", err) } @@ -126,9 +127,9 @@ func TestRunReindex_EmptyFile(t *testing.T) { defer rc.Reset() // Create the context directory and empty LEARNINGS.md - ctxDir := filepath.Join(tempDir, config.DirContext) + ctxDir := filepath.Join(tempDir, dir.Context) _ = os.MkdirAll(ctxDir, 0750) - _ = os.WriteFile(filepath.Join(ctxDir, config.FileLearning), []byte("# Learnings\n"), 0600) + _ = os.WriteFile(filepath.Join(ctxDir, ctx.Learning), []byte("# Learnings\n"), 0600) cmd := Cmd() cmd.SetArgs([]string{"reindex"}) diff --git a/internal/cli/load/cmd/root/cmd.go b/internal/cli/load/cmd/root/cmd.go index d37451a9..dac67658 100644 --- a/internal/cli/load/cmd/root/cmd.go +++ b/internal/cli/load/cmd/root/cmd.go @@ -5,3 +5,53 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// Cmd returns the "ctx load" command for outputting assembled context. +// +// The command loads context files from .context/ and outputs them in the +// recommended read order, suitable for providing to an AI assistant. +// +// Flags: +// - --budget: Token budget for assembly (default 8000) +// - --raw: Output raw file contents without headers or assembly +// +// Returns: +// - *cobra.Command: Configured load command with flags registered +func Cmd() *cobra.Command { + var ( + budget int + raw bool + ) + + short, long := assets.CommandDesc(assets.CmdDescKeyLoad) + cmd := &cobra.Command{ + Use: "load", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, args []string) error { + // Use configured budget if flag not explicitly set + if !cmd.Flags().Changed("budget") { + budget = rc.TokenBudget() + } + return Run(cmd, budget, raw) + }, + } + + cmd.Flags().IntVar( + &budget, "budget", + rc.DefaultTokenBudget, + assets.FlagDesc(assets.FlagDescKeyLoadBudget), + ) + cmd.Flags().BoolVar( + &raw, "raw", false, assets.FlagDesc(assets.FlagDescKeyLoadRaw), + ) + + return cmd +} diff --git a/internal/cli/load/cmd/root/doc.go b/internal/cli/load/cmd/root/doc.go new file mode 100644 index 00000000..266ea02a --- /dev/null +++ b/internal/cli/load/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx load command. +// +// It outputs assembled context Markdown from .context/ files in the +// recommended read order within a configurable token budget. +package root diff --git a/internal/cli/load/cmd/root/run.go b/internal/cli/load/cmd/root/run.go index 9e73d57f..94cb8cbd 100644 --- a/internal/cli/load/cmd/root/run.go +++ b/internal/cli/load/cmd/root/run.go @@ -8,12 +8,13 @@ package root import ( "errors" - "fmt" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/load/core" "github.com/ActiveMemory/ctx/internal/context" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // Run executes the load command logic. @@ -33,14 +34,18 @@ func Run(cmd *cobra.Command, budget int, raw bool) error { if err != nil { var notFoundError *context.NotFoundError if errors.As(err, ¬FoundError) { - return fmt.Errorf("no .context/ directory found. Run 'ctx init' first") + return ctxerr.NotInitialized() } return err } + files := core.SortByReadOrder(ctx.Files) + if raw { - return core.OutputRaw(cmd, ctx) + return write.LoadRaw(cmd, files) } - return core.OutputAssembled(cmd, ctx, budget) + return write.LoadAssembled( + cmd, files, budget, ctx.TotalTokens, core.FileNameToTitle, + ) } diff --git a/internal/cli/load/core/convert.go b/internal/cli/load/core/convert.go index 4a6d04d5..8a638b75 100644 --- a/internal/cli/load/core/convert.go +++ b/internal/cli/load/core/convert.go @@ -9,7 +9,7 @@ package core import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/file" ) // FileNameToTitle converts a context file name to a human-readable title. @@ -25,7 +25,7 @@ import ( // - string: Title case representation of the file name func FileNameToTitle(name string) string { // Remove .md extension - name = strings.TrimSuffix(name, config.ExtMarkdown) + name = strings.TrimSuffix(name, file.ExtMarkdown) // Convert SCREAMING_SNAKE to Title Case name = strings.ReplaceAll(name, "_", " ") // Title case each word diff --git a/internal/cli/load/core/sort.go b/internal/cli/load/core/sort.go index f33f682c..a20c0a18 100644 --- a/internal/cli/load/core/sort.go +++ b/internal/cli/load/core/sort.go @@ -9,11 +9,11 @@ package core import ( "sort" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" "github.com/ActiveMemory/ctx/internal/context" ) -// SortByReadOrder sorts context files according to [config.FileReadOrder]. +// SortByReadOrder sorts context files according to [config.ReadOrder]. // // Files not in the read-order list are assigned a low priority (100) and // will appear at the end. The original slice is not modified; a new sorted @@ -27,7 +27,7 @@ import ( func SortByReadOrder(files []context.FileInfo) []context.FileInfo { // Create a map for a quick priority lookup priority := make(map[string]int) - for i, name := range config.FileReadOrder { + for i, name := range ctx.ReadOrder { priority[name] = i } diff --git a/internal/cli/load/load.go b/internal/cli/load/load.go index 67d82dd2..ec3c81cd 100644 --- a/internal/cli/load/load.go +++ b/internal/cli/load/load.go @@ -9,48 +9,10 @@ package load import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" loadroot "github.com/ActiveMemory/ctx/internal/cli/load/cmd/root" - "github.com/ActiveMemory/ctx/internal/rc" ) // Cmd returns the "ctx load" command for outputting assembled context. -// -// The command loads context files from .context/ and outputs them in the -// recommended read order, suitable for providing to an AI assistant. -// -// Flags: -// - --budget: Token budget for assembly (default 8000) -// - --raw: Output raw file contents without headers or assembly -// -// Returns: -// - *cobra.Command: Configured load command with flags registered func Cmd() *cobra.Command { - var ( - budget int - raw bool - ) - - short, long := assets.CommandDesc("load") - cmd := &cobra.Command{ - Use: "load", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, args []string) error { - // Use configured budget if flag not explicitly set - if !cmd.Flags().Changed("budget") { - budget = rc.TokenBudget() - } - return loadroot.Run(cmd, budget, raw) - }, - } - - cmd.Flags().IntVar( - &budget, "budget", rc.DefaultTokenBudget, assets.FlagDesc("load.budget"), - ) - cmd.Flags().BoolVar( - &raw, "raw", false, assets.FlagDesc("load.raw"), - ) - - return cmd + return loadroot.Cmd() } diff --git a/internal/cli/loop/cmd/root/cmd.go b/internal/cli/loop/cmd/root/cmd.go index d37451a9..710aa618 100644 --- a/internal/cli/loop/cmd/root/cmd.go +++ b/internal/cli/loop/cmd/root/cmd.go @@ -5,3 +5,73 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/ActiveMemory/ctx/internal/config/loop" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx loop" command for generating Ralph loop scripts. +// +// The command generates a shell script that runs an AI assistant in a loop +// until a completion signal is detected, enabling iterative development +// where the AI builds on previous work. +// +// Flags: +// - --prompt, -p: Prompt file to use (default "PROMPT.md") +// - --tool, -t: AI tool - claude, aider, or generic (default "claude") +// - --max-iterations, -n: Maximum iterations, 0 for unlimited (default 0) +// - --completion, -c: Completion signal to detect +// (default "SYSTEM_CONVERGED") +// - --output, -o: Output script filename (default "loop.sh") +// +// Returns: +// - *cobra.Command: Configured loop command with flags registered +func Cmd() *cobra.Command { + var ( + promptFile string + tool string + maxIterations int + completionMsg string + outputFile string + ) + + short, long := assets.CommandDesc(assets.CmdDescKeyLoop) + cmd := &cobra.Command{ + Use: "loop", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, args []string) error { + return Run( + cmd, promptFile, tool, maxIterations, completionMsg, outputFile, + ) + }, + } + + cmd.Flags().StringVarP(&promptFile, + "prompt", "p", + loop.PromptMd, assets.FlagDesc(assets.FlagDescKeyLoopPrompt), + ) + cmd.Flags().StringVarP( + &tool, "tool", "t", "claude", assets.FlagDesc(assets.FlagDescKeyLoopTool), + ) + cmd.Flags().IntVarP( + &maxIterations, + "max-iterations", "n", + 0, assets.FlagDesc(assets.FlagDescKeyLoopMaxIterations), + ) + cmd.Flags().StringVarP( + &completionMsg, + "completion", "c", loop.DefaultCompletionSignal, + assets.FlagDesc(assets.FlagDescKeyLoopCompletion), + ) + cmd.Flags().StringVarP( + &outputFile, + "output", "o", + "loop.sh", assets.FlagDesc(assets.FlagDescKeyLoopOutput), + ) + + return cmd +} diff --git a/internal/cli/loop/cmd/root/doc.go b/internal/cli/loop/cmd/root/doc.go new file mode 100644 index 00000000..6947e581 --- /dev/null +++ b/internal/cli/loop/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx loop command. +// +// It generates a Ralph loop shell script that runs an AI assistant +// iteratively until a completion signal is detected. +package root diff --git a/internal/cli/loop/cmd/root/run.go b/internal/cli/loop/cmd/root/run.go index a37e07cc..8f63d97b 100644 --- a/internal/cli/loop/cmd/root/run.go +++ b/internal/cli/loop/cmd/root/run.go @@ -7,12 +7,14 @@ package root import ( - "fmt" "os" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // Run executes the loop command logic. @@ -36,37 +38,23 @@ func Run( maxIterations int, completionMsg, outputFile string, ) error { - // Validate tool validTools := map[string]bool{"claude": true, "aider": true, "generic": true} if !validTools[tool] { - return fmt.Errorf( - "invalid tool %q: must be claude, aider, or generic", tool, - ) + return ctxerr.InvalidTool(tool) } - // Generate the script script := GenerateLoopScript(promptFile, tool, maxIterations, completionMsg) - // Write to the file - if err := os.WriteFile( - outputFile, []byte(script), config.PermExec, - ); err != nil { - return fmt.Errorf("failed to write %s: %w", outputFile, err) + if writeErr := os.WriteFile( + outputFile, []byte(script), fs.PermExec, + ); writeErr != nil { + return ctxerr.FileWrite(outputFile, writeErr) } - cmd.Println(fmt.Sprintf("✓ Generated %s", outputFile)) - cmd.Println() - cmd.Println(config.LoopHeadingStart) - cmd.Println(fmt.Sprintf(" ./%s", outputFile)) - cmd.Println() - cmd.Println(fmt.Sprintf("Tool: %s", tool)) - cmd.Println(fmt.Sprintf("Prompt: %s", promptFile)) - if maxIterations > 0 { - cmd.Println(fmt.Sprintf("Max iterations: %d", maxIterations)) - } else { - cmd.Println("Max iterations: unlimited") - } - cmd.Println(fmt.Sprintf("Completion signal: %s", completionMsg)) + write.InfoLoopGenerated( + cmd, outputFile, assets.LoopHeadingStart, + tool, promptFile, maxIterations, completionMsg, + ) return nil } diff --git a/internal/cli/loop/cmd/root/script.go b/internal/cli/loop/cmd/root/script.go index a78f45d4..0ae3aa14 100644 --- a/internal/cli/loop/cmd/root/script.go +++ b/internal/cli/loop/cmd/root/script.go @@ -10,7 +10,7 @@ import ( "fmt" "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" ) // GenerateLoopScript creates a bash script for running a Ralph loop. @@ -46,11 +46,11 @@ func GenerateLoopScript( maxIterCheck := "" if maxIterations > 0 { maxIterCheck = fmt.Sprintf( - config.TplLoopMaxIter, maxIterations, maxIterations, config.TplLoopNotify) + assets.TplLoopMaxIter, maxIterations, maxIterations, assets.TplLoopNotify) } - script := fmt.Sprintf(config.TplLoopScript, - absPrompt, completionMsg, maxIterCheck, aiCommand, config.LoopComplete, config.TplLoopNotify) + script := fmt.Sprintf(assets.TplLoopScript, + absPrompt, completionMsg, maxIterCheck, aiCommand, assets.LoopComplete, assets.TplLoopNotify) return script } diff --git a/internal/cli/loop/loop.go b/internal/cli/loop/loop.go index 66ff41d9..eb5669f5 100644 --- a/internal/cli/loop/loop.go +++ b/internal/cli/loop/loop.go @@ -9,68 +9,10 @@ package loop import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" looproot "github.com/ActiveMemory/ctx/internal/cli/loop/cmd/root" - "github.com/ActiveMemory/ctx/internal/config" ) // Cmd returns the "ctx loop" command for generating Ralph loop scripts. -// -// The command generates a shell script that runs an AI assistant in a loop -// until a completion signal is detected, enabling iterative development -// where the AI builds on previous work. -// -// Flags: -// - --prompt, -p: Prompt file to use (default "PROMPT.md") -// - --tool, -t: AI tool - claude, aider, or generic (default "claude") -// - --max-iterations, -n: Maximum iterations, 0 for unlimited (default 0) -// - --completion, -c: Completion signal to detect -// (default "SYSTEM_CONVERGED") -// - --output, -o: Output script filename (default "loop.sh") -// -// Returns: -// - *cobra.Command: Configured loop command with flags registered func Cmd() *cobra.Command { - var ( - promptFile string - tool string - maxIterations int - completionMsg string - outputFile string - ) - - short, long := assets.CommandDesc("loop") - cmd := &cobra.Command{ - Use: "loop", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, args []string) error { - return looproot.Run( - cmd, promptFile, tool, maxIterations, completionMsg, outputFile, - ) - }, - } - - cmd.Flags().StringVarP(&promptFile, - "prompt", "p", config.FilePromptMd, assets.FlagDesc("loop.prompt"), - ) - cmd.Flags().StringVarP( - &tool, "tool", "t", "claude", assets.FlagDesc("loop.tool"), - ) - cmd.Flags().IntVarP( - &maxIterations, - "max-iterations", "n", - 0, assets.FlagDesc("loop.max-iterations"), - ) - cmd.Flags().StringVarP( - &completionMsg, - "completion", "c", "SYSTEM_CONVERGED", assets.FlagDesc("loop.completion"), - ) - cmd.Flags().StringVarP( - &outputFile, - "output", "o", - "loop.sh", assets.FlagDesc("loop.output"), - ) - - return cmd + return looproot.Cmd() } diff --git a/internal/cli/mcp/cmd/root/cmd.go b/internal/cli/mcp/cmd/root/cmd.go new file mode 100644 index 00000000..76e1d861 --- /dev/null +++ b/internal/cli/mcp/cmd/root/cmd.go @@ -0,0 +1,18 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package root + +import ( + internalmcp "github.com/ActiveMemory/ctx/internal/mcp" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/spf13/cobra" +) + +func Cmd(cmd *cobra.Command, _ []string) error { + srv := internalmcp.NewServer(rc.ContextDir(), cmd.Root().Version) + return srv.Serve() +} diff --git a/internal/cli/mcp/mcp.go b/internal/cli/mcp/mcp.go index a31ef68e..9efa90ec 100644 --- a/internal/cli/mcp/mcp.go +++ b/internal/cli/mcp/mcp.go @@ -8,17 +8,16 @@ package mcp import ( + "github.com/ActiveMemory/ctx/internal/config/cli" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/config" - internalmcp "github.com/ActiveMemory/ctx/internal/mcp" - "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/cli/mcp/cmd/root" ) // Cmd returns the mcp command group. func Cmd() *cobra.Command { - short, long := assets.CommandDesc("mcp") + short, long := assets.CommandDesc(assets.CmdDescKeyMcp) cmd := &cobra.Command{ Use: "mcp", Short: short, @@ -32,15 +31,13 @@ func Cmd() *cobra.Command { // serveCmd returns the mcp serve subcommand. func serveCmd() *cobra.Command { + serveShort, serveLong := assets.CommandDesc(assets.CmdDescKeyMcpServe) return &cobra.Command{ Use: "serve", - Short: "Start the MCP server (stdin/stdout)", - Long: "Start the MCP server, communicating via JSON-RPC 2.0 over stdin/stdout.\n\nThis command is intended to be invoked by MCP clients (AI tools), not\nrun directly by users. Configure your AI tool to run 'ctx mcp serve'\nas an MCP server.", - Annotations: map[string]string{config.AnnotationSkipInit: "true"}, + Short: serveShort, + Long: serveLong, + Annotations: map[string]string{cli.AnnotationSkipInit: cli.AnnotationTrue}, SilenceUsage: true, - RunE: func(_ *cobra.Command, _ []string) error { - srv := internalmcp.NewServer(rc.ContextDir()) - return srv.Serve() - }, + RunE: root.Cmd, } } diff --git a/internal/cli/memory/cmd/diff/cmd.go b/internal/cli/memory/cmd/diff/cmd.go new file mode 100644 index 00000000..b42e6151 --- /dev/null +++ b/internal/cli/memory/cmd/diff/cmd.go @@ -0,0 +1,29 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package diff + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the memory diff subcommand. +// +// Returns: +// - *cobra.Command: command for showing memory diff. +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyMemoryDiff) + return &cobra.Command{ + Use: "diff", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd) + }, + } +} diff --git a/internal/cli/memory/cmd/diff/doc.go b/internal/cli/memory/cmd/diff/doc.go new file mode 100644 index 00000000..ef9669a4 --- /dev/null +++ b/internal/cli/memory/cmd/diff/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package diff implements the ctx memory diff subcommand. +// +// It shows line-level differences between the mirror and current MEMORY.md. +package diff diff --git a/internal/cli/memory/diff.go b/internal/cli/memory/cmd/diff/run.go similarity index 54% rename from internal/cli/memory/diff.go rename to internal/cli/memory/cmd/diff/run.go index 1dfa8bd3..08e9f7a1 100644 --- a/internal/cli/memory/diff.go +++ b/internal/cli/memory/cmd/diff/run.go @@ -4,46 +4,43 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package memory +package diff import ( - "fmt" "path/filepath" "github.com/spf13/cobra" + ctxerr "github.com/ActiveMemory/ctx/internal/err" mem "github.com/ActiveMemory/ctx/internal/memory" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/write" ) -func diffCmd() *cobra.Command { - return &cobra.Command{ - Use: "diff", - Short: "Show what changed since last sync", - Long: `Show a line-based diff between .context/memory/mirror.md and the -current MEMORY.md. No output when files are identical.`, - RunE: func(cmd *cobra.Command, _ []string) error { - return runDiff(cmd) - }, - } -} - -func runDiff(cmd *cobra.Command) error { +// Run computes and prints a line-based diff between the mirror and +// current MEMORY.md. +// +// Parameters: +// - cmd: Cobra command for output routing. +// +// Returns: +// - error: on discovery or diff failure. +func Run(cmd *cobra.Command) error { contextDir := rc.ContextDir() projectRoot := filepath.Dir(contextDir) sourcePath, discoverErr := mem.DiscoverMemoryPath(projectRoot) if discoverErr != nil { - return fmt.Errorf("MEMORY.md not found: %w", discoverErr) + return ctxerr.MemoryDiscoverFailed(discoverErr) } diff, diffErr := mem.Diff(contextDir, sourcePath) if diffErr != nil { - return fmt.Errorf("computing diff: %w", diffErr) + return ctxerr.MemoryDiffFailed(diffErr) } if diff == "" { - cmd.Println("No changes since last sync.") + write.MemoryNoChanges(cmd) return nil } diff --git a/internal/cli/memory/cmd/importer/cmd.go b/internal/cli/memory/cmd/importer/cmd.go new file mode 100644 index 00000000..78ea15c0 --- /dev/null +++ b/internal/cli/memory/cmd/importer/cmd.go @@ -0,0 +1,38 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package importer + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the memory import subcommand. +// +// Returns: +// - *cobra.Command: command for importing MEMORY.md entries into .context/ files. +func Cmd() *cobra.Command { + var dryRun bool + + short, long := assets.CommandDesc(assets.CmdDescKeyMemoryImport) + cmd := &cobra.Command{ + Use: "import", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, dryRun) + }, + } + + cmd.Flags().BoolVar( + &dryRun, "dry-run", false, + assets.FlagDesc(assets.FlagDescKeyMemoryImportDryRun), + ) + + return cmd +} diff --git a/internal/cli/memory/cmd/importer/doc.go b/internal/cli/memory/cmd/importer/doc.go new file mode 100644 index 00000000..09a1f902 --- /dev/null +++ b/internal/cli/memory/cmd/importer/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package importer implements the ctx memory import subcommand. +// +// It classifies and promotes MEMORY.md entries into structured +// .context/ files using heuristic keyword matching. +package importer diff --git a/internal/cli/memory/cmd/importer/run.go b/internal/cli/memory/cmd/importer/run.go new file mode 100644 index 00000000..ae3b7ee1 --- /dev/null +++ b/internal/cli/memory/cmd/importer/run.go @@ -0,0 +1,127 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package importer + +import ( + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/entry" + memory2 "github.com/ActiveMemory/ctx/internal/config/memory" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/memory/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/memory" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run parses MEMORY.md entries, classifies them by heuristic keyword +// matching, deduplicates against prior imports, and promotes new entries +// into the appropriate .context/ files. +// +// Parameters: +// - cmd: Cobra command for output routing. +// - dryRun: when true, show the classification plan without writing. +// +// Returns: +// - error: on discovery, read, state, or promotion failure. +func Run(cmd *cobra.Command, dryRun bool) error { + contextDir := rc.ContextDir() + projectRoot := filepath.Dir(contextDir) + + sourcePath, discoverErr := memory.DiscoverMemoryPath(projectRoot) + if discoverErr != nil { + write.ErrAutoMemoryNotActive(cmd, discoverErr) + return ctxerr.MemoryNotFound() + } + + sourceData, readErr := validation.SafeReadFile( + filepath.Dir(sourcePath), filepath.Base(sourcePath), + ) + if readErr != nil { + return ctxerr.ReadMemory(readErr) + } + + entries := memory.ParseEntries(string(sourceData)) + if len(entries) == 0 { + write.ImportNoEntries(cmd, memory2.MemorySource) + return nil + } + + state, loadErr := memory.LoadState(contextDir) + if loadErr != nil { + return ctxerr.LoadState(loadErr) + } + + write.ImportScanHeader(cmd, memory2.MemorySource, len(entries)) + + var result core.ImportResult + + for _, e := range entries { + hash := memory.EntryHash(e.Text) + + if state.Imported(hash) { + result.Dupes++ + continue + } + + classification := memory.Classify(e) + title := core.Truncate(e.Text, 60) + + if classification.Target == memory.TargetSkip { + result.Skipped++ + if dryRun { + write.ImportEntrySkipped(cmd, title) + } + continue + } + + targetFile := entry.ToCtxFile[classification.Target] + + if dryRun { + write.ImportEntryClassified(cmd, title, targetFile, classification.Keywords) + } else { + if promoteErr := memory.Promote(e, classification); promoteErr != nil { + write.ErrImportPromote(cmd, targetFile, promoteErr) + continue + } + state.MarkImported(hash, classification.Target) + write.ImportEntryAdded(cmd, title, targetFile) + } + + switch classification.Target { + case entry.Convention: + result.Conventions++ + case entry.Decision: + result.Decisions++ + case entry.Learning: + result.Learnings++ + case entry.Task: + result.Tasks++ + } + } + + write.ImportSummary(cmd, write.ImportCounts{ + Conventions: result.Conventions, + Decisions: result.Decisions, + Learnings: result.Learnings, + Tasks: result.Tasks, + Skipped: result.Skipped, + Dupes: result.Dupes, + }, dryRun) + + if !dryRun && result.Total() > 0 { + state.MarkImportedDone() + if saveErr := memory.SaveState(contextDir, state); saveErr != nil { + return ctxerr.SaveState(saveErr) + } + } + + return nil +} diff --git a/internal/cli/memory/cmd/publish/cmd.go b/internal/cli/memory/cmd/publish/cmd.go new file mode 100644 index 00000000..089faecc --- /dev/null +++ b/internal/cli/memory/cmd/publish/cmd.go @@ -0,0 +1,44 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package publish + +import ( + "github.com/ActiveMemory/ctx/internal/config/memory" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the memory publish subcommand. +// +// Returns: +// - *cobra.Command: command for publishing curated context to MEMORY.md. +func Cmd() *cobra.Command { + var budget int + var dryRun bool + + short, long := assets.CommandDesc(assets.CmdDescKeyMemoryPublish) + cmd := &cobra.Command{ + Use: "publish", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, budget, dryRun) + }, + } + + cmd.Flags().IntVar(&budget, + "budget", memory.DefaultPublishBudget, + assets.FlagDesc(assets.FlagDescKeyMemoryPublishBudget), + ) + cmd.Flags().BoolVar(&dryRun, + "dry-run", false, + assets.FlagDesc(assets.FlagDescKeyMemoryPublishDryRun), + ) + + return cmd +} diff --git a/internal/cli/memory/cmd/publish/doc.go b/internal/cli/memory/cmd/publish/doc.go new file mode 100644 index 00000000..af71aa23 --- /dev/null +++ b/internal/cli/memory/cmd/publish/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package publish implements the ctx memory publish subcommand. +// +// It pushes curated .context/ content into MEMORY.md so the agent +// sees structured project context on session start. +package publish diff --git a/internal/cli/memory/cmd/publish/run.go b/internal/cli/memory/cmd/publish/run.go new file mode 100644 index 00000000..24c6ed99 --- /dev/null +++ b/internal/cli/memory/cmd/publish/run.go @@ -0,0 +1,63 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package publish + +import ( + "path/filepath" + + "github.com/spf13/cobra" + + ctxerr "github.com/ActiveMemory/ctx/internal/err" + mem "github.com/ActiveMemory/ctx/internal/memory" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run selects high-value context, formats it, and writes a marked block +// into MEMORY.md. In dry-run mode it reports what would be published. +// +// Parameters: +// - cmd: Cobra command for output routing. +// - budget: maximum line count for the published block. +// - dryRun: when true, show the plan without writing. +// +// Returns: +// - error: on discovery, selection, or publish failure. +func Run(cmd *cobra.Command, budget int, dryRun bool) error { + contextDir := rc.ContextDir() + projectRoot := filepath.Dir(contextDir) + + memoryPath, discoverErr := mem.DiscoverMemoryPath(projectRoot) + if discoverErr != nil { + write.ErrAutoMemoryNotActive(cmd, discoverErr) + return ctxerr.MemoryNotFound() + } + + result, selectErr := mem.SelectContent(contextDir, budget) + if selectErr != nil { + return ctxerr.SelectContentFailed(selectErr) + } + + write.PublishPlan(cmd, budget, + len(result.Tasks), len(result.Decisions), + len(result.Conventions), len(result.Learnings), + result.TotalLines, + ) + + if dryRun { + write.PublishDryRun(cmd) + return nil + } + + if _, publishErr := mem.Publish(contextDir, memoryPath, budget); publishErr != nil { + return ctxerr.PublishFailed(publishErr) + } + + write.PublishDone(cmd) + + return nil +} diff --git a/internal/cli/memory/cmd/status/cmd.go b/internal/cli/memory/cmd/status/cmd.go index b3df96d1..71cd8f44 100644 --- a/internal/cli/memory/cmd/status/cmd.go +++ b/internal/cli/memory/cmd/status/cmd.go @@ -18,13 +18,13 @@ import ( // Returns: // - *cobra.Command: command for showing memory bridge status. func Cmd() *cobra.Command { - short, long := assets.CommandDesc("memory.status") + short, long := assets.CommandDesc(assets.CmdDescKeyMemoryStatus) return &cobra.Command{ Use: "status", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runStatus(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/memory/cmd/status/doc.go b/internal/cli/memory/cmd/status/doc.go new file mode 100644 index 00000000..c23a2ed2 --- /dev/null +++ b/internal/cli/memory/cmd/status/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package status implements the ctx memory status subcommand. +// +// It displays the current state of the memory bridge, showing sync +// status between MEMORY.md and its mirror. +package status diff --git a/internal/cli/memory/cmd/status/run.go b/internal/cli/memory/cmd/status/run.go index 18e32aef..319139aa 100644 --- a/internal/cli/memory/cmd/status/run.go +++ b/internal/cli/memory/cmd/status/run.go @@ -7,20 +7,24 @@ package status import ( - "fmt" "os" "path/filepath" "time" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/memory" + time2 "github.com/ActiveMemory/ctx/internal/config/time" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/memory/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" mem "github.com/ActiveMemory/ctx/internal/memory" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) -// runStatus prints memory bridge status including source location, +// Run prints memory bridge status including source location, // last sync time, line counts, drift indicator, and archive count. // // Parameters: @@ -28,62 +32,62 @@ import ( // // Returns: // - error: on discovery failure. -func runStatus(cmd *cobra.Command) error { +func Run(cmd *cobra.Command) error { contextDir := rc.ContextDir() projectRoot := filepath.Dir(contextDir) sourcePath, discoverErr := mem.DiscoverMemoryPath(projectRoot) if discoverErr != nil { - cmd.Println("Memory Bridge Status") - cmd.Println(" Source: auto memory not active (MEMORY.md not found)") - return fmt.Errorf("MEMORY.md not found") + write.MemoryBridgeHeader(cmd) + write.MemorySourceNotActive(cmd) + return ctxerr.MemoryNotFound() } - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) - - cmd.Println("Memory Bridge Status") - cmd.Println(fmt.Sprintf(" Source: %s", sourcePath)) - cmd.Println(fmt.Sprintf(" Mirror: .context/%s/%s", config.DirMemory, config.FileMemoryMirror)) + write.MemoryBridgeHeader(cmd) + write.MemorySource(cmd, sourcePath) + write.MemoryMirror(cmd, memory.PathMemoryMirror) // Last sync time state, _ := mem.LoadState(contextDir) if state.LastSync != nil { ago := time.Since(*state.LastSync).Truncate(time.Minute) - cmd.Println(fmt.Sprintf(" Last sync: %s (%s ago)", - state.LastSync.Local().Format("2006-01-02 15:04"), core.FormatDuration(ago))) + write.MemoryLastSync(cmd, + state.LastSync.Local().Format(time2.DateTimeFormat), + core.FormatDuration(ago)) } else { - cmd.Println(" Last sync: never") + write.MemoryLastSyncNever(cmd) } cmd.Println() // Source line count - if sourceData, readErr := os.ReadFile(sourcePath); readErr == nil { //nolint:gosec // discovered path - line := fmt.Sprintf(" MEMORY.md: %d lines", core.CountFileLines(sourceData)) - if mem.HasDrift(contextDir, sourcePath) { - line += " (modified since last sync)" - } - cmd.Println(line) + hasDrift := mem.HasDrift(contextDir, sourcePath) + if sourceData, readErr := validation.SafeReadFile( + filepath.Dir(sourcePath), filepath.Base(sourcePath), + ); readErr == nil { + write.MemorySourceLines(cmd, core.CountFileLines(sourceData), hasDrift) } // Mirror line count - if mirrorData, readErr := os.ReadFile(mirrorPath); readErr == nil { //nolint:gosec // project-local path - cmd.Println(fmt.Sprintf(" Mirror: %d lines", core.CountFileLines(mirrorData))) + memoryDir := filepath.Join(contextDir, dir.Memory) + if mirrorData, readErr := validation.SafeReadFile( + memoryDir, memory.MemoryMirror, + ); readErr == nil { + write.MemoryMirrorLines(cmd, core.CountFileLines(mirrorData)) } else { - cmd.Println(" Mirror: not yet synced") + write.MemoryMirrorNotSynced(cmd) } // Drift - hasDrift := mem.HasDrift(contextDir, sourcePath) if hasDrift { - cmd.Println(" Drift: detected (source is newer)") + write.MemoryDriftDetected(cmd) } else { - cmd.Println(" Drift: none") + write.MemoryDriftNone(cmd) } // Archives count := mem.ArchiveCount(contextDir) - cmd.Println(fmt.Sprintf(" Archives: %d snapshots in .context/%s/", count, config.DirMemoryArchive)) + write.MemoryArchives(cmd, count, dir.MemoryArchive) if hasDrift { // Exit code 2 for drift diff --git a/internal/cli/memory/cmd/sync/cmd.go b/internal/cli/memory/cmd/sync/cmd.go index aca620b6..1d993576 100644 --- a/internal/cli/memory/cmd/sync/cmd.go +++ b/internal/cli/memory/cmd/sync/cmd.go @@ -20,18 +20,18 @@ import ( func Cmd() *cobra.Command { var dryRun bool - short, long := assets.CommandDesc("memory.sync") + short, long := assets.CommandDesc(assets.CmdDescKeyMemorySync) cmd := &cobra.Command{ Use: "sync", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runSync(cmd, dryRun) + return Run(cmd, dryRun) }, } cmd.Flags().BoolVar( - &dryRun, "dry-run", false, assets.FlagDesc("memory.sync.dry-run"), + &dryRun, "dry-run", false, assets.FlagDesc(assets.FlagDescKeyMemorySyncDryRun), ) return cmd diff --git a/internal/cli/memory/cmd/sync/doc.go b/internal/cli/memory/cmd/sync/doc.go new file mode 100644 index 00000000..e35254b1 --- /dev/null +++ b/internal/cli/memory/cmd/sync/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package sync implements the ctx memory sync subcommand. +// +// It synchronizes MEMORY.md content to its configured mirror location. +package sync diff --git a/internal/cli/memory/cmd/sync/run.go b/internal/cli/memory/cmd/sync/run.go index 7f5eedbf..77dd02c0 100644 --- a/internal/cli/memory/cmd/sync/run.go +++ b/internal/cli/memory/cmd/sync/run.go @@ -9,16 +9,16 @@ package sync import ( "path/filepath" + memory2 "github.com/ActiveMemory/ctx/internal/config/memory" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/memory" "github.com/ActiveMemory/ctx/internal/rc" "github.com/ActiveMemory/ctx/internal/write" ) -// runSync discovers MEMORY.md, mirrors it into .context/memory/, and +// Run discovers MEMORY.md, mirrors it into .context/memory/, and // updates the sync state. In dry-run mode it reports what would happen // without writing any files. // @@ -28,7 +28,7 @@ import ( // // Returns: // - error: on discovery failure, sync failure, or state persistence failure. -func runSync(cmd *cobra.Command, dryRun bool) error { +func Run(cmd *cobra.Command, dryRun bool) error { contextDir := rc.ContextDir() projectRoot := filepath.Dir(contextDir) @@ -39,14 +39,8 @@ func runSync(cmd *cobra.Command, dryRun bool) error { } if dryRun { - write.DryRun(cmd) - write.Source(cmd, sourcePath) - write.Mirror(cmd, config.PathMemoryMirror) - if memory.HasDrift(contextDir, sourcePath) { - write.StatusDrift(cmd) - } else { - write.StatusNoDrift(cmd) - } + write.SyncDryRun(cmd, sourcePath, memory2.PathMemoryMirror, + memory.HasDrift(contextDir, sourcePath)) return nil } @@ -55,17 +49,11 @@ func runSync(cmd *cobra.Command, dryRun bool) error { return ctxerr.SyncFailed(syncErr) } - if result.ArchivedTo != "" { - write.Archived(cmd, filepath.Base(result.ArchivedTo)) - } - - write.Synced(cmd, config.FileMemorySource, config.PathMemoryMirror) - write.Source(cmd, result.SourcePath) - write.Lines(cmd, result.SourceLines, result.MirrorLines) - - if result.SourceLines > result.MirrorLines { - write.NewContent(cmd, result.SourceLines-result.MirrorLines) - } + write.SyncResult(cmd, + memory2.MemorySource, memory2.PathMemoryMirror, + result.SourcePath, filepath.Base(result.ArchivedTo), + result.SourceLines, result.MirrorLines, + ) // Update sync state state, loadErr := memory.LoadState(contextDir) diff --git a/internal/cli/memory/unpublish.go b/internal/cli/memory/cmd/unpublish/cmd.go similarity index 59% rename from internal/cli/memory/unpublish.go rename to internal/cli/memory/cmd/unpublish/cmd.go index e9c23ddb..df8c0b77 100644 --- a/internal/cli/memory/unpublish.go +++ b/internal/cli/memory/cmd/unpublish/cmd.go @@ -4,24 +4,26 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package memory +package unpublish import ( "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ) -// unpublishCmd returns the memory unpublish subcommand. +// Cmd returns the memory unpublish subcommand. // // Returns: // - *cobra.Command: command for removing published context from MEMORY.md. -func unpublishCmd() *cobra.Command { +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyMemoryUnpublish) return &cobra.Command{ Use: "unpublish", - Short: "Remove published context from MEMORY.md", - Long: `Remove the ctx-managed marker block from MEMORY.md, -preserving all Claude-owned content outside the markers.`, + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runUnpublish(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/memory/cmd/unpublish/doc.go b/internal/cli/memory/cmd/unpublish/doc.go new file mode 100644 index 00000000..9fc0fac0 --- /dev/null +++ b/internal/cli/memory/cmd/unpublish/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package unpublish implements the ctx memory unpublish subcommand. +// +// It removes the ctx-managed marker block from MEMORY.md, preserving +// all Claude-owned content outside the markers. +package unpublish diff --git a/internal/cli/memory/cmd/unpublish/run.go b/internal/cli/memory/cmd/unpublish/run.go new file mode 100644 index 00000000..1507cd25 --- /dev/null +++ b/internal/cli/memory/cmd/unpublish/run.go @@ -0,0 +1,63 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package unpublish + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/fs" + memory2 "github.com/ActiveMemory/ctx/internal/config/memory" + "github.com/spf13/cobra" + + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/memory" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run removes the ctx-managed marker block from MEMORY.md, +// preserving all Claude-owned content outside the markers. +// +// Parameters: +// - cmd: Cobra command for output routing. +// +// Returns: +// - error: on discovery, read, or write failure. +func Run(cmd *cobra.Command) error { + contextDir := rc.ContextDir() + projectRoot := filepath.Dir(contextDir) + + memoryPath, discoverErr := memory.DiscoverMemoryPath(projectRoot) + if discoverErr != nil { + write.ErrAutoMemoryNotActive(cmd, discoverErr) + return ctxerr.MemoryNotFound() + } + + data, readErr := validation.SafeReadFile( + filepath.Dir(memoryPath), filepath.Base(memoryPath), + ) + if readErr != nil { + return ctxerr.ReadMemory(readErr) + } + + cleaned, found := memory.RemovePublished(string(data)) + if !found { + write.UnpublishNotFound(cmd, memory2.MemorySource) + return nil + } + + if writeErr := os.WriteFile( + memoryPath, []byte(cleaned), fs.PermFile, + ); writeErr != nil { + return ctxerr.WriteMemory(writeErr) + } + + write.UnpublishDone(cmd, memory2.MemorySource) + return nil +} diff --git a/internal/cli/memory/core/count.go b/internal/cli/memory/core/count.go new file mode 100644 index 00000000..f820dc0f --- /dev/null +++ b/internal/cli/memory/core/count.go @@ -0,0 +1,24 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "bytes" + + "github.com/ActiveMemory/ctx/internal/config/token" +) + +// CountFileLines counts the number of newline characters in data. +// +// Parameters: +// - data: raw file bytes +// +// Returns: +// - int: number of newline characters +func CountFileLines(data []byte) int { + return bytes.Count(data, []byte(token.NewlineLF)) +} diff --git a/internal/cli/memory/core/duration.go b/internal/cli/memory/core/duration.go new file mode 100644 index 00000000..8479f324 --- /dev/null +++ b/internal/cli/memory/core/duration.go @@ -0,0 +1,46 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "strconv" + "time" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// FormatDuration returns a human-readable duration string. +// +// Parameters: +// - d: duration to format +// +// Returns: +// - string: human-readable representation (e.g. "3 hours", "1 day") +func FormatDuration(d time.Duration) string { + if d < time.Minute { + return assets.TextDesc(assets.TextDescKeyTimeJustNow) + } + if d < time.Hour { + return pluralize(int(d.Minutes()), + assets.TextDesc(assets.TextDescKeyTimeMinute)) + } + h := int(d.Hours()) + if h < 24 { + return pluralize(h, + assets.TextDesc(assets.TextDescKeyTimeHour)) + } + return pluralize(h/24, + assets.TextDesc(assets.TextDescKeyTimeDay)) +} + +// pluralize returns "1 unit" or "N units". +func pluralize(n int, unit string) string { + if n == 1 { + return "1 " + unit + } + return strconv.Itoa(n) + " " + unit + "s" +} diff --git a/internal/cli/memory/core/helpers.go b/internal/cli/memory/core/helpers.go deleted file mode 100644 index 72c9363b..00000000 --- a/internal/cli/memory/core/helpers.go +++ /dev/null @@ -1,102 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package core - -import ( - "fmt" - "strings" - "time" - - "github.com/ActiveMemory/ctx/internal/config" -) - -// ImportResult tracks counts per target for import reporting. -type ImportResult struct { - Conventions int - Decisions int - Learnings int - Tasks int - Skipped int - Dupes int -} - -// Total returns the number of entries actually imported (excludes skips -// and duplicates). -// -// Returns: -// - int: count of imported entries. -func (r ImportResult) Total() int { - return r.Conventions + r.Decisions + r.Learnings + r.Tasks -} - -// CountFileLines counts the number of newline characters in data. -// -// Parameters: -// - data: raw file bytes. -// -// Returns: -// - int: number of newline characters. -func CountFileLines(data []byte) int { - if len(data) == 0 { - return 0 - } - count := 0 - for _, b := range data { - if b == '\n' { - count++ - } - } - return count -} - -// FormatDuration returns a human-readable duration string. -// -// Parameters: -// - d: duration to format. -// -// Returns: -// - string: human-readable representation (e.g. "3 hours", "1 day"). -func FormatDuration(d time.Duration) string { - if d < time.Minute { - return "just now" - } - if d < time.Hour { - m := int(d.Minutes()) - if m == 1 { - return "1 minute" - } - return fmt.Sprintf("%d minutes", m) - } - h := int(d.Hours()) - if h == 1 { - return "1 hour" - } - if h < 24 { - return fmt.Sprintf("%d hours", h) - } - days := h / 24 - if days == 1 { - return "1 day" - } - return fmt.Sprintf("%d days", days) -} - -// Truncate returns the first line of s, capped at max characters. -// -// Parameters: -// - s: input string (may be multi-line). -// - max: maximum length including ellipsis. -// -// Returns: -// - string: truncated first line. -func Truncate(s string, max int) string { - line := strings.SplitN(s, config.NewlineLF, 2)[0] - if len(line) <= max { - return line - } - return line[:max-3] + "..." -} diff --git a/internal/cli/memory/core/truncate.go b/internal/cli/memory/core/truncate.go new file mode 100644 index 00000000..bbbaeb9e --- /dev/null +++ b/internal/cli/memory/core/truncate.go @@ -0,0 +1,29 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "strings" + + "github.com/ActiveMemory/ctx/internal/config/token" +) + +// Truncate returns the first line of s, capped at max characters. +// +// Parameters: +// - s: input string (may be multi-line). +// - max: maximum length including ellipsis. +// +// Returns: +// - string: truncated first line. +func Truncate(s string, max int) string { + line, _, _ := strings.Cut(s, token.NewlineLF) + if len(line) <= max { + return line + } + return line[:max-len(token.Ellipsis)] + token.Ellipsis +} diff --git a/internal/cli/memory/core/types.go b/internal/cli/memory/core/types.go new file mode 100644 index 00000000..b170a09f --- /dev/null +++ b/internal/cli/memory/core/types.go @@ -0,0 +1,34 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +// ImportResult tracks counts per target for import reporting. +// +// Fields: +// - Conventions: number of convention entries imported. +// - Decisions: number of decision entries imported. +// - Learnings: number of learning entries imported. +// - Tasks: number of task entries imported. +// - Skipped: number of entries skipped (unclassified). +// - Dupes: number of duplicate entries skipped. +type ImportResult struct { + Conventions int + Decisions int + Learnings int + Tasks int + Skipped int + Dupes int +} + +// Total returns the number of entries actually imported (excludes skips +// and duplicates). +// +// Returns: +// - int: sum of conventions, decisions, learnings, and tasks. +func (r ImportResult) Total() int { + return r.Conventions + r.Decisions + r.Learnings + r.Tasks +} diff --git a/internal/cli/memory/import.go b/internal/cli/memory/import.go deleted file mode 100644 index ee9c9f68..00000000 --- a/internal/cli/memory/import.go +++ /dev/null @@ -1,46 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -import ( - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/assets" -) - -// importCmd returns the memory import subcommand. -// -// Returns: -// - *cobra.Command: command for importing MEMORY.md entries into .context/ files. -func importCmd() *cobra.Command { - var dryRun bool - - cmd := &cobra.Command{ - Use: "import", - Short: "Import entries from MEMORY.md into .context/ files", - Long: `Classify and promote entries from Claude Code's MEMORY.md into -structured .context/ files using heuristic keyword matching. - -Each entry is classified as a convention, decision, learning, task, -or skipped (session notes, generic text). Deduplication prevents -re-importing the same entry. - -Exit codes: - 0 Imported successfully (or nothing new to import) - 1 MEMORY.md not found`, - RunE: func(cmd *cobra.Command, _ []string) error { - return runImport(cmd, dryRun) - }, - } - - cmd.Flags().BoolVar( - &dryRun, "dry-run", false, - assets.FlagDesc("memory.import.dry-run"), - ) - - return cmd -} diff --git a/internal/cli/memory/memory.go b/internal/cli/memory/memory.go index ff702fd3..389332fb 100644 --- a/internal/cli/memory/memory.go +++ b/internal/cli/memory/memory.go @@ -4,19 +4,23 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -// Package memory implements the "ctx memory" CLI command group for -// bridging Claude Code's auto memory into .context/. package memory import ( "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/diff" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/importer" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/publish" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/status" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/sync" + "github.com/ActiveMemory/ctx/internal/cli/memory/cmd/unpublish" ) // Cmd returns the "ctx memory" parent command. func Cmd() *cobra.Command { - short, long := assets.CommandDesc("memory") + short, long := assets.CommandDesc(assets.CmdDescKeyMemory) cmd := &cobra.Command{ Use: "memory", Short: short, @@ -24,12 +28,12 @@ func Cmd() *cobra.Command { } cmd.AddCommand( - syncCmd(), - statusCmd(), - diffCmd(), - importCmd(), - publishCmd(), - unpublishCmd(), + sync.Cmd(), + status.Cmd(), + diff.Cmd(), + importer.Cmd(), + publish.Cmd(), + unpublish.Cmd(), ) return cmd diff --git a/internal/cli/memory/publish.go b/internal/cli/memory/publish.go deleted file mode 100644 index 7cce3e6f..00000000 --- a/internal/cli/memory/publish.go +++ /dev/null @@ -1,97 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -import ( - "fmt" - "path/filepath" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/assets" - mem "github.com/ActiveMemory/ctx/internal/memory" - "github.com/ActiveMemory/ctx/internal/rc" -) - -func publishCmd() *cobra.Command { - var budget int - var dryRun bool - - cmd := &cobra.Command{ - Use: "publish", - Short: "Push curated context to MEMORY.md", - Long: `Push curated .context/ content into Claude Code's MEMORY.md -so the agent sees structured project context on session start. - -Content is wrapped in markers ( / ). -Claude-owned content outside the markers is preserved. - -Exit codes: - 0 Published successfully - 1 MEMORY.md not found`, - RunE: func(cmd *cobra.Command, _ []string) error { - return runPublish(cmd, budget, dryRun) - }, - } - - cmd.Flags().IntVar(&budget, "budget", mem.DefaultPublishBudget, assets.FlagDesc("memory.publish.budget")) - cmd.Flags().BoolVar(&dryRun, "dry-run", false, assets.FlagDesc("memory.publish.dry-run")) - - return cmd -} - -func runPublish(cmd *cobra.Command, budget int, dryRun bool) error { - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - memoryPath, discoverErr := mem.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - cmd.PrintErrln("Auto memory not active:", discoverErr) - return fmt.Errorf("MEMORY.md not found") - } - - result, selectErr := mem.SelectContent(contextDir, budget) - if selectErr != nil { - return fmt.Errorf("selecting content: %w", selectErr) - } - - cmd.Println("Publishing .context/ -> MEMORY.md...") - cmd.Println() - cmd.Println(" Source files: TASKS.md, DECISIONS.md, CONVENTIONS.md, LEARNINGS.md") - cmd.Println(fmt.Sprintf(" Budget: %d lines", budget)) - cmd.Println() - cmd.Println(" Published block:") - if len(result.Tasks) > 0 { - cmd.Println(fmt.Sprintf(" %d pending tasks (from TASKS.md)", len(result.Tasks))) - } - if len(result.Decisions) > 0 { - cmd.Println(fmt.Sprintf(" %d recent decisions (from DECISIONS.md)", len(result.Decisions))) - } - if len(result.Conventions) > 0 { - cmd.Println(fmt.Sprintf(" %d key conventions (from CONVENTIONS.md)", len(result.Conventions))) - } - if len(result.Learnings) > 0 { - cmd.Println(fmt.Sprintf(" %d recent learnings (from LEARNINGS.md)", len(result.Learnings))) - } - cmd.Println() - cmd.Println(fmt.Sprintf(" Total: %d lines (within %d-line budget)", result.TotalLines, budget)) - - if dryRun { - cmd.Println() - cmd.Println("Dry run — no files written.") - return nil - } - - if _, publishErr := mem.Publish(contextDir, memoryPath, budget); publishErr != nil { - return fmt.Errorf("publishing: %w", publishErr) - } - - cmd.Println() - cmd.Println("Published to MEMORY.md (markers: ... )") - - return nil -} diff --git a/internal/cli/memory/run.go b/internal/cli/memory/run.go deleted file mode 100644 index d4a6be8f..00000000 --- a/internal/cli/memory/run.go +++ /dev/null @@ -1,273 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -import ( - "fmt" - "os" - "path/filepath" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/config" - ctxerr "github.com/ActiveMemory/ctx/internal/err" - "github.com/ActiveMemory/ctx/internal/memory" - "github.com/ActiveMemory/ctx/internal/rc" - "github.com/ActiveMemory/ctx/internal/write" -) - -// runSync discovers MEMORY.md, mirrors it into .context/memory/, and -// updates the sync state. In dry-run mode it reports what would happen -// without writing any files. -// -// Parameters: -// - cmd: Cobra command for output routing. -// - dryRun: when true, report the plan without writing. -// -// Returns: -// - error: on discovery failure, sync failure, or state persistence failure. -func runSync(cmd *cobra.Command, dryRun bool) error { - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - sourcePath, discoverErr := memory.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - write.ErrAutoMemoryNotActive(cmd, discoverErr) - return ctxerr.MemoryNotFound() - } - - if dryRun { - write.DryRun(cmd) - write.Source(cmd, sourcePath) - write.Mirror(cmd, config.PathMemoryMirror) - if memory.HasDrift(contextDir, sourcePath) { - write.StatusDrift(cmd) - } else { - write.StatusNoDrift(cmd) - } - return nil - } - - result, syncErr := memory.Sync(contextDir, sourcePath) - if syncErr != nil { - return ctxerr.SyncFailed(syncErr) - } - - if result.ArchivedTo != "" { - write.Archived(cmd, filepath.Base(result.ArchivedTo)) - } - - write.Synced(cmd, config.FileMemorySource, config.PathMemoryMirror) - write.Source(cmd, result.SourcePath) - write.Lines(cmd, result.SourceLines, result.MirrorLines) - - if result.SourceLines > result.MirrorLines { - write.NewContent(cmd, result.SourceLines-result.MirrorLines) - } - - // Update sync state - state, loadErr := memory.LoadState(contextDir) - if loadErr != nil { - return ctxerr.LoadState(loadErr) - } - state.MarkSynced() - if saveErr := memory.SaveState(contextDir, state); saveErr != nil { - return ctxerr.SaveState(saveErr) - } - - return nil -} - -// runUnpublish removes the ctx-managed marker block from MEMORY.md, -// preserving all Claude-owned content outside the markers. -// -// Parameters: -// - cmd: Cobra command for output routing. -// -// Returns: -// - error: on discovery, read, or write failure. -func runUnpublish(cmd *cobra.Command) error { - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - memoryPath, discoverErr := memory.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - write.ErrAutoMemoryNotActive(cmd, discoverErr) - return ctxerr.MemoryNotFound() - } - - data, readErr := os.ReadFile(memoryPath) //nolint:gosec // discovered path - if readErr != nil { - return ctxerr.ReadMemory(readErr) - } - - cleaned, found := memory.RemovePublished(string(data)) - if !found { - cmd.Println("No published block found in " + config.FileMemorySource + ".") - return nil - } - - if writeErr := os.WriteFile(memoryPath, []byte(cleaned), config.PermFile); writeErr != nil { - return ctxerr.WriteMemory(writeErr) - } - - cmd.Println("Removed published block from " + config.FileMemorySource + ".") - return nil -} - -// runImport parses MEMORY.md entries, classifies them by heuristic keyword -// matching, deduplicates against prior imports, and promotes new entries -// into the appropriate .context/ files. -// -// Parameters: -// - cmd: Cobra command for output routing. -// - dryRun: when true, show the classification plan without writing. -// -// Returns: -// - error: on discovery, read, state, or promotion failure. -func runImport(cmd *cobra.Command, dryRun bool) error { - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - sourcePath, discoverErr := memory.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - write.ErrAutoMemoryNotActive(cmd, discoverErr) - return ctxerr.MemoryNotFound() - } - - sourceData, readErr := os.ReadFile(sourcePath) //nolint:gosec // discovered path - if readErr != nil { - return ctxerr.ReadMemory(readErr) - } - - entries := memory.ParseEntries(string(sourceData)) - if len(entries) == 0 { - cmd.Println("No entries found in " + config.FileMemorySource + ".") - return nil - } - - state, loadErr := memory.LoadState(contextDir) - if loadErr != nil { - return ctxerr.LoadState(loadErr) - } - - cmd.Println("Scanning " + config.FileMemorySource + " for new entries...") - cmd.Println(fmt.Sprintf(" Found %d entries", len(entries))) - cmd.Println() - - var result importResult - - for _, entry := range entries { - hash := memory.EntryHash(entry.Text) - - if state.Imported(hash) { - result.dupes++ - continue - } - - classification := memory.Classify(entry) - title := truncate(entry.Text, 60) - - if classification.Target == memory.TargetSkip { - result.skipped++ - if dryRun { - cmd.Println(fmt.Sprintf(" -> %q", title)) - cmd.Println(" Classified: skip") - cmd.Println() - } - continue - } - - targetFile := config.FileType[classification.Target] - - if dryRun { - cmd.Println(fmt.Sprintf(" -> %q", title)) - cmd.Println(fmt.Sprintf(" Classified: %s (keywords: %s)", - targetFile, strings.Join(classification.Keywords, ", "))) - cmd.Println() - } else { - if promoteErr := memory.Promote(entry, classification); promoteErr != nil { - cmd.PrintErrln(fmt.Sprintf(" Error promoting to %s: %v", targetFile, promoteErr)) - continue - } - state.MarkImported(hash, classification.Target) - cmd.Println(fmt.Sprintf(" -> %q", title)) - cmd.Println(fmt.Sprintf(" Added to %s", targetFile)) - cmd.Println() - } - - switch classification.Target { - case config.EntryConvention: - result.conventions++ - case config.EntryDecision: - result.decisions++ - case config.EntryLearning: - result.learnings++ - case config.EntryTask: - result.tasks++ - } - } - - // Summary - var summary string - if dryRun { - summary = fmt.Sprintf("Dry run — would import: %d entries", result.total()) - } else { - summary = fmt.Sprintf("Imported: %d entries", result.total()) - } - - var parts []string - if result.conventions > 0 { - parts = append(parts, fmt.Sprintf("%d convention", result.conventions)) - } - if result.decisions > 0 { - parts = append(parts, fmt.Sprintf("%d decision", result.decisions)) - } - if result.learnings > 0 { - parts = append(parts, fmt.Sprintf("%d learning", result.learnings)) - } - if result.tasks > 0 { - parts = append(parts, fmt.Sprintf("%d task", result.tasks)) - } - if len(parts) > 0 { - summary += fmt.Sprintf(" (%s)", strings.Join(parts, ", ")) - } - cmd.Println(summary) - - if result.skipped > 0 { - cmd.Println(fmt.Sprintf("Skipped: %d entries (session notes/unclassified)", result.skipped)) - } - if result.dupes > 0 { - cmd.Println(fmt.Sprintf("Duplicates: %d entries (already imported)", result.dupes)) - } - - if !dryRun && result.total() > 0 { - state.MarkImportedDone() - if saveErr := memory.SaveState(contextDir, state); saveErr != nil { - return ctxerr.SaveState(saveErr) - } - } - - return nil -} - -// truncate returns the first line of s, capped at max characters. -// -// Parameters: -// - s: input string (may be multi-line). -// - max: maximum length including ellipsis. -// -// Returns: -// - string: truncated first line. -func truncate(s string, max int) string { - line := strings.SplitN(s, config.NewlineLF, 2)[0] - if len(line) <= max { - return line - } - return line[:max-3] + "..." -} diff --git a/internal/cli/memory/status.go b/internal/cli/memory/status.go deleted file mode 100644 index 57aea424..00000000 --- a/internal/cli/memory/status.go +++ /dev/null @@ -1,142 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -import ( - "fmt" - "os" - "path/filepath" - "time" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/config" - mem "github.com/ActiveMemory/ctx/internal/memory" - "github.com/ActiveMemory/ctx/internal/rc" -) - -func statusCmd() *cobra.Command { - return &cobra.Command{ - Use: "status", - Short: "Show drift, timestamps, and entry counts", - Long: `Show memory bridge status: source location, last sync time, -line counts, drift indicator, and archive count. - -Exit codes: - 0 No drift - 1 MEMORY.md not found - 2 Drift detected (MEMORY.md changed since last sync)`, - RunE: func(cmd *cobra.Command, _ []string) error { - return runStatus(cmd) - }, - } -} - -func runStatus(cmd *cobra.Command) error { - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - sourcePath, discoverErr := mem.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - cmd.Println("Memory Bridge Status") - cmd.Println(" Source: auto memory not active (MEMORY.md not found)") - return fmt.Errorf("MEMORY.md not found") - } - - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) - - cmd.Println("Memory Bridge Status") - cmd.Println(fmt.Sprintf(" Source: %s", sourcePath)) - cmd.Println(fmt.Sprintf(" Mirror: .context/%s/%s", config.DirMemory, config.FileMemoryMirror)) - - // Last sync time - state, _ := mem.LoadState(contextDir) - if state.LastSync != nil { - ago := time.Since(*state.LastSync).Truncate(time.Minute) - cmd.Println(fmt.Sprintf(" Last sync: %s (%s ago)", - state.LastSync.Local().Format("2006-01-02 15:04"), formatDuration(ago))) - } else { - cmd.Println(" Last sync: never") - } - - cmd.Println() - - // Source line count - if sourceData, readErr := os.ReadFile(sourcePath); readErr == nil { //nolint:gosec // discovered path - line := fmt.Sprintf(" MEMORY.md: %d lines", countFileLines(sourceData)) - if mem.HasDrift(contextDir, sourcePath) { - line += " (modified since last sync)" - } - cmd.Println(line) - } - - // Mirror line count - if mirrorData, readErr := os.ReadFile(mirrorPath); readErr == nil { //nolint:gosec // project-local path - cmd.Println(fmt.Sprintf(" Mirror: %d lines", countFileLines(mirrorData))) - } else { - cmd.Println(" Mirror: not yet synced") - } - - // Drift - hasDrift := mem.HasDrift(contextDir, sourcePath) - if hasDrift { - cmd.Println(" Drift: detected (source is newer)") - } else { - cmd.Println(" Drift: none") - } - - // Archives - count := mem.ArchiveCount(contextDir) - cmd.Println(fmt.Sprintf(" Archives: %d snapshots in .context/%s/", count, config.DirMemoryArchive)) - - if hasDrift { - // Exit code 2 for drift - cmd.SilenceUsage = true - cmd.SilenceErrors = true - os.Exit(2) //nolint:revive // spec-defined exit code - } - - return nil -} - -func countFileLines(data []byte) int { - if len(data) == 0 { - return 0 - } - count := 0 - for _, b := range data { - if b == '\n' { - count++ - } - } - return count -} - -func formatDuration(d time.Duration) string { - if d < time.Minute { - return "just now" - } - if d < time.Hour { - m := int(d.Minutes()) - if m == 1 { - return "1 minute" - } - return fmt.Sprintf("%d minutes", m) - } - h := int(d.Hours()) - if h == 1 { - return "1 hour" - } - if h < 24 { - return fmt.Sprintf("%d hours", h) - } - days := h / 24 - if days == 1 { - return "1 day" - } - return fmt.Sprintf("%d days", days) -} diff --git a/internal/cli/memory/sync.go b/internal/cli/memory/sync.go deleted file mode 100644 index 19addf90..00000000 --- a/internal/cli/memory/sync.go +++ /dev/null @@ -1,39 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -import ( - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/assets" -) - -func syncCmd() *cobra.Command { - var dryRun bool - - cmd := &cobra.Command{ - Use: "sync", - Short: "Copy MEMORY.md to mirror, archive previous version", - Long: `Copy Claude Code's MEMORY.md to .context/memory/mirror.md. - -Archives the previous mirror before overwriting. Reports line counts -and drift since last sync. - -Exit codes: - 0 Synced successfully - 1 MEMORY.md not found (auto memory not active)`, - RunE: func(cmd *cobra.Command, _ []string) error { - return runSync(cmd, dryRun) - }, - } - - cmd.Flags().BoolVar( - &dryRun, "dry-run", false, assets.FlagDesc("memory.sync.dry-run"), - ) - - return cmd -} diff --git a/internal/cli/memory/types.go b/internal/cli/memory/types.go deleted file mode 100644 index 9905f35f..00000000 --- a/internal/cli/memory/types.go +++ /dev/null @@ -1,23 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package memory - -// importResult tracks counts per target for import reporting. -type importResult struct { - conventions int - decisions int - learnings int - tasks int - skipped int - dupes int -} - -// total returns the number of entries actually imported (excludes skips -// and duplicates). -func (r importResult) total() int { - return r.conventions + r.decisions + r.learnings + r.tasks -} diff --git a/internal/cli/notify/cmd/setup/cmd.go b/internal/cli/notify/cmd/setup/cmd.go index f660cb35..322d648c 100644 --- a/internal/cli/notify/cmd/setup/cmd.go +++ b/internal/cli/notify/cmd/setup/cmd.go @@ -19,13 +19,13 @@ import ( // Returns: // - *cobra.Command: Configured setup subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("notify.setup") + short, long := assets.CommandDesc(assets.CmdDescKeyNotifySetup) return &cobra.Command{ Use: "setup", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return RunSetup(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } } diff --git a/internal/cli/system/core/block.go b/internal/cli/notify/cmd/setup/doc.go similarity index 54% rename from internal/cli/system/core/block.go rename to internal/cli/notify/cmd/setup/doc.go index fd12d78b..81fc1a15 100644 --- a/internal/cli/system/core/block.go +++ b/internal/cli/notify/cmd/setup/doc.go @@ -4,10 +4,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package core - -// BlockResponse is the JSON output for blocked commands. -type BlockResponse struct { - Decision string `json:"decision"` - Reason string `json:"reason"` -} +// Package setup implements the ctx notify setup subcommand. +// +// It configures webhook notification endpoints for hook events. +package setup diff --git a/internal/cli/notify/cmd/setup/run.go b/internal/cli/notify/cmd/setup/run.go index eacc67e7..6793d240 100644 --- a/internal/cli/notify/cmd/setup/run.go +++ b/internal/cli/notify/cmd/setup/run.go @@ -8,16 +8,18 @@ package setup import ( "bufio" - "fmt" "os" "strings" + "github.com/ActiveMemory/ctx/internal/config/crypto" "github.com/spf13/cobra" + ctxerr "github.com/ActiveMemory/ctx/internal/err" notifylib "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunSetup prompts for a webhook URL and saves it encrypted. +// Run prompts for a webhook URL and saves it encrypted. // // Exported for testability (tests inject a mock stdin). // @@ -27,52 +29,23 @@ import ( // // Returns: // - error: Non-nil on empty input or save failure -func RunSetup(cmd *cobra.Command, stdin *os.File) error { - cmd.Print("Enter webhook URL: ") +func Run(cmd *cobra.Command, stdin *os.File) error { + write.SetupPrompt(cmd) scanner := bufio.NewScanner(stdin) if !scanner.Scan() { - return fmt.Errorf("no input received") + return ctxerr.NoInput() } url := strings.TrimSpace(scanner.Text()) if url == "" { - return fmt.Errorf("webhook URL cannot be empty") + return ctxerr.WebhookEmpty() } if saveErr := notifylib.SaveWebhook(url); saveErr != nil { - return fmt.Errorf("save webhook: %w", saveErr) + return ctxerr.SaveWebhook(saveErr) } - masked := MaskURL(url) - cmd.Println("Webhook configured: " + masked) - cmd.Println("Encrypted at: .context/.notify.enc") + write.SetupDone(cmd, notifylib.MaskURL(url), crypto.NotifyEnc) return nil } - -// MaskURL shows the scheme + host and masks everything after. -// -// Exported for testability. -// -// Parameters: -// - url: Full webhook URL -// -// Returns: -// - string: Masked URL safe for display -func MaskURL(url string) string { - // Find the third slash (end of scheme://host) - count := 0 - for i, c := range url { - if c == '/' { - count++ - if count == 3 { - return url[:i] + "/***" - } - } - } - // No path — show as-is but with masked end - if len(url) > 20 { - return url[:20] + "***" - } - return url -} diff --git a/internal/cli/notify/cmd/test/cmd.go b/internal/cli/notify/cmd/test/cmd.go index 8f59ac61..38b143ad 100644 --- a/internal/cli/notify/cmd/test/cmd.go +++ b/internal/cli/notify/cmd/test/cmd.go @@ -17,7 +17,7 @@ import ( // Returns: // - *cobra.Command: Configured test subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("notify.test") + short, long := assets.CommandDesc(assets.CmdDescKeyNotifyTest) return &cobra.Command{ Use: "test", Short: short, diff --git a/internal/cli/notify/cmd/test/doc.go b/internal/cli/notify/cmd/test/doc.go new file mode 100644 index 00000000..af145c5e --- /dev/null +++ b/internal/cli/notify/cmd/test/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package test implements the ctx notify test subcommand. +// +// It sends a test notification to the configured webhook endpoint to +// verify connectivity. +package test diff --git a/internal/cli/notify/cmd/test/run.go b/internal/cli/notify/cmd/test/run.go index 2013b06a..3a7f339b 100644 --- a/internal/cli/notify/cmd/test/run.go +++ b/internal/cli/notify/cmd/test/run.go @@ -7,19 +7,18 @@ package test import ( - "bytes" "encoding/json" - "fmt" - "net/http" "os" "path/filepath" "time" + "github.com/ActiveMemory/ctx/internal/config/crypto" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/notify" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/write" ) // runTest sends a test notification to the configured webhook. @@ -30,12 +29,12 @@ import ( // Returns: // - error: Non-nil on webhook load or HTTP failure func runTest(cmd *cobra.Command) error { - url, err := notify.LoadWebhook() - if err != nil { - return fmt.Errorf("load webhook: %w", err) + url, loadErr := notify.LoadWebhook() + if loadErr != nil { + return ctxerr.LoadWebhook(loadErr) } if url == "" { - cmd.Println("No webhook configured. Run: ctx notify setup") + write.TestNoWebhook(cmd) return nil } @@ -53,26 +52,20 @@ func runTest(cmd *cobra.Command) error { body, marshalErr := json.Marshal(payload) if marshalErr != nil { - return fmt.Errorf("marshal payload: %w", marshalErr) + return ctxerr.MarshalPayload(marshalErr) } - // Check event filter — but for test we bypass and send directly if !notify.EventAllowed("test", rc.NotifyEvents()) { - cmd.Println("Note: event \"test\" is filtered by your .ctxrc notify.events config.") - cmd.Println("Sending anyway for testing purposes.") + write.TestFiltered(cmd) } - client := &http.Client{Timeout: 5 * time.Second} - resp, postErr := client.Post(url, "application/json", bytes.NewReader(body)) //nolint:gosec // URL is user-configured via encrypted storage + resp, postErr := notify.PostJSON(url, body) if postErr != nil { - return fmt.Errorf("send test notification: %w", postErr) + return ctxerr.SendNotification(postErr) } defer func() { _ = resp.Body.Close() }() - cmd.Println(fmt.Sprintf("Webhook responded: HTTP %d %s", resp.StatusCode, http.StatusText(resp.StatusCode))) - if resp.StatusCode >= 200 && resp.StatusCode < 300 { - cmd.Println("Webhook is working " + config.FileNotifyEnc) - } + write.TestResult(cmd, resp.StatusCode, crypto.NotifyEnc) return nil } diff --git a/internal/cli/notify/notify.go b/internal/cli/notify/notify.go index 70bfef29..c38c95fb 100644 --- a/internal/cli/notify/notify.go +++ b/internal/cli/notify/notify.go @@ -7,7 +7,6 @@ package notify import ( - "fmt" "strings" "github.com/spf13/cobra" @@ -15,6 +14,7 @@ import ( "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/cli/notify/cmd/setup" "github.com/ActiveMemory/ctx/internal/cli/notify/cmd/test" + ctxerr "github.com/ActiveMemory/ctx/internal/err" notifylib "github.com/ActiveMemory/ctx/internal/notify" ) @@ -28,7 +28,7 @@ func Cmd() *cobra.Command { var hook string var variant string - short, long := assets.CommandDesc("notify") + short, long := assets.CommandDesc(assets.CmdDescKeyNotify) cmd := &cobra.Command{ Use: "notify [message]", Short: short, @@ -36,10 +36,10 @@ func Cmd() *cobra.Command { Args: cobra.MinimumNArgs(0), RunE: func(cmd *cobra.Command, args []string) error { if event == "" { - return fmt.Errorf("required flag \"event\" not set") + return ctxerr.FlagRequired("event") } if len(args) == 0 { - return fmt.Errorf("message argument is required") + return ctxerr.ArgRequired("message") } message := strings.Join(args, " ") var ref *notifylib.TemplateRef @@ -50,10 +50,19 @@ func Cmd() *cobra.Command { }, } - cmd.Flags().StringVarP(&event, "event", "e", "", assets.FlagDesc("notify.event")) - cmd.Flags().StringVarP(&sessionID, "session-id", "s", "", assets.FlagDesc("notify.session-id")) - cmd.Flags().StringVar(&hook, "hook", "", assets.FlagDesc("notify.hook")) - cmd.Flags().StringVar(&variant, "variant", "", assets.FlagDesc("notify.variant")) + cmd.Flags().StringVarP(&event, + "event", "e", "", + assets.FlagDesc(assets.FlagDescKeyNotifyEvent), + ) + cmd.Flags().StringVarP(&sessionID, + "session-id", "s", "", assets.FlagDesc(assets.FlagDescKeyNotifySessionId), + ) + cmd.Flags().StringVar(&hook, + "hook", "", assets.FlagDesc(assets.FlagDescKeyNotifyHook), + ) + cmd.Flags().StringVar(&variant, + "variant", "", assets.FlagDesc(assets.FlagDescKeyNotifyVariant), + ) cmd.AddCommand(setup.Cmd()) cmd.AddCommand(test.Cmd()) diff --git a/internal/cli/notify/notify_test.go b/internal/cli/notify/notify_test.go index 34b8b69b..92ef8058 100644 --- a/internal/cli/notify/notify_test.go +++ b/internal/cli/notify/notify_test.go @@ -16,7 +16,7 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/notify/cmd/setup" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" notifylib "github.com/ActiveMemory/ctx/internal/notify" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -28,7 +28,7 @@ func setupCLITest(t *testing.T) (string, func()) { _ = os.Chdir(tempDir) _ = os.MkdirAll(filepath.Join(tempDir, ".context"), 0o750) // Create required files so isInitialized returns true - for _, f := range config.FilesRequired { + for _, f := range ctx.FilesRequired { _ = os.WriteFile(filepath.Join(tempDir, ".context", f), []byte("# "+f+"\n"), 0o600) } rc.Reset() @@ -114,9 +114,9 @@ func TestSetup_WithMockStdin(t *testing.T) { cmd.SetOut(&buf) cmd.SetErr(&buf) - err = setup.RunSetup(cmd, tmpFile) + err = setup.Run(cmd, tmpFile) if err != nil { - t.Fatalf("setup.RunSetup() error = %v", err) + t.Fatalf("setup.Run() error = %v", err) } output := buf.String() @@ -139,9 +139,9 @@ func TestMaskURL(t *testing.T) { } for _, tc := range tests { - got := setup.MaskURL(tc.input) + got := notifylib.MaskURL(tc.input) if got != tc.want { - t.Errorf("setup.MaskURL(%q) = %q, want %q", tc.input, got, tc.want) + t.Errorf("notifylib.MaskURL(%q) = %q, want %q", tc.input, got, tc.want) } } } @@ -165,7 +165,7 @@ func TestSetup_EmptyInput(t *testing.T) { cmd.SetOut(&buf) cmd.SetErr(&buf) - setupErr := setup.RunSetup(cmd, tmpFile) + setupErr := setup.Run(cmd, tmpFile) if setupErr == nil { t.Fatal("expected error for empty webhook URL input") } diff --git a/internal/cli/pad/cmd/add/cmd.go b/internal/cli/pad/cmd/add/cmd.go index 83a2ba2c..a8ae9497 100644 --- a/internal/cli/pad/cmd/add/cmd.go +++ b/internal/cli/pad/cmd/add/cmd.go @@ -19,7 +19,7 @@ import ( func Cmd() *cobra.Command { var filePath string - short, _ := assets.CommandDesc("pad.add") + short, _ := assets.CommandDesc(assets.CmdDescKeyPadAdd) cmd := &cobra.Command{ Use: "add TEXT", Short: short, @@ -32,7 +32,10 @@ func Cmd() *cobra.Command { }, } - cmd.Flags().StringVarP(&filePath, "file", "f", "", assets.FlagDesc("pad.add.file")) + cmd.Flags().StringVarP(&filePath, + "file", "f", "", + assets.FlagDesc(assets.FlagDescKeyPadAddFile), + ) return cmd } diff --git a/internal/cli/pad/cmd/add/doc.go b/internal/cli/pad/cmd/add/doc.go new file mode 100644 index 00000000..99e9d6f3 --- /dev/null +++ b/internal/cli/pad/cmd/add/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package add implements the ctx pad add subcommand. +// +// It appends a new text entry to the scratch pad. +package add diff --git a/internal/cli/pad/cmd/add/run.go b/internal/cli/pad/cmd/add/run.go index 999dc82c..1691a2c6 100644 --- a/internal/cli/pad/cmd/add/run.go +++ b/internal/cli/pad/cmd/add/run.go @@ -7,12 +7,13 @@ package add import ( - "fmt" - "os" - + "github.com/ActiveMemory/ctx/internal/config/pad" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) // runAdd appends a new entry and prints confirmation. @@ -35,7 +36,7 @@ func runAdd(cmd *cobra.Command, text string) error { return writeErr } - cmd.Println(fmt.Sprintf("Added entry %d.", len(entries))) + write.PadEntryAdded(cmd, len(entries)) return nil } @@ -49,13 +50,13 @@ func runAdd(cmd *cobra.Command, text string) error { // Returns: // - error: Non-nil on read/write failure or file too large func runAddBlob(cmd *cobra.Command, label, filePath string) error { - data, err := os.ReadFile(filePath) //nolint:gosec // user-provided path is intentional + data, err := validation.ReadUserFile(filePath) if err != nil { - return fmt.Errorf("read file: %w", err) + return ctxerr.ReadFile(err) } - if len(data) > core.MaxBlobSize { - return fmt.Errorf("file too large: %d bytes (max %d)", len(data), core.MaxBlobSize) + if len(data) > pad.MaxBlobSize { + return ctxerr.FileTooLarge(len(data), pad.MaxBlobSize) } entries, readErr := core.ReadEntries() @@ -69,6 +70,6 @@ func runAddBlob(cmd *cobra.Command, label, filePath string) error { return writeErr } - cmd.Println(fmt.Sprintf("Added entry %d.", len(entries))) + write.PadEntryAdded(cmd, len(entries)) return nil } diff --git a/internal/cli/pad/cmd/edit/cmd.go b/internal/cli/pad/cmd/edit/cmd.go index b77652e1..091ccdd4 100644 --- a/internal/cli/pad/cmd/edit/cmd.go +++ b/internal/cli/pad/cmd/edit/cmd.go @@ -7,12 +7,12 @@ package edit import ( - "fmt" "strconv" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // Cmd returns the pad edit subcommand. @@ -36,7 +36,7 @@ func Cmd() *cobra.Command { var filePath string var labelText string - short, long := assets.CommandDesc("pad.edit") + short, long := assets.CommandDesc(assets.CmdDescKeyPadEdit) cmd := &cobra.Command{ Use: "edit N [TEXT]", Short: short, @@ -45,7 +45,7 @@ func Cmd() *cobra.Command { RunE: func(cmd *cobra.Command, args []string) error { n, err := strconv.Atoi(args[0]) if err != nil { - return fmt.Errorf("invalid index: %s", args[0]) + return ctxerr.InvalidIndex(args[0]) } hasPositional := len(args) == 2 @@ -56,7 +56,7 @@ func Cmd() *cobra.Command { // --file/--label conflict with positional/--append/--prepend. if (hasFile || hasLabel) && (hasPositional || hasAppend || hasPrepend) { - return fmt.Errorf("--file/--label and positional text/--append/--prepend are mutually exclusive") + return ctxerr.EditBlobTextConflict() } // Blob edit mode. @@ -77,10 +77,10 @@ func Cmd() *cobra.Command { } if flagCount == 0 { - return fmt.Errorf("provide replacement text, --append, or --prepend") + return ctxerr.EditNoMode() } if flagCount > 1 { - return fmt.Errorf("--append, --prepend, and positional text are mutually exclusive") + return ctxerr.EditTextConflict() } switch { @@ -94,10 +94,18 @@ func Cmd() *cobra.Command { }, } - cmd.Flags().StringVar(&appendText, "append", "", assets.FlagDesc("pad.edit.append")) - cmd.Flags().StringVar(&prependText, "prepend", "", assets.FlagDesc("pad.edit.prepend")) - cmd.Flags().StringVarP(&filePath, "file", "f", "", assets.FlagDesc("pad.edit.file")) - cmd.Flags().StringVar(&labelText, "label", "", assets.FlagDesc("pad.edit.label")) + cmd.Flags().StringVar(&appendText, + "append", "", assets.FlagDesc(assets.FlagDescKeyPadEditAppend), + ) + cmd.Flags().StringVar(&prependText, + "prepend", "", assets.FlagDesc(assets.FlagDescKeyPadEditPrepend), + ) + cmd.Flags().StringVarP(&filePath, + "file", "f", "", assets.FlagDesc(assets.FlagDescKeyPadEditFile), + ) + cmd.Flags().StringVar(&labelText, + "label", "", assets.FlagDesc(assets.FlagDescKeyPadEditLabel), + ) return cmd } diff --git a/internal/cli/pad/cmd/edit/doc.go b/internal/cli/pad/cmd/edit/doc.go new file mode 100644 index 00000000..d761f5d7 --- /dev/null +++ b/internal/cli/pad/cmd/edit/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package edit implements the ctx pad edit subcommand. +// +// It modifies an existing pad entry by index, supporting replace, append, +// and prepend modes as well as blob file and label updates. +package edit diff --git a/internal/cli/pad/cmd/edit/run.go b/internal/cli/pad/cmd/edit/run.go index 1b6c23c6..4c0ebb99 100644 --- a/internal/cli/pad/cmd/edit/run.go +++ b/internal/cli/pad/cmd/edit/run.go @@ -7,12 +7,13 @@ package edit import ( - "fmt" - "os" - + "github.com/ActiveMemory/ctx/internal/config/pad" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) // runEdit replaces entry at 1-based position n with new text. @@ -40,7 +41,7 @@ func runEdit(cmd *cobra.Command, n int, text string) error { return writeErr } - cmd.Println(fmt.Sprintf("Updated entry %d.", n)) + write.PadEntryUpdated(cmd, n) return nil } @@ -64,7 +65,7 @@ func runEditAppend(cmd *cobra.Command, n int, text string) error { } if core.IsBlob(entries[n-1]) { - return fmt.Errorf("cannot append to a blob entry") + return ctxerr.BlobAppendNotAllowed() } entries[n-1] = entries[n-1] + " " + text @@ -73,7 +74,7 @@ func runEditAppend(cmd *cobra.Command, n int, text string) error { return writeErr } - cmd.Println(fmt.Sprintf("Updated entry %d.", n)) + write.PadEntryUpdated(cmd, n) return nil } @@ -97,7 +98,7 @@ func runEditPrepend(cmd *cobra.Command, n int, text string) error { } if core.IsBlob(entries[n-1]) { - return fmt.Errorf("cannot prepend to a blob entry") + return ctxerr.BlobPrependNotAllowed() } entries[n-1] = text + " " + entries[n-1] @@ -106,7 +107,7 @@ func runEditPrepend(cmd *cobra.Command, n int, text string) error { return writeErr } - cmd.Println(fmt.Sprintf("Updated entry %d.", n)) + write.PadEntryUpdated(cmd, n) return nil } @@ -132,7 +133,7 @@ func runEditBlob(cmd *cobra.Command, n int, filePath, labelText string) error { oldLabel, oldData, ok := core.SplitBlob(entries[n-1]) if !ok { - return fmt.Errorf("entry %d is not a blob entry", n) + return ctxerr.NotBlobEntry(n) } newLabel := oldLabel @@ -143,12 +144,12 @@ func runEditBlob(cmd *cobra.Command, n int, filePath, labelText string) error { } if filePath != "" { - data, readErr := os.ReadFile(filePath) //nolint:gosec // user-provided path is intentional + data, readErr := validation.ReadUserFile(filePath) if readErr != nil { - return fmt.Errorf("read file: %w", readErr) + return ctxerr.ReadFile(readErr) } - if len(data) > core.MaxBlobSize { - return fmt.Errorf("file too large: %d bytes (max %d)", len(data), core.MaxBlobSize) + if len(data) > pad.MaxBlobSize { + return ctxerr.FileTooLarge(len(data), pad.MaxBlobSize) } newData = data } @@ -159,6 +160,6 @@ func runEditBlob(cmd *cobra.Command, n int, filePath, labelText string) error { return writeErr } - cmd.Println(fmt.Sprintf("Updated entry %d.", n)) + write.PadEntryUpdated(cmd, n) return nil } diff --git a/internal/cli/pad/cmd/export/cmd.go b/internal/cli/pad/cmd/export/cmd.go index 43ce75b8..f2eeaa76 100644 --- a/internal/cli/pad/cmd/export/cmd.go +++ b/internal/cli/pad/cmd/export/cmd.go @@ -19,7 +19,7 @@ import ( func Cmd() *cobra.Command { var force, dryRun bool - short, long := assets.CommandDesc("pad.export") + short, long := assets.CommandDesc(assets.CmdDescKeyPadExport) cmd := &cobra.Command{ Use: "export [DIR]", Short: short, @@ -36,11 +36,11 @@ func Cmd() *cobra.Command { cmd.Flags().BoolVarP( &force, "force", "f", false, - assets.FlagDesc("pad.export.force"), + assets.FlagDesc(assets.FlagDescKeyPadExportForce), ) cmd.Flags().BoolVar( &dryRun, "dry-run", false, - assets.FlagDesc("pad.export.dry-run"), + assets.FlagDesc(assets.FlagDescKeyPadExportDryRun), ) return cmd diff --git a/internal/cli/pad/cmd/export/doc.go b/internal/cli/pad/cmd/export/doc.go new file mode 100644 index 00000000..f071c094 --- /dev/null +++ b/internal/cli/pad/cmd/export/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package export implements the ctx pad export subcommand. +// +// It exports pad entries to a specified directory on disk. +package export diff --git a/internal/cli/pad/cmd/export/run.go b/internal/cli/pad/cmd/export/run.go index 3923a2bb..649140e4 100644 --- a/internal/cli/pad/cmd/export/run.go +++ b/internal/cli/pad/cmd/export/run.go @@ -12,9 +12,11 @@ import ( "path/filepath" "time" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/write" ) @@ -35,8 +37,8 @@ func runExport(cmd *cobra.Command, dir string, force, dryRun bool) error { } if !dryRun { - if mkErr := os.MkdirAll(dir, 0o750); mkErr != nil { - return fmt.Errorf("mkdir %s: %w", dir, mkErr) + if mkErr := os.MkdirAll(dir, fs.PermExec); mkErr != nil { + return ctxerr.Mkdir(dir, mkErr) } } @@ -64,29 +66,20 @@ func runExport(cmd *cobra.Command, dir string, force, dryRun bool) error { } if dryRun { - cmd.Println(fmt.Sprintf(" %s → %s", label, outPath)) + write.PadExportPlan(cmd, label, outPath) count++ continue } - if writeErr := os.WriteFile(outPath, data, 0o600); writeErr != nil { - cmd.PrintErrln(fmt.Sprintf(" ! failed to write %s: %v", label, writeErr)) + if writeErr := os.WriteFile(outPath, data, fs.PermSecret); writeErr != nil { + write.ErrPadExportWrite(cmd, label, writeErr) continue } - cmd.Println(fmt.Sprintf(" + %s", label)) + write.PadExportDone(cmd, label) count++ } - if count == 0 { - cmd.Println("No blob entries to export.") - return nil - } - - verb := "Exported" - if dryRun { - verb = "Would export" - } - cmd.Println(fmt.Sprintf("%s %d blobs.", verb, count)) + write.PadExportSummary(cmd, count, dryRun) return nil } diff --git a/internal/cli/pad/cmd/imp/cmd.go b/internal/cli/pad/cmd/imp/cmd.go index d935e208..87de191b 100644 --- a/internal/cli/pad/cmd/imp/cmd.go +++ b/internal/cli/pad/cmd/imp/cmd.go @@ -19,7 +19,7 @@ import ( func Cmd() *cobra.Command { var blobs bool - short, long := assets.CommandDesc("pad.imp") + short, long := assets.CommandDesc(assets.CmdDescKeyPadImp) cmd := &cobra.Command{ Use: "import FILE", Short: short, @@ -34,7 +34,7 @@ func Cmd() *cobra.Command { } cmd.Flags().BoolVar(&blobs, "blobs", false, - assets.FlagDesc("pad.imp.blobs")) + assets.FlagDesc(assets.FlagDescKeyPadImpBlobs)) return cmd } diff --git a/internal/cli/pad/cmd/imp/doc.go b/internal/cli/pad/cmd/imp/doc.go new file mode 100644 index 00000000..73fbf9bf --- /dev/null +++ b/internal/cli/pad/cmd/imp/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package imp implements the ctx pad import subcommand. +// +// It imports entries into the scratch pad from an external file. +package imp diff --git a/internal/cli/pad/cmd/imp/run.go b/internal/cli/pad/cmd/imp/run.go index 13d76a4a..e71de6fa 100644 --- a/internal/cli/pad/cmd/imp/run.go +++ b/internal/cli/pad/cmd/imp/run.go @@ -8,15 +8,17 @@ package imp import ( "bufio" - "fmt" "io" "os" - "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/config/pad" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) // runImport reads lines from a file (or stdin) and appends them as entries. @@ -32,13 +34,13 @@ func runImport(cmd *cobra.Command, file string) error { if file == "-" { r = os.Stdin } else { - f, err := os.Open(file) //nolint:gosec // user-provided path is intentional + f, err := validation.OpenUserFile(file) if err != nil { - return fmt.Errorf("open %s: %w", file, err) + return ctxerr.OpenFile(file, err) } defer func() { if cerr := f.Close(); cerr != nil { - fmt.Fprintf(os.Stderr, "warning: close %s: %v\n", file, cerr) + write.ErrPadImportCloseWarning(cmd, file, cerr) } }() r = f @@ -60,11 +62,11 @@ func runImport(cmd *cobra.Command, file string) error { count++ } if scanErr := scanner.Err(); scanErr != nil { - return fmt.Errorf("read input: %w", scanErr) + return ctxerr.ReadInput(scanErr) } if count == 0 { - cmd.Println("No entries to import.") + write.PadImportNone(cmd) return nil } @@ -72,7 +74,7 @@ func runImport(cmd *cobra.Command, file string) error { return writeErr } - cmd.Println(fmt.Sprintf("Imported %d entries.", count)) + write.PadImportDone(cmd, count) return nil } @@ -88,15 +90,15 @@ func runImport(cmd *cobra.Command, file string) error { func runImportBlobs(cmd *cobra.Command, path string) error { info, statErr := os.Stat(path) if statErr != nil { - return fmt.Errorf("stat %s: %w", path, statErr) + return ctxerr.StatPath(path, statErr) } if !info.IsDir() { - return fmt.Errorf("%s is not a directory", path) + return ctxerr.NotDirectory(path) } dirEntries, readErr := os.ReadDir(path) if readErr != nil { - return fmt.Errorf("read directory %s: %w", path, readErr) + return ctxerr.ReadDirectory(path, readErr) } entries, loadErr := core.ReadEntries() @@ -111,38 +113,31 @@ func runImportBlobs(cmd *cobra.Command, path string) error { } name := de.Name() - filePath := filepath.Join(path, name) - data, fileErr := os.ReadFile(filePath) //nolint:gosec // user-provided path is intentional + data, fileErr := validation.SafeReadFile(path, name) if fileErr != nil { - cmd.PrintErrln(fmt.Sprintf(" ! skipped: %s (%v)", name, fileErr)) + write.ErrPadImportBlobSkipped(cmd, name, fileErr) skipped++ continue } - if len(data) > core.MaxBlobSize { - cmd.PrintErrln(fmt.Sprintf(" ! skipped: %s (exceeds %d byte limit)", - name, core.MaxBlobSize)) + if len(data) > pad.MaxBlobSize { + write.ErrPadImportBlobTooLarge(cmd, name, pad.MaxBlobSize) skipped++ continue } entries = append(entries, core.MakeBlob(name, data)) - cmd.Println(fmt.Sprintf(" + %s", name)) + write.PadImportBlobAdded(cmd, name) added++ } - if added == 0 && skipped == 0 { - cmd.Println("No files to import.") - return nil - } - if added > 0 { if writeErr := core.WriteEntries(entries); writeErr != nil { return writeErr } } - cmd.Println(fmt.Sprintf("Done. Added %d, skipped %d.", added, skipped)) + write.PadImportBlobSummary(cmd, added, skipped) return nil } diff --git a/internal/cli/pad/cmd/merge/cmd.go b/internal/cli/pad/cmd/merge/cmd.go index f0e6d752..34c4107c 100644 --- a/internal/cli/pad/cmd/merge/cmd.go +++ b/internal/cli/pad/cmd/merge/cmd.go @@ -20,21 +20,21 @@ func Cmd() *cobra.Command { var keyFile string var dryRun bool - short, long := assets.CommandDesc("pad.merge") + short, long := assets.CommandDesc(assets.CmdDescKeyPadMerge) cmd := &cobra.Command{ Use: "merge FILE...", Short: short, Long: long, Args: cobra.MinimumNArgs(1), RunE: func(cmd *cobra.Command, args []string) error { - return runMerge(cmd, args, keyFile, dryRun) + return Run(cmd, args, keyFile, dryRun) }, } cmd.Flags().StringVarP(&keyFile, "key", "k", "", - assets.FlagDesc("pad.merge.key")) + assets.FlagDesc(assets.FlagDescKeyPadMergeKey)) cmd.Flags().BoolVar(&dryRun, "dry-run", false, - assets.FlagDesc("pad.merge.dry-run")) + assets.FlagDesc(assets.FlagDescKeyPadMergeDryRun)) return cmd } diff --git a/internal/cli/pad/cmd/merge/doc.go b/internal/cli/pad/cmd/merge/doc.go new file mode 100644 index 00000000..3c7f6c23 --- /dev/null +++ b/internal/cli/pad/cmd/merge/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package merge implements the ctx pad merge subcommand. +// +// It merges entries from one or more external pad files into the current pad. +package merge diff --git a/internal/cli/pad/cmd/merge/run.go b/internal/cli/pad/cmd/merge/run.go index 7c2e1fa0..784d968f 100644 --- a/internal/cli/pad/cmd/merge/run.go +++ b/internal/cli/pad/cmd/merge/run.go @@ -7,18 +7,14 @@ package merge import ( - "fmt" - "os" - "strings" - "unicode/utf8" - "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" - "github.com/ActiveMemory/ctx/internal/crypto" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// runMerge reads entries from input files, deduplicates against the current +// Run reads entries from input files, deduplicates against the current // pad, and writes the merged result. // // Parameters: @@ -29,7 +25,7 @@ import ( // // Returns: // - error: Non-nil on read/write failures -func runMerge( +func Run( cmd *cobra.Command, files []string, keyFile string, @@ -40,69 +36,53 @@ func runMerge( return readErr } - key := loadMergeKey(keyFile) + key := core.LoadMergeKey(keyFile) seen := make(map[string]bool, len(current)) for _, e := range current { seen[e] = true } - blobLabels := buildBlobLabelMap(current) + blobLabels := core.BuildBlobLabelMap(current) var added, dupes int var newEntries []string for _, file := range files { - entries, fileErr := readFileEntries(file, key) + entries, fileErr := core.ReadFileEntries(file, key) if fileErr != nil { - return fmt.Errorf("open %s: %w", file, fileErr) + return ctxerr.OpenFile(file, fileErr) } - warnIfBinary(cmd, file, entries) + if core.HasBinaryEntries(entries) { + write.PadMergeBinaryWarning(cmd, file) + } for _, entry := range entries { if seen[entry] { dupes++ - cmd.Println(fmt.Sprintf( - " = %-40s (duplicate, skipped)\n", - core.DisplayEntry(entry), - )) + write.PadMergeDupe(cmd, core.DisplayEntry(entry)) continue } seen[entry] = true - checkBlobConflict(cmd, entry, blobLabels) + + if conflict, label := core.HasBlobConflict(entry, blobLabels); conflict { + write.PadMergeBlobConflict(cmd, label) + } + newEntries = append(newEntries, entry) added++ - cmd.Println(fmt.Sprintf( - " + %-40s (from %s)\n", - core.DisplayEntry(entry), - file, - )) + write.PadMergeAdded(cmd, core.DisplayEntry(entry), file) } } - if added == 0 && dupes == 0 { - cmd.Println("No entries to merge.") - return nil - } - if added == 0 { - cmd.Println(fmt.Sprintf( - "No new entries to merge (%d %s skipped).\n", - dupes, - pluralize("duplicate", dupes), - )) + write.PadMergeSummary(cmd, added, dupes, dryRun) return nil } if dryRun { - cmd.Println(fmt.Sprintf( - "Would merge %d new %s (%d %s skipped).\n", - added, - pluralize("entry", added), - dupes, - pluralize("duplicate", dupes), - )) + write.PadMergeSummary(cmd, added, dupes, dryRun) return nil } @@ -113,146 +93,6 @@ func runMerge( return writeErr } - cmd.Println(fmt.Sprintf( - "Merged %d new %s (%d %s skipped).\n", - added, - pluralize("entry", added), - dupes, - pluralize("duplicate", dupes), - )) + write.PadMergeSummary(cmd, added, dupes, false) return nil } - -// readFileEntries reads a scratchpad file, attempting decryption first. -// -// Parameters: -// - path: Path to the scratchpad file -// - key: Encryption key (nil to skip decryption attempt) -// -// Returns: -// - []string: Parsed entries -// - error: Non-nil if the file cannot be read -func readFileEntries(path string, key []byte) ([]string, error) { - data, readErr := os.ReadFile(path) //nolint:gosec // user-provided path is intentional - if readErr != nil { - return nil, readErr - } - - if len(data) == 0 { - return nil, nil - } - - if key != nil { - plaintext, decErr := crypto.Decrypt(key, data) - if decErr == nil { - return core.ParseEntries(plaintext), nil - } - } - - return core.ParseEntries(data), nil -} - -// loadMergeKey loads the encryption key for merge input decryption. -// -// Parameters: -// - keyFile: Explicit key file path (empty string = use project key) -// -// Returns: -// - []byte: The loaded key, or nil if no key is available -func loadMergeKey(keyFile string) []byte { - if keyFile != "" { - key, loadErr := crypto.LoadKey(keyFile) - if loadErr != nil { - return nil - } - return key - } - - key, loadErr := crypto.LoadKey(core.KeyPath()) - if loadErr != nil { - return nil - } - return key -} - -// buildBlobLabelMap creates a map of blob labels to their full entry strings. -// -// Parameters: -// - entries: Scratchpad entries to scan -// -// Returns: -// - map[string]string: Blob label to full entry string -func buildBlobLabelMap(entries []string) map[string]string { - labels := make(map[string]string) - for _, entry := range entries { - if label, _, ok := core.SplitBlob(entry); ok { - labels[label] = entry - } - } - return labels -} - -// checkBlobConflict warns if a blob entry has the same label as an existing -// blob but different content. -// -// Parameters: -// - cmd: Cobra command for output -// - entry: The new entry to check -// - blobLabels: Map of existing blob labels to their full entry strings -func checkBlobConflict( - cmd *cobra.Command, - entry string, - blobLabels map[string]string, -) { - label, _, ok := core.SplitBlob(entry) - if !ok { - return - } - - existing, found := blobLabels[label] - if found && existing != entry { - cmd.Println(fmt.Sprintf( - " ! blob %q has different content across sources; both kept\n", - label, - )) - } - - blobLabels[label] = entry -} - -// warnIfBinary prints a warning if any entries contain non-UTF-8 bytes. -// -// Parameters: -// - cmd: Cobra command for output -// - file: The source file path (for the warning message) -// - entries: The parsed entries to check -func warnIfBinary(cmd *cobra.Command, file string, entries []string) { - for _, entry := range entries { - if !utf8.ValidString(entry) { - cmd.Println(fmt.Sprintf( - " ! %s appears to contain binary data;"+ - " it may be encrypted (use --key)\n", - file, - )) - return - } - } -} - -// pluralize returns the singular or plural form of a word. -// -// Parameters: -// - word: The singular form -// - count: The count to check -// -// Returns: -// - string: Singular form if count == 1, otherwise plural -func pluralize(word string, count int) string { - if count == 1 { - return word - } - if strings.HasSuffix(word, "y") { - return word[:len(word)-1] + "ies" - } - return word + "s" -} diff --git a/internal/cli/pad/cmd/mv/cmd.go b/internal/cli/pad/cmd/mv/cmd.go index 5f05b31a..35469dc7 100644 --- a/internal/cli/pad/cmd/mv/cmd.go +++ b/internal/cli/pad/cmd/mv/cmd.go @@ -19,7 +19,7 @@ import ( // Returns: // - *cobra.Command: Configured mv subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("pad.mv") + short, _ := assets.CommandDesc(assets.CmdDescKeyPadMv) return &cobra.Command{ Use: "mv N M", Short: short, @@ -33,7 +33,7 @@ func Cmd() *cobra.Command { if err != nil { return err } - return runMv(cmd, n, m) + return Run(cmd, n, m) }, } } diff --git a/internal/cli/pad/cmd/mv/doc.go b/internal/cli/pad/cmd/mv/doc.go new file mode 100644 index 00000000..45ad804f --- /dev/null +++ b/internal/cli/pad/cmd/mv/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package mv implements the ctx pad mv subcommand. +// +// It moves a pad entry from one position to another by index. +package mv diff --git a/internal/cli/pad/cmd/mv/run.go b/internal/cli/pad/cmd/mv/run.go index 6a5a0dbb..d4bca1ce 100644 --- a/internal/cli/pad/cmd/mv/run.go +++ b/internal/cli/pad/cmd/mv/run.go @@ -7,14 +7,13 @@ package mv import ( - "fmt" - "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + "github.com/ActiveMemory/ctx/internal/write" ) -// runMv moves entry from 1-based position n to 1-based position m. +// Run moves entry from 1-based position n to 1-based position m. // // Parameters: // - cmd: Cobra command for output @@ -23,7 +22,7 @@ import ( // // Returns: // - error: Non-nil on invalid index or read/write failure -func runMv(cmd *cobra.Command, n, m int) error { +func Run(cmd *cobra.Command, n, m int) error { entries, err := core.ReadEntries() if err != nil { return err @@ -48,6 +47,6 @@ func runMv(cmd *cobra.Command, n, m int) error { return writeErr } - cmd.Println(fmt.Sprintf("Moved entry %d to %d.", n, m)) + write.PadEntryMoved(cmd, n, m) return nil } diff --git a/internal/cli/pad/cmd/resolve/cmd.go b/internal/cli/pad/cmd/resolve/cmd.go index c6dd2479..5f4395eb 100644 --- a/internal/cli/pad/cmd/resolve/cmd.go +++ b/internal/cli/pad/cmd/resolve/cmd.go @@ -17,13 +17,13 @@ import ( // Returns: // - *cobra.Command: Configured resolve subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("pad.resolve") + short, long := assets.CommandDesc(assets.CmdDescKeyPadResolve) return &cobra.Command{ Use: "resolve", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runResolve(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/pad/cmd/resolve/doc.go b/internal/cli/pad/cmd/resolve/doc.go new file mode 100644 index 00000000..2d3066f0 --- /dev/null +++ b/internal/cli/pad/cmd/resolve/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package resolve implements the ctx pad resolve subcommand. +// +// It resolves merge conflicts in the scratch pad file. +package resolve diff --git a/internal/cli/pad/cmd/resolve/run.go b/internal/cli/pad/cmd/resolve/run.go index 870981c0..c355bebb 100644 --- a/internal/cli/pad/cmd/resolve/run.go +++ b/internal/cli/pad/cmd/resolve/run.go @@ -7,28 +7,26 @@ package resolve import ( - "errors" - "fmt" - + "github.com/ActiveMemory/ctx/internal/config/pad" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/crypto" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/write" ) -// runResolve reads and prints both sides of a merge conflict. +// Run reads and prints both sides of a merge conflict. // // Parameters: // - cmd: Cobra command for output // // Returns: // - error: Non-nil if no conflict files found or decryption fails -func runResolve(cmd *cobra.Command) error { +func Run(cmd *cobra.Command) error { if !rc.ScratchpadEncrypt() { - return errors.New("resolve is only needed for encrypted scratchpads") + return ctxerr.ResolveNotEncrypted() } kp := core.KeyPath() @@ -39,27 +37,29 @@ func runResolve(cmd *cobra.Command) error { dir := rc.ContextDir() - ours, errOurs := core.DecryptFile(key, dir, config.FileScratchpadEnc+".ours") - theirs, errTheirs := core.DecryptFile(key, dir, config.FileScratchpadEnc+".theirs") + ours, errOurs := core.DecryptFile(key, dir, pad.Enc+".ours") + theirs, errTheirs := core.DecryptFile(key, dir, pad.Enc+".theirs") if errOurs != nil && errTheirs != nil { - return fmt.Errorf("no conflict files found (%s.ours / %s.theirs)", - config.FileScratchpadEnc, config.FileScratchpadEnc) + return ctxerr.NoConflictFiles(pad.Enc) } if errOurs == nil { - cmd.Println("=== OURS ===") - for i, entry := range ours { - cmd.Println(fmt.Sprintf(" %d. %s", i+1, core.DisplayEntry(entry))) - } + write.PadResolveSide(cmd, "OURS", displayAll(ours)) } if errTheirs == nil { - cmd.Println("=== THEIRS ===") - for i, entry := range theirs { - cmd.Println(fmt.Sprintf(" %d. %s", i+1, core.DisplayEntry(entry))) - } + write.PadResolveSide(cmd, "THEIRS", displayAll(theirs)) } return nil } + +// displayAll converts entries to their display form. +func displayAll(entries []string) []string { + out := make([]string, len(entries)) + for i, e := range entries { + out[i] = core.DisplayEntry(e) + } + return out +} diff --git a/internal/cli/pad/cmd/rm/cmd.go b/internal/cli/pad/cmd/rm/cmd.go index 0d705275..a14d4d8d 100644 --- a/internal/cli/pad/cmd/rm/cmd.go +++ b/internal/cli/pad/cmd/rm/cmd.go @@ -19,7 +19,7 @@ import ( // Returns: // - *cobra.Command: Configured rm subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("pad.rm") + short, _ := assets.CommandDesc(assets.CmdDescKeyPadRm) return &cobra.Command{ Use: "rm N", Short: short, @@ -29,7 +29,7 @@ func Cmd() *cobra.Command { if err != nil { return err } - return runRm(cmd, n) + return Run(cmd, n) }, } } diff --git a/internal/cli/pad/cmd/rm/doc.go b/internal/cli/pad/cmd/rm/doc.go new file mode 100644 index 00000000..dc1c8e9e --- /dev/null +++ b/internal/cli/pad/cmd/rm/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package rm implements the ctx pad rm subcommand. +// +// It removes a pad entry by its 1-based index. +package rm diff --git a/internal/cli/pad/cmd/rm/run.go b/internal/cli/pad/cmd/rm/run.go index cb3adc8b..a82fb848 100644 --- a/internal/cli/pad/cmd/rm/run.go +++ b/internal/cli/pad/cmd/rm/run.go @@ -7,14 +7,13 @@ package rm import ( - "fmt" - "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + "github.com/ActiveMemory/ctx/internal/write" ) -// runRm removes entry at 1-based position n. +// Run removes entry at 1-based position n. // // Parameters: // - cmd: Cobra command for output @@ -22,7 +21,7 @@ import ( // // Returns: // - error: Non-nil on invalid index or read/write failure -func runRm(cmd *cobra.Command, n int) error { +func Run(cmd *cobra.Command, n int) error { entries, err := core.ReadEntries() if err != nil { return err @@ -38,6 +37,6 @@ func runRm(cmd *cobra.Command, n int) error { return writeErr } - cmd.Println(fmt.Sprintf("Removed entry %d.", n)) + write.PadEntryRemoved(cmd, n) return nil } diff --git a/internal/cli/pad/cmd/show/cmd.go b/internal/cli/pad/cmd/show/cmd.go index 943bb489..7f6dc86f 100644 --- a/internal/cli/pad/cmd/show/cmd.go +++ b/internal/cli/pad/cmd/show/cmd.go @@ -7,12 +7,12 @@ package show import ( - "fmt" "strconv" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // Cmd returns the pad show subcommand. @@ -27,7 +27,7 @@ import ( func Cmd() *cobra.Command { var outPath string - short, long := assets.CommandDesc("pad.show") + short, long := assets.CommandDesc(assets.CmdDescKeyPadShow) cmd := &cobra.Command{ Use: "show N", Short: short, @@ -36,13 +36,15 @@ func Cmd() *cobra.Command { RunE: func(cmd *cobra.Command, args []string) error { n, err := strconv.Atoi(args[0]) if err != nil { - return fmt.Errorf("invalid index: %s", args[0]) + return ctxerr.InvalidIndex(args[0]) } - return runShow(cmd, n, outPath) + return Run(cmd, n, outPath) }, } - cmd.Flags().StringVar(&outPath, "out", "", assets.FlagDesc("pad.show.out")) + cmd.Flags().StringVar(&outPath, + "out", "", assets.FlagDesc(assets.FlagDescKeyPadShowOut), + ) return cmd } diff --git a/internal/cli/pad/cmd/show/doc.go b/internal/cli/pad/cmd/show/doc.go new file mode 100644 index 00000000..63988af2 --- /dev/null +++ b/internal/cli/pad/cmd/show/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package show implements the ctx pad show subcommand. +// +// It outputs the raw text of a pad entry by index, designed for pipe +// composability with other pad commands. +package show diff --git a/internal/cli/pad/cmd/show/run.go b/internal/cli/pad/cmd/show/run.go index e550742d..d9474436 100644 --- a/internal/cli/pad/cmd/show/run.go +++ b/internal/cli/pad/cmd/show/run.go @@ -7,16 +7,17 @@ package show import ( - "fmt" "os" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// runShow prints the raw text of entry at 1-based position n. +// Run prints the raw text of entry at 1-based position n. // // Parameters: // - cmd: Cobra command for output @@ -25,7 +26,7 @@ import ( // // Returns: // - error: Non-nil on invalid index, read failure, or write failure -func runShow(cmd *cobra.Command, n int, outPath string) error { +func Run(cmd *cobra.Command, n int, outPath string) error { entries, err := core.ReadEntries() if err != nil { return err @@ -35,28 +36,28 @@ func runShow(cmd *cobra.Command, n int, outPath string) error { return ctxerr.EntryRange(n, 0) } - if err := core.ValidateIndex(n, entries); err != nil { - return err + if validErr := core.ValidateIndex(n, entries); validErr != nil { + return validErr } entry := entries[n-1] - if label, data, ok := core.SplitBlob(entry); ok { - _ = label + if _, data, ok := core.SplitBlob(entry); ok { if outPath != "" { - if writeErr := os.WriteFile(outPath, data, 0600); writeErr != nil { - return fmt.Errorf("write file: %w", writeErr) + if writeErr := os.WriteFile( + outPath, data, fs.PermSecret, + ); writeErr != nil { + return ctxerr.WriteFileFailed(writeErr) } - cmd.Println(fmt.Sprintf("Wrote %d bytes to %s", len(data), outPath)) + write.PadBlobWritten(cmd, len(data), outPath) return nil } cmd.Print(string(data)) return nil } - // Non-blob entry. if outPath != "" { - return fmt.Errorf("--out can only be used with blob entries") + return ctxerr.OutFlagRequiresBlob() } cmd.Println(entry) diff --git a/internal/cli/pad/core/blob.go b/internal/cli/pad/core/blob.go index 0900bdd5..c23c7cb3 100644 --- a/internal/cli/pad/core/blob.go +++ b/internal/cli/pad/core/blob.go @@ -9,13 +9,9 @@ package core import ( "encoding/base64" "strings" -) - -// BlobSep separates the label from the base64-encoded file content. -const BlobSep = ":::" -// MaxBlobSize is the maximum file size (pre-encoding) allowed for blob entries. -const MaxBlobSize = 64 * 1024 + "github.com/ActiveMemory/ctx/internal/config/pad" +) // IsBlob returns true if the entry contains the blob separator. // @@ -25,7 +21,7 @@ const MaxBlobSize = 64 * 1024 // Returns: // - bool: True if entry is a blob func IsBlob(entry string) bool { - return strings.Contains(entry, BlobSep) + return strings.Contains(entry, pad.BlobSep) } // SplitBlob parses a blob entry into its label and decoded data. @@ -38,13 +34,13 @@ func IsBlob(entry string) bool { // - data: Decoded file content // - ok: False for non-blob entries or malformed base64 func SplitBlob(entry string) (label string, data []byte, ok bool) { - idx := strings.Index(entry, BlobSep) + idx := strings.Index(entry, pad.BlobSep) if idx < 0 { return "", nil, false } label = entry[:idx] - encoded := entry[idx+len(BlobSep):] + encoded := entry[idx+len(pad.BlobSep):] data, err := base64.StdEncoding.DecodeString(encoded) if err != nil { @@ -63,7 +59,7 @@ func SplitBlob(entry string) (label string, data []byte, ok bool) { // Returns: // - string: Formatted blob entry func MakeBlob(label string, data []byte) string { - return label + BlobSep + base64.StdEncoding.EncodeToString(data) + return label + pad.BlobSep + base64.StdEncoding.EncodeToString(data) } // DisplayEntry returns a display-friendly version of an entry. @@ -77,7 +73,7 @@ func MakeBlob(label string, data []byte) string { // - string: Human-readable entry representation func DisplayEntry(entry string) string { if label, _, ok := SplitBlob(entry); ok { - return label + " [BLOB]" + return label + pad.BlobTag } return entry } diff --git a/internal/cli/pad/core/merge.go b/internal/cli/pad/core/merge.go new file mode 100644 index 00000000..0b63953a --- /dev/null +++ b/internal/cli/pad/core/merge.go @@ -0,0 +1,118 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "unicode/utf8" + + "github.com/ActiveMemory/ctx/internal/crypto" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// ReadFileEntries reads a scratchpad file, attempting decryption first. +// +// Parameters: +// - path: path to the scratchpad file. +// - key: encryption key (nil to skip the decryption attempt). +// +// Returns: +// - []string: parsed entries. +// - error: non-nil if the file cannot be read. +func ReadFileEntries(path string, key []byte) ([]string, error) { + data, readErr := validation.ReadUserFile(path) + if readErr != nil { + return nil, readErr + } + + if len(data) == 0 { + return nil, nil + } + + if key != nil { + plaintext, decErr := crypto.Decrypt(key, data) + if decErr == nil { + return ParseEntries(plaintext), nil + } + } + + return ParseEntries(data), nil +} + +// LoadMergeKey loads the encryption key for merge input decryption. +// +// Parameters: +// - keyFile: explicit key file path (empty string = use project key). +// +// Returns: +// - []byte: the loaded key, or nil if no key is available. +func LoadMergeKey(keyFile string) []byte { + path := keyFile + if path == "" { + path = KeyPath() + } + + key, loadErr := crypto.LoadKey(path) + if loadErr != nil { + return nil + } + return key +} + +// BuildBlobLabelMap creates a map of blob labels to their full entry strings. +// +// Parameters: +// - entries: scratchpad entries to scan. +// +// Returns: +// - map[string]string: blob label to full entry string. +func BuildBlobLabelMap(entries []string) map[string]string { + labels := make(map[string]string) + for _, entry := range entries { + if label, _, ok := SplitBlob(entry); ok { + labels[label] = entry + } + } + return labels +} + +// HasBlobConflict checks if a blob entry has the same label as an existing +// blob but different content. Updates the label map with the new entry. +// +// Parameters: +// - entry: the new entry to check. +// - blobLabels: map of existing blob labels to their full entry strings. +// +// Returns: +// - bool: true if a conflict was detected. +// - string: the conflicting label (empty if no conflict). +func HasBlobConflict(entry string, blobLabels map[string]string) (bool, string) { + label, _, ok := SplitBlob(entry) + if !ok { + return false, "" + } + + existing, found := blobLabels[label] + conflict := found && existing != entry + blobLabels[label] = entry + return conflict, label +} + +// HasBinaryEntries checks if any entries contain non-UTF-8 bytes. +// +// Parameters: +// - entries: the parsed entries to check. +// +// Returns: +// - bool: true if any entry contains non-UTF-8 data. +func HasBinaryEntries(entries []string) bool { + for _, entry := range entries { + if !utf8.ValidString(entry) { + return true + } + } + return false +} diff --git a/internal/cli/pad/core/parse.go b/internal/cli/pad/core/parse.go index eb23ec58..c892ae6b 100644 --- a/internal/cli/pad/core/parse.go +++ b/internal/cli/pad/core/parse.go @@ -9,7 +9,7 @@ package core import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/token" ) // ParseEntries splits raw bytes into entry lines, filtering empty lines. @@ -23,7 +23,7 @@ func ParseEntries(data []byte) []string { if len(data) == 0 { return nil } - lines := strings.Split(string(data), config.NewlineLF) + lines := strings.Split(string(data), token.NewlineLF) var entries []string for _, line := range lines { if line != "" { @@ -44,5 +44,5 @@ func FormatEntries(entries []string) []byte { if len(entries) == 0 { return nil } - return []byte(strings.Join(entries, config.NewlineLF) + config.NewlineLF) + return []byte(strings.Join(entries, token.NewlineLF) + token.NewlineLF) } diff --git a/internal/cli/pad/core/store.go b/internal/cli/pad/core/store.go index 4e20b8e3..e59b2348 100644 --- a/internal/cli/pad/core/store.go +++ b/internal/cli/pad/core/store.go @@ -13,28 +13,25 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/pad" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/crypto" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" "github.com/ActiveMemory/ctx/internal/validation" ) -// Output messages matching the spec. -const ( - MsgEmpty = "Scratchpad is empty." - MsgKeyCreated = "Scratchpad key created at %s\n" -) - // ScratchpadPath returns the full path to the scratchpad file. // // Returns: // - string: Encrypted or plaintext path based on rc.ScratchpadEncrypt() func ScratchpadPath() string { if rc.ScratchpadEncrypt() { - return filepath.Join(rc.ContextDir(), config.FileScratchpadEnc) + return filepath.Join(rc.ContextDir(), pad.Enc) } - return filepath.Join(rc.ContextDir(), config.FileScratchpadMd) + return filepath.Join(rc.ContextDir(), pad.Md) } // KeyPath returns the full path to the encryption key file. @@ -45,7 +42,7 @@ func ScratchpadPath() string { // Returns: // - string: Resolved key file path func KeyPath() string { - config.MigrateKeyFile(rc.ContextDir()) + crypto.MigrateKeyFile(rc.ContextDir()) return rc.KeyPath() } @@ -74,19 +71,18 @@ func EnsureKey() error { // First use: generate key. key, genErr := crypto.GenerateKey() if genErr != nil { - return fmt.Errorf("generate scratchpad key: %w", genErr) + return ctxerr.GenerateKey(genErr) } - // Ensure parent directory exists (user-level or project-local). - if mkErr := os.MkdirAll(filepath.Dir(kp), config.PermKeyDir); mkErr != nil { - return fmt.Errorf("create key dir: %w", mkErr) + if mkErr := os.MkdirAll(filepath.Dir(kp), fs.PermKeyDir); mkErr != nil { + return ctxerr.MkdirKeyDir(mkErr) } if saveErr := crypto.SaveKey(kp, key); saveErr != nil { - return fmt.Errorf("save scratchpad key: %w", saveErr) + return ctxerr.SaveKey(saveErr) } - fmt.Fprintf(os.Stderr, MsgKeyCreated, kp) + fmt.Fprintln(os.Stderr, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPadKeyCreated), kp)) //nolint:errcheck // best-effort notice return nil } @@ -107,17 +103,17 @@ func EnsureGitignore(contextDir, filename string) error { return err } - for _, line := range strings.Split(string(content), config.NewlineLF) { + for _, line := range strings.Split(string(content), token.NewlineLF) { if strings.TrimSpace(line) == entry { return nil } } sep := "" - if len(content) > 0 && !strings.HasSuffix(string(content), config.NewlineLF) { - sep = config.NewlineLF + if len(content) > 0 && !strings.HasSuffix(string(content), token.NewlineLF) { + sep = token.NewlineLF } - return os.WriteFile(gitignorePath, []byte(string(content)+sep+entry+config.NewlineLF), config.PermFile) + return os.WriteFile(gitignorePath, []byte(string(content)+sep+entry+token.NewlineLF), fs.PermFile) } // ReadEntries reads the scratchpad and returns its entries. @@ -138,14 +134,13 @@ func ReadEntries() ([]string, error) { if errors.Is(err, os.ErrNotExist) { return nil, nil } - return nil, fmt.Errorf("read scratchpad: %w", err) + return nil, ctxerr.ReadScratchpad(err) } if !rc.ScratchpadEncrypt() { return ParseEntries(data), nil } - // Encrypted mode: load key and decrypt kp := KeyPath() key, loadErr := crypto.LoadKey(kp) if loadErr != nil { @@ -175,10 +170,9 @@ func WriteEntries(entries []string) error { plaintext := FormatEntries(entries) if !rc.ScratchpadEncrypt() { - return os.WriteFile(path, plaintext, config.PermFile) + return os.WriteFile(path, plaintext, fs.PermFile) } - // Encrypted mode: ensure key exists (auto-generate on first use). if err := EnsureKey(); err != nil { return err } @@ -194,5 +188,5 @@ func WriteEntries(entries []string) error { return ctxerr.EncryptFailed(encErr) } - return os.WriteFile(path, ciphertext, config.PermFile) + return os.WriteFile(path, ciphertext, fs.PermFile) } diff --git a/internal/cli/pad/pad.go b/internal/cli/pad/pad.go index d697314d..0a61f0b0 100644 --- a/internal/cli/pad/pad.go +++ b/internal/cli/pad/pad.go @@ -22,6 +22,7 @@ import ( "github.com/ActiveMemory/ctx/internal/cli/pad/cmd/rm" "github.com/ActiveMemory/ctx/internal/cli/pad/cmd/show" "github.com/ActiveMemory/ctx/internal/cli/pad/core" + "github.com/ActiveMemory/ctx/internal/write" ) // Cmd returns the pad command with subcommands. @@ -31,7 +32,7 @@ import ( // Returns: // - *cobra.Command: Configured pad command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("pad") + short, long := assets.CommandDesc(assets.CmdDescKeyPad) cmd := &cobra.Command{ Use: "pad", Short: short, @@ -69,7 +70,7 @@ func runList(cmd *cobra.Command) error { } if len(entries) == 0 { - cmd.Println(core.MsgEmpty) + write.PadEmpty(cmd) return nil } diff --git a/internal/cli/pad/pad_test.go b/internal/cli/pad/pad_test.go index 36e201bf..821cc09e 100644 --- a/internal/cli/pad/pad_test.go +++ b/internal/cli/pad/pad_test.go @@ -15,10 +15,12 @@ import ( "strings" "testing" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/pad" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/pad/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/crypto" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" @@ -29,28 +31,28 @@ import ( // sets the RC context dir override, and returns the temp dir path. func setupEncrypted(t *testing.T) string { t.Helper() - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } - t.Setenv("HOME", dir) + t.Setenv("HOME", tmpDir) t.Cleanup(func() { _ = os.Chdir(origDir) rc.Reset() }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } // Write key to the global path (where rc.KeyPath resolves). - userKeyPath := config.GlobalKeyPath() - if err := os.MkdirAll(filepath.Dir(userKeyPath), config.PermKeyDir); err != nil { + userKeyPath := crypto.GlobalKeyPath() + if err := os.MkdirAll(filepath.Dir(userKeyPath), fs.PermKeyDir); err != nil { t.Fatal(err) } key, err := crypto.GenerateKey() @@ -61,19 +63,19 @@ func setupEncrypted(t *testing.T) string { t.Fatal(err) } - return dir + return tmpDir } // setupPlaintext creates a temp dir with a .context/ directory and // scratchpad_encrypt: false in .ctxrc. func setupPlaintext(t *testing.T) string { t.Helper() - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } - t.Setenv("HOME", dir) + t.Setenv("HOME", tmpDir) t.Cleanup(func() { _ = os.Chdir(origDir) rc.Reset() @@ -81,18 +83,18 @@ func setupPlaintext(t *testing.T) string { // Write .ctxrc with encryption disabled rcContent := "scratchpad_encrypt: false\n" - if err := os.WriteFile(filepath.Join(dir, ".ctxrc"), []byte(rcContent), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, ".ctxrc"), []byte(rcContent), 0600); err != nil { t.Fatal(err) } rc.Reset() - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } - return dir + return tmpDir } // runCmd executes a cobra command and captures its output. @@ -118,8 +120,8 @@ func TestList_Empty(t *testing.T) { if err != nil { t.Fatalf("unexpected error: %v", err) } - if !strings.Contains(out, core.MsgEmpty) { - t.Errorf("output = %q, want %q", out, core.MsgEmpty) + if !strings.Contains(out, "Scratchpad is empty.") { + t.Errorf("output = %q, want %q", out, "Scratchpad is empty.") } } @@ -156,7 +158,7 @@ func TestAdd_Plaintext(t *testing.T) { } // Verify the file is plain text - path := filepath.Join(config.DirContext, config.FileScratchpadMd) + path := filepath.Join(dir.Context, pad.Md) data, err := os.ReadFile(path) //nolint:gosec // test reads a known test file path if err != nil { t.Fatalf("ReadFile() error: %v", err) @@ -477,10 +479,10 @@ func TestMv_OutOfRange(t *testing.T) { } func TestNoKey_EncryptedFileExists(t *testing.T) { - dir := t.TempDir() - t.Setenv("HOME", dir) + tmpDir := t.TempDir() + t.Setenv("HOME", tmpDir) origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { @@ -489,16 +491,16 @@ func TestNoKey_EncryptedFileExists(t *testing.T) { }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } // Create an encrypted file but no key if err := os.WriteFile( - filepath.Join(ctxDir, config.FileScratchpadEnc), + filepath.Join(ctxDir, pad.Enc), []byte("encrypted data here but dummy"), 0600, ); err != nil { @@ -703,9 +705,9 @@ func TestEdit_InvalidIndex(t *testing.T) { } func TestEnsureGitignore_NewFile(t *testing.T) { - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { _ = os.Chdir(origDir) }) @@ -725,9 +727,9 @@ func TestEnsureGitignore_NewFile(t *testing.T) { } func TestEnsureGitignore_AlreadyPresent(t *testing.T) { - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { _ = os.Chdir(origDir) }) @@ -751,9 +753,9 @@ func TestEnsureGitignore_AlreadyPresent(t *testing.T) { } func TestEnsureGitignore_AppendToExisting(t *testing.T) { - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { _ = os.Chdir(origDir) }) @@ -781,8 +783,8 @@ func TestScratchpadPath_Plaintext(t *testing.T) { setupPlaintext(t) path := core.ScratchpadPath() - if !strings.HasSuffix(path, config.FileScratchpadMd) { - t.Errorf("core.ScratchpadPath() = %q, want suffix %q", path, config.FileScratchpadMd) + if !strings.HasSuffix(path, pad.Md) { + t.Errorf("core.ScratchpadPath() = %q, want suffix %q", path, pad.Md) } } @@ -790,8 +792,8 @@ func TestScratchpadPath_Encrypted(t *testing.T) { setupEncrypted(t) path := core.ScratchpadPath() - if !strings.HasSuffix(path, config.FileScratchpadEnc) { - t.Errorf("core.ScratchpadPath() = %q, want suffix %q", path, config.FileScratchpadEnc) + if !strings.HasSuffix(path, pad.Enc) { + t.Errorf("core.ScratchpadPath() = %q, want suffix %q", path, pad.Enc) } } @@ -818,27 +820,27 @@ func TestEnsureKey_KeyAlreadyExists(t *testing.T) { } func TestEnsureKey_EncFileExistsNoKey(t *testing.T) { - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } - t.Setenv("HOME", dir) + t.Setenv("HOME", tmpDir) t.Cleanup(func() { _ = os.Chdir(origDir) rc.Reset() }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } // Create enc file but no key - encPath := filepath.Join(ctxDir, config.FileScratchpadEnc) + encPath := filepath.Join(ctxDir, pad.Enc) if err := os.WriteFile(encPath, []byte("data"), 0600); err != nil { t.Fatal(err) } @@ -853,21 +855,21 @@ func TestEnsureKey_EncFileExistsNoKey(t *testing.T) { } func TestEnsureKey_GeneratesNewKey(t *testing.T) { - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } - t.Setenv("HOME", dir) + t.Setenv("HOME", tmpDir) t.Cleanup(func() { _ = os.Chdir(origDir) rc.Reset() }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } @@ -970,7 +972,7 @@ func TestResolve_WithConflictFiles(t *testing.T) { if err != nil { t.Fatal(err) } - oursPath := filepath.Join(config.DirContext, config.FileScratchpadEnc+".ours") + oursPath := filepath.Join(dir.Context, pad.Enc+".ours") err = os.WriteFile(oursPath, oursCipher, 0600) if err != nil { t.Fatal(err) @@ -982,7 +984,7 @@ func TestResolve_WithConflictFiles(t *testing.T) { if err != nil { t.Fatal(err) } - theirsPath := filepath.Join(config.DirContext, config.FileScratchpadEnc+".theirs") + theirsPath := filepath.Join(dir.Context, pad.Enc+".theirs") err = os.WriteFile(theirsPath, theirsCipher, 0600) if err != nil { t.Fatal(err) @@ -1019,7 +1021,7 @@ func TestResolve_OnlyOursFile(t *testing.T) { if err != nil { t.Fatal(err) } - oursPath := filepath.Join(config.DirContext, config.FileScratchpadEnc+".ours") + oursPath := filepath.Join(dir.Context, pad.Enc+".ours") err = os.WriteFile(oursPath, oursCipher, 0600) if err != nil { t.Fatal(err) @@ -1063,7 +1065,7 @@ func TestList_PlaintextEmpty(t *testing.T) { if err != nil { t.Fatalf("list error: %v", err) } - if !strings.Contains(out, core.MsgEmpty) { + if !strings.Contains(out, "Scratchpad is empty.") { t.Errorf("output = %q, want empty message", out) } } @@ -1112,12 +1114,12 @@ func TestEdit_PrependOutOfRange(t *testing.T) { func TestDecryptFile_BadData(t *testing.T) { key, _ := crypto.GenerateKey() - dir := t.TempDir() - if writeErr := os.WriteFile(filepath.Join(dir, "bad.enc"), []byte("not-encrypted"), 0600); writeErr != nil { + tmpDir := t.TempDir() + if writeErr := os.WriteFile(filepath.Join(tmpDir, "bad.enc"), []byte("not-encrypted"), 0600); writeErr != nil { t.Fatal(writeErr) } - _, err := core.DecryptFile(key, dir, "bad.enc") + _, err := core.DecryptFile(key, tmpDir, "bad.enc") if err == nil { t.Fatal("expected decryption error for bad data") } @@ -1128,9 +1130,9 @@ func TestDecryptFile_BadData(t *testing.T) { func TestDecryptFile_MissingFile(t *testing.T) { key, _ := crypto.GenerateKey() - dir := t.TempDir() + tmpDir := t.TempDir() - _, err := core.DecryptFile(key, dir, "nonexistent.enc") + _, err := core.DecryptFile(key, tmpDir, "nonexistent.enc") if err == nil { t.Fatal("expected error for missing file") } @@ -1138,18 +1140,18 @@ func TestDecryptFile_MissingFile(t *testing.T) { func TestDecryptFile_ValidData(t *testing.T) { key, _ := crypto.GenerateKey() - dir := t.TempDir() + tmpDir := t.TempDir() plaintext := []byte("entry1\nentry2\n") ciphertext, encErr := crypto.Encrypt(key, plaintext) if encErr != nil { t.Fatal(encErr) } - if writeErr := os.WriteFile(filepath.Join(dir, "good.enc"), ciphertext, 0600); writeErr != nil { + if writeErr := os.WriteFile(filepath.Join(tmpDir, "good.enc"), ciphertext, 0600); writeErr != nil { t.Fatal(writeErr) } - entries, err := core.DecryptFile(key, dir, "good.enc") + entries, err := core.DecryptFile(key, tmpDir, "good.enc") if err != nil { t.Fatalf("decryptFile error: %v", err) } @@ -1261,7 +1263,7 @@ func TestIsBlob(t *testing.T) { func TestSplitBlob_Valid(t *testing.T) { data := []byte("hello world") encoded := base64.StdEncoding.EncodeToString(data) - entry := "my label" + core.BlobSep + encoded + entry := "my label" + pad.BlobSep + encoded label, decoded, ok := core.SplitBlob(entry) if !ok { @@ -1324,10 +1326,10 @@ func TestDisplayEntry_Plain(t *testing.T) { // --- Blob add tests --- func TestAdd_BlobEncrypted(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) // Create a test file. - testFile := filepath.Join(dir, "test-blob.md") + testFile := filepath.Join(tmpDir, "test-blob.md") content := "secret plan content\n" if err := os.WriteFile(testFile, []byte(content), 0600); err != nil { t.Fatal(err) @@ -1352,17 +1354,17 @@ func TestAdd_BlobEncrypted(t *testing.T) { } func TestAdd_BlobTooLarge(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - testFile := filepath.Join(dir, "big.bin") - data := make([]byte, core.MaxBlobSize+1) + testFile := filepath.Join(tmpDir, "big.bin") + data := make([]byte, pad.MaxBlobSize+1) if err := os.WriteFile(testFile, data, 0600); err != nil { t.Fatal(err) } _, err := runCmd(newPadCmd("add", "--file", testFile, "big blob")) if err == nil { - t.Fatal("expected error for file exceeding core.MaxBlobSize") + t.Fatal("expected error for file exceeding config.MaxBlobSize") } if !strings.Contains(err.Error(), "file too large") { t.Errorf("error = %q, want 'file too large'", err.Error()) @@ -1384,7 +1386,7 @@ func TestAdd_BlobFileNotFound(t *testing.T) { // --- Blob list tests --- func TestList_BlobDisplay(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) // Add a plain entry. if _, err := runCmd(newPadCmd("add", "plain note")); err != nil { @@ -1392,7 +1394,7 @@ func TestList_BlobDisplay(t *testing.T) { } // Add a blob entry. - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte("file content"), 0600); err != nil { t.Fatal(err) } @@ -1415,10 +1417,10 @@ func TestList_BlobDisplay(t *testing.T) { // --- Blob show tests --- func TestShow_BlobAutoDecodes(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) content := "decoded file content\n" - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte(content), 0600); err != nil { t.Fatal(err) } @@ -1436,10 +1438,10 @@ func TestShow_BlobAutoDecodes(t *testing.T) { } func TestShow_BlobOutFlag(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) content := "file to recover\n" - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte(content), 0600); err != nil { t.Fatal(err) } @@ -1447,7 +1449,7 @@ func TestShow_BlobOutFlag(t *testing.T) { t.Fatal(err) } - outFile := filepath.Join(dir, "recovered.txt") + outFile := filepath.Join(tmpDir, "recovered.txt") out, err := runCmd(newPadCmd("show", "1", "--out", outFile)) if err != nil { t.Fatalf("show --out error: %v", err) @@ -1466,13 +1468,13 @@ func TestShow_BlobOutFlag(t *testing.T) { } func TestShow_OutFlagOnPlainEntry(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) if _, err := runCmd(newPadCmd("add", "plain note")); err != nil { t.Fatal(err) } - outFile := filepath.Join(dir, "out.txt") + outFile := filepath.Join(tmpDir, "out.txt") _, err := runCmd(newPadCmd("show", "1", "--out", outFile)) if err == nil { t.Fatal("expected error for --out on plain entry") @@ -1485,10 +1487,10 @@ func TestShow_OutFlagOnPlainEntry(t *testing.T) { // --- Blob edit tests --- func TestEdit_BlobReplaceFile(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) // Add a blob entry. - v1 := filepath.Join(dir, "v1.txt") + v1 := filepath.Join(tmpDir, "v1.txt") if err := os.WriteFile(v1, []byte("version 1"), 0600); err != nil { t.Fatal(err) } @@ -1497,7 +1499,7 @@ func TestEdit_BlobReplaceFile(t *testing.T) { } // Replace the file content. - v2 := filepath.Join(dir, "v2.txt") + v2 := filepath.Join(tmpDir, "v2.txt") if err := os.WriteFile(v2, []byte("version 2"), 0600); err != nil { t.Fatal(err) } @@ -1528,9 +1530,9 @@ func TestEdit_BlobReplaceFile(t *testing.T) { } func TestEdit_BlobReplaceLabel(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - v1 := filepath.Join(dir, "v1.txt") + v1 := filepath.Join(tmpDir, "v1.txt") if err := os.WriteFile(v1, []byte("content"), 0600); err != nil { t.Fatal(err) } @@ -1566,9 +1568,9 @@ func TestEdit_BlobReplaceLabel(t *testing.T) { } func TestEdit_BlobReplaceBoth(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - v1 := filepath.Join(dir, "v1.txt") + v1 := filepath.Join(tmpDir, "v1.txt") if err := os.WriteFile(v1, []byte("old content"), 0600); err != nil { t.Fatal(err) } @@ -1576,7 +1578,7 @@ func TestEdit_BlobReplaceBoth(t *testing.T) { t.Fatal(err) } - v2 := filepath.Join(dir, "v2.txt") + v2 := filepath.Join(tmpDir, "v2.txt") if err := os.WriteFile(v2, []byte("new content"), 0600); err != nil { t.Fatal(err) } @@ -1607,9 +1609,9 @@ func TestEdit_BlobReplaceBoth(t *testing.T) { } func TestEdit_AppendOnBlobErrors(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte("content"), 0600); err != nil { t.Fatal(err) } @@ -1627,9 +1629,9 @@ func TestEdit_AppendOnBlobErrors(t *testing.T) { } func TestEdit_PrependOnBlobErrors(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte("content"), 0600); err != nil { t.Fatal(err) } @@ -1663,9 +1665,9 @@ func TestEdit_LabelOnNonBlobErrors(t *testing.T) { } func TestEdit_FileAndPositionalMutuallyExclusive(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - testFile := filepath.Join(dir, "blob.txt") + testFile := filepath.Join(tmpDir, "blob.txt") if err := os.WriteFile(testFile, []byte("content"), 0600); err != nil { t.Fatal(err) } @@ -1685,9 +1687,9 @@ func TestEdit_FileAndPositionalMutuallyExclusive(t *testing.T) { // --- Import tests --- func TestImport_FromFile(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - importFile := filepath.Join(dir, "notes.txt") + importFile := filepath.Join(tmpDir, "notes.txt") if err := os.WriteFile(importFile, []byte("alpha\nbeta\ngamma\n"), 0600); err != nil { t.Fatal(err) } @@ -1713,9 +1715,9 @@ func TestImport_FromFile(t *testing.T) { } func TestImport_SkipsEmpty(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - importFile := filepath.Join(dir, "notes.txt") + importFile := filepath.Join(tmpDir, "notes.txt") if err := os.WriteFile(importFile, []byte("alpha\n\n\nbeta\n\n"), 0600); err != nil { t.Fatal(err) } @@ -1730,9 +1732,9 @@ func TestImport_SkipsEmpty(t *testing.T) { } func TestImport_EmptyFile(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - importFile := filepath.Join(dir, "empty.txt") + importFile := filepath.Join(tmpDir, "empty.txt") if err := os.WriteFile(importFile, []byte(""), 0600); err != nil { t.Fatal(err) } @@ -1747,7 +1749,7 @@ func TestImport_EmptyFile(t *testing.T) { } func TestImport_AppendsToExisting(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add 2 entries first for _, e := range []string{"existing1", "existing2"} { @@ -1756,7 +1758,7 @@ func TestImport_AppendsToExisting(t *testing.T) { } } - importFile := filepath.Join(dir, "notes.txt") + importFile := filepath.Join(tmpDir, "notes.txt") if err := os.WriteFile(importFile, []byte("new1\nnew2\nnew3\n"), 0600); err != nil { t.Fatal(err) } @@ -1822,9 +1824,9 @@ func TestImport_FileNotFound(t *testing.T) { } func TestImport_Encrypted(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - importFile := filepath.Join(dir, "notes.txt") + importFile := filepath.Join(tmpDir, "notes.txt") if err := os.WriteFile(importFile, []byte("secret1\nsecret2\n"), 0600); err != nil { t.Fatal(err) } @@ -1848,9 +1850,9 @@ func TestImport_Encrypted(t *testing.T) { } func TestImport_WhitespaceOnly(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - importFile := filepath.Join(dir, "blanks.txt") + importFile := filepath.Join(tmpDir, "blanks.txt") if err := os.WriteFile(importFile, []byte(" \n\t\n \t \n"), 0600); err != nil { t.Fatal(err) } @@ -1867,9 +1869,9 @@ func TestImport_WhitespaceOnly(t *testing.T) { // --- Import --blobs tests --- func TestImportBlobs_Basic(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -1907,9 +1909,9 @@ func TestImportBlobs_Basic(t *testing.T) { } func TestImportBlobs_SkipsDirectories(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(filepath.Join(blobDir, "subdir"), 0750); err != nil { t.Fatal(err) } @@ -1931,9 +1933,9 @@ func TestImportBlobs_SkipsDirectories(t *testing.T) { } func TestImportBlobs_SkipsTooLarge(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -1943,7 +1945,7 @@ func TestImportBlobs_SkipsTooLarge(t *testing.T) { t.Fatal(err) } // Oversized file - big := make([]byte, core.MaxBlobSize+1) + big := make([]byte, pad.MaxBlobSize+1) if err := os.WriteFile(filepath.Join(blobDir, "huge.bin"), big, 0600); err != nil { t.Fatal(err) @@ -1962,9 +1964,9 @@ func TestImportBlobs_SkipsTooLarge(t *testing.T) { } func TestImportBlobs_EmptyDir(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - blobDir := filepath.Join(dir, "empty") + blobDir := filepath.Join(tmpDir, "empty") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -1979,9 +1981,9 @@ func TestImportBlobs_EmptyDir(t *testing.T) { } func TestImportBlobs_NotADirectory(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - regularFile := filepath.Join(dir, "file.txt") + regularFile := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(regularFile, []byte("data"), 0600); err != nil { t.Fatal(err) } @@ -1996,14 +1998,14 @@ func TestImportBlobs_NotADirectory(t *testing.T) { } func TestImportBlobs_AppendsToExisting(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add a pre-existing entry if _, err := runCmd(newPadCmd("add", "existing note")); err != nil { t.Fatal(err) } - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -2034,9 +2036,9 @@ func TestImportBlobs_AppendsToExisting(t *testing.T) { } func TestImportBlobs_Encrypted(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -2064,9 +2066,9 @@ func TestImportBlobs_Encrypted(t *testing.T) { } func TestImportBlobs_BlobContent(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - blobDir := filepath.Join(dir, "blobs") + blobDir := filepath.Join(tmpDir, "blobs") if err := os.MkdirAll(blobDir, 0750); err != nil { t.Fatal(err) } @@ -2104,20 +2106,20 @@ func TestImportBlobs_BlobContent(t *testing.T) { // --- Export tests --- func TestExport_Basic(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add a plain entry and two blobs if _, err := runCmd(newPadCmd("add", "plain note")); err != nil { t.Fatal(err) } - f1 := filepath.Join(dir, "file1.txt") + f1 := filepath.Join(tmpDir, "file1.txt") if err := os.WriteFile(f1, []byte("content one"), 0600); err != nil { t.Fatal(err) } if _, err := runCmd(newPadCmd("add", "--file", f1, "blob1.txt")); err != nil { t.Fatal(err) } - f2 := filepath.Join(dir, "file2.md") + f2 := filepath.Join(tmpDir, "file2.md") if err := os.WriteFile(f2, []byte("content two"), 0600); err != nil { t.Fatal(err) } @@ -2125,7 +2127,7 @@ func TestExport_Basic(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") out, err := runCmd(newPadCmd("export", exportDir)) if err != nil { t.Fatalf("export error: %v", err) @@ -2184,10 +2186,10 @@ func TestExport_NoBlobsOnly(t *testing.T) { } func TestExport_CollisionTimestamp(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add a blob - f := filepath.Join(dir, "file.txt") + f := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(f, []byte("blob data"), 0600); err != nil { t.Fatal(err) } @@ -2195,7 +2197,7 @@ func TestExport_CollisionTimestamp(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") if err := os.MkdirAll(exportDir, 0o750); err != nil { t.Fatal(err) } @@ -2224,9 +2226,9 @@ func TestExport_CollisionTimestamp(t *testing.T) { } func TestExport_Force(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - f := filepath.Join(dir, "file.txt") + f := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(f, []byte("new data"), 0600); err != nil { t.Fatal(err) } @@ -2234,7 +2236,7 @@ func TestExport_Force(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") if err := os.MkdirAll(exportDir, 0o750); err != nil { t.Fatal(err) } @@ -2260,9 +2262,9 @@ func TestExport_Force(t *testing.T) { } func TestExport_DryRun(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - f := filepath.Join(dir, "file.txt") + f := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(f, []byte("content"), 0600); err != nil { t.Fatal(err) } @@ -2270,7 +2272,7 @@ func TestExport_DryRun(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") out, err := runCmd(newPadCmd("export", "--dry-run", exportDir)) if err != nil { @@ -2287,9 +2289,9 @@ func TestExport_DryRun(t *testing.T) { } func TestExport_DirCreated(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - f := filepath.Join(dir, "file.txt") + f := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(f, []byte("data"), 0600); err != nil { t.Fatal(err) } @@ -2297,7 +2299,7 @@ func TestExport_DirCreated(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "nested", "export", "dir") + exportDir := filepath.Join(tmpDir, "nested", "export", "dir") out, err := runCmd(newPadCmd("export", exportDir)) if err != nil { t.Fatalf("export error: %v", err) @@ -2313,9 +2315,9 @@ func TestExport_DirCreated(t *testing.T) { } func TestExport_Encrypted(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) - f := filepath.Join(dir, "secret.txt") + f := filepath.Join(tmpDir, "secret.txt") if err := os.WriteFile(f, []byte("secret content"), 0600); err != nil { t.Fatal(err) } @@ -2323,7 +2325,7 @@ func TestExport_Encrypted(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") out, err := runCmd(newPadCmd("export", exportDir)) if err != nil { t.Fatalf("export error: %v", err) @@ -2342,9 +2344,9 @@ func TestExport_Encrypted(t *testing.T) { } func TestExport_FilePermissions(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - f := filepath.Join(dir, "file.txt") + f := filepath.Join(tmpDir, "file.txt") if err := os.WriteFile(f, []byte("data"), 0600); err != nil { t.Fatal(err) } @@ -2352,7 +2354,7 @@ func TestExport_FilePermissions(t *testing.T) { t.Fatal(err) } - exportDir := filepath.Join(dir, "export") + exportDir := filepath.Join(tmpDir, "export") if _, err := runCmd(newPadCmd("export", exportDir)); err != nil { t.Fatal(err) } @@ -2393,7 +2395,7 @@ func writeEncryptedPad(t *testing.T, path string, key []byte, entries []string) } func TestMerge_Basic(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add existing entries to current pad. for _, e := range []string{"existing1", "existing2"} { @@ -2403,7 +2405,7 @@ func TestMerge_Basic(t *testing.T) { } // Create a plaintext file with 3 entries (1 duplicate). - mergeFile := filepath.Join(dir, "other.md") + mergeFile := filepath.Join(tmpDir, "other.md") writePlaintextPad(t, mergeFile, []string{"existing1", "new1", "new2"}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2427,7 +2429,7 @@ func TestMerge_Basic(t *testing.T) { } func TestMerge_AllDuplicates(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) for _, e := range []string{"alpha", "beta"} { if _, err := runCmd(newPadCmd("add", e)); err != nil { @@ -2435,7 +2437,7 @@ func TestMerge_AllDuplicates(t *testing.T) { } } - mergeFile := filepath.Join(dir, "dupes.md") + mergeFile := filepath.Join(tmpDir, "dupes.md") writePlaintextPad(t, mergeFile, []string{"alpha", "beta"}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2448,9 +2450,9 @@ func TestMerge_AllDuplicates(t *testing.T) { } func TestMerge_EmptyFile(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) - mergeFile := filepath.Join(dir, "empty.md") + mergeFile := filepath.Join(tmpDir, "empty.md") writePlaintextPad(t, mergeFile, []string{}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2463,16 +2465,16 @@ func TestMerge_EmptyFile(t *testing.T) { } func TestMerge_MultipleFiles(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) if _, err := runCmd(newPadCmd("add", "existing")); err != nil { t.Fatal(err) } - fileA := filepath.Join(dir, "a.md") + fileA := filepath.Join(tmpDir, "a.md") writePlaintextPad(t, fileA, []string{"from-a", "shared"}) - fileB := filepath.Join(dir, "b.md") + fileB := filepath.Join(tmpDir, "b.md") writePlaintextPad(t, fileB, []string{"from-b", "shared"}) out, err := runCmd(newPadCmd("merge", fileA, fileB)) @@ -2496,7 +2498,7 @@ func TestMerge_MultipleFiles(t *testing.T) { } func TestMerge_EncryptedInput(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) if _, err := runCmd(newPadCmd("add", "current")); err != nil { t.Fatal(err) @@ -2508,7 +2510,7 @@ func TestMerge_EncryptedInput(t *testing.T) { t.Fatal(loadErr) } - encFile := filepath.Join(dir, "other.enc") + encFile := filepath.Join(tmpDir, "other.enc") writeEncryptedPad(t, encFile, key, []string{"encrypted-entry"}) out, err := runCmd(newPadCmd("merge", encFile)) @@ -2529,10 +2531,10 @@ func TestMerge_EncryptedInput(t *testing.T) { } func TestMerge_PlaintextFallback(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Plaintext file will fail decryption (no key) and fall back. - mergeFile := filepath.Join(dir, "notes.md") + mergeFile := filepath.Join(tmpDir, "notes.md") writePlaintextPad(t, mergeFile, []string{"fallback-entry"}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2545,17 +2547,17 @@ func TestMerge_PlaintextFallback(t *testing.T) { } func TestMerge_MixedEncPlain(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) key, loadErr := crypto.LoadKey(rc.KeyPath()) if loadErr != nil { t.Fatal(loadErr) } - encFile := filepath.Join(dir, "enc.enc") + encFile := filepath.Join(tmpDir, "enc.enc") writeEncryptedPad(t, encFile, key, []string{"from-enc"}) - plainFile := filepath.Join(dir, "plain.md") + plainFile := filepath.Join(tmpDir, "plain.md") writePlaintextPad(t, plainFile, []string{"from-plain"}) out, err := runCmd(newPadCmd("merge", encFile, plainFile)) @@ -2568,13 +2570,13 @@ func TestMerge_MixedEncPlain(t *testing.T) { } func TestMerge_DryRun(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) if _, err := runCmd(newPadCmd("add", "existing")); err != nil { t.Fatal(err) } - mergeFile := filepath.Join(dir, "notes.md") + mergeFile := filepath.Join(tmpDir, "notes.md") writePlaintextPad(t, mergeFile, []string{"existing", "new-entry"}) out, err := runCmd(newPadCmd("merge", "--dry-run", mergeFile)) @@ -2596,20 +2598,20 @@ func TestMerge_DryRun(t *testing.T) { } func TestMerge_CustomKey(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Generate a foreign key. foreignKey, genErr := crypto.GenerateKey() if genErr != nil { t.Fatal(genErr) } - foreignKeyFile := filepath.Join(dir, "foreign.key") + foreignKeyFile := filepath.Join(tmpDir, "foreign.key") if err := crypto.SaveKey(foreignKeyFile, foreignKey); err != nil { t.Fatal(err) } // Create encrypted file with the foreign key. - encFile := filepath.Join(dir, "foreign.enc") + encFile := filepath.Join(tmpDir, "foreign.enc") writeEncryptedPad(t, encFile, foreignKey, []string{"foreign-secret"}) out, err := runCmd(newPadCmd("merge", "--key", foreignKeyFile, encFile)) @@ -2630,7 +2632,7 @@ func TestMerge_CustomKey(t *testing.T) { } func TestMerge_BlobEntries(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) blobEntry := core.MakeBlob("test.txt", []byte("hello world")) if _, err := runCmd(newPadCmd("add", "text-entry")); err != nil { @@ -2639,7 +2641,7 @@ func TestMerge_BlobEntries(t *testing.T) { // Create file with same blob + a new blob. newBlob := core.MakeBlob("new.txt", []byte("new content")) - mergeFile := filepath.Join(dir, "blobs.md") + mergeFile := filepath.Join(tmpDir, "blobs.md") writePlaintextPad(t, mergeFile, []string{blobEntry, newBlob}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2655,11 +2657,11 @@ func TestMerge_BlobEntries(t *testing.T) { } func TestMerge_BlobConflict(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add a blob with label "config.json". blob1 := core.MakeBlob("config.json", []byte(`{"v":1}`)) - mergeFile1 := filepath.Join(dir, "first.md") + mergeFile1 := filepath.Join(tmpDir, "first.md") writePlaintextPad(t, mergeFile1, []string{blob1}) if _, err := runCmd(newPadCmd("merge", mergeFile1)); err != nil { t.Fatal(err) @@ -2667,7 +2669,7 @@ func TestMerge_BlobConflict(t *testing.T) { // Merge a different blob with the same label. blob2 := core.MakeBlob("config.json", []byte(`{"v":2}`)) - mergeFile2 := filepath.Join(dir, "second.md") + mergeFile2 := filepath.Join(tmpDir, "second.md") writePlaintextPad(t, mergeFile2, []string{blob2}) out, err := runCmd(newPadCmd("merge", mergeFile2)) @@ -2683,10 +2685,10 @@ func TestMerge_BlobConflict(t *testing.T) { } func TestMerge_BinaryWarning(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Write raw binary data (not valid UTF-8). - binFile := filepath.Join(dir, "garbage.bin") + binFile := filepath.Join(tmpDir, "garbage.bin") if err := os.WriteFile(binFile, []byte{0xff, 0xfe, 0x00, 0x01, 0x80, 0x90}, 0600); err != nil { t.Fatal(err) } @@ -2710,10 +2712,10 @@ func TestMerge_FileNotFound(t *testing.T) { } func TestMerge_EmptyPadMerge(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Current pad is empty; merge entries into it. - mergeFile := filepath.Join(dir, "fresh.md") + mergeFile := filepath.Join(tmpDir, "fresh.md") writePlaintextPad(t, mergeFile, []string{"alpha", "beta", "gamma"}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2726,13 +2728,13 @@ func TestMerge_EmptyPadMerge(t *testing.T) { } func TestMerge_PlaintextMode(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) if _, err := runCmd(newPadCmd("add", "plaintext-existing")); err != nil { t.Fatal(err) } - mergeFile := filepath.Join(dir, "notes.md") + mergeFile := filepath.Join(tmpDir, "notes.md") writePlaintextPad(t, mergeFile, []string{"plaintext-new"}) out, err := runCmd(newPadCmd("merge", mergeFile)) @@ -2744,7 +2746,7 @@ func TestMerge_PlaintextMode(t *testing.T) { } // Verify the scratchpad.md file is plaintext. - padPath := filepath.Join(dir, config.DirContext, config.FileScratchpadMd) + padPath := filepath.Join(tmpDir, dir.Context, pad.Md) data, readErr := os.ReadFile(padPath) if readErr != nil { t.Fatal(readErr) @@ -2757,7 +2759,7 @@ func TestMerge_PlaintextMode(t *testing.T) { } func TestMerge_PreservesOrder(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Add entries in specific order. for _, e := range []string{"first", "second", "third"} { @@ -2766,7 +2768,7 @@ func TestMerge_PreservesOrder(t *testing.T) { } } - mergeFile := filepath.Join(dir, "new.md") + mergeFile := filepath.Join(tmpDir, "new.md") writePlaintextPad(t, mergeFile, []string{"fourth", "fifth"}) if _, err := runCmd(newPadCmd("merge", mergeFile)); err != nil { @@ -2774,7 +2776,7 @@ func TestMerge_PreservesOrder(t *testing.T) { } // Read the raw pad and verify order. - padPath := filepath.Join(dir, config.DirContext, config.FileScratchpadMd) + padPath := filepath.Join(tmpDir, dir.Context, pad.Md) data, readErr := os.ReadFile(padPath) if readErr != nil { t.Fatal(readErr) @@ -2792,17 +2794,17 @@ func TestMerge_PreservesOrder(t *testing.T) { } func TestMerge_CrossFileDedup(t *testing.T) { - dir := setupPlaintext(t) + tmpDir := setupPlaintext(t) // Merge two files where entries overlap across files AND with current pad. if _, err := runCmd(newPadCmd("add", "base")); err != nil { t.Fatal(err) } - fileA := filepath.Join(dir, "a.md") + fileA := filepath.Join(tmpDir, "a.md") writePlaintextPad(t, fileA, []string{"base", "unique-a", "shared-ab"}) - fileB := filepath.Join(dir, "b.md") + fileB := filepath.Join(tmpDir, "b.md") writePlaintextPad(t, fileB, []string{"shared-ab", "unique-b"}) out, err := runCmd(newPadCmd("merge", fileA, fileB)) @@ -2817,11 +2819,11 @@ func TestMerge_CrossFileDedup(t *testing.T) { } func TestMerge_EncryptedWithBlobDedup(t *testing.T) { - dir := setupEncrypted(t) + tmpDir := setupEncrypted(t) // Add a blob to the current pad. blob := core.MakeBlob("readme.md", []byte("# README")) - f := filepath.Join(dir, "tmp-readme.md") + f := filepath.Join(tmpDir, "tmp-readme.md") if err := os.WriteFile(f, []byte("# README"), 0600); err != nil { t.Fatal(err) } @@ -2836,7 +2838,7 @@ func TestMerge_EncryptedWithBlobDedup(t *testing.T) { } // Create encrypted file with the same blob. - encFile := filepath.Join(dir, "merge.enc") + encFile := filepath.Join(tmpDir, "merge.enc") writeEncryptedPad(t, encFile, key, []string{blob, "new-text"}) out, err := runCmd(newPadCmd("merge", encFile)) diff --git a/internal/cli/pause/cmd/root/cmd.go b/internal/cli/pause/cmd/root/cmd.go index d37451a9..83ec5f29 100644 --- a/internal/cli/pause/cmd/root/cmd.go +++ b/internal/cli/pause/cmd/root/cmd.go @@ -5,3 +5,30 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the top-level "ctx pause" command. +// +// Returns: +// - *cobra.Command: Configured pause command +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyPause) + cmd := &cobra.Command{ + Use: "pause", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + sessionID, _ := cmd.Flags().GetString("session-id") + return Run(cmd, sessionID) + }, + } + cmd.Flags().String("session-id", "", + assets.FlagDesc(assets.FlagDescKeyPauseSessionId), + ) + return cmd +} diff --git a/internal/cli/pause/cmd/root/doc.go b/internal/cli/pause/cmd/root/doc.go new file mode 100644 index 00000000..6b0eb183 --- /dev/null +++ b/internal/cli/pause/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx pause command. +// +// It pauses context hooks for the current session so that ctx hook +// callbacks are temporarily suppressed. +package root diff --git a/internal/cli/pause/cmd/root/run.go b/internal/cli/pause/cmd/root/run.go index e6930f21..8027bb67 100644 --- a/internal/cli/pause/cmd/root/run.go +++ b/internal/cli/pause/cmd/root/run.go @@ -7,12 +7,12 @@ package root import ( - "fmt" "os" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/write" ) // Run executes the pause command. @@ -28,6 +28,6 @@ func Run(cmd *cobra.Command, sessionID string) error { sessionID = core.ReadSessionID(os.Stdin) } core.Pause(sessionID) - cmd.Println(fmt.Sprintf("Context hooks paused for session %s", sessionID)) + write.SessionPaused(cmd, sessionID) return nil } diff --git a/internal/cli/pause/pause.go b/internal/cli/pause/pause.go index 360e04a7..1f92da8d 100644 --- a/internal/cli/pause/pause.go +++ b/internal/cli/pause/pause.go @@ -9,25 +9,10 @@ package pause import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" pauseroot "github.com/ActiveMemory/ctx/internal/cli/pause/cmd/root" ) // Cmd returns the top-level "ctx pause" command. -// -// Returns: -// - *cobra.Command: Configured pause command func Cmd() *cobra.Command { - short, long := assets.CommandDesc("pause") - cmd := &cobra.Command{ - Use: "pause", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, _ []string) error { - sessionID, _ := cmd.Flags().GetString("session-id") - return pauseroot.Run(cmd, sessionID) - }, - } - cmd.Flags().String("session-id", "", assets.FlagDesc("pause.session-id")) - return cmd + return pauseroot.Cmd() } diff --git a/internal/cli/pause/pause_test.go b/internal/cli/pause/pause_test.go index 05fccb1b..4cd25449 100644 --- a/internal/cli/pause/pause_test.go +++ b/internal/cli/pause/pause_test.go @@ -13,19 +13,19 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) func setupStateDir(t *testing.T) string { t.Helper() - dir := t.TempDir() - t.Setenv("CTX_DIR", dir) + tmpDir := t.TempDir() + t.Setenv("CTX_DIR", tmpDir) rc.Reset() - if mkErr := os.MkdirAll(filepath.Join(dir, config.DirState), 0o750); mkErr != nil { + if mkErr := os.MkdirAll(filepath.Join(tmpDir, dir.State), 0o750); mkErr != nil { t.Fatal(mkErr) } - return dir + return tmpDir } func TestCmd_WithSessionIDFlag(t *testing.T) { diff --git a/internal/cli/permissions/cmd/restore/cmd.go b/internal/cli/permissions/cmd/restore/cmd.go index abfe8cee..ceb84617 100644 --- a/internal/cli/permissions/cmd/restore/cmd.go +++ b/internal/cli/permissions/cmd/restore/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured restore subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("permissions.restore") + short, long := assets.CommandDesc(assets.CmdDescKeyPermissionsRestore) return &cobra.Command{ Use: "restore", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return RunRestore(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/permissions/cmd/restore/doc.go b/internal/cli/permissions/cmd/restore/doc.go new file mode 100644 index 00000000..d374df5d --- /dev/null +++ b/internal/cli/permissions/cmd/restore/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package restore implements the ctx permissions restore subcommand. +// +// It restores file permissions from a previously saved snapshot. +package restore diff --git a/internal/cli/permissions/cmd/restore/run.go b/internal/cli/permissions/cmd/restore/run.go index 6b0d44d6..3f929b3d 100644 --- a/internal/cli/permissions/cmd/restore/run.go +++ b/internal/cli/permissions/cmd/restore/run.go @@ -9,97 +9,76 @@ package restore import ( "bytes" "encoding/json" - "fmt" "os" + claude2 "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/claude" "github.com/ActiveMemory/ctx/internal/cli/permissions/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunRestore resets settings.local.json from the golden image. +// Run resets settings.local.json from the golden image. // // Parameters: // - cmd: Cobra command for output // // Returns: // - error: Non-nil on read/write/parse failure or missing golden file -func RunRestore(cmd *cobra.Command) error { - goldenBytes, goldenReadErr := os.ReadFile(config.FileSettingsGolden) +func Run(cmd *cobra.Command) error { + goldenBytes, goldenReadErr := os.ReadFile(claude2.SettingsGolden) if goldenReadErr != nil { if os.IsNotExist(goldenReadErr) { - return core.ErrGoldenNotFound() + return ctxerr.GoldenNotFound() } - return core.ErrReadFile(config.FileSettingsGolden, goldenReadErr) + return ctxerr.FileRead(claude2.SettingsGolden, goldenReadErr) } - localBytes, localReadErr := os.ReadFile(config.FileSettings) + localBytes, localReadErr := os.ReadFile(claude2.Settings) if localReadErr != nil { if os.IsNotExist(localReadErr) { - // No local file — just copy golden. - if writeErr := os.WriteFile(config.FileSettings, goldenBytes, config.PermFile); writeErr != nil { - return core.ErrWriteFile(config.FileSettings, writeErr) + if writeErr := os.WriteFile( + claude2.Settings, goldenBytes, fs.PermFile, + ); writeErr != nil { + return ctxerr.FileWrite(claude2.Settings, writeErr) } - cmd.Println("Restored golden image (no local settings existed).") + write.RestoreNoLocal(cmd) return nil } - return core.ErrReadFile(config.FileSettings, localReadErr) + return ctxerr.FileRead(claude2.Settings, localReadErr) } - // Fast path: files are identical. if bytes.Equal(goldenBytes, localBytes) { - cmd.Println("Settings already match golden image.") + write.RestoreMatch(cmd) return nil } - // Parse both to compute permission diff. var golden, local claude.Settings if goldenParseErr := json.Unmarshal(goldenBytes, &golden); goldenParseErr != nil { - return core.ErrParseSettings(config.FileSettingsGolden, goldenParseErr) + return ctxerr.ParseFile(claude2.SettingsGolden, goldenParseErr) } if localParseErr := json.Unmarshal(localBytes, &local); localParseErr != nil { - return core.ErrParseSettings(config.FileSettings, localParseErr) + return ctxerr.ParseFile(claude2.Settings, localParseErr) } - restored, dropped := core.DiffStringSlices(golden.Permissions.Allow, local.Permissions.Allow) - denyRestored, denyDropped := core.DiffStringSlices(golden.Permissions.Deny, local.Permissions.Deny) + restored, dropped := core.DiffStringSlices( + golden.Permissions.Allow, local.Permissions.Allow, + ) + denyRestored, denyDropped := core.DiffStringSlices( + golden.Permissions.Deny, local.Permissions.Deny, + ) - if len(dropped) > 0 { - cmd.Println(fmt.Sprintf("Dropped %d session allow permission(s):", len(dropped))) - for _, p := range dropped { - cmd.Println(fmt.Sprintf(" - %s", p)) - } - } - if len(restored) > 0 { - cmd.Println(fmt.Sprintf("Restored %d allow permission(s):", len(restored))) - for _, p := range restored { - cmd.Println(fmt.Sprintf(" + %s", p)) - } - } - if len(denyDropped) > 0 { - cmd.Println(fmt.Sprintf("Dropped %d session deny rule(s):", len(denyDropped))) - for _, p := range denyDropped { - cmd.Println(fmt.Sprintf(" - %s", p)) - } - } - if len(denyRestored) > 0 { - cmd.Println(fmt.Sprintf("Restored %d deny rule(s):", len(denyRestored))) - for _, p := range denyRestored { - cmd.Println(fmt.Sprintf(" + %s", p)) - } - } - allEmpty := len(dropped) == 0 && len(restored) == 0 && len(denyDropped) == 0 && len(denyRestored) == 0 - if allEmpty { - cmd.Println("Permission lists match; other settings differ.") - } + write.RestoreDiff(cmd, dropped, restored, denyDropped, denyRestored) - // Write golden bytes (byte-for-byte copy). - if writeErr := os.WriteFile(config.FileSettings, goldenBytes, config.PermFile); writeErr != nil { - return core.ErrWriteFile(config.FileSettings, writeErr) + if writeErr := os.WriteFile( + claude2.Settings, goldenBytes, fs.PermFile, + ); writeErr != nil { + return ctxerr.FileWrite(claude2.Settings, writeErr) } - cmd.Println("Restored from golden image.") + write.RestoreDone(cmd) return nil } diff --git a/internal/cli/permissions/cmd/snapshot/cmd.go b/internal/cli/permissions/cmd/snapshot/cmd.go index ae722c3c..7be8ccc7 100644 --- a/internal/cli/permissions/cmd/snapshot/cmd.go +++ b/internal/cli/permissions/cmd/snapshot/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured snapshot subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("permissions.snapshot") + short, long := assets.CommandDesc(assets.CmdDescKeyPermissionsSnapshot) return &cobra.Command{ Use: "snapshot", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return RunSnapshot(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/permissions/cmd/snapshot/doc.go b/internal/cli/permissions/cmd/snapshot/doc.go new file mode 100644 index 00000000..1043ac9a --- /dev/null +++ b/internal/cli/permissions/cmd/snapshot/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package snapshot implements the ctx permissions snapshot subcommand. +// +// It captures a point-in-time snapshot of file permissions for later +// restoration. +package snapshot diff --git a/internal/cli/permissions/cmd/snapshot/run.go b/internal/cli/permissions/cmd/snapshot/run.go index 9a241dd2..aeb308da 100644 --- a/internal/cli/permissions/cmd/snapshot/run.go +++ b/internal/cli/permissions/cmd/snapshot/run.go @@ -7,41 +7,43 @@ package snapshot import ( - "fmt" "os" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/cli/permissions/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunSnapshot saves settings.local.json as the golden image. +// Run saves settings.local.json as the golden image. // // Parameters: // - cmd: Cobra command for output // // Returns: // - error: Non-nil on read/write failure or missing settings file -func RunSnapshot(cmd *cobra.Command) error { - content, readErr := os.ReadFile(config.FileSettings) +func Run(cmd *cobra.Command) error { + content, readErr := os.ReadFile(claude.Settings) if readErr != nil { if os.IsNotExist(readErr) { - return core.ErrSettingsNotFound() + return ctxerr.SettingsNotFound() } - return core.ErrReadFile(config.FileSettings, readErr) + return ctxerr.FileRead(claude.Settings, readErr) } - // Determine message based on whether golden already exists. - verb := "Saved" - if _, statErr := os.Stat(config.FileSettingsGolden); statErr == nil { - verb = "Updated" + updated := false + if _, statErr := os.Stat(claude.SettingsGolden); statErr == nil { + updated = true } - if writeErr := os.WriteFile(config.FileSettingsGolden, content, config.PermFile); writeErr != nil { - return core.ErrWriteFile(config.FileSettingsGolden, writeErr) + if writeErr := os.WriteFile( + claude.SettingsGolden, content, fs.PermFile, + ); writeErr != nil { + return ctxerr.FileWrite(claude.SettingsGolden, writeErr) } - cmd.Println(fmt.Sprintf("%s golden image: %s", verb, config.FileSettingsGolden)) + write.SnapshotDone(cmd, updated, claude.SettingsGolden) return nil } diff --git a/internal/cli/permissions/core/err.go b/internal/cli/permissions/core/err.go deleted file mode 100644 index af16a85b..00000000 --- a/internal/cli/permissions/core/err.go +++ /dev/null @@ -1,61 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package core - -import "fmt" - -// ErrSettingsNotFound returns an error when settings.local.json is missing. -// -// Returns: -// - error: Descriptive error for missing settings file -func ErrSettingsNotFound() error { - return fmt.Errorf("no .claude/settings.local.json found") -} - -// ErrGoldenNotFound returns an error when settings.golden.json is missing. -// -// Returns: -// - error: Descriptive error advising the user to run snapshot first -func ErrGoldenNotFound() error { - return fmt.Errorf("no .claude/settings.golden.json found — run 'ctx permissions snapshot' first") -} - -// ErrReadFile wraps a file read failure. -// -// Parameters: -// - path: File path that failed to read -// - err: Underlying read error -// -// Returns: -// - error: Wrapped error with file path context -func ErrReadFile(path string, err error) error { - return fmt.Errorf("failed to read %s: %w", path, err) -} - -// ErrWriteFile wraps a file write failure. -// -// Parameters: -// - path: File path that failed to write -// - err: Underlying write error -// -// Returns: -// - error: Wrapped error with file path context -func ErrWriteFile(path string, err error) error { - return fmt.Errorf("failed to write %s: %w", path, err) -} - -// ErrParseSettings wraps a JSON parse failure for a settings file. -// -// Parameters: -// - path: File path that failed to parse -// - err: Underlying parse error -// -// Returns: -// - error: Wrapped error with file path context -func ErrParseSettings(path string, err error) error { - return fmt.Errorf("failed to parse %s: %w", path, err) -} diff --git a/internal/cli/permissions/permissions.go b/internal/cli/permissions/permissions.go index 190e3cf3..112f1b4e 100644 --- a/internal/cli/permissions/permissions.go +++ b/internal/cli/permissions/permissions.go @@ -24,7 +24,7 @@ import ( // Returns: // - *cobra.Command: Configured permissions command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("permissions") + short, long := assets.CommandDesc(assets.CmdDescKeyPermissions) cmd := &cobra.Command{ Use: "permissions", diff --git a/internal/cli/permissions/permissions_test.go b/internal/cli/permissions/permissions_test.go index f032c8ef..981994b5 100644 --- a/internal/cli/permissions/permissions_test.go +++ b/internal/cli/permissions/permissions_test.go @@ -15,7 +15,9 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/permissions/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/fs" ) // setupDir creates a temp dir with .claude/, chdirs into it, and returns cleanup. @@ -28,7 +30,7 @@ func setupDir(t *testing.T) { } t.Cleanup(func() { _ = os.Chdir(origDir) }) - if err := os.MkdirAll(config.DirClaude, config.PermExec); err != nil { + if err := os.MkdirAll(dir.Claude, fs.PermExec); err != nil { t.Fatal(err) } } @@ -36,7 +38,7 @@ func setupDir(t *testing.T) { // writeSettings writes JSON content to settings.local.json. func writeSettings(t *testing.T, content string) { t.Helper() - if err := os.WriteFile(config.FileSettings, []byte(content), config.PermFile); err != nil { + if err := os.WriteFile(claude.Settings, []byte(content), fs.PermFile); err != nil { t.Fatal(err) } } @@ -44,7 +46,7 @@ func writeSettings(t *testing.T, content string) { // writeGolden writes JSON content to settings.golden.json. func writeGolden(t *testing.T, content string) { t.Helper() - if err := os.WriteFile(config.FileSettingsGolden, []byte(content), config.PermFile); err != nil { + if err := os.WriteFile(claude.SettingsGolden, []byte(content), fs.PermFile); err != nil { t.Fatal(err) } } @@ -75,7 +77,7 @@ func TestSnapshotCreatesGoldenFile(t *testing.T) { } // Verify golden file is a byte-for-byte copy. - golden, readErr := os.ReadFile(config.FileSettingsGolden) + golden, readErr := os.ReadFile(claude.SettingsGolden) if readErr != nil { t.Fatal(readErr) } @@ -98,7 +100,7 @@ func TestSnapshotOverwritesExisting(t *testing.T) { t.Errorf("output = %q, want 'Updated'", out) } - golden, readErr := os.ReadFile(config.FileSettingsGolden) + golden, readErr := os.ReadFile(claude.SettingsGolden) if readErr != nil { t.Fatal(readErr) } @@ -139,7 +141,7 @@ func TestRestoreFromGolden(t *testing.T) { } // Verify settings file now matches golden. - data, readErr := os.ReadFile(config.FileSettings) + data, readErr := os.ReadFile(claude.Settings) if readErr != nil { t.Fatal(readErr) } @@ -226,7 +228,7 @@ func TestRestoreNoLocalFile(t *testing.T) { } // Verify settings file was created from golden. - data, readErr := os.ReadFile(config.FileSettings) + data, readErr := os.ReadFile(claude.Settings) if readErr != nil { t.Fatal(readErr) } @@ -381,7 +383,7 @@ func TestSnapshotPreservesExactBytes(t *testing.T) { t.Fatal(err) } - golden, readErr := os.ReadFile(config.FileSettingsGolden) + golden, readErr := os.ReadFile(claude.SettingsGolden) if readErr != nil { t.Fatal(readErr) } @@ -402,7 +404,7 @@ func TestRestorePreservesExactBytes(t *testing.T) { t.Fatal(err) } - data, readErr := os.ReadFile(config.FileSettings) + data, readErr := os.ReadFile(claude.Settings) if readErr != nil { t.Fatal(readErr) } @@ -431,7 +433,7 @@ func TestCmdHasSubcommands(t *testing.T) { // Verify golden file path is under .claude/ (not .context/). func TestGoldenFilePath(t *testing.T) { - if !strings.HasPrefix(config.FileSettingsGolden, config.DirClaude+"/") { - t.Errorf("FileSettingsGolden = %q, want prefix %q", config.FileSettingsGolden, config.DirClaude+"/") + if !strings.HasPrefix(claude.SettingsGolden, dir.Claude+"/") { + t.Errorf("SettingsGolden = %q, want prefix %q", claude.SettingsGolden, dir.Claude+"/") } } diff --git a/internal/cli/prompt/cmd/add/cmd.go b/internal/cli/prompt/cmd/add/cmd.go index f966a948..c27e4359 100644 --- a/internal/cli/prompt/cmd/add/cmd.go +++ b/internal/cli/prompt/cmd/add/cmd.go @@ -19,7 +19,7 @@ import ( func Cmd() *cobra.Command { var fromStdin bool - short, long := assets.CommandDesc("prompt.add") + short, long := assets.CommandDesc(assets.CmdDescKeyPromptAdd) cmd := &cobra.Command{ Use: "add NAME", @@ -31,7 +31,9 @@ func Cmd() *cobra.Command { }, } - cmd.Flags().BoolVar(&fromStdin, "stdin", false, assets.FlagDesc("prompt.add.stdin")) + cmd.Flags().BoolVar(&fromStdin, + "stdin", false, assets.FlagDesc(assets.FlagDescKeyPromptAddStdin), + ) return cmd } diff --git a/internal/cli/prompt/cmd/add/doc.go b/internal/cli/prompt/cmd/add/doc.go new file mode 100644 index 00000000..0ccab381 --- /dev/null +++ b/internal/cli/prompt/cmd/add/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package add implements the ctx prompt add subcommand. +// +// It registers a new named prompt template, accepting content from +// arguments or stdin. +package add diff --git a/internal/cli/prompt/cmd/add/run.go b/internal/cli/prompt/cmd/add/run.go index 75bc5033..3ecf88ed 100644 --- a/internal/cli/prompt/cmd/add/run.go +++ b/internal/cli/prompt/cmd/add/run.go @@ -7,16 +7,18 @@ package add import ( - "fmt" "io" "os" "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/cli/prompt/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // runAdd creates a new prompt template file. @@ -30,15 +32,15 @@ import ( // - error: Non-nil on write failure, duplicate name, or missing template func runAdd(cmd *cobra.Command, name string, fromStdin bool) error { dir := core.PromptsDir() - if mkdirErr := os.MkdirAll(dir, config.PermExec); mkdirErr != nil { - return fmt.Errorf("create prompts directory: %w", mkdirErr) + if mkdirErr := os.MkdirAll(dir, fs.PermExec); mkdirErr != nil { + return ctxerr.Mkdir("prompts directory", mkdirErr) } - path := filepath.Join(dir, name+config.ExtMarkdown) + path := filepath.Join(dir, name+file.ExtMarkdown) // Check if file already exists. if _, statErr := os.Stat(path); statErr == nil { - return fmt.Errorf("prompt %q already exists", name) + return ctxerr.PromptExists(name) } var content []byte @@ -47,21 +49,21 @@ func runAdd(cmd *cobra.Command, name string, fromStdin bool) error { var readErr error content, readErr = io.ReadAll(cmd.InOrStdin()) if readErr != nil { - return fmt.Errorf("read stdin: %w", readErr) + return ctxerr.ReadInput(readErr) } } else { // Try to load from embedded starter templates. var templateErr error - content, templateErr = assets.PromptTemplate(name + config.ExtMarkdown) + content, templateErr = assets.PromptTemplate(name + file.ExtMarkdown) if templateErr != nil { - return fmt.Errorf("no embedded template %q — use --stdin to provide content", name) + return ctxerr.NoPromptTemplate(name) } } - if writeErr := os.WriteFile(path, content, config.PermFile); writeErr != nil { - return fmt.Errorf("write prompt: %w", writeErr) + if writeErr := os.WriteFile(path, content, fs.PermFile); writeErr != nil { + return ctxerr.WriteFileFailed(writeErr) } - cmd.Println(fmt.Sprintf("Created prompt %q.", name)) + write.PromptCreated(cmd, name) return nil } diff --git a/internal/cli/prompt/cmd/list/cmd.go b/internal/cli/prompt/cmd/list/cmd.go index 8202f2c3..15bd6387 100644 --- a/internal/cli/prompt/cmd/list/cmd.go +++ b/internal/cli/prompt/cmd/list/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured list subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("prompt.list") + short, _ := assets.CommandDesc(assets.CmdDescKeyPromptList) return &cobra.Command{ Use: "list", Short: short, Args: cobra.NoArgs, RunE: func(cmd *cobra.Command, _ []string) error { - return RunList(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/prompt/cmd/list/doc.go b/internal/cli/prompt/cmd/list/doc.go new file mode 100644 index 00000000..834def0b --- /dev/null +++ b/internal/cli/prompt/cmd/list/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package list implements the ctx prompt list subcommand. +// +// It displays all registered prompt templates. +package list diff --git a/internal/cli/prompt/cmd/list/run.go b/internal/cli/prompt/cmd/list/run.go index 4f9597d8..7320b40b 100644 --- a/internal/cli/prompt/cmd/list/run.go +++ b/internal/cli/prompt/cmd/list/run.go @@ -7,47 +7,48 @@ package list import ( - "fmt" "os" "strings" + "github.com/ActiveMemory/ctx/internal/config/file" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/prompt/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunList prints all available prompt template names. +// Run prints all available prompt template names. // // Parameters: // - cmd: Cobra command for output // // Returns: // - error: Non-nil on read failure -func RunList(cmd *cobra.Command) error { +func Run(cmd *cobra.Command) error { dir := core.PromptsDir() entries, readErr := os.ReadDir(dir) if readErr != nil { if os.IsNotExist(readErr) { - cmd.Println("No prompts found. Run 'ctx init' or 'ctx prompt add' to create prompts.") + write.PromptNone(cmd) return nil } - return fmt.Errorf("read prompts directory: %w", readErr) + return ctxerr.ReadDirectory(dir, readErr) } var found bool for _, entry := range entries { name := entry.Name() - if entry.IsDir() || !strings.HasSuffix(name, config.ExtMarkdown) { + if entry.IsDir() || !strings.HasSuffix(name, file.ExtMarkdown) { continue } - cmd.Println(fmt.Sprintf(" %s", strings.TrimSuffix(name, config.ExtMarkdown))) + write.PromptItem(cmd, strings.TrimSuffix(name, file.ExtMarkdown)) found = true } if !found { - cmd.Println("No prompts found. Run 'ctx init' or 'ctx prompt add' to create prompts.") + write.PromptNone(cmd) } return nil diff --git a/internal/cli/prompt/cmd/rm/cmd.go b/internal/cli/prompt/cmd/rm/cmd.go index 3905742b..b841d05d 100644 --- a/internal/cli/prompt/cmd/rm/cmd.go +++ b/internal/cli/prompt/cmd/rm/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured rm subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("prompt.rm") + short, _ := assets.CommandDesc(assets.CmdDescKeyPromptRm) return &cobra.Command{ Use: "rm NAME", Short: short, Args: cobra.ExactArgs(1), RunE: func(cmd *cobra.Command, args []string) error { - return runRm(cmd, args[0]) + return Run(cmd, args[0]) }, } } diff --git a/internal/cli/prompt/cmd/rm/doc.go b/internal/cli/prompt/cmd/rm/doc.go new file mode 100644 index 00000000..651aed6c --- /dev/null +++ b/internal/cli/prompt/cmd/rm/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package rm implements the ctx prompt rm subcommand. +// +// It removes a named prompt template by name. +package rm diff --git a/internal/cli/prompt/cmd/rm/run.go b/internal/cli/prompt/cmd/rm/run.go index 48e7319f..0d6ba544 100644 --- a/internal/cli/prompt/cmd/rm/run.go +++ b/internal/cli/prompt/cmd/rm/run.go @@ -7,17 +7,18 @@ package rm import ( - "fmt" "os" "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/file" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/prompt/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// runRm deletes a prompt template by name. +// Run deletes a prompt template by name. // // Parameters: // - cmd: Cobra command for output @@ -25,17 +26,17 @@ import ( // // Returns: // - error: Non-nil on missing template or remove failure -func runRm(cmd *cobra.Command, name string) error { - path := filepath.Join(core.PromptsDir(), name+config.ExtMarkdown) +func Run(cmd *cobra.Command, name string) error { + path := filepath.Join(core.PromptsDir(), name+file.ExtMarkdown) if _, statErr := os.Stat(path); os.IsNotExist(statErr) { - return fmt.Errorf("prompt %q not found", name) + return ctxerr.PromptNotFound(name) } if removeErr := os.Remove(path); removeErr != nil { - return fmt.Errorf("remove prompt: %w", removeErr) + return ctxerr.RemovePrompt(removeErr) } - cmd.Println(fmt.Sprintf("Removed prompt %q.", name)) + write.PromptRemoved(cmd, name) return nil } diff --git a/internal/cli/prompt/cmd/show/cmd.go b/internal/cli/prompt/cmd/show/cmd.go index a6f449f7..f7e4cec0 100644 --- a/internal/cli/prompt/cmd/show/cmd.go +++ b/internal/cli/prompt/cmd/show/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured show subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("prompt.show") + short, _ := assets.CommandDesc(assets.CmdDescKeyPromptShow) return &cobra.Command{ Use: "show NAME", Short: short, Args: cobra.ExactArgs(1), RunE: func(cmd *cobra.Command, args []string) error { - return runShow(cmd, args[0]) + return Run(cmd, args[0]) }, } } diff --git a/internal/cli/prompt/cmd/show/doc.go b/internal/cli/prompt/cmd/show/doc.go new file mode 100644 index 00000000..21d20a8b --- /dev/null +++ b/internal/cli/prompt/cmd/show/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package show implements the ctx prompt show subcommand. +// +// It prints the content of a named prompt template. +package show diff --git a/internal/cli/prompt/cmd/show/run.go b/internal/cli/prompt/cmd/show/run.go index b8bb0edd..4155e3d8 100644 --- a/internal/cli/prompt/cmd/show/run.go +++ b/internal/cli/prompt/cmd/show/run.go @@ -7,17 +7,17 @@ package show import ( - "fmt" "os" - "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/file" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/prompt/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/validation" ) -// runShow reads and prints a prompt template by name. +// Run reads and prints a prompt template by name. // // Parameters: // - cmd: Cobra command for output @@ -25,15 +25,15 @@ import ( // // Returns: // - error: Non-nil on read failure or missing template -func runShow(cmd *cobra.Command, name string) error { - path := filepath.Join(core.PromptsDir(), name+config.ExtMarkdown) - - content, readErr := os.ReadFile(path) //nolint:gosec // user-provided name is intentional +func Run(cmd *cobra.Command, name string) error { + content, readErr := validation.SafeReadFile( + core.PromptsDir(), name+file.ExtMarkdown, + ) if readErr != nil { if os.IsNotExist(readErr) { - return fmt.Errorf("prompt %q not found", name) + return ctxerr.PromptNotFound(name) } - return fmt.Errorf("read prompt: %w", readErr) + return ctxerr.ReadFile(readErr) } cmd.Print(string(content)) diff --git a/internal/cli/prompt/core/paths.go b/internal/cli/prompt/core/path.go similarity index 80% rename from internal/cli/prompt/core/paths.go rename to internal/cli/prompt/core/path.go index a49d65e2..ada26ac1 100644 --- a/internal/cli/prompt/core/paths.go +++ b/internal/cli/prompt/core/path.go @@ -9,7 +9,7 @@ package core import ( "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -18,5 +18,5 @@ import ( // Returns: // - string: Absolute path to .context/prompts/ func PromptsDir() string { - return filepath.Join(rc.ContextDir(), config.DirPrompts) + return filepath.Join(rc.ContextDir(), dir.Prompts) } diff --git a/internal/cli/prompt/prompt.go b/internal/cli/prompt/prompt.go index a9300d90..2aae08da 100644 --- a/internal/cli/prompt/prompt.go +++ b/internal/cli/prompt/prompt.go @@ -23,14 +23,14 @@ import ( // Returns: // - *cobra.Command: Configured prompt command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("prompt") + short, long := assets.CommandDesc(assets.CmdDescKeyPrompt) cmd := &cobra.Command{ Use: "prompt", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return list.RunList(cmd) + return list.Run(cmd) }, } diff --git a/internal/cli/prompt/prompt_test.go b/internal/cli/prompt/prompt_test.go index 82de80df..ae591e31 100644 --- a/internal/cli/prompt/prompt_test.go +++ b/internal/cli/prompt/prompt_test.go @@ -13,9 +13,10 @@ import ( "strings" "testing" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -23,9 +24,9 @@ import ( // dir override, and returns the temp dir path. func setup(t *testing.T) string { t.Helper() - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { @@ -34,14 +35,14 @@ func setup(t *testing.T) string { }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) - if err := os.MkdirAll(ctxDir, config.PermExec); err != nil { + ctxDir := filepath.Join(tmpDir, dir.Context) + if err := os.MkdirAll(ctxDir, fs.PermExec); err != nil { t.Fatal(err) } - return dir + return tmpDir } // newPromptCmd builds a fresh command with the given args. @@ -86,17 +87,17 @@ func TestList_NoDir(t *testing.T) { } func TestList_WithPrompts(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) // Create prompts directory with files - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } - if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("# Review"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("# Review"), fs.PermFile); err != nil { t.Fatal(err) } - if err := os.WriteFile(filepath.Join(promptDir, "debug.md"), []byte("# Debug"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(promptDir, "debug.md"), []byte("# Debug"), fs.PermFile); err != nil { t.Fatal(err) } @@ -113,14 +114,14 @@ func TestList_WithPrompts(t *testing.T) { } func TestShow(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } content := "# Code Review\n\nReview this code.\n" - if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte(content), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte(content), fs.PermFile); err != nil { t.Fatal(err) } @@ -146,11 +147,11 @@ func TestShow_Missing(t *testing.T) { } func TestAdd_FromTemplate(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) // Create prompts dir - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -173,10 +174,10 @@ func TestAdd_FromTemplate(t *testing.T) { } func TestAdd_FromStdin(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -201,13 +202,13 @@ func TestAdd_FromStdin(t *testing.T) { } func TestAdd_AlreadyExists(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } - if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("existing"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("existing"), fs.PermFile); err != nil { t.Fatal(err) } @@ -233,13 +234,13 @@ func TestAdd_NoTemplate(t *testing.T) { } func TestRm(t *testing.T) { - dir := setup(t) + tmpDir := setup(t) - promptDir := filepath.Join(dir, config.DirContext, config.DirPrompts) - if err := os.MkdirAll(promptDir, config.PermExec); err != nil { + promptDir := filepath.Join(tmpDir, dir.Context, dir.Prompts) + if err := os.MkdirAll(promptDir, fs.PermExec); err != nil { t.Fatal(err) } - if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("# Review"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(promptDir, "review.md"), []byte("# Review"), fs.PermFile); err != nil { t.Fatal(err) } diff --git a/internal/cli/recall/cmd/export/cmd.go b/internal/cli/recall/cmd/export/cmd.go index d03597bb..24056371 100644 --- a/internal/cli/recall/cmd/export/cmd.go +++ b/internal/cli/recall/cmd/export/cmd.go @@ -20,55 +20,49 @@ import ( func Cmd() *cobra.Command { var opts core.ExportOpts - short, long := assets.CommandDesc("recall.export") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallExport) cmd := &cobra.Command{ Use: "export [session-id]", Short: short, Long: long, RunE: func(cmd *cobra.Command, args []string) error { - return runExport(cmd, args, opts) + return Run(cmd, args, opts) }, } cmd.Flags().BoolVar( - &opts.All, "all", false, assets.FlagDesc("recall.export.all"), + &opts.All, "all", false, assets.FlagDesc(assets.FlagDescKeyRecallExportAll), ) cmd.Flags().BoolVar( - &opts.AllProjects, "all-projects", false, assets.FlagDesc("recall.export.all-projects"), + &opts.AllProjects, "all-projects", false, + assets.FlagDesc(assets.FlagDescKeyRecallExportAllProjects), ) cmd.Flags().BoolVar( &opts.Regenerate, "regenerate", false, - assets.FlagDesc("recall.export.regenerate"), + assets.FlagDesc(assets.FlagDescKeyRecallExportRegenerate), ) cmd.Flags().BoolVar( &opts.KeepFrontmatter, "keep-frontmatter", true, - assets.FlagDesc("recall.export.keep-frontmatter"), + assets.FlagDesc(assets.FlagDescKeyRecallExportKeepFrontmatter), ) - // Deprecated: --force is replaced by --keep-frontmatter=false. - cmd.Flags().BoolVar( - &opts.Force, - "force", false, - assets.FlagDesc("recall.export.force"), - ) - _ = cmd.Flags().MarkDeprecated("force", "use --keep-frontmatter=false instead") cmd.Flags().BoolVarP( &opts.Yes, "yes", "y", false, - assets.FlagDesc("recall.export.yes"), + assets.FlagDesc(assets.FlagDescKeyRecallExportYes), ) cmd.Flags().BoolVar( &opts.DryRun, "dry-run", false, - assets.FlagDesc("recall.export.dry-run"), + assets.FlagDesc(assets.FlagDescKeyRecallExportDryRun), ) // Deprecated: --skip-existing is now the default behavior for --all. var skipExisting bool - cmd.Flags().BoolVar(&skipExisting, "skip-existing", false, assets.FlagDesc("recall.export.skip-existing")) + cmd.Flags().BoolVar(&skipExisting, "skip-existing", false, assets.FlagDesc(assets.FlagDescKeyRecallExportSkipExisting)) _ = cmd.Flags().MarkDeprecated("skip-existing", "this is now the default behavior for --all") return cmd diff --git a/internal/cli/recall/cmd/export/doc.go b/internal/cli/recall/cmd/export/doc.go new file mode 100644 index 00000000..0bf58507 --- /dev/null +++ b/internal/cli/recall/cmd/export/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package export implements the ctx recall export subcommand. +// +// It exports parsed sessions to journal markdown files. +package export diff --git a/internal/cli/recall/cmd/export/run.go b/internal/cli/recall/cmd/export/run.go index e1df34c7..08dd6290 100644 --- a/internal/cli/recall/cmd/export/run.go +++ b/internal/cli/recall/cmd/export/run.go @@ -11,10 +11,13 @@ import ( "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/journal" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/recall/core" - "github.com/ActiveMemory/ctx/internal/config" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/journal/state" "github.com/ActiveMemory/ctx/internal/rc" @@ -22,87 +25,7 @@ import ( "github.com/ActiveMemory/ctx/internal/write" ) -// executeExport writes files according to the plan. -// -// Parameters: -// - cmd: Cobra command for output. -// - plan: the export plan with file actions. -// - jstate: journal state to update as files are exported. -// - opts: export flag values. -// -// Returns: -// - exported: number of new files written. -// - updated: number of existing files updated (frontmatter preserved). -// - skipped: number of files skipped (existing or locked). -func executeExport( - cmd *cobra.Command, - plan core.ExportPlan, - jstate *state.JournalState, - opts core.ExportOpts, -) (exported, updated, skipped int) { - for _, fa := range plan.Actions { - if fa.Action == core.ActionLocked { - skipped++ - write.SkipFile(cmd, fa.Filename, config.FrontmatterLocked) - continue - } - if fa.Action == core.ActionSkip { - skipped++ - write.SkipFile(cmd, fa.Filename, config.ReasonExists) - continue - } - - // Generate content, sanitizing any invalid UTF-8. - content := strings.ToValidUTF8( - core.FormatJournalEntryPart( - fa.Session, fa.Messages[fa.StartIdx:fa.EndIdx], - fa.StartIdx, fa.Part, fa.TotalParts, fa.BaseName, fa.Title, - ), - config.Ellipsis, - ) - - fileExists := fa.Action == core.ActionRegenerate - - // Preserve enriched YAML frontmatter from the existing file. - discard := opts.DiscardFrontmatter() - if fileExists && !discard { - existing, readErr := os.ReadFile(filepath.Clean(fa.Path)) - if readErr == nil { - if fm := core.ExtractFrontmatter(string(existing)); fm != "" { - content = fm + config.NewlineLF + core.StripFrontmatter(content) - } - } - } - if fileExists && discard { - jstate.ClearEnriched(fa.Filename) - } - if fileExists && !discard { - updated++ - } else { - exported++ - } - - // Write file. - if writeErr := os.WriteFile( - fa.Path, []byte(content), config.PermFile, - ); writeErr != nil { - write.WarnFileErr(cmd, fa.Filename, writeErr) - continue - } - - jstate.MarkExported(fa.Filename) - - if fileExists && !discard { - write.ExportedFile(cmd, fa.Filename, config.ReasonUpdated) - } else { - write.ExportedFile(cmd, fa.Filename, "") - } - } - - return exported, updated, skipped -} - -// runExport handles the recall export command. +// Run handles the recall export command. // // Parameters: // - cmd: Cobra command for output. @@ -111,7 +34,7 @@ func executeExport( // // Returns: // - error: non-nil on validation, scan, or write failures. -func runExport(cmd *cobra.Command, args []string, opts core.ExportOpts) error { +func Run(cmd *cobra.Command, args []string, opts core.ExportOpts) error { // --keep-frontmatter=false implies --regenerate // (can't discard without regenerating). if !opts.KeepFrontmatter { @@ -163,9 +86,9 @@ func runExport(cmd *cobra.Command, args []string, opts core.ExportOpts) error { } // 4. Ensure journal directory exists. - journalDir := filepath.Join(rc.ContextDir(), config.DirJournal) - if mkErr := os.MkdirAll(journalDir, config.PermExec); mkErr != nil { - return ctxerr.Mkdir(config.DirJournal, mkErr) + journalDir := filepath.Join(rc.ContextDir(), dir.Journal) + if mkErr := os.MkdirAll(journalDir, fs.PermExec); mkErr != nil { + return ctxerr.Mkdir(dir.Journal, mkErr) } // 5. Load state + build index. @@ -183,14 +106,14 @@ func runExport(cmd *cobra.Command, args []string, opts core.ExportOpts) error { for _, rop := range plan.RenameOps { core.RenameJournalFiles(journalDir, rop.OldBase, rop.NewBase, rop.NumParts) jstate.Rename( - rop.OldBase+config.ExtMarkdown, rop.NewBase+config.ExtMarkdown, + rop.OldBase+file.ExtMarkdown, rop.NewBase+file.ExtMarkdown, ) renamed++ } // 8. Dry-run → print summary and return. if opts.DryRun { - write.ExportSummary(cmd, core.PlanCounts(plan), true) + write.ExportSummary(cmd, plan.NewCount, plan.RegenCount, plan.SkipCount, plan.LockedCount, true) return nil } @@ -207,11 +130,11 @@ func runExport(cmd *cobra.Command, args []string, opts core.ExportOpts) error { } // 10. Execute the export. - exported, updated, skipped := executeExport(cmd, plan, jstate, opts) + exported, updated, skipped := core.ExecuteExport(cmd, plan, jstate, opts) // 11. Persist journal state. if saveErr := jstate.Save(journalDir); saveErr != nil { - write.WarnFileErr(cmd, config.FileJournalState, saveErr) + write.WarnFileErr(cmd, journal.FileState, saveErr) } // 12. Print final summary. diff --git a/internal/cli/recall/cmd/list/cmd.go b/internal/cli/recall/cmd/list/cmd.go index 743bcfb2..a7091f0e 100644 --- a/internal/cli/recall/cmd/list/cmd.go +++ b/internal/cli/recall/cmd/list/cmd.go @@ -7,6 +7,7 @@ package list import ( + "github.com/ActiveMemory/ctx/internal/config/journal" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" @@ -26,23 +27,35 @@ func Cmd() *cobra.Command { allProjects bool ) - short, long := assets.CommandDesc("recall.list") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallList) cmd := &cobra.Command{ Use: "list", Short: short, Long: long, RunE: func(cmd *cobra.Command, args []string) error { - return runList(cmd, limit, project, tool, since, until, allProjects) + return Run(cmd, limit, project, tool, since, until, allProjects) }, } - cmd.Flags().IntVarP(&limit, "limit", "n", 20, assets.FlagDesc("recall.list.limit")) - cmd.Flags().StringVarP(&project, "project", "p", "", assets.FlagDesc("recall.list.project")) - cmd.Flags().StringVarP(&tool, "tool", "t", "", assets.FlagDesc("recall.list.tool")) - cmd.Flags().StringVar(&since, "since", "", assets.FlagDesc("recall.list.since")) - cmd.Flags().StringVar(&until, "until", "", assets.FlagDesc("recall.list.until")) - cmd.Flags().BoolVar(&allProjects, "all-projects", false, assets.FlagDesc("recall.list.all-projects")) + cmd.Flags().IntVarP(&limit, "limit", "n", journal.DefaultRecallListLimit, + assets.FlagDesc(assets.FlagDescKeyRecallListLimit), + ) + cmd.Flags().StringVarP(&project, "project", "p", "", + assets.FlagDesc(assets.FlagDescKeyRecallListProject), + ) + cmd.Flags().StringVarP(&tool, "tool", "t", "", + assets.FlagDesc(assets.FlagDescKeyRecallListTool), + ) + cmd.Flags().StringVar(&since, "since", "", + assets.FlagDesc(assets.FlagDescKeyRecallListSince), + ) + cmd.Flags().StringVar(&until, "until", "", + assets.FlagDesc(assets.FlagDescKeyRecallListUntil), + ) + cmd.Flags().BoolVar(&allProjects, "all-projects", false, + assets.FlagDesc(assets.FlagDescKeyRecallListAllProjects), + ) return cmd } diff --git a/internal/cli/recall/cmd/list/doc.go b/internal/cli/recall/cmd/list/doc.go new file mode 100644 index 00000000..3629d607 --- /dev/null +++ b/internal/cli/recall/cmd/list/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package list implements the ctx recall list subcommand. +// +// It lists parsed sessions with optional filtering by project, tool, +// and date range. +package list diff --git a/internal/cli/recall/cmd/list/run.go b/internal/cli/recall/cmd/list/run.go index 00128530..46c62190 100644 --- a/internal/cli/recall/cmd/list/run.go +++ b/internal/cli/recall/cmd/list/run.go @@ -10,17 +10,19 @@ import ( "fmt" "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/time" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/recall/core" - "github.com/ActiveMemory/ctx/internal/config" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/parse" "github.com/ActiveMemory/ctx/internal/recall/parser" "github.com/ActiveMemory/ctx/internal/write" ) -// runList handles the recall list command. +// Run handles the recall list command. // // Finds all sessions, applies optional filters, and displays them in a // formatted list with project, time, turn count, and preview. @@ -36,7 +38,7 @@ import ( // // Returns: // - error: non-nil if date parsing or session scanning fails -func runList( +func Run( cmd *cobra.Command, limit int, project, tool, since, until string, allProjects bool, @@ -44,15 +46,15 @@ func runList( // Parse date filters sinceTime, sinceErr := parse.Date(since) if since != "" && sinceErr != nil { - return ctxerr.InvalidDate(config.FlagSince, since, sinceErr) + return ctxerr.InvalidDate(assets.FlagSince, since, sinceErr) } untilTime, untilErr := parse.Date(until) if until != "" && untilErr != nil { - return ctxerr.InvalidDate(config.FlagUntil, until, untilErr) + return ctxerr.InvalidDate(assets.FlagUntil, until, untilErr) } // --until is inclusive: advance to the end of the day if until != "" { - untilTime = untilTime.Add(config.InclusiveUntilOffset) + untilTime = untilTime.Add(time.InclusiveUntilOffset) } sessions, scanErr := core.FindSessions(allProjects) @@ -102,9 +104,9 @@ func runList( write.SessionListHeader(cmd, len(sessions), shown) // Compute dynamic column widths from data. - slugW, projW := len(config.ColSlug), len(config.ColProject) + slugW, projW := len(assets.ColSlug), len(assets.ColProject) for _, s := range filtered { - slug := core.Truncate(s.Slug, config.SlugMaxLen) + slug := core.Truncate(s.Slug, journal.SlugMaxLen) if len(slug) > slugW { slugW = len(slug) } @@ -114,15 +116,15 @@ func runList( } // Print column header. - rowFmt := fmt.Sprintf(config.TplRecallListRow, slugW, projW) + rowFmt := fmt.Sprintf(assets.TplRecallListRow, slugW, projW) write.SessionListRow(cmd, rowFmt, - config.ColSlug, config.ColProject, config.ColDate, - config.ColDuration, config.ColTurns, config.ColTokens) + assets.ColSlug, assets.ColProject, assets.ColDate, + assets.ColDuration, assets.ColTurns, assets.ColTokens) // Print sessions. for _, s := range filtered { - slug := core.Truncate(s.Slug, config.SlugMaxLen) - dateStr := s.StartTime.Local().Format(config.DateTimeFormat) + slug := core.Truncate(s.Slug, journal.SlugMaxLen) + dateStr := s.StartTime.Local().Format(time.DateTimeFormat) dur := core.FormatDuration(s.Duration) turns := fmt.Sprintf("%d", s.TurnCount) tokens := "" diff --git a/internal/cli/recall/cmd/lock/cmd.go b/internal/cli/recall/cmd/lock/cmd.go index 57c11995..22d8237e 100644 --- a/internal/cli/recall/cmd/lock/cmd.go +++ b/internal/cli/recall/cmd/lock/cmd.go @@ -22,7 +22,7 @@ import ( func Cmd() *cobra.Command { var all bool - short, long := assets.CommandDesc("recall.lock") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallLock) cmd := &cobra.Command{ Use: "lock ", @@ -33,7 +33,9 @@ func Cmd() *cobra.Command { }, } - cmd.Flags().BoolVar(&all, "all", false, assets.FlagDesc("recall.lock.all")) + cmd.Flags().BoolVar(&all, "all", false, + assets.FlagDesc(assets.FlagDescKeyRecallLockAll), + ) return cmd } diff --git a/internal/cli/recall/cmd/lock/doc.go b/internal/cli/recall/cmd/lock/doc.go new file mode 100644 index 00000000..bf60c66c --- /dev/null +++ b/internal/cli/recall/cmd/lock/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package lock implements the ctx recall lock subcommand. +// +// It protects journal entries from being overwritten by export +// --regenerate, marking them as locked in the state file. +package lock diff --git a/internal/cli/recall/cmd/show/cmd.go b/internal/cli/recall/cmd/show/cmd.go index 3d90a8d8..5c602729 100644 --- a/internal/cli/recall/cmd/show/cmd.go +++ b/internal/cli/recall/cmd/show/cmd.go @@ -23,20 +23,26 @@ func Cmd() *cobra.Command { allProjects bool ) - short, long := assets.CommandDesc("recall.show") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallShow) cmd := &cobra.Command{ Use: "show [session-id]", Short: short, Long: long, RunE: func(cmd *cobra.Command, args []string) error { - return runShow(cmd, args, latest, full, allProjects) + return Run(cmd, args, latest, full, allProjects) }, } - cmd.Flags().BoolVar(&latest, "latest", false, assets.FlagDesc("recall.show.latest")) - cmd.Flags().BoolVar(&full, "full", false, assets.FlagDesc("recall.show.full")) - cmd.Flags().BoolVar(&allProjects, "all-projects", false, assets.FlagDesc("recall.show.all-projects")) + cmd.Flags().BoolVar(&latest, "latest", false, + assets.FlagDesc(assets.FlagDescKeyRecallShowLatest), + ) + cmd.Flags().BoolVar(&full, "full", false, + assets.FlagDesc(assets.FlagDescKeyRecallShowFull), + ) + cmd.Flags().BoolVar(&allProjects, "all-projects", false, + assets.FlagDesc(assets.FlagDescKeyRecallShowAllProjects), + ) return cmd } diff --git a/internal/cli/recall/cmd/show/doc.go b/internal/cli/recall/cmd/show/doc.go new file mode 100644 index 00000000..e9b784ca --- /dev/null +++ b/internal/cli/recall/cmd/show/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package show implements the ctx recall show subcommand. +// +// It displays detailed information about a specific parsed session. +package show diff --git a/internal/cli/recall/cmd/show/run.go b/internal/cli/recall/cmd/show/run.go index 9f115030..d7b69f0d 100644 --- a/internal/cli/recall/cmd/show/run.go +++ b/internal/cli/recall/cmd/show/run.go @@ -9,16 +9,19 @@ package show import ( "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/recall/core" - "github.com/ActiveMemory/ctx/internal/config" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/recall/parser" "github.com/ActiveMemory/ctx/internal/write" ) -// runShow handles the recall show command. +// Run handles the recall show command. // // Displays detailed information about a session including metadata, token // usage, tool usage summary, and optionally the full conversation. @@ -32,7 +35,7 @@ import ( // // Returns: // - error: non-nil if session not found or scanning fails -func runShow( +func Run( cmd *cobra.Command, args []string, latest, full, allProjects bool, ) error { sessions, scanErr := core.FindSessions(allProjects) @@ -44,7 +47,7 @@ func runShow( if allProjects { return ctxerr.NoSessionsFound("") } - return ctxerr.NoSessionsFound(config.HintUseAllProjects) + return ctxerr.NoSessionsFound(assets.HintUseAllProjects) } var session *parser.Session @@ -69,48 +72,29 @@ func runShow( if len(matches) > 1 { lines := core.FormatSessionMatchLines(matches) write.AmbiguousSessionMatchWithHint( - cmd, args[0], lines, matches[0].ID[:config.SessionIDHintLen], + cmd, args[0], lines, matches[0].ID[:journal.SessionIDHintLen], ) return ctxerr.AmbiguousQuery() } session = matches[0] } - // Print session details - write.SectionHeader(cmd, 1, session.Slug) - - write.SessionDetail(cmd, config.MetadataID, session.ID) - write.SessionDetail(cmd, config.MetadataTool, session.Tool) - write.SessionDetail(cmd, config.MetadataProject, session.Project) - if session.GitBranch != "" { - write.SessionDetail(cmd, config.MetadataBranch, session.GitBranch) - } - if session.Model != "" { - write.SessionDetail(cmd, config.MetadataModel, session.Model) - } - write.BlankLine(cmd) - - write.SessionDetail( - cmd, config.MetadataStarted, - session.StartTime.Format(config.DateTimePreciseFormat), - ) - write.SessionDetail( - cmd, config.MetadataDuration, core.FormatDuration(session.Duration), - ) - write.SessionDetailInt(cmd, config.MetadataTurns, session.TurnCount) - write.SessionDetailInt(cmd, config.MetadataMessages, len(session.Messages)) - write.BlankLine(cmd) - - write.SessionDetail( - cmd, config.MetadataInputUsage, core.FormatTokens(session.TotalTokensIn), - ) - write.SessionDetail( - cmd, config.MetadataOutputUsage, core.FormatTokens(session.TotalTokensOut), - ) - write.SessionDetail( - cmd, config.MetadataTotal, core.FormatTokens(session.TotalTokens), - ) - write.BlankLine(cmd) + // Print session details. + write.SessionMetadata(cmd, write.SessionInfo{ + Slug: session.Slug, + ID: session.ID, + Tool: session.Tool, + Project: session.Project, + Branch: session.GitBranch, + Model: session.Model, + Started: session.StartTime.Format(time.DateTimePreciseFormat), + Duration: core.FormatDuration(session.Duration), + Turns: session.TurnCount, + Messages: len(session.Messages), + TokensIn: core.FormatTokens(session.TotalTokensIn), + TokensOut: core.FormatTokens(session.TotalTokensOut), + TokensAll: core.FormatTokens(session.TotalTokens), + }) // Tool usage summary tools := session.AllToolUses() @@ -120,7 +104,7 @@ func runShow( toolCounts[t.Name]++ } - write.SectionHeader(cmd, 2, config.SectionToolUsage) + write.SectionHeader(cmd, 2, assets.SectionToolUsage) for name, count := range toolCounts { write.ListItem(cmd, "%s: %d", name, count) } @@ -129,18 +113,18 @@ func runShow( // Messages if full { - write.SectionHeader(cmd, 2, config.SectionConversation) + write.SectionHeader(cmd, 2, assets.SectionConversation) for i, msg := range session.Messages { - role := config.LabelRoleUser + role := assets.RoleUser if msg.BelongsToAssistant() { - role = config.LabelRoleAssistant + role = assets.LabelRoleAssistant } else if len(msg.ToolResults) > 0 && msg.Text == "" { - role = config.LabelToolOutput + role = assets.ToolOutput } write.ConversationTurn( - cmd, i+1, role, msg.Timestamp.Format(config.TimeFormat), + cmd, i+1, role, msg.Timestamp.Format(time.Format), ) if msg.Text != "" { @@ -149,12 +133,12 @@ func runShow( for _, t := range msg.ToolUses { toolInfo := core.FormatToolUse(t) - write.SessionDetail(cmd, config.LabelTool, toolInfo) + write.SessionDetail(cmd, assets.LabelTool, toolInfo) } for _, tr := range msg.ToolResults { if tr.IsError { - write.Hint(cmd, config.LabelError) + write.Hint(cmd, assets.LabelError) } if tr.Content != "" { content := core.StripLineNumbers(tr.Content) @@ -167,25 +151,25 @@ func runShow( } } } else { - write.SectionHeader(cmd, 2, config.SectionConversationPreview) + write.SectionHeader(cmd, 2, assets.SectionConversationPreview) count := 0 for _, msg := range session.Messages { if msg.BelongsToUser() && msg.Text != "" { count++ - if count > config.PreviewMaxTurns { - write.MoreTurns(cmd, session.TurnCount-config.PreviewMaxTurns) + if count > journal.PreviewMaxTurns { + write.MoreTurns(cmd, session.TurnCount-journal.PreviewMaxTurns) break } text := msg.Text - if len(text) > config.PreviewMaxTextLen { - text = text[:config.PreviewMaxTextLen] + config.Ellipsis + if len(text) > journal.PreviewMaxTextLen { + text = text[:journal.PreviewMaxTextLen] + token.Ellipsis } write.NumberedItem(cmd, count, text) } } write.BlankLine(cmd) - write.Hint(cmd, config.HintUseFullFlag) + write.Hint(cmd, assets.HintUseFullFlag) } return nil diff --git a/internal/cli/recall/cmd/sync/cmd.go b/internal/cli/recall/cmd/sync/cmd.go index 00eb7ef5..815c6003 100644 --- a/internal/cli/recall/cmd/sync/cmd.go +++ b/internal/cli/recall/cmd/sync/cmd.go @@ -21,14 +21,14 @@ import ( // Returns: // - *cobra.Command: Command for syncing lock state from frontmatter func Cmd() *cobra.Command { - short, long := assets.CommandDesc("recall.sync") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallSync) cmd := &cobra.Command{ Use: "sync", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runSync(cmd) + return Run(cmd) }, } diff --git a/internal/cli/recall/cmd/sync/doc.go b/internal/cli/recall/cmd/sync/doc.go new file mode 100644 index 00000000..bb08d59b --- /dev/null +++ b/internal/cli/recall/cmd/sync/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package sync implements the ctx recall sync subcommand. +// +// It scans journal markdowns and syncs their frontmatter lock state +// into the state file, treating frontmatter as the source of truth. +package sync diff --git a/internal/cli/recall/cmd/sync/run.go b/internal/cli/recall/cmd/sync/run.go index ff9ad92c..6b9fc138 100644 --- a/internal/cli/recall/cmd/sync/run.go +++ b/internal/cli/recall/cmd/sync/run.go @@ -7,18 +7,20 @@ package sync import ( - "fmt" "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/journal" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/recall/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/journal/state" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/write" ) -// runSync scans all journal markdowns and syncs frontmatter lock state +// Run scans all journal markdowns and syncs frontmatter lock state // to .state.json. // // Parameters: @@ -26,12 +28,12 @@ import ( // // Returns: // - error: Non-nil on I/O failure -func runSync(cmd *cobra.Command) error { - journalDir := filepath.Join(rc.ContextDir(), config.DirJournal) +func Run(cmd *cobra.Command) error { + journalDir := filepath.Join(rc.ContextDir(), dir.Journal) jstate, loadErr := state.Load(journalDir) if loadErr != nil { - return fmt.Errorf("load journal state: %w", loadErr) + return ctxerr.LoadJournalState(loadErr) } files, matchErr := core.MatchJournalFiles(journalDir, nil, true) @@ -39,7 +41,7 @@ func runSync(cmd *cobra.Command) error { return matchErr } if len(files) == 0 { - cmd.Println("No journal entries found.") + write.JournalSyncNone(cmd) return nil } @@ -52,30 +54,21 @@ func runSync(cmd *cobra.Command) error { switch { case fmLocked && !stateLocked: - jstate.Mark(filename, "locked") - cmd.Println(fmt.Sprintf(" ✓ %s (locked)", filename)) + jstate.Mark(filename, journal.StageLocked) + write.JournalSyncLocked(cmd, filename) locked++ case !fmLocked && stateLocked: - jstate.Clear(filename, "locked") - cmd.Println(fmt.Sprintf(" ✓ %s (unlocked)", filename)) + jstate.Clear(filename, journal.StageLocked) + write.JournalSyncUnlocked(cmd, filename) unlocked++ } } if saveErr := jstate.Save(journalDir); saveErr != nil { - return fmt.Errorf("save journal state: %w", saveErr) + return ctxerr.SaveJournalState(saveErr) } - if locked == 0 && unlocked == 0 { - cmd.Println("No changes — state already matches frontmatter.") - } else { - if locked > 0 { - cmd.Println(fmt.Sprintf("\nLocked %d entry(s).", locked)) - } - if unlocked > 0 { - cmd.Println(fmt.Sprintf("\nUnlocked %d entry(s).", unlocked)) - } - } + write.JournalSyncSummary(cmd, locked, unlocked) return nil } diff --git a/internal/cli/recall/cmd/unlock/cmd.go b/internal/cli/recall/cmd/unlock/cmd.go index ec3f7820..1b8e5ce0 100644 --- a/internal/cli/recall/cmd/unlock/cmd.go +++ b/internal/cli/recall/cmd/unlock/cmd.go @@ -22,18 +22,20 @@ import ( func Cmd() *cobra.Command { var all bool - short, long := assets.CommandDesc("recall.unlock") + short, long := assets.CommandDesc(assets.CmdDescKeyRecallUnlock) cmd := &cobra.Command{ Use: "unlock ", Short: short, Long: long, RunE: func(cmd *cobra.Command, args []string) error { - return runUnlock(cmd, args, all) + return Run(cmd, args, all) }, } - cmd.Flags().BoolVar(&all, "all", false, assets.FlagDesc("recall.unlock.all")) + cmd.Flags().BoolVar(&all, "all", false, + assets.FlagDesc(assets.FlagDescKeyRecallUnlockAll), + ) return cmd } diff --git a/internal/cli/recall/cmd/unlock/doc.go b/internal/cli/recall/cmd/unlock/doc.go new file mode 100644 index 00000000..fadf982b --- /dev/null +++ b/internal/cli/recall/cmd/unlock/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package unlock implements the ctx recall unlock subcommand. +// +// It removes lock protection from journal entries, allowing export +// --regenerate to overwrite them again. +package unlock diff --git a/internal/cli/recall/cmd/unlock/run.go b/internal/cli/recall/cmd/unlock/run.go index d40697cf..25916cd2 100644 --- a/internal/cli/recall/cmd/unlock/run.go +++ b/internal/cli/recall/cmd/unlock/run.go @@ -12,7 +12,7 @@ import ( "github.com/ActiveMemory/ctx/internal/cli/recall/core" ) -// runUnlock delegates to core.RunLockUnlock with lock=false. +// Run delegates to core.RunLockUnlock with lock=false. // // Parameters: // - cmd: Cobra command for output @@ -21,7 +21,7 @@ import ( // // Returns: // - error: Non-nil on validation or I/O failure -func runUnlock( +func Run( cmd *cobra.Command, args []string, all bool, diff --git a/internal/cli/recall/core/confirm.go b/internal/cli/recall/core/confirm.go index 70d84789..abdb31fc 100644 --- a/internal/cli/recall/core/confirm.go +++ b/internal/cli/recall/core/confirm.go @@ -11,10 +11,13 @@ import ( "os" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/cli" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/write" - "github.com/spf13/cobra" ) // ConfirmExport prints the plan summary and prompts for confirmation. @@ -27,13 +30,13 @@ import ( // - bool: true if the user confirms. // - error: non-nil if reading input fails. func ConfirmExport(cmd *cobra.Command, plan ExportPlan) (bool, error) { - write.ExportSummary(cmd, PlanCounts(plan), false) - cmd.Print("Proceed? [y/N] ") + write.ExportSummary(cmd, plan.NewCount, plan.RegenCount, plan.SkipCount, plan.LockedCount, false) + cmd.Print(assets.TextDesc(assets.TextDescKeyConfirmProceed)) reader := bufio.NewReader(os.Stdin) - response, readErr := reader.ReadString('\n') + response, readErr := reader.ReadString(token.NewlineLF[0]) if readErr != nil { return false, ctxerr.ReadInput(readErr) } response = strings.TrimSpace(strings.ToLower(response)) - return response == config.ConfirmShort || response == config.ConfirmLong, nil + return response == cli.ConfirmShort || response == cli.ConfirmLong, nil } diff --git a/internal/cli/recall/core/execute.go b/internal/cli/recall/core/execute.go new file mode 100644 index 00000000..51e479f8 --- /dev/null +++ b/internal/cli/recall/core/execute.go @@ -0,0 +1,101 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "os" + "path/filepath" + "strings" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/journal/state" + "github.com/ActiveMemory/ctx/internal/write" +) + +// ExecuteExport writes files according to the plan. +// +// Parameters: +// - cmd: Cobra command for output. +// - plan: the export plan with file actions. +// - jstate: journal state to update as files are exported. +// - opts: export flag values. +// +// Returns: +// - exported: number of new files written. +// - updated: number of existing files updated (frontmatter preserved). +// - skipped: number of files skipped (existing or locked). +func ExecuteExport( + cmd *cobra.Command, + plan ExportPlan, + jstate *state.JournalState, + opts ExportOpts, +) (exported, updated, skipped int) { + for _, fa := range plan.Actions { + if fa.Action == ActionLocked { + skipped++ + write.SkipFile(cmd, fa.Filename, assets.FrontmatterLocked) + continue + } + if fa.Action == ActionSkip { + skipped++ + write.SkipFile(cmd, fa.Filename, assets.ReasonExists) + continue + } + + // Generate content, sanitizing any invalid UTF-8. + content := strings.ToValidUTF8( + FormatJournalEntryPart( + fa.Session, fa.Messages[fa.StartIdx:fa.EndIdx], + fa.StartIdx, fa.Part, fa.TotalParts, fa.BaseName, fa.Title, + ), + token.Ellipsis, + ) + + fileExists := fa.Action == ActionRegenerate + + // Preserve enriched YAML frontmatter from the existing file. + discard := opts.DiscardFrontmatter() + if fileExists && !discard { + existing, readErr := os.ReadFile(filepath.Clean(fa.Path)) + if readErr == nil { + if fm := ExtractFrontmatter(string(existing)); fm != "" { + content = fm + token.NewlineLF + StripFrontmatter(content) + } + } + } + if fileExists && discard { + jstate.ClearEnriched(fa.Filename) + } + if fileExists && !discard { + updated++ + } else { + exported++ + } + + // Write file. + if writeErr := os.WriteFile( + fa.Path, []byte(content), fs.PermFile, + ); writeErr != nil { + write.WarnFileErr(cmd, fa.Filename, writeErr) + continue + } + + jstate.MarkExported(fa.Filename) + + if fileExists && !discard { + write.ExportedFile(cmd, fa.Filename, assets.ReasonUpdated) + } else { + write.ExportedFile(cmd, fa.Filename, "") + } + } + + return exported, updated, skipped +} diff --git a/internal/cli/recall/core/extract.go b/internal/cli/recall/core/extract.go index 0e952eab..05f06471 100644 --- a/internal/cli/recall/core/extract.go +++ b/internal/cli/recall/core/extract.go @@ -9,7 +9,7 @@ package core import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/token" ) // ExtractFrontmatter returns the YAML frontmatter block from content, @@ -21,9 +21,9 @@ import ( // Returns: // - string: The frontmatter block including delimiters, or "" if not found func ExtractFrontmatter(content string) string { - nl := config.NewlineLF - fmOpen := config.Separator + nl - fmClose := nl + config.Separator + nl + nl := token.NewlineLF + fmOpen := token.Separator + nl + fmClose := nl + token.Separator + nl if !strings.HasPrefix(content, fmOpen) { return "" @@ -49,5 +49,5 @@ func StripFrontmatter(content string) string { if fm == "" { return content } - return strings.TrimLeft(content[len(fm):], config.NewlineLF) + return strings.TrimLeft(content[len(fm):], token.NewlineLF) } diff --git a/internal/cli/recall/core/fmt.go b/internal/cli/recall/core/fmt.go index 34209dc9..7c078e73 100644 --- a/internal/cli/recall/core/fmt.go +++ b/internal/cli/recall/core/fmt.go @@ -9,7 +9,9 @@ package core import ( "fmt" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/time" "github.com/ActiveMemory/ctx/internal/recall/parser" ) @@ -24,10 +26,10 @@ func FormatSessionMatchLines(matches []*parser.Session) []string { lines := make([]string, 0, len(matches)) for _, m := range matches { lines = append(lines, fmt.Sprintf( - config.TplSessionMatch, + assets.TplSessionMatch, m.Slug, - m.ID[:config.SessionIDShortLen], - m.StartTime.Format(config.DateTimeFormat)), + m.ID[:journal.SessionIDShortLen], + m.StartTime.Format(time.DateTimeFormat)), ) } return lines diff --git a/internal/cli/recall/core/format.go b/internal/cli/recall/core/format.go index 633babeb..d7df95fd 100644 --- a/internal/cli/recall/core/format.go +++ b/internal/cli/recall/core/format.go @@ -12,23 +12,16 @@ import ( "html" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/box" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/recall/parser" ) -// Claude Code tool names used in session transcripts. -const ( - ToolRead = "Read" - ToolWrite = "Write" - ToolEdit = "Edit" - ToolBash = "Bash" - ToolGrep = "Grep" - ToolGlob = "Glob" - ToolWebFetch = "WebFetch" - ToolWebSearch = "WebSearch" - ToolTask = "Task" -) - // FenceForContent returns the appropriate code fence for content. // // Uses longer fences when content contains backticks to avoid @@ -41,9 +34,9 @@ const ( // Returns: // - string: A fence string (e.g., "```", "````", "```"") func FenceForContent(content string) string { - fence := config.CodeFence + fence := token.CodeFence for strings.Contains(content, fence) { - fence += config.Backtick + fence += token.Backtick } return fence } @@ -64,16 +57,16 @@ func FenceForContent(content string) string { // Returns: // - string: Filename like "2026-01-15-fix-auth-bug-abc12345.md" func FormatJournalFilename(s *parser.Session, slugOverride string) string { - date := s.StartTime.Local().Format("2006-01-02") + date := s.StartTime.Local().Format(time.DateFormat) shortID := s.ID - if len(shortID) > config.RecallShortIDLen { - shortID = shortID[:config.RecallShortIDLen] + if len(shortID) > journal.ShortIDLen { + shortID = shortID[:journal.ShortIDLen] } slug := s.Slug if slugOverride != "" { slug = slugOverride } - return fmt.Sprintf(config.TplRecallFilename, date, slug, shortID) + return fmt.Sprintf(assets.TplRecallFilename, date, slug, shortID) } // FormatJournalEntryPart generates Markdown content for a part of a journal entry. @@ -99,48 +92,42 @@ func FormatJournalEntryPart( baseName, title string, ) string { var sb strings.Builder - nl := config.NewlineLF - sep := config.Separator + nl := token.NewlineLF + sep := token.Separator // Metadata (YAML frontmatter + HTML details) - only on part 1 if part == 1 { localStart := s.StartTime.Local() - dateStr := localStart.Format("2006-01-02") - timeStr := localStart.Format("15:04:05") + dateStr := localStart.Format(time.DateFormat) + timeStr := localStart.Format(time.Format) durationStr := FormatDuration(s.Duration) // Basic YAML frontmatter sb.WriteString(sep + nl) - sb.WriteString(fmt.Sprintf("date: %q"+nl, dateStr)) - sb.WriteString(fmt.Sprintf("time: %q"+nl, timeStr)) - sb.WriteString(fmt.Sprintf("project: %s"+nl, s.Project)) + writeFmQuoted(&sb, assets.FmKeyDate, dateStr) + writeFmQuoted(&sb, assets.FmKeyTime, timeStr) + writeFmString(&sb, assets.FmKeyProject, s.Project) if s.GitBranch != "" { - sb.WriteString(fmt.Sprintf("branch: %s"+nl, s.GitBranch)) + writeFmString(&sb, assets.FmKeyBranch, s.GitBranch) } if s.Model != "" { - sb.WriteString(fmt.Sprintf("model: %s"+nl, s.Model)) + writeFmString(&sb, assets.FmKeyModel, s.Model) } if s.TotalTokensIn > 0 { - sb.WriteString(fmt.Sprintf("tokens_in: %d"+nl, s.TotalTokensIn)) + writeFmInt(&sb, assets.FmKeyTokensIn, s.TotalTokensIn) } if s.TotalTokensOut > 0 { - sb.WriteString(fmt.Sprintf("tokens_out: %d"+nl, s.TotalTokensOut)) + writeFmInt(&sb, assets.FmKeyTokensOut, s.TotalTokensOut) } - sb.WriteString(fmt.Sprintf("session_id: %q"+nl, s.ID)) + writeFmQuoted(&sb, assets.FmKeySessionID, s.ID) if title != "" { - sb.WriteString(fmt.Sprintf("title: %q"+nl, title)) + writeFmQuoted(&sb, assets.FmKeyTitle, title) } sb.WriteString(sep + nl + nl) // Header — prefer title, fall back to slug, then baseName. - heading := title - if heading == "" { - heading = s.Slug - } - if heading == "" { - heading = baseName - } - sb.WriteString(fmt.Sprintf(config.TplJournalPageHeading+nl+nl, heading)) + heading := resolveHeading(title, s.Slug, baseName) + sb.WriteString(fmt.Sprintf(assets.TplJournalPageHeading+nl+nl, heading)) // Navigation header for multipart sessions if totalParts > 1 { @@ -151,63 +138,57 @@ func FormatJournalEntryPart( // Session metadata as collapsible HTML table // (Markdown tables don't render inside
in Zensical) summaryText := fmt.Sprintf("%s · %s · %s", dateStr, durationStr, s.Model) - sb.WriteString(fmt.Sprintf(config.TplMetaDetailsOpen, summaryText)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "ID", s.ID)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Date", dateStr)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Time", timeStr)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Duration", durationStr)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Tool", s.Tool)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Project", s.Project)) + sb.WriteString(fmt.Sprintf(assets.TplMetaDetailsOpen, summaryText)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelID, s.ID)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelDate, dateStr)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelTime, timeStr)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelDuration, durationStr)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelTool, s.Tool)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelProject, s.Project)) if s.GitBranch != "" { - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Branch", s.GitBranch)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelBranch, s.GitBranch)) } if s.Model != "" { - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Model", s.Model)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelModel, s.Model)) } - sb.WriteString(config.TplMetaDetailsClose + nl + nl) + sb.WriteString(assets.TplMetaDetailsClose + nl + nl) // Token stats as collapsible HTML table turnStr := fmt.Sprintf("%d", s.TurnCount) - sb.WriteString(fmt.Sprintf(config.TplMetaDetailsOpen, turnStr)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Turns", turnStr)) + sb.WriteString(fmt.Sprintf(assets.TplMetaDetailsOpen, turnStr)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelTurns, turnStr)) tokenSummary := fmt.Sprintf("%s (in: %s, out: %s)", FormatTokens(s.TotalTokens), FormatTokens(s.TotalTokensIn), FormatTokens(s.TotalTokensOut)) - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Tokens", tokenSummary)) + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelTokens, tokenSummary)) if totalParts > 1 { - sb.WriteString(fmt.Sprintf(config.TplMetaRow+nl, "Parts", + sb.WriteString(fmt.Sprintf(assets.TplMetaRow+nl, assets.MetaLabelParts, fmt.Sprintf("%d", totalParts))) } - sb.WriteString(config.TplMetaDetailsClose + nl + nl) + sb.WriteString(assets.TplMetaDetailsClose + nl + nl) sb.WriteString(sep + nl + nl) // Tool usage summary tools := s.AllToolUses() if len(tools) > 0 { - sb.WriteString(config.RecallHeadingToolUsage + nl + nl) + sb.WriteString(assets.RecallHeadingToolUsage + nl + nl) toolCounts := make(map[string]int) for _, t := range tools { toolCounts[t.Name]++ } for name, count := range toolCounts { sb.WriteString(fmt.Sprintf( - config.TplRecallToolCount+nl, name, count), + assets.TplRecallToolCount+nl, name, count), ) } sb.WriteString(nl + sep + nl + nl) } } else { // Header (non-part-1) — same fallback as part 1. - heading := title - if heading == "" { - heading = s.Slug - } - if heading == "" { - heading = baseName - } - sb.WriteString(fmt.Sprintf(config.TplJournalPageHeading+nl+nl, heading)) + heading := resolveHeading(title, s.Slug, baseName) + sb.WriteString(fmt.Sprintf(assets.TplJournalPageHeading+nl+nl, heading)) // Navigation header for multipart sessions if totalParts > 1 { @@ -218,25 +199,25 @@ func FormatJournalEntryPart( // Conversation section if part == 1 { - sb.WriteString(config.RecallHeadingConversation + nl + nl) + sb.WriteString(assets.RecallHeadingConversation + nl + nl) } else { sb.WriteString(fmt.Sprintf( - config.TplRecallConversationContinued+nl+nl, part-1), + assets.TplRecallConversationContinued+nl+nl, part-1), ) } for i, msg := range messages { msgNum := startMsgIdx + i + 1 - role := config.LabelRoleUser + role := assets.RoleUser if msg.BelongsToAssistant() { - role = config.LabelRoleAssistant + role = assets.LabelRoleAssistant } else if len(msg.ToolResults) > 0 && msg.Text == "" { - role = config.LabelToolOutput + role = assets.ToolOutput } localTime := msg.Timestamp.Local() - sb.WriteString(fmt.Sprintf(config.TplRecallTurnHeader+nl+nl, - msgNum, role, localTime.Format("15:04:05"))) + sb.WriteString(fmt.Sprintf(assets.TplRecallTurnHeader+nl+nl, + msgNum, role, localTime.Format(time.Format))) if msg.Text != "" { text := msg.Text @@ -250,13 +231,13 @@ func FormatJournalEntryPart( // Tool uses for _, t := range msg.ToolUses { - sb.WriteString(fmt.Sprintf(config.TplRecallToolUse+nl, FormatToolUse(t))) + sb.WriteString(fmt.Sprintf(assets.TplRecallToolUse+nl, FormatToolUse(t))) } // Tool results for _, tr := range msg.ToolResults { if tr.IsError { - sb.WriteString(config.TplRecallErrorMarker + nl) + sb.WriteString(assets.TplRecallErrorMarker + nl) } if tr.Content != "" { content := StripLineNumbers(tr.Content) @@ -264,21 +245,21 @@ func FormatJournalEntryPart( fence := FenceForContent(content) lines := strings.Count(content, nl) - if lines > config.RecallDetailsThreshold { - summary := fmt.Sprintf(config.TplRecallDetailsSummary, lines) - sb.WriteString(fmt.Sprintf(config.TplRecallDetailsOpen+nl+nl, summary)) + if lines > journal.DetailsThreshold { + summary := fmt.Sprintf(assets.TplRecallDetailsSummary, lines) + sb.WriteString(fmt.Sprintf(assets.TplRecallDetailsOpen+nl+nl, summary)) sb.WriteString("
" + nl + html.EscapeString(content) + nl + "
" + nl) - sb.WriteString(config.TplRecallDetailsClose + nl) + sb.WriteString(assets.TplRecallDetailsClose + nl) } else { sb.WriteString(fmt.Sprintf( - config.TplRecallFencedBlock+nl, fence, content, fence), + assets.TplRecallFencedBlock+nl, fence, content, fence), ) } // Render system reminders as Markdown outside the code fence for _, reminder := range reminders { sb.WriteString( - fmt.Sprintf(nl+config.LabelBoldReminder+" %s"+nl, reminder), + fmt.Sprintf(nl+assets.BoldReminder+" %s"+nl, reminder), ) } } @@ -298,6 +279,32 @@ func FormatJournalEntryPart( return sb.String() } +// resolveHeading returns the first non-empty value among title, slug, baseName. +func resolveHeading(title, slug, baseName string) string { + if title != "" { + return title + } + if slug != "" { + return slug + } + return baseName +} + +// writeFmQuoted writes a YAML frontmatter quoted string field. +func writeFmQuoted(sb *strings.Builder, key, value string) { + fmt.Fprintf(sb, assets.TplFmQuoted+token.NewlineLF, key, value) +} + +// writeFmString writes a YAML frontmatter bare string field. +func writeFmString(sb *strings.Builder, key, value string) { + fmt.Fprintf(sb, assets.TplFmString+token.NewlineLF, key, value) +} + +// writeFmInt writes a YAML frontmatter integer field. +func writeFmInt(sb *strings.Builder, key string, value int) { + fmt.Fprintf(sb, assets.TplFmInt+token.NewlineLF, key, value) +} + // FormatPartNavigation generates previous/next navigation links for // multipart sessions. // @@ -311,32 +318,32 @@ func FormatJournalEntryPart( // (e.g., "**Part 2 of 3** | [← Previous](...) | [Next →](...)") func FormatPartNavigation(part, totalParts int, baseName string) string { var sb strings.Builder - nl := config.NewlineLF + nl := token.NewlineLF - sb.WriteString(fmt.Sprintf(config.TplRecallPartOf, part, totalParts)) + sb.WriteString(fmt.Sprintf(assets.TplRecallPartOf, part, totalParts)) if part > 1 || part < totalParts { - sb.WriteString(config.PipeSeparator) + sb.WriteString(box.PipeSeparator) } // Previous link if part > 1 { - prevFile := baseName + config.ExtMarkdown + prevFile := baseName + file.ExtMarkdown if part > 2 { - prevFile = fmt.Sprintf(config.TplRecallPartFilename, baseName, part-1) + prevFile = fmt.Sprintf(assets.TplRecallPartFilename, baseName, part-1) } - sb.WriteString(fmt.Sprintf(config.TplRecallNavPrev, prevFile)) + sb.WriteString(fmt.Sprintf(assets.TplRecallNavPrev, prevFile)) } // Separator between prev and next if part > 1 && part < totalParts { - sb.WriteString(config.PipeSeparator) + sb.WriteString(box.PipeSeparator) } // Next link if part < totalParts { - nextFile := fmt.Sprintf(config.TplRecallPartFilename, baseName, part+1) - sb.WriteString(fmt.Sprintf(config.TplRecallNavNext, nextFile)) + nextFile := fmt.Sprintf(assets.TplRecallPartFilename, baseName, part+1) + sb.WriteString(fmt.Sprintf(assets.TplRecallNavNext, nextFile)) } sb.WriteString(nl) @@ -406,7 +413,7 @@ func Truncate(s string, max int) string { // Returns: // - string: Content with line number prefixes removed func StripLineNumbers(content string) string { - return config.RegExLineNumber.ReplaceAllString(content, "") + return regex.LineNumber.ReplaceAllString(content, "") } // ExtractSystemReminders separates system-reminder content from tool output. @@ -421,14 +428,14 @@ func StripLineNumbers(content string) string { // - string: Content with system-reminder tags removed // - []string: Extracted reminder texts (may be empty) func ExtractSystemReminders(content string) (string, []string) { - matches := config.RegExSystemReminder.FindAllStringSubmatch(content, -1) + matches := regex.SystemReminder.FindAllStringSubmatch(content, -1) var reminders []string for _, m := range matches { if len(m) > 1 && m[1] != "" { reminders = append(reminders, m[1]) } } - cleaned := config.RegExSystemReminder.ReplaceAllString(content, "") + cleaned := regex.SystemReminder.ReplaceAllString(content, "") return cleaned, reminders } @@ -445,35 +452,24 @@ func ExtractSystemReminders(content string) (string, []string) { // - string: Content with code fences properly separated by blank lines func NormalizeCodeFences(content string) string { // Add newlines before code fences that follow text on the same line - result := config.RegExCodeFenceInline.ReplaceAllString(content, "$1\n\n$2") + result := regex.CodeFenceInline.ReplaceAllString(content, "$1\n\n$2") // Add newlines after code fences that are followed by text on the same line - result = config.RegExCodeFenceClose.ReplaceAllString(result, "$1\n\n$2") + result = regex.CodeFenceClose.ReplaceAllString(result, "$1\n\n$2") return result } -// FormatToolUse formats a tool invocation with its key parameters. -// -// Extracts the most relevant parameter based on tool type (e.g., file path -// for Read/Write, command for Bash, pattern for Grep). -// -// Parameters: -// - t: Tool use to format -// -// Returns: -// - string: Formatted string like "Read: /path/to/file" or just tool name -// // toolDisplayKey maps tool names to the JSON input key that best // describes each invocation. var toolDisplayKey = map[string]string{ - ToolRead: "file_path", - ToolWrite: "file_path", - ToolEdit: "file_path", - ToolBash: "command", - ToolGrep: "pattern", - ToolGlob: "pattern", - ToolWebFetch: "url", - ToolWebSearch: "query", - ToolTask: "description", + assets.ToolRead: assets.ToolInputFilePath, + assets.ToolWrite: assets.ToolInputFilePath, + assets.ToolEdit: assets.ToolInputFilePath, + assets.ToolBash: assets.ToolInputCommand, + assets.ToolGrep: assets.ToolInputPattern, + assets.ToolGlob: assets.ToolInputPattern, + assets.ToolWebFetch: assets.ToolInputURL, + assets.ToolWebSearch: assets.ToolInputQuery, + assets.ToolTask: assets.ToolInputDescription, } // FormatToolUse formats a tool invocation with its key parameters. @@ -496,8 +492,8 @@ func FormatToolUse(t parser.ToolUse) string { if !ok { return t.Name } - if t.Name == ToolBash && len(val) > 100 { - val = val[:100] + "..." + if t.Name == assets.ToolBash && len(val) > assets.ToolDisplayMaxLen { + val = val[:assets.ToolDisplayMaxLen] + token.Ellipsis } - return fmt.Sprintf("%s: %s", t.Name, val) + return fmt.Sprintf(assets.TplToolDisplay, t.Name, val) } diff --git a/internal/cli/recall/core/index.go b/internal/cli/recall/core/index.go index 3e18181f..c448fa03 100644 --- a/internal/cli/recall/core/index.go +++ b/internal/cli/recall/core/index.go @@ -12,7 +12,12 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/validation" ) // BuildSessionIndex scans journal .md files in journalDir and returns a @@ -38,7 +43,7 @@ func BuildSessionIndex(journalDir string) map[string]string { } for _, e := range entries { - if e.IsDir() || !strings.HasSuffix(e.Name(), config.ExtMarkdown) { + if e.IsDir() || !strings.HasSuffix(e.Name(), file.ExtMarkdown) { continue } @@ -63,7 +68,7 @@ func BuildSessionIndex(journalDir string) map[string]string { // Filename format: YYYY-MM-DD-slug-SHORTID.md or ...-pN.md name := e.Name() // Strip multipart suffix (e.g., "-p2.md" → config.ExtMarkdown). - baseName := strings.TrimSuffix(name, config.ExtMarkdown) + baseName := strings.TrimSuffix(name, file.ExtMarkdown) if idx := strings.LastIndex(baseName, "-p"); idx > 0 { suffix := baseName[idx+2:] allDigits := true @@ -81,8 +86,8 @@ func BuildSessionIndex(journalDir string) map[string]string { } // Extract the last 8 chars before .md as candidate short ID. - if len(baseName) >= config.RecallShortIDLen { - shortID := baseName[len(baseName)-config.RecallShortIDLen:] + if len(baseName) >= journal.ShortIDLen { + shortID := baseName[len(baseName)-journal.ShortIDLen:] // Store with the short ID as key (caller matches against // session.ID[:8]). if _, exists := index[shortID]; !exists { @@ -105,13 +110,13 @@ func BuildSessionIndex(journalDir string) map[string]string { // Returns: // - string: The session ID, or "" if not found func ExtractSessionID(content string) string { - nl := config.NewlineLF - fmOpen := config.Separator + nl + nl := token.NewlineLF + fmOpen := token.Separator + nl if !strings.HasPrefix(content, fmOpen) { return "" } - end := strings.Index(content[len(fmOpen):], nl+config.Separator+nl) + end := strings.Index(content[len(fmOpen):], nl+token.Separator+nl) if end < 0 { return "" } @@ -119,8 +124,9 @@ func ExtractSessionID(content string) string { for _, line := range strings.Split(fmBlock, nl) { line = strings.TrimSpace(line) - if strings.HasPrefix(line, "session_id:") { - val := strings.TrimSpace(strings.TrimPrefix(line, "session_id:")) + prefix := assets.FmKeySessionID + token.Colon + if strings.HasPrefix(line, prefix) { + val := strings.TrimSpace(strings.TrimPrefix(line, prefix)) // Strip surrounding quotes. val = strings.Trim(val, `"'`) return val @@ -145,8 +151,8 @@ func LookupSessionFile(index map[string]string, sessionID string) string { return name } short := sessionID - if len(short) > config.RecallShortIDLen { - short = short[:config.RecallShortIDLen] + if len(short) > journal.ShortIDLen { + short = short[:journal.ShortIDLen] } if name, ok := index[short]; ok { return name @@ -163,19 +169,19 @@ func LookupSessionFile(index map[string]string, sessionID string) string { // Returns: // - string: The field value (unquoted), or "" if not found func ExtractFrontmatterField(content, field string) string { - nl := config.NewlineLF - fmOpen := config.Separator + nl + nl := token.NewlineLF + fmOpen := token.Separator + nl if !strings.HasPrefix(content, fmOpen) { return "" } - end := strings.Index(content[len(fmOpen):], nl+config.Separator+nl) + end := strings.Index(content[len(fmOpen):], nl+token.Separator+nl) if end < 0 { return "" } fmBlock := content[len(fmOpen) : len(fmOpen)+end] - prefix := field + ":" + prefix := field + token.Colon for _, line := range strings.Split(fmBlock, nl) { line = strings.TrimSpace(line) if strings.HasPrefix(line, prefix) { @@ -201,16 +207,16 @@ func ExtractFrontmatterField(content, field string) string { // - numParts: Expected number of parts (used for nav link updates) func RenameJournalFiles(journalDir, oldBase, newBase string, numParts int) { // Rename base file. - oldPath := filepath.Join(journalDir, oldBase+config.ExtMarkdown) - newPath := filepath.Join(journalDir, newBase+config.ExtMarkdown) + oldPath := filepath.Join(journalDir, oldBase+file.ExtMarkdown) + newPath := filepath.Join(journalDir, newBase+file.ExtMarkdown) if _, statErr := os.Stat(oldPath); statErr == nil { _ = os.Rename(oldPath, newPath) } // Rename multipart files and update nav links. for p := 2; p <= numParts; p++ { - oldPart := filepath.Join(journalDir, fmt.Sprintf("%s-p%d%s", oldBase, p, config.ExtMarkdown)) - newPart := filepath.Join(journalDir, fmt.Sprintf("%s-p%d%s", newBase, p, config.ExtMarkdown)) + oldPart := filepath.Join(journalDir, fmt.Sprintf(assets.TplRecallPartFilename, oldBase, p)) + newPart := filepath.Join(journalDir, fmt.Sprintf(assets.TplRecallPartFilename, newBase, p)) if _, statErr := os.Stat(oldPart); statErr == nil { _ = os.Rename(oldPart, newPart) } @@ -233,10 +239,10 @@ func UpdateNavLinks(journalDir, newBase, oldBase string, numParts int) { return } - files := []string{filepath.Join(journalDir, newBase+config.ExtMarkdown)} + files := []string{filepath.Join(journalDir, newBase+file.ExtMarkdown)} for p := 2; p <= numParts; p++ { files = append(files, filepath.Join(journalDir, - fmt.Sprintf("%s-p%d%s", newBase, p, config.ExtMarkdown))) + fmt.Sprintf(assets.TplRecallPartFilename, newBase, p))) } for _, f := range files { @@ -246,7 +252,7 @@ func UpdateNavLinks(journalDir, newBase, oldBase string, numParts int) { } updated := strings.ReplaceAll(string(data), oldBase, newBase) if updated != string(data) { - _ = os.WriteFile(f, []byte(updated), config.PermFile) //nolint:gosec // same permissions + _ = validation.WriteFile(f, []byte(updated), fs.PermFile) } } } diff --git a/internal/cli/recall/core/lock.go b/internal/cli/recall/core/lock.go index 956f79aa..553565a3 100644 --- a/internal/cli/recall/core/lock.go +++ b/internal/cli/recall/core/lock.go @@ -7,22 +7,29 @@ package core import ( - "fmt" "os" "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/cli" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/journal/state" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) // LockedFrontmatterLine is the YAML line inserted into frontmatter when // a journal entry is locked. -const LockedFrontmatterLine = "locked: true # managed by ctx" +var LockedFrontmatterLine = assets.FrontmatterLocked + token.Colon + " " + + cli.AnnotationTrue + " # managed by ctx" // MatchJournalFiles returns journal .md filenames matching the given // patterns. If all is true, returns every .md file in the directory. @@ -46,13 +53,13 @@ func MatchJournalFiles( if os.IsNotExist(readErr) { return nil, nil } - return nil, ctxerr.ReadDir("journal directory", readErr) + return nil, ctxerr.ReadJournalDir(readErr) } // Collect all .md filenames. var mdFiles []string for _, e := range entries { - if !e.IsDir() && strings.HasSuffix(e.Name(), config.ExtMarkdown) { + if !e.IsDir() && strings.HasSuffix(e.Name(), file.ExtMarkdown) { mdFiles = append(mdFiles, e.Name()) } } @@ -96,7 +103,7 @@ func MatchJournalFiles( // Returns: // - string: Base filename (without -pN suffix) func MultipartBase(filename string) string { - base := strings.TrimSuffix(filename, config.ExtMarkdown) + base := strings.TrimSuffix(filename, file.ExtMarkdown) if idx := strings.LastIndex(base, "-p"); idx > 0 { suffix := base[idx+2:] allDigits := true @@ -107,12 +114,15 @@ func MultipartBase(filename string) string { } } if allDigits && len(suffix) > 0 { - return base[:idx] + config.ExtMarkdown + return base[:idx] + file.ExtMarkdown } } return filename } +// lockedPrefix is the frontmatter key prefix for locked lines. +var lockedPrefix = assets.FrontmatterLocked + token.Colon + // UpdateLockFrontmatter inserts or removes the "locked: true" line in // a journal file's YAML frontmatter. The state file is the source of // truth; this is for human visibility only. @@ -127,15 +137,15 @@ func UpdateLockFrontmatter(path string, lock bool) { } content := string(data) - nl := config.NewlineLF - fmOpen := config.Separator + nl + nl := token.NewlineLF + fmOpen := token.Separator + nl if !strings.HasPrefix(content, fmOpen) { // No frontmatter — nothing to modify. return } - closeIdx := strings.Index(content[len(fmOpen):], nl+config.Separator+nl) + closeIdx := strings.Index(content[len(fmOpen):], nl+token.Separator+nl) if closeIdx < 0 { return } @@ -145,27 +155,27 @@ func UpdateLockFrontmatter(path string, lock bool) { if lock { // Already has locked line? - if strings.Contains(fmBlock, config.FrontmatterLocked+":") { + if strings.Contains(fmBlock, lockedPrefix) { return } // Insert before closing ---. updated := content[:fmEnd] + nl + LockedFrontmatterLine + content[fmEnd:] - _ = os.WriteFile(path, []byte(updated), config.PermFile) + _ = validation.WriteFile(path, []byte(updated), fs.PermFile) } else { // Remove the locked line. lines := strings.Split(fmBlock, nl) var filtered []string for _, line := range lines { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, config.FrontmatterLocked+":") { + if strings.HasPrefix(trimmed, lockedPrefix) { continue } filtered = append(filtered, line) } newFM := strings.Join(filtered, nl) updated := content[:len(fmOpen)] + newFM + content[fmEnd:] - _ = os.WriteFile(path, []byte(updated), config.PermFile) + _ = validation.WriteFile(path, []byte(updated), fs.PermFile) } } @@ -184,14 +194,14 @@ func FrontmatterHasLocked(path string) bool { } content := string(data) - nl := config.NewlineLF - fmOpen := config.Separator + nl + nl := token.NewlineLF + fmOpen := token.Separator + nl if !strings.HasPrefix(content, fmOpen) { return false } - closeIdx := strings.Index(content[len(fmOpen):], nl+config.Separator+nl) + closeIdx := strings.Index(content[len(fmOpen):], nl+token.Separator+nl) if closeIdx < 0 { return false } @@ -200,15 +210,15 @@ func FrontmatterHasLocked(path string) bool { for _, line := range strings.Split(fmBlock, nl) { trimmed := strings.TrimSpace(line) - if !strings.HasPrefix(trimmed, "locked:") { + if !strings.HasPrefix(trimmed, lockedPrefix) { continue } - val := strings.TrimSpace(strings.TrimPrefix(trimmed, "locked:")) + val := strings.TrimSpace(strings.TrimPrefix(trimmed, lockedPrefix)) // Strip inline comment (e.g. "true # managed by ctx"). if idx := strings.Index(val, "#"); idx >= 0 { val = strings.TrimSpace(val[:idx]) } - return val == "true" + return val == cli.AnnotationTrue } return false @@ -233,10 +243,10 @@ func RunLockUnlock( return cmd.Help() } if len(args) > 0 && all { - return ctxerr.AllWithArgument("a pattern") + return ctxerr.AllWithPattern() } - journalDir := filepath.Join(rc.ContextDir(), config.DirJournal) + journalDir := filepath.Join(rc.ContextDir(), dir.Journal) jstate, loadErr := state.Load(journalDir) if loadErr != nil { @@ -250,16 +260,16 @@ func RunLockUnlock( } if len(files) == 0 { if all { - cmd.Println("No journal entries found.") + write.LockUnlockNone(cmd) } else { return ctxerr.NoEntriesMatch(strings.Join(args, ", ")) } return nil } - verb := config.FrontmatterLocked + verb := assets.FrontmatterLocked if !lock { - verb = "unlocked" + verb = assets.Unlocked } count := 0 @@ -274,16 +284,16 @@ func RunLockUnlock( // Update state. if lock { - jstate.Mark(filename, config.FrontmatterLocked) + jstate.Mark(filename, assets.FrontmatterLocked) } else { - jstate.Clear(filename, config.FrontmatterLocked) + jstate.Clear(filename, assets.FrontmatterLocked) } // Update frontmatter for human visibility. path := filepath.Join(journalDir, filename) UpdateLockFrontmatter(path, lock) - cmd.Println(fmt.Sprintf(" ok %s (%s)", filename, verb)) + write.LockUnlockEntry(cmd, filename, verb) count++ } @@ -291,11 +301,7 @@ func RunLockUnlock( return ctxerr.SaveJournalState(saveErr) } - if count == 0 { - cmd.Println(fmt.Sprintf("No changes — all matched entries already %s.", verb)) - } else { - cmd.Println(fmt.Sprintf("\n%s %d entry(s).", strings.Title(verb), count)) //nolint:staticcheck // strings.Title is fine for single words - } + write.LockUnlockSummary(cmd, verb, count) return nil } diff --git a/internal/cli/recall/core/lock_test.go b/internal/cli/recall/core/lock_test.go index a54fb613..85d4d8f0 100644 --- a/internal/cli/recall/core/lock_test.go +++ b/internal/cli/recall/core/lock_test.go @@ -12,7 +12,7 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" ) func TestMultipartBase(t *testing.T) { @@ -57,7 +57,7 @@ func TestMatchJournalFiles_All(t *testing.T) { dir := t.TempDir() for _, name := range []string{"a.md", "b.md", "c.md", "state.json"} { if writeErr := os.WriteFile( - filepath.Join(dir, name), []byte("x"), config.PermFile, + filepath.Join(dir, name), []byte("x"), fs.PermFile, ); writeErr != nil { t.Fatal(writeErr) } @@ -80,7 +80,7 @@ func TestMatchJournalFiles_Pattern(t *testing.T) { } for _, name := range names { if writeErr := os.WriteFile( - filepath.Join(dir, name), []byte("x"), config.PermFile, + filepath.Join(dir, name), []byte("x"), fs.PermFile, ); writeErr != nil { t.Fatal(writeErr) } @@ -108,7 +108,7 @@ func TestMatchJournalFiles_MultipartExpands(t *testing.T) { } for _, name := range names { if writeErr := os.WriteFile( - filepath.Join(dir, name), []byte("x"), config.PermFile, + filepath.Join(dir, name), []byte("x"), fs.PermFile, ); writeErr != nil { t.Fatal(writeErr) } @@ -138,7 +138,7 @@ func TestUpdateLockFrontmatter_Lock(t *testing.T) { dir := t.TempDir() path := filepath.Join(dir, "test.md") content := "---\ndate: \"2026-01-21\"\ntitle: \"Test\"\n---\n\n# Body\n" - if writeErr := os.WriteFile(path, []byte(content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(path, []byte(content), fs.PermFile); writeErr != nil { t.Fatal(writeErr) } @@ -161,7 +161,7 @@ func TestUpdateLockFrontmatter_Unlock(t *testing.T) { path := filepath.Join(dir, "test.md") content := "---\ndate: \"2026-01-21\"\n" + LockedFrontmatterLine + "\ntitle: \"Test\"\n---\n\n# Body\n" - if writeErr := os.WriteFile(path, []byte(content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(path, []byte(content), fs.PermFile); writeErr != nil { t.Fatal(writeErr) } @@ -183,7 +183,7 @@ func TestUpdateLockFrontmatter_NoFrontmatter(t *testing.T) { dir := t.TempDir() path := filepath.Join(dir, "test.md") content := "# No frontmatter here\n\nJust a body.\n" - if writeErr := os.WriteFile(path, []byte(content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(path, []byte(content), fs.PermFile); writeErr != nil { t.Fatal(writeErr) } @@ -203,7 +203,7 @@ func TestUpdateLockFrontmatter_IdempotentLock(t *testing.T) { path := filepath.Join(dir, "test.md") content := "---\ndate: \"2026-01-21\"\n" + LockedFrontmatterLine + "\n---\n\n# Body\n" - if writeErr := os.WriteFile(path, []byte(content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(path, []byte(content), fs.PermFile); writeErr != nil { t.Fatal(writeErr) } @@ -267,7 +267,7 @@ func TestFrontmatterHasLocked(t *testing.T) { t.Run(tt.name, func(t *testing.T) { dir := t.TempDir() path := filepath.Join(dir, "test.md") - if writeErr := os.WriteFile(path, []byte(tt.content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(path, []byte(tt.content), fs.PermFile); writeErr != nil { t.Fatal(writeErr) } diff --git a/internal/cli/recall/core/plan.go b/internal/cli/recall/core/plan.go index 231498d1..e0bf3a3b 100644 --- a/internal/cli/recall/core/plan.go +++ b/internal/cli/recall/core/plan.go @@ -12,7 +12,9 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" "github.com/ActiveMemory/ctx/internal/journal/state" "github.com/ActiveMemory/ctx/internal/recall/parser" ) @@ -49,7 +51,7 @@ func PlanExport( } totalMsgs := len(nonEmptyMsgs) - numParts := (totalMsgs + config.MaxMessagesPerPart - 1) / config.MaxMessagesPerPart + numParts := (totalMsgs + journal.MaxMessagesPerPart - 1) / journal.MaxMessagesPerPart if numParts < 1 { numParts = 1 } @@ -60,18 +62,18 @@ func PlanExport( oldPath := filepath.Join(journalDir, oldFile) if data, readErr := os.ReadFile(filepath.Clean(oldPath)); readErr == nil { existingTitle = ExtractFrontmatterField( - string(data), config.FrontmatterTitle, + string(data), assets.FrontmatterTitle, ) } } slug, title := TitleSlug(s, existingTitle) baseFilename := FormatJournalFilename(s, slug) - baseName := strings.TrimSuffix(baseFilename, config.ExtMarkdown) + baseName := strings.TrimSuffix(baseFilename, file.ExtMarkdown) // Detect renames (dedup: old slug → new slug). if oldFile := LookupSessionFile(sessionIndex, s.ID); oldFile != "" { - oldBase := strings.TrimSuffix(oldFile, config.ExtMarkdown) + oldBase := strings.TrimSuffix(oldFile, file.ExtMarkdown) if oldBase != baseName { plan.RenameOps = append(plan.RenameOps, RenameOp{ OldBase: oldBase, @@ -85,12 +87,12 @@ func PlanExport( for part := 1; part <= numParts; part++ { filename := baseFilename if numParts > 1 && part > 1 { - filename = fmt.Sprintf(config.TplRecallPartFilename, baseName, part) + filename = fmt.Sprintf(assets.TplRecallPartFilename, baseName, part) } path := filepath.Join(journalDir, filename) - startIdx := (part - 1) * config.MaxMessagesPerPart - endIdx := startIdx + config.MaxMessagesPerPart + startIdx := (part - 1) * journal.MaxMessagesPerPart + endIdx := startIdx + journal.MaxMessagesPerPart if endIdx > totalMsgs { endIdx = totalMsgs } @@ -109,7 +111,7 @@ func PlanExport( case FrontmatterHasLocked(path): // Frontmatter says locked — promote to state so future // operations skip the file without reparsing. - jstate.Mark(filename, config.FrontmatterLocked) + jstate.Mark(filename, assets.FrontmatterLocked) action = ActionLocked plan.LockedCount++ case singleSession || opts.Regenerate || opts.DiscardFrontmatter(): diff --git a/internal/cli/recall/core/slug.go b/internal/cli/recall/core/slug.go index ba7e1f50..f6c9c4ad 100644 --- a/internal/cli/recall/core/slug.go +++ b/internal/cli/recall/core/slug.go @@ -10,7 +10,9 @@ import ( "strings" "unicode/utf8" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/recall/parser" ) @@ -30,7 +32,7 @@ const SlugMaxLen = 50 // - string: Slugified string (may be empty if input is empty or all punctuation) func SlugifyTitle(title string) string { // Strip the "..." truncation suffix from FirstUserMsg if present. - title = strings.TrimSuffix(title, "...") + title = strings.TrimSuffix(title, token.Ellipsis) var sb strings.Builder prevHyphen := false @@ -75,12 +77,12 @@ func SlugifyTitle(title string) string { // Returns: // - string: Cleaned title string func CleanTitle(s string) string { - s = strings.TrimSuffix(s, "...") - s = config.RegExClaudeTag.ReplaceAllString(s, "") + s = strings.TrimSuffix(s, token.Ellipsis) + s = regex.SystemClaudeTag.ReplaceAllString(s, "") var sb strings.Builder prevSpace := false for _, r := range s { - if r == '\n' || r == '\r' || r == '\t' { + if r == rune(token.NewlineLF[0]) || r == rune(token.NewlineCRLF[0]) || r == rune(token.Tab[0]) { r = ' ' } if r == ' ' { @@ -95,10 +97,10 @@ func CleanTitle(s string) string { } out := strings.TrimSpace(sb.String()) - // Truncate to RecallMaxTitleLen on a word boundary. - if utf8.RuneCountInString(out) > config.RecallMaxTitleLen { + // Truncate to MaxTitleLen on a word boundary. + if utf8.RuneCountInString(out) > journal.MaxTitleLen { runes := []rune(out) - truncated := string(runes[:config.RecallMaxTitleLen]) + truncated := string(runes[:journal.MaxTitleLen]) if idx := strings.LastIndex(truncated, " "); idx > 0 { truncated = truncated[:idx] } @@ -149,8 +151,8 @@ func TitleSlug(s *parser.Session, existingTitle string) (slug, title string) { } short := s.ID - if len(short) > config.RecallShortIDLen { - short = short[:config.RecallShortIDLen] + if len(short) > journal.ShortIDLen { + short = short[:journal.ShortIDLen] } return short, "" } diff --git a/internal/cli/recall/core/types.go b/internal/cli/recall/core/types.go index 806bab0b..384c19ea 100644 --- a/internal/cli/recall/core/types.go +++ b/internal/cli/recall/core/types.go @@ -8,7 +8,6 @@ package core import ( "github.com/ActiveMemory/ctx/internal/recall/parser" - "github.com/ActiveMemory/ctx/internal/write" ) // ExportAction describes what will happen to a given file. @@ -23,18 +22,17 @@ const ( // ExportOpts holds all flag values for the export command. type ExportOpts struct { - All, AllProjects, Force, Regenerate, Yes, DryRun bool - KeepFrontmatter bool + All, AllProjects, Regenerate, Yes, DryRun bool + KeepFrontmatter bool } // DiscardFrontmatter reports whether frontmatter should be discarded -// during regeneration, based on the combination of --keep-frontmatter -// and the deprecated --force flag. +// during regeneration. // // Returns: // - bool: True if frontmatter should be discarded func (o ExportOpts) DiscardFrontmatter() bool { - return !o.KeepFrontmatter || o.Force + return !o.KeepFrontmatter } // FileAction describes the planned action for a single export file (one part @@ -71,19 +69,3 @@ type RenameOp struct { NewBase string NumParts int } - -// PlanCounts converts an ExportPlan's counters to write.ExportCounts. -// -// Parameters: -// - p: Export plan with counters -// -// Returns: -// - write.ExportCounts: Formatted counters for output -func PlanCounts(p ExportPlan) write.ExportCounts { - return write.ExportCounts{ - New: p.NewCount, - Regen: p.RegenCount, - Skip: p.SkipCount, - Locked: p.LockedCount, - } -} diff --git a/internal/cli/recall/core/validate.go b/internal/cli/recall/core/validate.go index 0a65a635..307e599a 100644 --- a/internal/cli/recall/core/validate.go +++ b/internal/cli/recall/core/validate.go @@ -33,7 +33,7 @@ func EmptyMessage(msg parser.Message) bool { // - error: non-nil if flags conflict. func ValidateExportFlags(args []string, opts ExportOpts) error { if len(args) > 0 && opts.All { - return ctxerr.AllWithArgument("a session ID") + return ctxerr.AllWithSessionID() } if opts.Regenerate && !opts.All { return ctxerr.RegenerateRequiresAll() diff --git a/internal/cli/recall/lock_test.go b/internal/cli/recall/lock_test.go index af0abc07..869aba8d 100644 --- a/internal/cli/recall/lock_test.go +++ b/internal/cli/recall/lock_test.go @@ -13,14 +13,14 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/recall/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/journal/state" ) func TestRunLockUnlock_LockSingle(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -28,7 +28,7 @@ func TestRunLockUnlock_LockSingle(t *testing.T) { filename := "2026-01-21-test-abc12345.md" content := "---\ndate: \"2026-01-21\"\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -72,7 +72,7 @@ func TestRunLockUnlock_LockSingle(t *testing.T) { func TestRunLockUnlock_UnlockSingle(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -80,7 +80,7 @@ func TestRunLockUnlock_UnlockSingle(t *testing.T) { content := "---\ndate: \"2026-01-21\"\n" + core.LockedFrontmatterLine + "\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -133,7 +133,7 @@ func TestRunLockUnlock_UnlockSingle(t *testing.T) { func TestRunLockUnlock_LockAll(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -141,7 +141,7 @@ func TestRunLockUnlock_LockAll(t *testing.T) { for _, f := range files { content := "---\ndate: \"2026-01-21\"\n---\n\n# " + f + "\n" if err := os.WriteFile( - filepath.Join(journalDir, f), []byte(content), config.PermFile, + filepath.Join(journalDir, f), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -176,7 +176,7 @@ func TestRunLockUnlock_LockAll(t *testing.T) { func TestRunLockUnlock_AlreadyLocked(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -184,7 +184,7 @@ func TestRunLockUnlock_AlreadyLocked(t *testing.T) { content := "---\ndate: \"2026-01-21\"\n" + core.LockedFrontmatterLine + "\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -253,7 +253,7 @@ func TestRunLockUnlock_AllWithPattern(t *testing.T) { func TestRunLockUnlock_LockMultipart(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -262,7 +262,7 @@ func TestRunLockUnlock_LockMultipart(t *testing.T) { for _, f := range []string{base, part2} { content := "---\ndate: \"2026-01-21\"\n---\n\n# " + f + "\n" if err := os.WriteFile( - filepath.Join(journalDir, f), []byte(content), config.PermFile, + filepath.Join(journalDir, f), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } diff --git a/internal/cli/recall/recall.go b/internal/cli/recall/recall.go index c78ec1f3..8160a750 100644 --- a/internal/cli/recall/recall.go +++ b/internal/cli/recall/recall.go @@ -26,7 +26,7 @@ import ( // Returns: // - *cobra.Command: The recall command with list, show, and serve subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("recall") + short, long := assets.CommandDesc(assets.CmdDescKeyRecall) cmd := &cobra.Command{ Use: "recall", diff --git a/internal/cli/recall/recall_test.go b/internal/cli/recall/recall_test.go index 949b91f0..959e993e 100644 --- a/internal/cli/recall/recall_test.go +++ b/internal/cli/recall/recall_test.go @@ -120,7 +120,7 @@ func TestRecallExportCmd_Flags(t *testing.T) { // Check flags (includes deprecated flags for backward compatibility). flags := []string{ "all", "all-projects", "regenerate", "keep-frontmatter", - "yes", "dry-run", "force", "skip-existing", + "yes", "dry-run", "skip-existing", } for _, f := range flags { if exportCmd.Flags().Lookup(f) == nil { diff --git a/internal/cli/recall/run_test.go b/internal/cli/recall/run_test.go index b93d5456..a431556c 100644 --- a/internal/cli/recall/run_test.go +++ b/internal/cli/recall/run_test.go @@ -15,6 +15,7 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/recall/core" + "github.com/ActiveMemory/ctx/internal/config/journal" "github.com/ActiveMemory/ctx/internal/journal/state" ) @@ -400,7 +401,7 @@ func TestRunRecallExport_PreservesFrontmatter(t *testing.T) { } } -func TestRunRecallExport_ForceDiscardsFrontmatter(t *testing.T) { +func TestRunRecallExport_KeepFrontmatterFalseDiscards(t *testing.T) { tmpDir := t.TempDir() t.Setenv("HOME", tmpDir) @@ -436,8 +437,8 @@ func TestRunRecallExport_ForceDiscardsFrontmatter(t *testing.T) { t.Fatal(writeErr) } - // Re-export with --force — should discard enriched frontmatter - exportHelper(t, tmpDir, "--force", "--yes") + // Re-export with --keep-frontmatter=false — should discard enriched frontmatter + exportHelper(t, tmpDir, "--keep-frontmatter=false", "--yes") data, err := os.ReadFile(filepath.Clean(path)) if err != nil { @@ -446,10 +447,10 @@ func TestRunRecallExport_ForceDiscardsFrontmatter(t *testing.T) { content := string(data) if strings.Contains(content, "A curated summary") { - t.Error("--force should discard enriched frontmatter summary") + t.Error("--keep-frontmatter=false should discard enriched frontmatter summary") } if strings.Contains(content, "tags:") { - t.Error("--force should discard enriched frontmatter tags") + t.Error("--keep-frontmatter=false should discard enriched frontmatter tags") } // File should still have session content if !strings.Contains(content, "session_id:") { @@ -457,7 +458,7 @@ func TestRunRecallExport_ForceDiscardsFrontmatter(t *testing.T) { } } -func TestRunRecallExport_ForceResetsEnrichmentState(t *testing.T) { +func TestRunRecallExport_KeepFrontmatterFalseResetsEnrichmentState(t *testing.T) { tmpDir := t.TempDir() t.Setenv("HOME", tmpDir) @@ -490,24 +491,24 @@ func TestRunRecallExport_ForceResetsEnrichmentState(t *testing.T) { // Verify it's marked enriched jstate, _ = state.Load(journalDir) - if !jstate.IsEnriched(mdFile) { - t.Fatal("file should be marked enriched before --force re-export") + if !jstate.Enriched(mdFile) { + t.Fatal("file should be marked enriched before re-export") } - // Re-export with --force - exportHelper(t, tmpDir, "--force", "--yes") + // Re-export with --keep-frontmatter=false + exportHelper(t, tmpDir, "--keep-frontmatter=false", "--yes") // Load state again and verify enriched was cleared jstate, err = state.Load(journalDir) if err != nil { - t.Fatalf("load state after force: %v", err) + t.Fatalf("load state after re-export: %v", err) } - if jstate.IsEnriched(mdFile) { - t.Error("--force re-export should clear enriched state") + if jstate.Enriched(mdFile) { + t.Error("re-export with --keep-frontmatter=false should clear enriched state") } // Exported state should still be set - if !jstate.IsExported(mdFile) { - t.Error("file should still be marked exported after --force re-export") + if !jstate.Exported(mdFile) { + t.Error("file should still be marked exported after re-export") } } @@ -890,7 +891,7 @@ func TestRunRecallExport_LockedSkippedByDefault(t *testing.T) { if loadErr != nil { t.Fatalf("load state: %v", loadErr) } - jstate.Mark(mdFile, "locked") + jstate.Mark(mdFile, journal.StageLocked) if saveErr := jstate.Save(journalDir); saveErr != nil { t.Fatalf("save state: %v", saveErr) } @@ -913,7 +914,7 @@ func TestRunRecallExport_LockedSkippedByDefault(t *testing.T) { } } -func TestRunRecallExport_LockedSkippedByForce(t *testing.T) { +func TestRunRecallExport_LockedSkippedByKeepFrontmatterFalse(t *testing.T) { tmpDir := t.TempDir() t.Setenv("HOME", tmpDir) @@ -940,26 +941,26 @@ func TestRunRecallExport_LockedSkippedByForce(t *testing.T) { if loadErr != nil { t.Fatalf("load state: %v", loadErr) } - jstate.Mark(mdFile, "locked") + jstate.Mark(mdFile, journal.StageLocked) if saveErr := jstate.Save(journalDir); saveErr != nil { t.Fatalf("save state: %v", saveErr) } // Overwrite. - custom := "locked content — force cannot override\n" + custom := "locked content — cannot override\n" if writeErr := os.WriteFile(path, []byte(custom), 0600); writeErr != nil { t.Fatal(writeErr) } - // Even --force --yes should not overwrite a locked file. - exportHelper(t, tmpDir, "--force", "--yes") + // Even --keep-frontmatter=false --yes should not overwrite a locked file. + exportHelper(t, tmpDir, "--keep-frontmatter=false", "--yes") data, readErr := os.ReadFile(filepath.Clean(path)) if readErr != nil { t.Fatalf("read: %v", readErr) } if string(data) != custom { - t.Error("locked file should not be overwritten even with --force") + t.Error("locked file should not be overwritten even with --keep-frontmatter=false") } } @@ -1102,7 +1103,7 @@ func TestRunRecallExport_DryRunShowsLocked(t *testing.T) { if loadErr != nil { t.Fatalf("load state: %v", loadErr) } - jstate.Mark(mdFile, "locked") + jstate.Mark(mdFile, journal.StageLocked) if saveErr := jstate.Save(journalDir); saveErr != nil { t.Fatalf("save state: %v", saveErr) } @@ -1143,16 +1144,6 @@ func TestDiscardFrontmatter(t *testing.T) { opts: core.ExportOpts{KeepFrontmatter: false}, want: true, }, - { - name: "force overrides keep-frontmatter", - opts: core.ExportOpts{KeepFrontmatter: true, Force: true}, - want: true, - }, - { - name: "both false and force", - opts: core.ExportOpts{KeepFrontmatter: false, Force: true}, - want: true, - }, } for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { diff --git a/internal/cli/recall/sync_test.go b/internal/cli/recall/sync_test.go index d748ddbf..367176ae 100644 --- a/internal/cli/recall/sync_test.go +++ b/internal/cli/recall/sync_test.go @@ -12,14 +12,14 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/journal/state" ) func TestRunSync_LocksFromFrontmatter(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -27,7 +27,7 @@ func TestRunSync_LocksFromFrontmatter(t *testing.T) { filename := "2026-01-21-test-abc12345.md" content := "---\ndate: \"2026-01-21\"\nlocked: true # managed by ctx\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -65,7 +65,7 @@ func TestRunSync_LocksFromFrontmatter(t *testing.T) { func TestRunSync_UnlocksFromFrontmatter(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -73,7 +73,7 @@ func TestRunSync_UnlocksFromFrontmatter(t *testing.T) { filename := "2026-01-21-test-abc12345.md" content := "---\ndate: \"2026-01-21\"\ntitle: \"Test\"\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -122,7 +122,7 @@ func TestRunSync_UnlocksFromFrontmatter(t *testing.T) { func TestRunSync_NoChanges(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -130,7 +130,7 @@ func TestRunSync_NoChanges(t *testing.T) { filename := "2026-01-21-test-abc12345.md" content := "---\ndate: \"2026-01-21\"\nlocked: true\n---\n\n# Test\n" if err := os.WriteFile( - filepath.Join(journalDir, filename), []byte(content), config.PermFile, + filepath.Join(journalDir, filename), []byte(content), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -169,7 +169,7 @@ func TestRunSync_NoChanges(t *testing.T) { func TestRunSync_EmptyDir(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -197,7 +197,7 @@ func TestRunSync_EmptyDir(t *testing.T) { func TestRunSync_MixedFiles(t *testing.T) { dir := t.TempDir() journalDir := filepath.Join(dir, ".context", "journal") - if err := os.MkdirAll(journalDir, config.PermExec); err != nil { + if err := os.MkdirAll(journalDir, fs.PermExec); err != nil { t.Fatal(err) } @@ -205,7 +205,7 @@ func TestRunSync_MixedFiles(t *testing.T) { fileA := "2026-01-21-test-aaa11111.md" contentA := "---\ndate: \"2026-01-21\"\nlocked: true\n---\n\n# A\n" if err := os.WriteFile( - filepath.Join(journalDir, fileA), []byte(contentA), config.PermFile, + filepath.Join(journalDir, fileA), []byte(contentA), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -214,7 +214,7 @@ func TestRunSync_MixedFiles(t *testing.T) { fileB := "2026-01-22-test-bbb22222.md" contentB := "---\ndate: \"2026-01-22\"\n---\n\n# B\n" if err := os.WriteFile( - filepath.Join(journalDir, fileB), []byte(contentB), config.PermFile, + filepath.Join(journalDir, fileB), []byte(contentB), fs.PermFile, ); err != nil { t.Fatal(err) } @@ -223,7 +223,7 @@ func TestRunSync_MixedFiles(t *testing.T) { fileC := "2026-01-23-test-ccc33333.md" contentC := "# C\n\nNo frontmatter.\n" if err := os.WriteFile( - filepath.Join(journalDir, fileC), []byte(contentC), config.PermFile, + filepath.Join(journalDir, fileC), []byte(contentC), fs.PermFile, ); err != nil { t.Fatal(err) } diff --git a/internal/cli/reindex/cmd/root/cmd.go b/internal/cli/reindex/cmd/root/cmd.go index d37451a9..42c65804 100644 --- a/internal/cli/reindex/cmd/root/cmd.go +++ b/internal/cli/reindex/cmd/root/cmd.go @@ -5,3 +5,27 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the reindex convenience command. +// +// The reindex command regenerates the quick-reference index at the top of +// both DECISIONS.md and LEARNINGS.md in a single invocation. +// +// Returns: +// - *cobra.Command: The reindex command +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyReindex) + + return &cobra.Command{ + Use: "reindex", + Short: short, + Long: long, + RunE: Run, + } +} diff --git a/internal/cli/reindex/cmd/root/doc.go b/internal/cli/reindex/cmd/root/doc.go new file mode 100644 index 00000000..7b5d79d1 --- /dev/null +++ b/internal/cli/reindex/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx reindex command. +// +// It regenerates the quick-reference indices at the top of DECISIONS.md +// and LEARNINGS.md. +package root diff --git a/internal/cli/reindex/cmd/root/run.go b/internal/cli/reindex/cmd/root/run.go index d6603f54..9f7d914a 100644 --- a/internal/cli/reindex/cmd/root/run.go +++ b/internal/cli/reindex/cmd/root/run.go @@ -9,9 +9,9 @@ package root import ( "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/ctx" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/index" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -28,24 +28,24 @@ func Run(cmd *cobra.Command, _ []string) error { w := cmd.OutOrStdout() ctxDir := rc.ContextDir() - decisionsPath := filepath.Join(ctxDir, config.FileDecision) + decisionsPath := filepath.Join(ctxDir, ctx.Decision) decisionsErr := index.ReindexFile( w, decisionsPath, - config.FileDecision, + ctx.Decision, index.UpdateDecisions, - config.EntryPlural[config.EntryDecision], + "decisions", ) if decisionsErr != nil { return decisionsErr } - learningsPath := filepath.Join(ctxDir, config.FileLearning) + learningsPath := filepath.Join(ctxDir, ctx.Learning) return index.ReindexFile( w, learningsPath, - config.FileLearning, + ctx.Learning, index.UpdateLearnings, - config.EntryPlural[config.EntryLearning], + "learnings", ) } diff --git a/internal/cli/reindex/reindex.go b/internal/cli/reindex/reindex.go index a7bb94d4..b9b63f8e 100644 --- a/internal/cli/reindex/reindex.go +++ b/internal/cli/reindex/reindex.go @@ -9,24 +9,10 @@ package reindex import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" reindexroot "github.com/ActiveMemory/ctx/internal/cli/reindex/cmd/root" ) // Cmd returns the reindex convenience command. -// -// The reindex command regenerates the quick-reference index at the top of -// both DECISIONS.md and LEARNINGS.md in a single invocation. -// -// Returns: -// - *cobra.Command: The reindex command func Cmd() *cobra.Command { - short, long := assets.CommandDesc("reindex") - - return &cobra.Command{ - Use: "reindex", - Short: short, - Long: long, - RunE: reindexroot.Run, - } + return reindexroot.Cmd() } diff --git a/internal/cli/reindex/reindex_test.go b/internal/cli/reindex/reindex_test.go index b8e9bee6..64b06be4 100644 --- a/internal/cli/reindex/reindex_test.go +++ b/internal/cli/reindex/reindex_test.go @@ -11,7 +11,8 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -65,7 +66,7 @@ func TestRunReindex_BothFiles(t *testing.T) { rc.Reset() defer rc.Reset() - ctxDir := filepath.Join(tempDir, config.DirContext) + ctxDir := filepath.Join(tempDir, dir.Context) _ = os.MkdirAll(ctxDir, 0750) decisions := `# Decisions @@ -77,7 +78,7 @@ func TestRunReindex_BothFiles(t *testing.T) { **Consequences:** Added yaml dependency ` _ = os.WriteFile( - filepath.Join(ctxDir, config.FileDecision), + filepath.Join(ctxDir, ctx.Decision), []byte(decisions), 0600, ) @@ -90,7 +91,7 @@ func TestRunReindex_BothFiles(t *testing.T) { **Application:** Add validation to all handlers ` _ = os.WriteFile( - filepath.Join(ctxDir, config.FileLearning), + filepath.Join(ctxDir, ctx.Learning), []byte(learnings), 0600, ) @@ -102,7 +103,7 @@ func TestRunReindex_BothFiles(t *testing.T) { } // Verify both files were updated - updatedDecisions, readErr := os.ReadFile(filepath.Join(ctxDir, config.FileDecision)) //nolint:gosec // test temp path + updatedDecisions, readErr := os.ReadFile(filepath.Join(ctxDir, ctx.Decision)) //nolint:gosec // test temp path if readErr != nil { t.Fatalf("failed to read updated DECISIONS.md: %v", readErr) } @@ -110,7 +111,7 @@ func TestRunReindex_BothFiles(t *testing.T) { t.Error("updated DECISIONS.md is empty") } - updatedLearnings, readErr := os.ReadFile(filepath.Join(ctxDir, config.FileLearning)) //nolint:gosec // test temp path + updatedLearnings, readErr := os.ReadFile(filepath.Join(ctxDir, ctx.Learning)) //nolint:gosec // test temp path if readErr != nil { t.Fatalf("failed to read updated LEARNINGS.md: %v", readErr) } @@ -128,12 +129,12 @@ func TestRunReindex_DecisionsMissingLearningsPresent(t *testing.T) { rc.Reset() defer rc.Reset() - ctxDir := filepath.Join(tempDir, config.DirContext) + ctxDir := filepath.Join(tempDir, dir.Context) _ = os.MkdirAll(ctxDir, 0750) // Only create LEARNINGS.md, not DECISIONS.md _ = os.WriteFile( - filepath.Join(ctxDir, config.FileLearning), + filepath.Join(ctxDir, ctx.Learning), []byte("# Learnings\n"), 0600, ) @@ -154,15 +155,15 @@ func TestRunReindex_EmptyFiles(t *testing.T) { rc.Reset() defer rc.Reset() - ctxDir := filepath.Join(tempDir, config.DirContext) + ctxDir := filepath.Join(tempDir, dir.Context) _ = os.MkdirAll(ctxDir, 0750) _ = os.WriteFile( - filepath.Join(ctxDir, config.FileDecision), + filepath.Join(ctxDir, ctx.Decision), []byte("# Decisions\n"), 0600, ) _ = os.WriteFile( - filepath.Join(ctxDir, config.FileLearning), + filepath.Join(ctxDir, ctx.Learning), []byte("# Learnings\n"), 0600, ) diff --git a/internal/cli/remind/cmd/add/cmd.go b/internal/cli/remind/cmd/add/cmd.go index 6e4ad16a..064ecfc1 100644 --- a/internal/cli/remind/cmd/add/cmd.go +++ b/internal/cli/remind/cmd/add/cmd.go @@ -19,18 +19,20 @@ import ( func Cmd() *cobra.Command { var afterFlag string - short, _ := assets.CommandDesc("remind.add") + short, _ := assets.CommandDesc(assets.CmdDescKeyRemindAdd) cmd := &cobra.Command{ Use: "add TEXT", Short: short, Args: cobra.ExactArgs(1), RunE: func(cmd *cobra.Command, args []string) error { - return RunAdd(cmd, args[0], afterFlag) + return Run(cmd, args[0], afterFlag) }, } - cmd.Flags().StringVarP(&afterFlag, "after", "a", "", assets.FlagDesc("remind.add.after")) + cmd.Flags().StringVarP(&afterFlag, "after", "a", "", + assets.FlagDesc(assets.FlagDescKeyRemindAddAfter), + ) return cmd } diff --git a/internal/cli/remind/cmd/add/doc.go b/internal/cli/remind/cmd/add/doc.go new file mode 100644 index 00000000..6a1d7e44 --- /dev/null +++ b/internal/cli/remind/cmd/add/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package add implements the ctx remind add subcommand. +// +// It creates a new reminder with optional deferred scheduling via the +// --after flag. +package add diff --git a/internal/cli/remind/cmd/add/run.go b/internal/cli/remind/cmd/add/run.go index 2f36da06..115c89dc 100644 --- a/internal/cli/remind/cmd/add/run.go +++ b/internal/cli/remind/cmd/add/run.go @@ -7,15 +7,17 @@ package add import ( - "fmt" "time" + time2 "github.com/ActiveMemory/ctx/internal/config/time" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/remind/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunAdd creates a new reminder and prints confirmation. +// Run creates a new reminder and prints confirmation. // // Exported for reuse by the parent command's default action. // @@ -26,7 +28,7 @@ import ( // // Returns: // - error: Non-nil on read/write failure or invalid date -func RunAdd(cmd *cobra.Command, message, after string) error { +func Run(cmd *cobra.Command, message, after string) error { reminders, readErr := core.ReadReminders() if readErr != nil { return readErr @@ -38,8 +40,8 @@ func RunAdd(cmd *cobra.Command, message, after string) error { Created: time.Now().UTC().Format(time.RFC3339), } if after != "" { - if _, parseErr := time.Parse("2006-01-02", after); parseErr != nil { - return fmt.Errorf("invalid date %q (expected YYYY-MM-DD)", after) + if _, parseErr := time.Parse(time2.DateFormat, after); parseErr != nil { + return ctxerr.InvalidDateValue(after) } r.After = &after } @@ -49,10 +51,6 @@ func RunAdd(cmd *cobra.Command, message, after string) error { return writeErr } - suffix := "" - if r.After != nil { - suffix = fmt.Sprintf(" (after %s)", *r.After) - } - cmd.Println(fmt.Sprintf(" + [%d] %s%s", r.ID, r.Message, suffix)) + write.ReminderAdded(cmd, r.ID, r.Message, r.After) return nil } diff --git a/internal/cli/remind/cmd/dismiss/cmd.go b/internal/cli/remind/cmd/dismiss/cmd.go index 080291e3..9d4bdd79 100644 --- a/internal/cli/remind/cmd/dismiss/cmd.go +++ b/internal/cli/remind/cmd/dismiss/cmd.go @@ -7,11 +7,10 @@ package dismiss import ( - "fmt" - "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // Cmd returns the remind dismiss subcommand. @@ -21,7 +20,7 @@ import ( func Cmd() *cobra.Command { var allFlag bool - short, _ := assets.CommandDesc("remind.dismiss") + short, _ := assets.CommandDesc(assets.CmdDescKeyRemindDismiss) cmd := &cobra.Command{ Use: "dismiss [ID]", @@ -29,16 +28,18 @@ func Cmd() *cobra.Command { Short: short, RunE: func(cmd *cobra.Command, args []string) error { if allFlag { - return runDismissAll(cmd) + return RunDismissAll(cmd) } if len(args) == 0 { - return fmt.Errorf("provide a reminder ID or use --all") + return ctxerr.ReminderIDRequired() } - return runDismiss(cmd, args[0]) + return RunDismiss(cmd, args[0]) }, } - cmd.Flags().BoolVar(&allFlag, "all", false, assets.FlagDesc("remind.dismiss.all")) + cmd.Flags().BoolVar(&allFlag, "all", false, + assets.FlagDesc(assets.FlagDescKeyRemindDismissAll), + ) return cmd } diff --git a/internal/cli/remind/cmd/dismiss/doc.go b/internal/cli/remind/cmd/dismiss/doc.go new file mode 100644 index 00000000..17a095e7 --- /dev/null +++ b/internal/cli/remind/cmd/dismiss/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package dismiss implements the ctx remind dismiss subcommand. +// +// It marks one or all pending reminders as dismissed. +package dismiss diff --git a/internal/cli/remind/cmd/dismiss/run.go b/internal/cli/remind/cmd/dismiss/run.go index 79115eb7..3b23fe22 100644 --- a/internal/cli/remind/cmd/dismiss/run.go +++ b/internal/cli/remind/cmd/dismiss/run.go @@ -7,15 +7,16 @@ package dismiss import ( - "fmt" "strconv" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/remind/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) -// runDismiss removes a single reminder by ID and prints confirmation. +// RunDismiss removes a single reminder by ID and prints confirmation. // // Parameters: // - cmd: Cobra command for output @@ -23,10 +24,10 @@ import ( // // Returns: // - error: Non-nil on invalid ID, missing reminder, or write failure -func runDismiss(cmd *cobra.Command, idStr string) error { +func RunDismiss(cmd *cobra.Command, idStr string) error { id, parseErr := strconv.Atoi(idStr) if parseErr != nil { - return fmt.Errorf("invalid ID %q", idStr) + return ctxerr.InvalidID(idStr) } reminders, readErr := core.ReadReminders() @@ -43,36 +44,36 @@ func runDismiss(cmd *cobra.Command, idStr string) error { } if found < 0 { - return fmt.Errorf("no reminder with ID %d", id) + return ctxerr.ReminderNotFound(id) } - cmd.Println(fmt.Sprintf(" - [%d] %s", reminders[found].ID, reminders[found].Message)) + write.ReminderDismissed(cmd, reminders[found].ID, reminders[found].Message) reminders = append(reminders[:found], reminders[found+1:]...) return core.WriteReminders(reminders) } -// runDismissAll removes all reminders and prints confirmation. +// RunDismissAll removes all reminders and prints confirmation. // // Parameters: // - cmd: Cobra command for output // // Returns: // - error: Non-nil on read or write failure -func runDismissAll(cmd *cobra.Command) error { +func RunDismissAll(cmd *cobra.Command) error { reminders, readErr := core.ReadReminders() if readErr != nil { return readErr } if len(reminders) == 0 { - cmd.Println("No reminders.") + write.ReminderNone(cmd) return nil } for _, r := range reminders { - cmd.Println(fmt.Sprintf(" - [%d] %s", r.ID, r.Message)) + write.ReminderDismissed(cmd, r.ID, r.Message) } - cmd.Println(fmt.Sprintf("Dismissed %d reminders.", len(reminders))) + write.ReminderDismissedAll(cmd, len(reminders)) return core.WriteReminders([]core.Reminder{}) } diff --git a/internal/cli/remind/cmd/list/cmd.go b/internal/cli/remind/cmd/list/cmd.go index 8be3b124..eb717fa6 100644 --- a/internal/cli/remind/cmd/list/cmd.go +++ b/internal/cli/remind/cmd/list/cmd.go @@ -17,14 +17,14 @@ import ( // Returns: // - *cobra.Command: Configured list subcommand func Cmd() *cobra.Command { - short, _ := assets.CommandDesc("remind.list") + short, _ := assets.CommandDesc(assets.CmdDescKeyRemindList) return &cobra.Command{ Use: "list", Aliases: []string{"ls"}, Short: short, RunE: func(cmd *cobra.Command, _ []string) error { - return RunList(cmd) + return Run(cmd) }, } } diff --git a/internal/cli/remind/cmd/list/doc.go b/internal/cli/remind/cmd/list/doc.go new file mode 100644 index 00000000..b37171a5 --- /dev/null +++ b/internal/cli/remind/cmd/list/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package list implements the ctx remind list subcommand. +// +// It displays all pending reminders. +package list diff --git a/internal/cli/remind/cmd/list/run.go b/internal/cli/remind/cmd/list/run.go index 245756b1..c86bbb1a 100644 --- a/internal/cli/remind/cmd/list/run.go +++ b/internal/cli/remind/cmd/list/run.go @@ -7,15 +7,16 @@ package list import ( - "fmt" "time" + time2 "github.com/ActiveMemory/ctx/internal/config/time" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/remind/core" + "github.com/ActiveMemory/ctx/internal/write" ) -// RunList prints all pending reminders with date annotations. +// Run prints all pending reminders with date annotations. // // Exported for reuse by the parent command's default action. // @@ -24,26 +25,20 @@ import ( // // Returns: // - error: Non-nil on read failure -func RunList(cmd *cobra.Command) error { +func Run(cmd *cobra.Command) error { reminders, readErr := core.ReadReminders() if readErr != nil { return readErr } if len(reminders) == 0 { - cmd.Println("No reminders.") + write.ReminderNone(cmd) return nil } - today := time.Now().Format("2006-01-02") + today := time.Now().Format(time2.DateFormat) for _, r := range reminders { - annotation := "" - if r.After != nil { - if *r.After > today { - annotation = fmt.Sprintf(" (after %s, not yet due)", *r.After) - } - } - cmd.Println(fmt.Sprintf(" [%d] %s%s", r.ID, r.Message, annotation)) + write.ReminderItem(cmd, r.ID, r.Message, r.After, today) } return nil diff --git a/internal/cli/remind/core/store.go b/internal/cli/remind/core/store.go index b98ad737..bc20f413 100644 --- a/internal/cli/remind/core/store.go +++ b/internal/cli/remind/core/store.go @@ -9,12 +9,14 @@ package core import ( "encoding/json" "errors" - "fmt" "os" "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/reminder" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" ) // Reminder represents a single session-scoped reminder. @@ -31,17 +33,16 @@ type Reminder struct { // - []Reminder: The parsed reminders (nil when file absent) // - error: Non-nil on read or parse failure func ReadReminders() ([]Reminder, error) { - path := RemindersPath() - data, readErr := os.ReadFile(path) //nolint:gosec // project-local path + data, readErr := validation.ReadUserFile(RemindersPath()) if readErr != nil { if errors.Is(readErr, os.ErrNotExist) { return nil, nil } - return nil, fmt.Errorf("read reminders: %w", readErr) + return nil, ctxerr.ReadReminders(readErr) } var reminders []Reminder if parseErr := json.Unmarshal(data, &reminders); parseErr != nil { - return nil, fmt.Errorf("parse reminders: %w", parseErr) + return nil, ctxerr.ParseReminders(parseErr) } return reminders, nil } @@ -58,7 +59,7 @@ func WriteReminders(reminders []Reminder) error { if marshalErr != nil { return marshalErr } - return os.WriteFile(RemindersPath(), data, config.PermFile) + return validation.WriteFile(RemindersPath(), data, fs.PermFile) } // NextID returns the next available reminder ID (max existing + 1). @@ -83,5 +84,5 @@ func NextID(reminders []Reminder) int { // Returns: // - string: Absolute path to reminders.json func RemindersPath() string { - return filepath.Join(rc.ContextDir(), config.FileReminders) + return filepath.Join(rc.ContextDir(), reminder.Reminders) } diff --git a/internal/cli/remind/remind.go b/internal/cli/remind/remind.go index 826a204d..dbe079f3 100644 --- a/internal/cli/remind/remind.go +++ b/internal/cli/remind/remind.go @@ -25,7 +25,7 @@ import ( func Cmd() *cobra.Command { var afterFlag string - short, long := assets.CommandDesc("remind") + short, long := assets.CommandDesc(assets.CmdDescKeyRemind) cmd := &cobra.Command{ Use: "remind [TEXT]", @@ -34,13 +34,15 @@ func Cmd() *cobra.Command { Args: cobra.ArbitraryArgs, RunE: func(cmd *cobra.Command, args []string) error { if len(args) > 0 { - return add.RunAdd(cmd, args[0], afterFlag) + return add.Run(cmd, args[0], afterFlag) } - return list.RunList(cmd) + return list.Run(cmd) }, } - cmd.Flags().StringVarP(&afterFlag, "after", "a", "", assets.FlagDesc("remind.after")) + cmd.Flags().StringVarP(&afterFlag, "after", "a", "", + assets.FlagDesc(assets.FlagDescKeyRemindAfter), + ) cmd.AddCommand(add.Cmd()) cmd.AddCommand(list.Cmd()) diff --git a/internal/cli/remind/remind_test.go b/internal/cli/remind/remind_test.go index e1d2a824..f570fea0 100644 --- a/internal/cli/remind/remind_test.go +++ b/internal/cli/remind/remind_test.go @@ -14,19 +14,19 @@ import ( "strings" "testing" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/remind/core" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/rc" ) // setup creates a temp dir with a .context/ directory and sets the RC override. func setup(t *testing.T) string { t.Helper() - dir := t.TempDir() + tmpDir := t.TempDir() origDir, _ := os.Getwd() - if err := os.Chdir(dir); err != nil { + if err := os.Chdir(tmpDir); err != nil { t.Fatal(err) } t.Cleanup(func() { @@ -35,14 +35,14 @@ func setup(t *testing.T) string { }) rc.Reset() - rc.OverrideContextDir(config.DirContext) + rc.OverrideContextDir(dir.Context) - ctxDir := filepath.Join(dir, config.DirContext) + ctxDir := filepath.Join(tmpDir, dir.Context) if err := os.MkdirAll(ctxDir, 0750); err != nil { t.Fatal(err) } - return dir + return tmpDir } // runCmd executes a cobra command and captures its output. diff --git a/internal/cli/resume/cmd/root/cmd.go b/internal/cli/resume/cmd/root/cmd.go index d37451a9..c8e14504 100644 --- a/internal/cli/resume/cmd/root/cmd.go +++ b/internal/cli/resume/cmd/root/cmd.go @@ -5,3 +5,33 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the top-level "ctx resume" command. +// +// Returns: +// - *cobra.Command: Configured resume command +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyResume) + + cmd := &cobra.Command{ + Use: "resume", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + sessionID, _ := cmd.Flags().GetString("session-id") + return Run(cmd, sessionID) + }, + } + + cmd.Flags().String("session-id", "", + assets.FlagDesc(assets.FlagDescKeyResumeSessionId), + ) + + return cmd +} diff --git a/internal/cli/resume/cmd/root/doc.go b/internal/cli/resume/cmd/root/doc.go new file mode 100644 index 00000000..28d8e4f4 --- /dev/null +++ b/internal/cli/resume/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx resume command. +// +// It resumes context hooks for the current session after they have been +// paused. +package root diff --git a/internal/cli/resume/cmd/root/run.go b/internal/cli/resume/cmd/root/run.go index dea2d2dd..9ca8668c 100644 --- a/internal/cli/resume/cmd/root/run.go +++ b/internal/cli/resume/cmd/root/run.go @@ -7,12 +7,12 @@ package root import ( - "fmt" "os" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/write" ) // Run executes the resume command. @@ -28,6 +28,6 @@ func Run(cmd *cobra.Command, sessionID string) error { sessionID = core.ReadSessionID(os.Stdin) } core.Resume(sessionID) - cmd.Println(fmt.Sprintf("Context hooks resumed for session %s", sessionID)) + write.SessionResumed(cmd, sessionID) return nil } diff --git a/internal/cli/resume/resume.go b/internal/cli/resume/resume.go index 3331fee2..1bdfb0e2 100644 --- a/internal/cli/resume/resume.go +++ b/internal/cli/resume/resume.go @@ -9,26 +9,10 @@ package resume import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" resumeroot "github.com/ActiveMemory/ctx/internal/cli/resume/cmd/root" ) // Cmd returns the top-level "ctx resume" command. -// -// Returns: -// - *cobra.Command: Configured resume command func Cmd() *cobra.Command { - short, long := assets.CommandDesc("resume") - - cmd := &cobra.Command{ - Use: "resume", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, _ []string) error { - sessionID, _ := cmd.Flags().GetString("session-id") - return resumeroot.Run(cmd, sessionID) - }, - } - cmd.Flags().String("session-id", "", assets.FlagDesc("resume.session-id")) - return cmd + return resumeroot.Cmd() } diff --git a/internal/cli/resume/resume_test.go b/internal/cli/resume/resume_test.go index eb5e5cb1..fd5ad8b4 100644 --- a/internal/cli/resume/resume_test.go +++ b/internal/cli/resume/resume_test.go @@ -14,19 +14,19 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) func setupStateDir(t *testing.T) string { t.Helper() - dir := t.TempDir() - t.Setenv("CTX_DIR", dir) + tmpDir := t.TempDir() + t.Setenv("CTX_DIR", tmpDir) rc.Reset() - if mkErr := os.MkdirAll(filepath.Join(dir, config.DirState), 0o750); mkErr != nil { + if mkErr := os.MkdirAll(filepath.Join(tmpDir, dir.State), 0o750); mkErr != nil { t.Fatal(mkErr) } - return dir + return tmpDir } func TestCmd_WithSessionIDFlag(t *testing.T) { @@ -49,13 +49,13 @@ func TestCmd_WithSessionIDFlag(t *testing.T) { } func TestCmd_PauseResume_Roundtrip(t *testing.T) { - dir := setupStateDir(t) + tmpDir := setupStateDir(t) sessionID := "test-roundtrip" // Pause first — creates the marker file. core.Pause(sessionID) - markerPath := filepath.Join(dir, config.DirState, "ctx-paused-"+sessionID) + markerPath := filepath.Join(tmpDir, dir.State, "ctx-paused-"+sessionID) if _, statErr := os.Stat(markerPath); statErr != nil { t.Fatalf("pause marker should exist after Pause(): %v", statErr) } diff --git a/internal/cli/serve/cmd/root/cmd.go b/internal/cli/serve/cmd/root/cmd.go index d37451a9..620563da 100644 --- a/internal/cli/serve/cmd/root/cmd.go +++ b/internal/cli/serve/cmd/root/cmd.go @@ -5,3 +5,31 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the serve command. +// +// Serves a static site by invoking zensical serve on the specified directory. +// +// Returns: +// - *cobra.Command: The serve command +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyServe) + + cmd := &cobra.Command{ + Use: "serve [directory]", + Short: short, + Long: long, + Args: cobra.MaximumNArgs(1), + RunE: func(_ *cobra.Command, args []string) error { + return Run(args) + }, + } + + return cmd +} diff --git a/internal/cli/serve/cmd/root/doc.go b/internal/cli/serve/cmd/root/doc.go new file mode 100644 index 00000000..081a47ab --- /dev/null +++ b/internal/cli/serve/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx serve command. +// +// It serves a static site locally by invoking zensical on the specified +// directory. +package root diff --git a/internal/cli/serve/cmd/root/err.go b/internal/cli/serve/cmd/root/err.go deleted file mode 100644 index b792cc33..00000000 --- a/internal/cli/serve/cmd/root/err.go +++ /dev/null @@ -1,50 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package root - -import "fmt" - -// ErrDirNotFound returns an error when the serve directory does not exist. -// -// Parameters: -// - dir: Directory path that was not found -// -// Returns: -// - error: Formatted error with the missing path -func ErrDirNotFound(dir string) error { - return fmt.Errorf("directory not found: %s", dir) -} - -// ErrNotDir returns an error when the path exists but is not a directory. -// -// Parameters: -// - path: Path that is not a directory -// -// Returns: -// - error: Formatted error with the path -func ErrNotDir(path string) error { - return fmt.Errorf("not a directory: %s", path) -} - -// ErrNoSiteConfig returns an error when the zensical config file is missing. -// -// Parameters: -// - dir: Directory where the config was expected -// -// Returns: -// - error: Formatted error with the directory path -func ErrNoSiteConfig(dir string) error { - return fmt.Errorf("no zensical.toml found in %s", dir) -} - -// ErrZensicalNotFound returns an error when zensical is not installed. -// -// Returns: -// - error: Formatted error with install instructions -func ErrZensicalNotFound() error { - return fmt.Errorf("zensical not found. Install with: pipx install zensical (requires Python >= 3.10)") -} diff --git a/internal/cli/serve/cmd/root/run.go b/internal/cli/serve/cmd/root/run.go index 4658adaa..c14826a2 100644 --- a/internal/cli/serve/cmd/root/run.go +++ b/internal/cli/serve/cmd/root/run.go @@ -11,7 +11,9 @@ import ( "os/exec" "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/zensical" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -24,41 +26,51 @@ import ( // - error: Non-nil if directory is invalid, config is missing, // or zensical is not found func Run(args []string) error { - var dir string + var d string if len(args) > 0 { - dir = args[0] + d = args[0] } else { - dir = filepath.Join(rc.ContextDir(), config.DirJournalSite) + d = filepath.Join(rc.ContextDir(), dir.JournalSite) } // Verify directory exists - info, statErr := os.Stat(dir) + info, statErr := os.Stat(d) if statErr != nil { - return ErrDirNotFound(dir) + return ctxerr.DirNotFound(d) } if !info.IsDir() { - return ErrNotDir(dir) + return ctxerr.NotDirectory(d) } // Check zensical.toml exists - tomlPath := filepath.Join(dir, config.FileZensicalToml) + tomlPath := filepath.Join(d, zensical.Toml) if _, statErr = os.Stat(tomlPath); os.IsNotExist(statErr) { - return ErrNoSiteConfig(dir) + return ctxerr.NoSiteConfig(d) } // Check if zensical is available - _, lookErr := exec.LookPath(config.BinZensical) + _, lookErr := exec.LookPath(zensical.Bin) if lookErr != nil { - return ErrZensicalNotFound() + return ctxerr.ZensicalNotFound() } - // Run zensical serve - zensical := exec.Command(config.BinZensical, "serve") //nolint:gosec // G204: args are constants - zensical.Dir = dir - zensical.Stdout = os.Stdout - zensical.Stderr = os.Stderr - zensical.Stdin = os.Stdin + return runZensical(d) +} + +// runZensical launches zensical serve in the given directory. +// +// Parameters: +// - dir: Working directory for the zensical process +// +// Returns: +// - error: Non-nil if the process fails +func runZensical(dir string) error { + z := exec.Command(zensical.Bin, "serve") //nolint:gosec // G204: args are constants + z.Dir = dir + z.Stdout = os.Stdout + z.Stderr = os.Stderr + z.Stdin = os.Stdin - return zensical.Run() + return z.Run() } diff --git a/internal/cli/serve/serve.go b/internal/cli/serve/serve.go index 97a95e21..9c4f0857 100644 --- a/internal/cli/serve/serve.go +++ b/internal/cli/serve/serve.go @@ -9,28 +9,10 @@ package serve import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" serveroot "github.com/ActiveMemory/ctx/internal/cli/serve/cmd/root" ) // Cmd returns the serve command. -// -// Serves a static site by invoking zensical serve on the specified directory. -// -// Returns: -// - *cobra.Command: The serve command func Cmd() *cobra.Command { - short, long := assets.CommandDesc("serve") - - cmd := &cobra.Command{ - Use: "serve [directory]", - Short: short, - Long: long, - Args: cobra.MaximumNArgs(1), - RunE: func(_ *cobra.Command, args []string) error { - return serveroot.Run(args) - }, - } - - return cmd + return serveroot.Cmd() } diff --git a/internal/cli/serve/serve_test.go b/internal/cli/serve/serve_test.go index 90e306e5..2fe3d86b 100644 --- a/internal/cli/serve/serve_test.go +++ b/internal/cli/serve/serve_test.go @@ -13,7 +13,8 @@ import ( "testing" serveroot "github.com/ActiveMemory/ctx/internal/cli/serve/cmd/root" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/zensical" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) func TestCmd(t *testing.T) { @@ -108,7 +109,7 @@ func TestRunServe_ZensicalNotFound(t *testing.T) { defer func() { _ = os.RemoveAll(tmpDir) }() // Create a zensical.toml so we pass the config check - tomlPath := filepath.Join(tmpDir, config.FileZensicalToml) + tomlPath := filepath.Join(tmpDir, zensical.Toml) if err := os.WriteFile(tomlPath, []byte("[site]\n"), 0600); err != nil { t.Fatalf("failed to create zensical.toml: %v", err) } @@ -140,7 +141,7 @@ func TestRunServe_DefaultDir(t *testing.T) { } func TestErrDirNotFound(t *testing.T) { - err := serveroot.ErrDirNotFound("/some/path") + err := ctxerr.DirNotFound("/some/path") if err == nil { t.Fatal("expected non-nil error") } @@ -153,7 +154,7 @@ func TestErrDirNotFound(t *testing.T) { } func TestErrNotDir(t *testing.T) { - err := serveroot.ErrNotDir("/some/file") + err := ctxerr.NotDirectory("/some/file") if err == nil { t.Fatal("expected non-nil error") } @@ -166,7 +167,7 @@ func TestErrNotDir(t *testing.T) { } func TestErrNoSiteConfig(t *testing.T) { - err := serveroot.ErrNoSiteConfig("/some/dir") + err := ctxerr.NoSiteConfig("/some/dir") if err == nil { t.Fatal("expected non-nil error") } @@ -186,7 +187,7 @@ func TestRunServe_WithMockZensical(t *testing.T) { } defer func() { _ = os.RemoveAll(tmpDir) }() - tomlPath := filepath.Join(tmpDir, config.FileZensicalToml) + tomlPath := filepath.Join(tmpDir, zensical.Toml) if err := os.WriteFile(tomlPath, []byte("[site]\n"), 0600); err != nil { t.Fatalf("failed to create zensical.toml: %v", err) } @@ -232,7 +233,7 @@ func TestCmd_RunE(t *testing.T) { } func TestErrZensicalNotFound(t *testing.T) { - err := serveroot.ErrZensicalNotFound() + err := ctxerr.ZensicalNotFound() if err == nil { t.Fatal("expected non-nil error") } diff --git a/internal/cli/site/cmd/feed/cmd.go b/internal/cli/site/cmd/feed/cmd.go index b92068e0..d1d347cf 100644 --- a/internal/cli/site/cmd/feed/cmd.go +++ b/internal/cli/site/cmd/feed/cmd.go @@ -8,8 +8,7 @@ package feed import ( - "path/filepath" - + "github.com/ActiveMemory/ctx/internal/config/rss" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" @@ -25,24 +24,24 @@ func Cmd() *cobra.Command { baseURL string ) - short, long := assets.CommandDesc("site.feed") + short, long := assets.CommandDesc(assets.CmdDescKeySiteFeed) cmd := &cobra.Command{ Use: "feed", Short: short, Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runFeed(cmd, "docs/blog", out, baseURL) + return runFeed(cmd, rss.DefaultFeedInputDir, out, baseURL) }, } cmd.Flags().StringVarP( - &out, "out", "o", filepath.Join("site", "feed.xml"), - assets.FlagDesc("site.feed.out"), + &out, "out", "o", rss.DefaultFeedOutPath, + assets.FlagDesc(assets.FlagDescKeySiteFeedOut), ) cmd.Flags().StringVar( - &baseURL, "base-url", "https://ctx.ist", - assets.FlagDesc("site.feed.base-url"), + &baseURL, "base-url", rss.DefaultFeedBaseURL, + assets.FlagDesc(assets.FlagDescKeySiteFeedBaseUrl), ) return cmd diff --git a/internal/cli/site/cmd/feed/doc.go b/internal/cli/site/cmd/feed/doc.go new file mode 100644 index 00000000..11926c1b --- /dev/null +++ b/internal/cli/site/cmd/feed/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package feed implements the ctx site feed subcommand. +// +// It generates an Atom/RSS feed from journal entries for site publishing. +package feed diff --git a/internal/cli/site/cmd/feed/run.go b/internal/cli/site/cmd/feed/run.go index bb04b7f8..9143152e 100644 --- a/internal/cli/site/cmd/feed/run.go +++ b/internal/cli/site/cmd/feed/run.go @@ -15,18 +15,12 @@ import ( "sort" "strings" + "github.com/ActiveMemory/ctx/internal/config/rss" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" "gopkg.in/yaml.v3" "github.com/ActiveMemory/ctx/internal/cli/site/core" - "github.com/ActiveMemory/ctx/internal/config" -) - -const ( - atomNS = "http://www.w3.org/2005/Atom" - feedTitle = "ctx blog" - defaultAuthor = "Context contributors" - xmlHeader = `` + "\n" ) // blogDatePattern matches filenames like 2026-02-25-slug.md. @@ -195,8 +189,8 @@ func parsePost(path, filename string) (blogPost, postStatus) { } content := string(data) - nl := config.NewlineLF - sep := config.Separator + nl := token.NewlineLF + sep := token.Separator if !strings.HasPrefix(content, sep+nl) { return blogPost{ @@ -286,13 +280,13 @@ func parsePost(path, filename string) (blogPost, postStatus) { // Returns: // - string: First paragraph text, or empty if none found func extractSummary(body string) string { - lines := strings.Split(body, config.NewlineLF) + lines := strings.Split(body, token.NewlineLF) foundHeading := false for _, line := range lines { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, config.PrefixHeading) { + if strings.HasPrefix(trimmed, token.PrefixHeading) { foundHeading = true continue } @@ -305,7 +299,7 @@ func extractSummary(body string) string { if trimmed == "" || strings.HasPrefix(trimmed, "!") || strings.HasPrefix(trimmed, "*") || - strings.HasPrefix(trimmed, config.PrefixHeading) { + strings.HasPrefix(trimmed, token.PrefixHeading) { continue } @@ -338,8 +332,8 @@ func generateAtom( } feed := core.AtomFeed{ - NS: atomNS, - Title: feedTitle, + NS: rss.FeedAtomNS, + Title: rss.FeedTitle, Links: []core.AtomLink{ {Href: blogURL}, {Href: feedURL, Rel: "self"}, @@ -369,7 +363,7 @@ func generateAtom( author := p.author if author == "" { - author = defaultAuthor + author = rss.FeedDefaultAuthor } entry.Author = &core.AtomAuthor{Name: author} @@ -393,7 +387,7 @@ func generateAtom( return fmt.Errorf("cannot marshal feed: %w", marshalErr) } - output := []byte(xmlHeader) + output := []byte(rss.FeedXMLHeader) output = append(output, xmlData...) output = append(output, '\n') diff --git a/internal/cli/site/site.go b/internal/cli/site/site.go index f128d6b8..7c3857a8 100644 --- a/internal/cli/site/site.go +++ b/internal/cli/site/site.go @@ -21,7 +21,7 @@ import ( // Returns: // - *cobra.Command: Parent command with site management subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("site") + short, long := assets.CommandDesc(assets.CmdDescKeySite) cmd := &cobra.Command{ Use: "site", diff --git a/internal/cli/status/cmd/root/cmd.go b/internal/cli/status/cmd/root/cmd.go index d37451a9..5556cff6 100644 --- a/internal/cli/status/cmd/root/cmd.go +++ b/internal/cli/status/cmd/root/cmd.go @@ -5,3 +5,46 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the status command. +// +// Flags: +// - --json: Output as JSON for machine parsing +// - --verbose, -v: Include file content previews +// +// Returns: +// - *cobra.Command: Configured status command with flags registered +func Cmd() *cobra.Command { + var ( + jsonOutput bool + verbose bool + ) + + short, long := assets.CommandDesc(assets.CmdDescKeyStatus) + + cmd := &cobra.Command{ + Use: "status", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, jsonOutput, verbose) + }, + } + + cmd.Flags().BoolVar( + &jsonOutput, + "json", false, assets.FlagDesc(assets.FlagDescKeyStatusJson), + ) + cmd.Flags().BoolVarP( + &verbose, "verbose", "v", false, + assets.FlagDesc(assets.FlagDescKeyStatusVerbose), + ) + + return cmd +} diff --git a/internal/cli/status/cmd/root/doc.go b/internal/cli/status/cmd/root/doc.go new file mode 100644 index 00000000..78c039a6 --- /dev/null +++ b/internal/cli/status/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx status command. +// +// It shows a context summary including file presence, token estimates, +// and task counts. +package root diff --git a/internal/cli/status/cmd/root/run.go b/internal/cli/status/cmd/root/run.go index 51f1518b..2781016b 100644 --- a/internal/cli/status/cmd/root/run.go +++ b/internal/cli/status/cmd/root/run.go @@ -8,12 +8,12 @@ package root import ( "errors" - "fmt" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/status/core" "github.com/ActiveMemory/ctx/internal/context" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // Run executes the status command logic. @@ -30,7 +30,7 @@ func Run(cmd *cobra.Command, jsonOutput, verbose bool) error { if err != nil { var notFoundError *context.NotFoundError if errors.As(err, ¬FoundError) { - return fmt.Errorf("no .context/ directory found. Run 'ctx init' first") + return ctxerr.ContextNotInitialized() } return err } diff --git a/internal/cli/status/core/fmt.go b/internal/cli/status/core/fmt.go index 0a6cf82c..5a6569eb 100644 --- a/internal/cli/status/core/fmt.go +++ b/internal/cli/status/core/fmt.go @@ -7,8 +7,10 @@ package core import ( - "fmt" "time" + + ctxtime "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/write" ) // FormatTimeAgo returns a human-readable relative time string. @@ -23,69 +25,29 @@ import ( // - string: Human-readable relative time func FormatTimeAgo(t time.Time) string { d := time.Since(t) - - switch { - case d < time.Minute: - return "just now" - case d < time.Hour: - mins := int(d.Minutes()) - if mins == 1 { - return "1 minute ago" - } - return fmt.Sprintf("%d minutes ago", mins) - case d < 24*time.Hour: - hours := int(d.Hours()) - if hours == 1 { - return "1 hour ago" - } - return fmt.Sprintf("%d hours ago", hours) - case d < 7*24*time.Hour: - days := int(d.Hours() / 24) - if days == 1 { - return "1 day ago" - } - return fmt.Sprintf("%d days ago", days) - default: - return t.Format("Jan 2, 2006") - } + return write.FormatTimeAgo( + d.Hours(), int(d.Minutes()), t.Format(ctxtime.OlderFormat), + ) } // FormatNumber returns a number with thousand separators. // -// Examples: 500 -> "500", 1500 -> "1,500", 12345 -> "12,345" -// // Parameters: // - n: The number to format // // Returns: // - string: Formatted number with commas func FormatNumber(n int) string { - if n < 1000 { - return fmt.Sprintf("%d", n) - } - return fmt.Sprintf("%d,%03d", n/1000, n%1000) + return write.FormatNumber(n) } // FormatBytes returns a human-readable byte-size string. // -// Uses binary units (1024-based): B, KB, MB, GB, etc. -// -// Examples: 500 -> "500 B", 1536 -> "1.5 KB", 1048576 -> "1.0 MB" -// // Parameters: // - b: The byte count to format // // Returns: // - string: Human-readable size with unit func FormatBytes(b int64) string { - const unit = 1024 - if b < unit { - return fmt.Sprintf("%d B", b) - } - div, exp := int64(unit), 0 - for n := b / unit; n >= unit; n /= unit { - div *= unit - exp++ - } - return fmt.Sprintf("%.1f %cB", float64(b)/float64(div), "KMGTPE"[exp]) + return write.FormatBytes(b) } diff --git a/internal/cli/status/core/out.go b/internal/cli/status/core/out.go index df47afc6..757b0d76 100644 --- a/internal/cli/status/core/out.go +++ b/internal/cli/status/core/out.go @@ -8,12 +8,13 @@ package core import ( "encoding/json" - "fmt" "time" + ctxtime "github.com/ActiveMemory/ctx/internal/config/time" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/write" ) // OutputStatusJSON writes context status as JSON to the command output. @@ -60,10 +61,6 @@ func OutputStatusJSON( // OutputStatusText writes context status as formatted text to the command output. // -// Displays a summary including file count, token estimate, file list with -// status indicators, and recent activity. When verbose is true, includes -// token counts, file sizes, and content previews for each file. -// // Parameters: // - cmd: Cobra command for output stream // - ctx: Loaded context to display @@ -74,61 +71,41 @@ func OutputStatusJSON( func OutputStatusText( cmd *cobra.Command, ctx *context.Context, verbose bool, ) error { - cmd.Println("Context Status") - cmd.Println("====================") - cmd.Println() - - cmd.Println(fmt.Sprintf("Context Directory: %s", ctx.Dir)) - cmd.Println(fmt.Sprintf("Total Files: %d", len(ctx.Files))) - cmd.Println(fmt.Sprintf( - "Token Estimate: %s tokens", FormatNumber(ctx.TotalTokens)), - ) - cmd.Println() - - cmd.Println("Files:") + write.StatusHeader(cmd, ctx.Dir, len(ctx.Files), ctx.TotalTokens) - // Sort files in a logical order sortedFiles := make([]context.FileInfo, len(ctx.Files)) copy(sortedFiles, ctx.Files) SortFilesByPriority(sortedFiles) for _, f := range sortedFiles { - var status string - var indicator string + fi := write.StatusFileInfo{ + Name: f.Name, + Tokens: f.Tokens, + Size: f.Size, + } if f.IsEmpty { - indicator = "○" - status = "empty" + fi.Indicator = "\u25cb" + fi.Status = "empty" } else { - indicator = "✓" - status = f.Summary + fi.Indicator = "\u2713" + fi.Status = f.Summary } - - if verbose { - // Verbose: show tokens and size - cmd.Println(fmt.Sprintf(" %s %s (%s) [%s tokens, %s]", - indicator, f.Name, status, - FormatNumber(f.Tokens), FormatBytes(f.Size))) - - // Show content preview for non-empty files - if !f.IsEmpty { - preview := ContentPreview(string(f.Content), 3) - for _, line := range preview { - cmd.Println(fmt.Sprintf(" (%s)", line)) - } - } - } else { - cmd.Println(fmt.Sprintf(" %s %s (%s)", indicator, f.Name, status)) + if verbose && !f.IsEmpty { + fi.Preview = ContentPreview(string(f.Content), 3) } + write.StatusFileItem(cmd, fi, verbose) } - // Recent activity - cmd.Println() - cmd.Println("Recent Activity:") recentFiles := GetRecentFiles(ctx.Files, 3) - for _, f := range recentFiles { - ago := FormatTimeAgo(f.ModTime) - cmd.Println(fmt.Sprintf(" - %s modified %s", f.Name, ago)) + entries := make([]write.StatusActivityInfo, len(recentFiles)) + for i, f := range recentFiles { + d := time.Since(f.ModTime) + entries[i] = write.StatusActivityInfo{ + Name: f.Name, + Ago: write.FormatTimeAgo(d.Hours(), int(d.Minutes()), f.ModTime.Format(ctxtime.OlderFormat)), + } } + write.StatusActivity(cmd, entries) return nil } diff --git a/internal/cli/status/core/preview.go b/internal/cli/status/core/preview.go index 554a19e9..0d8a9b3e 100644 --- a/internal/cli/status/core/preview.go +++ b/internal/cli/status/core/preview.go @@ -10,7 +10,9 @@ import ( "strings" "unicode/utf8" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/ActiveMemory/ctx/internal/config/token" ) // ContentPreview returns the first n non-empty, meaningful lines @@ -26,7 +28,7 @@ import ( // Returns: // - []string: Up to n meaningful lines from the content func ContentPreview(content string, n int) []string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var preview []string inFrontmatter := false @@ -39,7 +41,7 @@ func ContentPreview(content string, n int) []string { } // Skip YAML frontmatter - if trimmed == config.Separator { + if trimmed == token.Separator { inFrontmatter = !inFrontmatter continue } @@ -48,15 +50,17 @@ func ContentPreview(content string, n int) []string { } // Skip HTML comments - if strings.HasPrefix(trimmed, config.CommentOpen) { + if strings.HasPrefix(trimmed, marker.CommentOpen) { continue } // Truncate long lines - if utf8.RuneCountInString(trimmed) > config.MaxPreviewLen { + if utf8.RuneCountInString(trimmed) > stats.MaxPreviewLen { runes := []rune(trimmed) - truncateAt := config.MaxPreviewLen - utf8.RuneCountInString(config.Ellipsis) - trimmed = string(runes[:truncateAt]) + config.Ellipsis + truncateAt := stats.MaxPreviewLen - utf8.RuneCountInString( + token.Ellipsis, + ) + trimmed = string(runes[:truncateAt]) + token.Ellipsis } preview = append(preview, trimmed) diff --git a/internal/cli/status/status.go b/internal/cli/status/status.go index fd3d6dee..46bb182e 100644 --- a/internal/cli/status/status.go +++ b/internal/cli/status/status.go @@ -9,42 +9,10 @@ package status import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" statusroot "github.com/ActiveMemory/ctx/internal/cli/status/cmd/root" ) // Cmd returns the status command. -// -// Flags: -// - --json: Output as JSON for machine parsing -// - --verbose, -v: Include file content previews -// -// Returns: -// - *cobra.Command: Configured status command with flags registered func Cmd() *cobra.Command { - var ( - jsonOutput bool - verbose bool - ) - - short, long := assets.CommandDesc("status") - - cmd := &cobra.Command{ - Use: "status", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, _ []string) error { - return statusroot.Run(cmd, jsonOutput, verbose) - }, - } - - cmd.Flags().BoolVar( - &jsonOutput, - "json", false, assets.FlagDesc("status.json"), - ) - cmd.Flags().BoolVarP( - &verbose, "verbose", "v", false, assets.FlagDesc("status.verbose"), - ) - - return cmd + return statusroot.Cmd() } diff --git a/internal/cli/sync/cmd/root/cmd.go b/internal/cli/sync/cmd/root/cmd.go index d37451a9..ab2a2e99 100644 --- a/internal/cli/sync/cmd/root/cmd.go +++ b/internal/cli/sync/cmd/root/cmd.go @@ -5,3 +5,42 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx sync" command for reconciling context with codebase. +// +// The command scans the codebase for changes that should be reflected in +// context files, such as new directories, package manager files, and +// configuration files. +// +// Flags: +// - --dry-run: Show what would change without modifying files +// +// Returns: +// - *cobra.Command: Configured sync command with flags registered +func Cmd() *cobra.Command { + var dryRun bool + + short, long := assets.CommandDesc(assets.CmdDescKeySync) + + cmd := &cobra.Command{ + Use: "sync", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, dryRun) + }, + } + + cmd.Flags().BoolVar( + &dryRun, + "dry-run", false, assets.FlagDesc(assets.FlagDescKeySyncDryRun), + ) + + return cmd +} diff --git a/internal/cli/sync/cmd/root/doc.go b/internal/cli/sync/cmd/root/doc.go new file mode 100644 index 00000000..dbb4e1df --- /dev/null +++ b/internal/cli/sync/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx sync command. +// +// It reconciles context with the codebase by scanning for changes that +// should be reflected in context files. +package root diff --git a/internal/cli/sync/cmd/root/run.go b/internal/cli/sync/cmd/root/run.go index 1e0a828e..6e8f62ce 100644 --- a/internal/cli/sync/cmd/root/run.go +++ b/internal/cli/sync/cmd/root/run.go @@ -8,12 +8,13 @@ package root import ( "errors" - "fmt" + "github.com/ActiveMemory/ctx/internal/write/sync" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/sync/core" "github.com/ActiveMemory/ctx/internal/context" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // Run executes the sync command logic. @@ -33,7 +34,7 @@ func Run(cmd *cobra.Command, dryRun bool) error { if err != nil { var notFoundError *context.NotFoundError if errors.As(err, ¬FoundError) { - return fmt.Errorf("no .context/ directory found. Run 'ctx init' first") + return ctxerr.ContextNotInitialized() } return err } @@ -41,38 +42,17 @@ func Run(cmd *cobra.Command, dryRun bool) error { actions := core.DetectSyncActions(ctx) if len(actions) == 0 { - cmd.Println("✓ Context is in sync with codebase") + sync.CtxSyncInSync(cmd) return nil } - cmd.Println("Sync Analysis") - cmd.Println("=============") - cmd.Println() - - if dryRun { - cmd.Println("DRY RUN — No changes will be made") - cmd.Println() - } + sync.CtxSyncHeader(cmd, dryRun) for i, action := range actions { - cmd.Println(fmt.Sprintf("%d. [%s] %s", i+1, action.Type, action.Description)) - if action.Suggestion != "" { - cmd.Println(fmt.Sprintf(" Suggestion: %s", action.Suggestion)) - } - cmd.Println() + sync.CtxSyncAction(cmd, i+1, action.Type, action.Description, action.Suggestion) } - if dryRun { - cmd.Println(fmt.Sprintf( - "Found %d items to sync. Run without --dry-run to apply suggestions.", - len(actions), - )) - } else { - cmd.Println(fmt.Sprintf( - "Found %d items. Review and update context files manually.", - len(actions), - )) - } + sync.CtxSyncSummary(cmd, len(actions), dryRun) return nil } diff --git a/internal/cli/sync/core/core_test.go b/internal/cli/sync/core/core_test.go index ee40eb8a..0a1e23c4 100644 --- a/internal/cli/sync/core/core_test.go +++ b/internal/cli/sync/core/core_test.go @@ -13,7 +13,8 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/initialize" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/context" ) @@ -36,21 +37,21 @@ func setupSyncDir(t *testing.T) string { } func TestDetectSyncActions_NoActions(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) ctx, err := context.Load("") if err != nil { t.Fatalf("failed to load context: %v", err) } - _ = dir + _ = tmpDir actions := DetectSyncActions(ctx) // Just verify it runs without error _ = actions } func TestCheckNewDirectories_ImportantDirs(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) ctx, err := context.Load("") if err != nil { @@ -59,7 +60,7 @@ func TestCheckNewDirectories_ImportantDirs(t *testing.T) { // Create important directories for _, d := range []string{"src", "lib", "pkg", "internal", "cmd", "api"} { - if mkErr := os.Mkdir(filepath.Join(dir, d), 0750); mkErr != nil { + if mkErr := os.Mkdir(filepath.Join(tmpDir, d), 0750); mkErr != nil { t.Fatal(mkErr) } } @@ -76,7 +77,7 @@ func TestCheckNewDirectories_ImportantDirs(t *testing.T) { } func TestCheckNewDirectories_SkipsHiddenAndVendor(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) ctx, err := context.Load("") if err != nil { @@ -85,7 +86,7 @@ func TestCheckNewDirectories_SkipsHiddenAndVendor(t *testing.T) { // Create directories that should be skipped for _, d := range []string{".git", "node_modules", "vendor", "dist", "build"} { - if mkErr := os.Mkdir(filepath.Join(dir, d), 0750); mkErr != nil { + if mkErr := os.Mkdir(filepath.Join(tmpDir, d), 0750); mkErr != nil { t.Fatal(mkErr) } } @@ -101,10 +102,10 @@ func TestCheckNewDirectories_SkipsHiddenAndVendor(t *testing.T) { } func TestCheckNewDirectories_DocumentedDirsIgnored(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Write ARCHITECTURE.md that mentions "src" - archPath := filepath.Join(dir, config.DirContext, config.FileArchitecture) + archPath := filepath.Join(tmpDir, dir.Context, ctx.Architecture) if err := os.WriteFile(archPath, []byte("# Architecture\n\nThe src directory contains...\n"), 0600); err != nil { t.Fatal(err) } @@ -114,7 +115,7 @@ func TestCheckNewDirectories_DocumentedDirsIgnored(t *testing.T) { t.Fatal(err) } - if mkErr := os.Mkdir(filepath.Join(dir, "src"), 0750); mkErr != nil { + if mkErr := os.Mkdir(filepath.Join(tmpDir, "src"), 0750); mkErr != nil { t.Fatal(mkErr) } @@ -141,16 +142,16 @@ func TestCheckPackageFiles_NoPackages(t *testing.T) { } func TestCheckPackageFiles_WithPackageFile(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Remove any existing dependency docs so the check triggers - archPath := filepath.Join(dir, config.DirContext, config.FileArchitecture) + archPath := filepath.Join(tmpDir, dir.Context, ctx.Architecture) _ = os.WriteFile(archPath, []byte("# Architecture\n\nSimple app.\n"), 0600) - depsPath := filepath.Join(dir, config.DirContext, config.FileDependency) + depsPath := filepath.Join(tmpDir, dir.Context, ctx.Dependency) _ = os.Remove(depsPath) // Create a package.json - if err := os.WriteFile(filepath.Join(dir, "package.json"), []byte(`{"name":"test"}`), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, "package.json"), []byte(`{"name":"test"}`), 0600); err != nil { t.Fatal(err) } @@ -173,13 +174,13 @@ func TestCheckPackageFiles_WithPackageFile(t *testing.T) { } func TestCheckPackageFiles_WithDepsDoc(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Create a package.json and DEPENDENCIES.md - if err := os.WriteFile(filepath.Join(dir, "package.json"), []byte(`{"name":"test"}`), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, "package.json"), []byte(`{"name":"test"}`), 0600); err != nil { t.Fatal(err) } - depsPath := filepath.Join(dir, config.DirContext, config.FileDependency) + depsPath := filepath.Join(tmpDir, dir.Context, ctx.Dependency) if err := os.WriteFile(depsPath, []byte("# Dependencies\n\nAll documented.\n"), 0600); err != nil { t.Fatal(err) } @@ -198,11 +199,11 @@ func TestCheckPackageFiles_WithDepsDoc(t *testing.T) { } func TestCheckConfigFiles_NoConfigs(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Remove Makefile created by init (it matches the Makefile config pattern) - _ = os.Remove(filepath.Join(dir, "Makefile")) - _ = os.Remove(filepath.Join(dir, "Makefile.ctx")) + _ = os.Remove(filepath.Join(tmpDir, "Makefile")) + _ = os.Remove(filepath.Join(tmpDir, "Makefile.ctx")) ctx, err := context.Load("") if err != nil { @@ -217,10 +218,10 @@ func TestCheckConfigFiles_NoConfigs(t *testing.T) { } func TestCheckConfigFiles_WithConfigFile(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Create a tsconfig.json - if err := os.WriteFile(filepath.Join(dir, "tsconfig.json"), []byte(`{}`), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, "tsconfig.json"), []byte(`{}`), 0600); err != nil { t.Fatal(err) } @@ -243,15 +244,15 @@ func TestCheckConfigFiles_WithConfigFile(t *testing.T) { } func TestCheckConfigFiles_DocumentedInConventions(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Create tsconfig.json - if err := os.WriteFile(filepath.Join(dir, "tsconfig.json"), []byte(`{}`), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, "tsconfig.json"), []byte(`{}`), 0600); err != nil { t.Fatal(err) } // Write CONVENTIONS.md mentioning tsconfig - convPath := filepath.Join(dir, config.DirContext, config.FileConvention) + convPath := filepath.Join(tmpDir, dir.Context, ctx.Convention) if err := os.WriteFile(convPath, []byte("# Conventions\n\ntsconfig.json is configured for strict mode.\n"), 0600); err != nil { t.Fatal(err) } @@ -270,15 +271,15 @@ func TestCheckConfigFiles_DocumentedInConventions(t *testing.T) { } func TestCheckPackageFiles_ArchContainsDependencies(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Create a go.mod - if err := os.WriteFile(filepath.Join(dir, "go.mod"), []byte("module test\n"), 0600); err != nil { + if err := os.WriteFile(filepath.Join(tmpDir, "go.mod"), []byte("module test\n"), 0600); err != nil { t.Fatal(err) } // Write ARCHITECTURE.md that mentions "dependencies" - archPath := filepath.Join(dir, config.DirContext, config.FileArchitecture) + archPath := filepath.Join(tmpDir, dir.Context, ctx.Architecture) if err := os.WriteFile(archPath, []byte("# Architecture\n\nProject dependencies are managed via go.mod.\n"), 0600); err != nil { t.Fatal(err) } @@ -299,20 +300,20 @@ func TestCheckPackageFiles_ArchContainsDependencies(t *testing.T) { func TestAction_Fields(t *testing.T) { a := Action{ Type: "NEW_DIR", - File: config.FileArchitecture, + File: ctx.Architecture, Description: "test description", Suggestion: "test suggestion", } - if a.Type != "NEW_DIR" || a.File != config.FileArchitecture { + if a.Type != "NEW_DIR" || a.File != ctx.Architecture { t.Error("action fields should be set correctly") } } func TestRunSync_ActionWithEmptySuggestion(t *testing.T) { - dir := setupSyncDir(t) + tmpDir := setupSyncDir(t) // Create important dir to trigger actions - if err := os.Mkdir(filepath.Join(dir, "services"), 0750); err != nil { + if err := os.Mkdir(filepath.Join(tmpDir, "services"), 0750); err != nil { t.Fatal(err) } diff --git a/internal/cli/sync/core/validate.go b/internal/cli/sync/core/validate.go index 065777b7..b7ee1914 100644 --- a/internal/cli/sync/core/validate.go +++ b/internal/cli/sync/core/validate.go @@ -12,7 +12,9 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dep" "github.com/ActiveMemory/ctx/internal/context" ) @@ -31,11 +33,11 @@ import ( func CheckPackageFiles(ctx *context.Context) []Action { var actions []Action - for file, desc := range config.Packages { - if _, err := os.Stat(file); err == nil { + for f, desc := range dep.Packages { + if _, err := os.Stat(f); err == nil { // File exists, check if we have DEPENDENCIES.md or similar hasDepsDoc := false - if f := ctx.File(config.FileDependency); f != nil { + if f := ctx.File(ctxCfg.Dependency); f != nil { hasDepsDoc = true } else { for _, f := range ctx.Files { @@ -51,13 +53,15 @@ func CheckPackageFiles(ctx *context.Context) []Action { if !hasDepsDoc { actions = append(actions, Action{ Type: "DEPS", - File: config.FileArchitecture, + File: ctxCfg.Architecture, Description: fmt.Sprintf( - "Found %s (%s) but no dependency documentation", file, desc, + assets.TextDesc(assets.TextDescKeySyncDepsDescription), + f, desc, + ), + Suggestion: fmt.Sprintf( + assets.TextDesc(assets.TextDescKeySyncDepsSuggestion), + ctxCfg.Architecture, ctxCfg.Dependency, ), - Suggestion: "Consider documenting key dependencies " + - "in " + config.FileArchitecture + " or create " + - config.FileDependency, }) } } @@ -80,12 +84,12 @@ func CheckPackageFiles(ctx *context.Context) []Action { func CheckConfigFiles(ctx *context.Context) []Action { var actions []Action - for _, cfg := range config.Patterns { + for _, cfg := range assets.Patterns { matches, _ := filepath.Glob(cfg.Pattern) if len(matches) > 0 { // Check if CONVENTIONS.md mentions this var convContent string - if f := ctx.File(config.FileConvention); f != nil { + if f := ctx.File(ctxCfg.Convention); f != nil { convContent = strings.ToLower(string(f.Content)) } @@ -94,13 +98,14 @@ func CheckConfigFiles(ctx *context.Context) []Action { if convContent == "" || !strings.Contains(convContent, keyword) { actions = append(actions, Action{ Type: "CONFIG", - File: config.FileConvention, + File: ctxCfg.Convention, Description: fmt.Sprintf( - "Found %s but %s not documented", matches[0], cfg.Topic, + assets.TextDesc(assets.TextDescKeySyncConfigDescription), + matches[0], cfg.Topic, ), - Suggestion: fmt.Sprintf("Document %s in %s", - cfg.Topic, - config.FileConvention, + Suggestion: fmt.Sprintf( + assets.TextDesc(assets.TextDescKeySyncConfigSuggestion), + cfg.Topic, ctxCfg.Convention, ), }) } @@ -127,7 +132,7 @@ func CheckNewDirectories(ctx *context.Context) []Action { // Get ARCHITECTURE.md content var archContent string - if f := ctx.File(config.FileArchitecture); f != nil { + if f := ctx.File(ctxCfg.Architecture); f != nil { archContent = strings.ToLower(string(f.Content)) } @@ -168,12 +173,14 @@ func CheckNewDirectories(ctx *context.Context) []Action { if isImportant && !strings.Contains(archContent, name) { actions = append(actions, Action{ Type: "NEW_DIR", - File: config.FileArchitecture, + File: ctxCfg.Architecture, Description: fmt.Sprintf( - "Directory '%s/' exists but not documented", name, + assets.TextDesc(assets.TextDescKeySyncDirDescription), + name, ), Suggestion: fmt.Sprintf( - "Add '%s/' to %s with description", name, config.FileArchitecture, + assets.TextDesc(assets.TextDescKeySyncDirSuggestion), + name, ctxCfg.Architecture, ), }) } diff --git a/internal/cli/sync/sync.go b/internal/cli/sync/sync.go index ae134cd4..2e2de5c4 100644 --- a/internal/cli/sync/sync.go +++ b/internal/cli/sync/sync.go @@ -9,39 +9,10 @@ package sync import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" syncroot "github.com/ActiveMemory/ctx/internal/cli/sync/cmd/root" ) // Cmd returns the "ctx sync" command for reconciling context with codebase. -// -// The command scans the codebase for changes that should be reflected in -// context files, such as new directories, package manager files, and -// configuration files. -// -// Flags: -// - --dry-run: Show what would change without modifying files -// -// Returns: -// - *cobra.Command: Configured sync command with flags registered func Cmd() *cobra.Command { - var dryRun bool - - short, long := assets.CommandDesc("sync") - - cmd := &cobra.Command{ - Use: "sync", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, _ []string) error { - return syncroot.Run(cmd, dryRun) - }, - } - - cmd.Flags().BoolVar( - &dryRun, - "dry-run", false, assets.FlagDesc("sync.dry-run"), - ) - - return cmd + return syncroot.Cmd() } diff --git a/internal/cli/system/cmd/backup/cmd.go b/internal/cli/system/cmd/backup/cmd.go index 68bd121e..2245bebb 100644 --- a/internal/cli/system/cmd/backup/cmd.go +++ b/internal/cli/system/cmd/backup/cmd.go @@ -7,6 +7,7 @@ package backup import ( + "github.com/ActiveMemory/ctx/internal/config/archive" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" @@ -17,25 +18,23 @@ import ( // Returns: // - *cobra.Command: Configured backup subcommand func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemBackup) + cmd := &cobra.Command{ Use: "backup", - Short: "Backup context and Claude data", - Long: `Create timestamped tar.gz archives of project context and/or global -Claude Code data. Optionally copies archives to an SMB share. - -Scopes: - project .context/, .claude/, ideas/, ~/.bashrc - global ~/.claude/ (excludes todos/) - all Both project and global (default) - -Environment: - CTX_BACKUP_SMB_URL - SMB share URL (e.g. smb://host/share) - CTX_BACKUP_SMB_SUBDIR - Subdirectory on share (default: ctx-sessions)`, + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runBackup(cmd) + return Run(cmd) }, } - cmd.Flags().String("scope", scopeAll, assets.FlagDesc("system.backup.scope")) - cmd.Flags().Bool("json", false, assets.FlagDesc("system.backup.json")) + + cmd.Flags().String("scope", archive.BackupScopeAll, + assets.FlagDesc(assets.FlagDescKeySystemBackupScope), + ) + cmd.Flags().Bool("json", false, + assets.FlagDesc(assets.FlagDescKeySystemBackupJson), + ) + return cmd } diff --git a/internal/cli/system/cmd/backup/doc.go b/internal/cli/system/cmd/backup/doc.go new file mode 100644 index 00000000..a602c029 --- /dev/null +++ b/internal/cli/system/cmd/backup/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package backup implements the ctx system backup subcommand. +// +// It creates timestamped tar.gz archives of project context and global +// Claude Code data, with optional SMB share copying. +package backup diff --git a/internal/cli/system/cmd/backup/run.go b/internal/cli/system/cmd/backup/run.go index 5e00c13d..9a1a08b2 100644 --- a/internal/cli/system/cmd/backup/run.go +++ b/internal/cli/system/cmd/backup/run.go @@ -7,91 +7,71 @@ package backup import ( - "archive/tar" - "compress/gzip" "encoding/json" - "fmt" - "io" - "io/fs" "os" - "path/filepath" "time" + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/env" + "github.com/ActiveMemory/ctx/internal/write/backup" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) -// backupScope enumerates the valid --scope values. -const ( - scopeProject = "project" - scopeGlobal = "global" - scopeAll = "all" -) - -// archiveEntry describes a directory or file to include in a backup archive. -type archiveEntry struct { - // SourcePath is the absolute path to the directory or file. - SourcePath string - // Prefix is the path prefix inside the tar archive. - Prefix string - // ExcludeDir is a directory name to skip (e.g. "journal-site"). - ExcludeDir string - // Optional means a missing source is not an error. - Optional bool -} - -// backupResult holds the outcome of a single archive creation. -type backupResult struct { - Scope string `json:"scope"` - Archive string `json:"archive"` - Size int64 `json:"size"` - SMBDest string `json:"smb_dest,omitempty"` -} - -func runBackup(cmd *cobra.Command) error { +// Run executes the backup command logic. +// +// Creates timestamped tar.gz archives of project context and/or global +// Claude Code data. Optionally copies archives to an SMB share. +// +// Parameters: +// - cmd: Cobra command for output and flag access +// +// Returns: +// - error: Non-nil on invalid scope, home directory lookup failure, +// SMB parse error, or archive creation failure +func Run(cmd *cobra.Command) error { scope, _ := cmd.Flags().GetString("scope") jsonOut, _ := cmd.Flags().GetBool("json") switch scope { - case scopeProject, scopeGlobal, scopeAll: + case archive.BackupScopeProject, archive.BackupScopeGlobal, archive.BackupScopeAll: default: - return fmt.Errorf("invalid scope %q: must be project, global, or all", scope) + return ctxerr.InvalidBackupScope(scope) } home, homeErr := os.UserHomeDir() if homeErr != nil { - return fmt.Errorf("determine home directory: %w", homeErr) + return ctxerr.HomeDir(homeErr) } - // Parse SMB config if configured. - smbURL := os.Getenv(config.EnvBackupSMBURL) - smbSubdir := os.Getenv(config.EnvBackupSMBSubdir) + smbURL := os.Getenv(env.BackupSMBURL) + smbSubdir := os.Getenv(env.BackupSMBSubdir) var smb *core.SMBConfig if smbURL != "" { var smbErr error smb, smbErr = core.ParseSMBConfig(smbURL, smbSubdir) if smbErr != nil { - return fmt.Errorf("parse SMB config: %w", smbErr) + return ctxerr.BackupSMBConfig(smbErr) } } - timestamp := time.Now().Format("20060102-150405") - var results []backupResult + timestamp := time.Now().Format(archive.BackupTimestampFormat) + var results []core.BackupResult - if scope == scopeProject || scope == scopeAll { - result, projErr := backupProject(cmd, home, timestamp, smb) + if scope == archive.BackupScopeProject || scope == archive.BackupScopeAll { + result, projErr := core.BackupProject(cmd, home, timestamp, smb) if projErr != nil { - return fmt.Errorf("project backup: %w", projErr) + return ctxerr.BackupProject(projErr) } results = append(results, result) } - if scope == scopeGlobal || scope == scopeAll { - result, globalErr := backupGlobal(cmd, home, timestamp, smb) + if scope == archive.BackupScopeGlobal || scope == archive.BackupScopeAll { + result, globalErr := core.BackupGlobal(cmd, home, timestamp, smb) if globalErr != nil { - return fmt.Errorf("global backup: %w", globalErr) + return ctxerr.BackupGlobal(globalErr) } results = append(results, result) } @@ -103,228 +83,7 @@ func runBackup(cmd *cobra.Command) error { } for _, r := range results { - cmd.Print(fmt.Sprintf("%s: %s (%s)", - r.Scope, r.Archive, formatSize(r.Size))) - if r.SMBDest != "" { - cmd.Print(fmt.Sprintf(" → %s", r.SMBDest)) - } - cmd.Println() - } - return nil -} - -func backupProject( - cmd *cobra.Command, home, timestamp string, smb *core.SMBConfig, -) (backupResult, error) { - cwd, cwdErr := os.Getwd() - if cwdErr != nil { - return backupResult{}, cwdErr - } - - archiveName := fmt.Sprintf("ctx-backup-%s.tar.gz", timestamp) - archivePath := filepath.Join(os.TempDir(), archiveName) - - entries := []archiveEntry{ - {SourcePath: filepath.Join(cwd, ".context"), Prefix: ".context", ExcludeDir: "journal-site"}, - {SourcePath: filepath.Join(cwd, ".claude"), Prefix: ".claude"}, - {SourcePath: filepath.Join(cwd, "ideas"), Prefix: "ideas", Optional: true}, - {SourcePath: filepath.Join(home, ".bashrc"), Prefix: ".bashrc"}, - } - - archiveErr := createArchive(archivePath, entries, cmd) - if archiveErr != nil { - return backupResult{}, archiveErr - } - - result := backupResult{Scope: scopeProject, Archive: archivePath} - info, statErr := os.Stat(archivePath) - if statErr == nil { - result.Size = info.Size() - } - - if smb != nil { - if mountErr := core.EnsureSMBMount(smb); mountErr != nil { - return result, mountErr - } - if copyErr := core.CopyToSMB(smb, archivePath); copyErr != nil { - return result, copyErr - } - result.SMBDest = filepath.Join(smb.GVFSPath, smb.Subdir, archiveName) - } - - // Touch marker file for check-backup-age hook. - markerDir := filepath.Join(home, ".local", "state") - _ = os.MkdirAll(markerDir, config.PermExec) - markerPath := filepath.Join(markerDir, config.BackupMarkerFile) - core.TouchFile(markerPath) - - return result, nil -} - -func backupGlobal( - cmd *cobra.Command, home, timestamp string, smb *core.SMBConfig, -) (backupResult, error) { - archiveName := fmt.Sprintf("claude-global-backup-%s.tar.gz", timestamp) - archivePath := filepath.Join(os.TempDir(), archiveName) - - entries := []archiveEntry{ - {SourcePath: filepath.Join(home, ".claude"), Prefix: ".claude", ExcludeDir: "todos"}, - } - - archiveErr := createArchive(archivePath, entries, cmd) - if archiveErr != nil { - return backupResult{}, archiveErr - } - - result := backupResult{Scope: scopeGlobal, Archive: archivePath} - info, statErr := os.Stat(archivePath) - if statErr == nil { - result.Size = info.Size() - } - - if smb != nil { - if mountErr := core.EnsureSMBMount(smb); mountErr != nil { - return result, mountErr - } - if copyErr := core.CopyToSMB(smb, archivePath); copyErr != nil { - return result, copyErr - } - result.SMBDest = filepath.Join(smb.GVFSPath, smb.Subdir, archiveName) - } - - return result, nil -} - -// createArchive builds a tar.gz archive from the given entries. -func createArchive( - archivePath string, entries []archiveEntry, cmd *cobra.Command, -) error { - outFile, createErr := os.Create(archivePath) //nolint:gosec // tmp path - if createErr != nil { - return fmt.Errorf("create archive file: %w", createErr) - } - defer func() { _ = outFile.Close() }() - - gzw := gzip.NewWriter(outFile) - defer func() { _ = gzw.Close() }() - - tw := tar.NewWriter(gzw) - defer func() { _ = tw.Close() }() - - for _, entry := range entries { - addErr := addEntry(tw, entry, cmd) - if addErr != nil { - return addErr - } + backup.BackupResultLine(cmd, r.Scope, r.Archive, r.Size, r.SMBDest) } return nil } - -// addEntry adds a single archiveEntry (file or directory) to the tar writer. -func addEntry(tw *tar.Writer, entry archiveEntry, cmd *cobra.Command) error { - info, statErr := os.Stat(entry.SourcePath) - if os.IsNotExist(statErr) { - if entry.Optional { - cmd.PrintErrln(fmt.Sprintf("skipping %s (not found)", entry.Prefix)) - return nil - } - return fmt.Errorf("source not found: %s", entry.SourcePath) - } - if statErr != nil { - return statErr - } - - // Single file (e.g. ~/.bashrc). - if !info.IsDir() { - return addSingleFile(tw, entry.SourcePath, entry.Prefix, info) - } - - // Directory walk. - return filepath.WalkDir(entry.SourcePath, - func(path string, d fs.DirEntry, walkErr error) error { - if walkErr != nil { - return walkErr - } - - // Skip excluded directories. - if d.IsDir() && entry.ExcludeDir != "" && d.Name() == entry.ExcludeDir { - return filepath.SkipDir - } - - // Skip symlinks. - if d.Type()&os.ModeSymlink != 0 { - return nil - } - - rel, relErr := filepath.Rel(entry.SourcePath, path) - if relErr != nil { - return relErr - } - - name := filepath.Join(entry.Prefix, rel) - // Normalize to forward slashes in tar. - name = filepath.ToSlash(name) - - fileInfo, infoErr := d.Info() - if infoErr != nil { - return infoErr - } - - header, headerErr := tar.FileInfoHeader(fileInfo, "") - if headerErr != nil { - return headerErr - } - header.Name = name - - if writeErr := tw.WriteHeader(header); writeErr != nil { - return writeErr - } - - if d.IsDir() { - return nil - } - - return copyFileToTar(tw, path) - }) -} - -// addSingleFile writes a single file entry into the tar. -func addSingleFile( - tw *tar.Writer, path, name string, info fs.FileInfo, -) error { - header, headerErr := tar.FileInfoHeader(info, "") - if headerErr != nil { - return headerErr - } - header.Name = name - - if writeErr := tw.WriteHeader(header); writeErr != nil { - return writeErr - } - return copyFileToTar(tw, path) -} - -// copyFileToTar reads a file and writes its contents to the tar writer. -func copyFileToTar(tw *tar.Writer, path string) error { - f, openErr := os.Open(path) //nolint:gosec // paths are from our own entries - if openErr != nil { - return openErr - } - defer func() { _ = f.Close() }() - _, copyErr := io.Copy(tw, f) - return copyErr -} - -// formatSize returns a human-readable file size string. -func formatSize(bytes int64) string { - const unit = 1024 - if bytes < unit { - return fmt.Sprintf("%d B", bytes) - } - div, exp := int64(unit), 0 - for n := bytes / unit; n >= unit; n /= unit { - div *= unit - exp++ - } - return fmt.Sprintf("%.1f %cB", float64(bytes)/float64(div), "KMG"[exp]) -} diff --git a/internal/cli/system/cmd/block_dangerous_commands/cmd.go b/internal/cli/system/cmd/block_dangerous_commands/cmd.go new file mode 100644 index 00000000..7e1f1611 --- /dev/null +++ b/internal/cli/system/cmd/block_dangerous_commands/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package block_dangerous_commands + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system block-dangerous-commands" subcommand. +// +// Returns: +// - *cobra.Command: Configured block-dangerous-commands subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemBlockDangerousCommands) + + return &cobra.Command{ + Use: "block-dangerous-commands", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/block_dangerous_commands/doc.go b/internal/cli/system/cmd/block_dangerous_commands/doc.go new file mode 100644 index 00000000..b7d52c7c --- /dev/null +++ b/internal/cli/system/cmd/block_dangerous_commands/doc.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package block_dangerous_commands implements the ctx system +// block-dangerous-commands subcommand. +// +// It provides a regex safety net that catches dangerous command patterns +// such as mid-command sudo, git push, and binary installs that the +// deny-list cannot express. +package block_dangerous_commands diff --git a/internal/cli/system/cmd/block_dangerous_commands/run.go b/internal/cli/system/cmd/block_dangerous_commands/run.go new file mode 100644 index 00000000..00c9848e --- /dev/null +++ b/internal/cli/system/cmd/block_dangerous_commands/run.go @@ -0,0 +1,83 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package block_dangerous_commands + +import ( + "encoding/json" + "os" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the block-dangerous-commands hook logic. +// +// Reads a hook input from stdin, checks the command against dangerous +// patterns (mid-command sudo, git push, cp/mv to bin), and emits a +// block response if matched. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + input := core.ReadInput(stdin) + command := input.ToolInput.Command + + if command == "" { + return nil + } + + var variant, fallback string + + if regex.MidSudo.MatchString(command) { + variant = hook.VariantMidSudo + fallback = assets.TextDesc(assets.TextDescKeyBlockMidSudo) + } + + if variant == "" && regex.MidGitPush.MatchString(command) { + variant = hook.VariantMidGitPush + fallback = assets.TextDesc(assets.TextDescKeyBlockMidGitPush) + } + + if variant == "" && regex.CpMvToBin.MatchString(command) { + variant = hook.VariantCpToBin + fallback = assets.TextDesc(assets.TextDescKeyBlockCpToBin) + } + + if variant == "" && regex.InstallToLocalBin.MatchString(command) { + variant = hook.VariantInstallToLocalBin + fallback = assets.TextDesc(assets.TextDescKeyBlockInstallToLocalBin) + } + + var reason string + if variant != "" { + reason = core.LoadMessage( + hook.BlockDangerousCommands, variant, nil, fallback, + ) + } + + if reason != "" { + resp := core.BlockResponse{ + Decision: hook.HookDecisionBlock, + Reason: reason, + } + data, _ := json.Marshal(resp) + cmd.Println(string(data)) + ref := notify.NewTemplateRef(hook.BlockDangerousCommands, variant, nil) + core.Relay(hook.BlockDangerousCommands+": "+reason, input.SessionID, ref) + } + + return nil +} diff --git a/internal/cli/system/cmd/block_non_path_ctx/cmd.go b/internal/cli/system/cmd/block_non_path_ctx/cmd.go new file mode 100644 index 00000000..ffd6c37a --- /dev/null +++ b/internal/cli/system/cmd/block_non_path_ctx/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package block_non_path_ctx + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system block-non-path-ctx" subcommand. +// +// Returns: +// - *cobra.Command: Configured block-non-path-ctx subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemBlockNonPathCtx) + + return &cobra.Command{ + Use: "block-non-path-ctx", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/block_non_path_ctx/doc.go b/internal/cli/system/cmd/block_non_path_ctx/doc.go new file mode 100644 index 00000000..36f0f7ae --- /dev/null +++ b/internal/cli/system/cmd/block_non_path_ctx/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package block_non_path_ctx implements the ctx system block-non-path-ctx +// subcommand. +// +// It blocks non-PATH ctx invocations such as ./ctx, go run ./cmd/ctx, +// and absolute-path ctx calls to enforce consistent binary usage. +package block_non_path_ctx diff --git a/internal/cli/system/cmd/block_non_path_ctx/run.go b/internal/cli/system/cmd/block_non_path_ctx/run.go new file mode 100644 index 00000000..1ce3c354 --- /dev/null +++ b/internal/cli/system/cmd/block_non_path_ctx/run.go @@ -0,0 +1,86 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package block_non_path_ctx + +import ( + "encoding/json" + "os" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the block-non-path-ctx hook logic. +// +// Reads a hook input from stdin, checks the command against patterns +// that invoke ctx via relative paths, go run, or absolute paths +// instead of the PATH-installed binary, and emits a block response +// if matched. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + input := core.ReadInput(stdin) + command := input.ToolInput.Command + + if command == "" { + return nil + } + + var variant, fallback string + + if regex.CtxRelativeStart.MatchString(command) || + regex.CtxRelativeSep.MatchString(command) { + variant = hook.VariantDotSlash + fallback = assets.TextDesc(assets.TextDescKeyBlockDotSlash) + } + + if regex.CtxGoRun.MatchString(command) { + variant = hook.VariantGoRun + fallback = assets.TextDesc(assets.TextDescKeyBlockGoRun) + } + + if variant == "" && (regex.CtxAbsoluteStart.MatchString(command) || + regex.AbsoluteSep.MatchString(command)) { + if !regex.CtxTestException.MatchString(command) { + variant = hook.VariantAbsolutePath + fallback = assets.TextDesc(assets.TextDescKeyBlockAbsolutePath) + } + } + + var reason string + if variant != "" { + reason = core.LoadMessage(hook.BlockNonPathCtx, variant, nil, fallback) + } + + if reason != "" { + resp := core.BlockResponse{ + Decision: hook.HookDecisionBlock, + Reason: reason + token.NewlineLF + token.NewlineLF + + assets.TextDesc(assets.TextDescKeyBlockConstitutionSuffix), + } + data, _ := json.Marshal(resp) + cmd.Println(string(data)) + blockRef := notify.NewTemplateRef(hook.BlockNonPathCtx, variant, nil) + core.Relay(hook.BlockNonPathCtx+": "+ + assets.TextDesc(assets.TextDescKeyBlockNonPathRelayMessage), + input.SessionID, blockRef, + ) + } + + return nil +} diff --git a/internal/cli/system/cmd/blockdangerouscommands/cmd.go b/internal/cli/system/cmd/blockdangerouscommands/cmd.go deleted file mode 100644 index 6da62ff4..00000000 --- a/internal/cli/system/cmd/blockdangerouscommands/cmd.go +++ /dev/null @@ -1,107 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package blockdangerouscommands - -import ( - "encoding/json" - "os" - "regexp" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system block-dangerous-commands" subcommand. -// -// Returns: -// - *cobra.Command: Configured block-dangerous-commands subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "block-dangerous-commands", - Short: "Block dangerous command patterns (regex safety net)", - Long: `Regex safety net for commands that the deny-list cannot express. -Catches mid-command sudo, mid-command git push, and binary installs -to bin directories. - -Hook event: PreToolUse (Bash) -Output: {"decision":"block","reason":"..."} or silent -Silent when: command doesn't match any dangerous pattern`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runBlockDangerousCommands(cmd, os.Stdin) - }, - } -} - -// Compiled regex patterns for dangerous command detection. -var ( - // Mid-command sudo after && || ; - reMidSudo = regexp.MustCompile(`(;|&&|\|\|)\s*sudo\s`) - // Mid-command git push after && || ; - reMidGitPush = regexp.MustCompile(`(;|&&|\|\|)\s*git\s+push`) - // cp/mv to bin directories - reCpMvToBin = regexp.MustCompile(`(cp|mv)\s+\S+\s+(/usr/local/bin|/usr/bin|~/go/bin|~/.local/bin|/home/\S+/go/bin|/home/\S+/.local/bin)`) - // cp/install to ~/.local/bin - reInstallToLocalBin = regexp.MustCompile(`(cp|install)\s.*~/\.local/bin`) -) - -func runBlockDangerousCommands(cmd *cobra.Command, stdin *os.File) error { - input := core.ReadInput(stdin) - command := input.ToolInput.Command - - if command == "" { - return nil - } - - var variant, fallback string - - // Mid-command sudo — after && || ; (prefix sudo caught by deny rule) - if reMidSudo.MatchString(command) { - variant = "mid-sudo" - fallback = "Cannot use sudo (no password access). Use 'make build && sudo make install' manually if needed." - } - - // Mid-command git push — after && || ; (prefix git push caught by deny rule) - if variant == "" && reMidGitPush.MatchString(command) { - variant = "mid-git-push" - fallback = "git push requires explicit user approval." - } - - // cp/mv to bin directories — agent must never install binaries - if variant == "" && reCpMvToBin.MatchString(command) { - variant = "cp-to-bin" - fallback = "Agent must not copy binaries to bin directories. Ask the user to run 'sudo make install' instead." - } - - // cp/install to ~/.local/bin — breaks PATH ctx rules - if variant == "" && reInstallToLocalBin.MatchString(command) { - variant = "install-to-local-bin" - fallback = "Do not copy binaries to ~/.local/bin — this overrides the system ctx in /usr/local/bin. Use 'ctx' from PATH." - } - - var reason string - if variant != "" { - reason = core.LoadMessage("block-dangerous-commands", variant, nil, fallback) - } - - if reason != "" { - resp := core.BlockResponse{ - Decision: "block", - Reason: reason, - } - data, _ := json.Marshal(resp) - cmd.Println(string(data)) - ref := notify.NewTemplateRef("block-dangerous-commands", variant, nil) - _ = notify.Send("relay", "block-dangerous-commands: "+reason, input.SessionID, ref) - eventlog.Append("relay", "block-dangerous-commands: "+reason, input.SessionID, ref) - } - - return nil -} diff --git a/internal/cli/system/cmd/blocknonpathctx/cmd.go b/internal/cli/system/cmd/blocknonpathctx/cmd.go deleted file mode 100644 index 29fd5e63..00000000 --- a/internal/cli/system/cmd/blocknonpathctx/cmd.go +++ /dev/null @@ -1,108 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package blocknonpathctx - -import ( - "encoding/json" - "fmt" - "os" - "regexp" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system block-non-path-ctx" subcommand. -// -// Returns: -// - *cobra.Command: Configured block-non-path-ctx subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "block-non-path-ctx", - Short: "Block non-PATH ctx invocations", - Long: `Blocks ./ctx, go run ./cmd/ctx, and absolute-path ctx invocations. -Enforces the CONSTITUTION.md rule: always use ctx from PATH. -Outputs a JSON block decision that prevents the tool call. - -Hook event: PreToolUse (Bash) -Output: {"decision":"block","reason":"..."} or silent -Silent when: command doesn't invoke ctx via a non-PATH route`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runBlockNonPathCtx(cmd, os.Stdin) - }, - } -} - -// Compiled regex patterns for command-position matching. -var ( - // Pattern 1: ./ctx or ./dist/ctx at start of command - reRelativeStart = regexp.MustCompile(`^\s*(\./ctx(\s|$)|\./dist/ctx)`) - // Pattern 1b: ./ctx or ./dist/ctx after command separator - reRelativeSep = regexp.MustCompile(`(&&|;|\|\||\|)\s*(\./ctx(\s|$)|\./dist/ctx)`) - // Pattern 2: go run ./cmd/ctx - reGoRun = regexp.MustCompile(`go run \./cmd/ctx`) - // Pattern 3: Absolute paths at start of command - reAbsoluteStart = regexp.MustCompile(`^\s*(/home/|/tmp/|/var/)\S*/ctx(\s|$)`) - // Pattern 3b: Absolute paths after command separator - reAbsoluteSep = regexp.MustCompile(`(&&|;|\|\||\|)\s*(/home/|/tmp/|/var/)\S*/ctx(\s|$)`) - // Exception: /tmp/ctx-test for integration tests - reTestException = regexp.MustCompile(`/tmp/ctx-test`) -) - -func runBlockNonPathCtx(cmd *cobra.Command, stdin *os.File) error { - input := core.ReadInput(stdin) - command := input.ToolInput.Command - - if command == "" { - return nil - } - - var variant, fallback string - - // Pattern 1: ./ctx or ./dist/ctx at command position - if reRelativeStart.MatchString(command) || reRelativeSep.MatchString(command) { - variant = "dot-slash" - fallback = "Use 'ctx' from PATH, not './ctx' or './dist/ctx'. Ask the user to run: make build && sudo make install" - } - - // Pattern 2: go run ./cmd/ctx - if reGoRun.MatchString(command) { - variant = "go-run" - fallback = "Use 'ctx' from PATH, not 'go run ./cmd/ctx'. Ask the user to run: make build && sudo make install" - } - - // Pattern 3: Absolute paths to ctx binary at command position - if variant == "" && (reAbsoluteStart.MatchString(command) || reAbsoluteSep.MatchString(command)) { - if !reTestException.MatchString(command) { - variant = "absolute-path" - fallback = "Use 'ctx' from PATH, not absolute paths. Ask the user to run: make build && sudo make install" - } - } - - var reason string - if variant != "" { - reason = core.LoadMessage("block-non-path-ctx", variant, nil, fallback) - } - - if reason != "" { - resp := core.BlockResponse{ - Decision: "block", - Reason: fmt.Sprintf("%s\n\nSee CONSTITUTION.md: ctx Invocation Invariants", reason), - } - data, _ := json.Marshal(resp) - cmd.Println(string(data)) - blockRef := notify.NewTemplateRef("block-non-path-ctx", variant, nil) - _ = notify.Send("relay", "block-non-path-ctx: Blocked non-PATH ctx invocation", input.SessionID, blockRef) - eventlog.Append("relay", "block-non-path-ctx: Blocked non-PATH ctx invocation", input.SessionID, blockRef) - } - - return nil -} diff --git a/internal/cli/system/cmd/bootstrap/cmd.go b/internal/cli/system/cmd/bootstrap/cmd.go index 025d413c..4bd87238 100644 --- a/internal/cli/system/cmd/bootstrap/cmd.go +++ b/internal/cli/system/cmd/bootstrap/cmd.go @@ -17,14 +17,23 @@ import ( // Returns: // - *cobra.Command: Configured bootstrap subcommand func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemBootstrap) + cmd := &cobra.Command{ Use: "bootstrap", - Short: "Print context location for AI agents", + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runBootstrap(cmd) + return Run(cmd) }, } - cmd.Flags().Bool("json", false, assets.FlagDesc("system.bootstrap.json")) - cmd.Flags().BoolP("quiet", "q", false, assets.FlagDesc("system.bootstrap.quiet")) + + cmd.Flags().Bool("json", false, + assets.FlagDesc(assets.FlagDescKeySystemBootstrapJson), + ) + cmd.Flags().BoolP("quiet", "q", false, + assets.FlagDesc(assets.FlagDescKeySystemBootstrapQuiet), + ) + return cmd } diff --git a/internal/cli/system/cmd/bootstrap/doc.go b/internal/cli/system/cmd/bootstrap/doc.go new file mode 100644 index 00000000..fdb58a16 --- /dev/null +++ b/internal/cli/system/cmd/bootstrap/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package bootstrap implements the ctx system bootstrap subcommand. +// +// It prints the context directory location for AI agents to discover +// and load project context on session start. +package bootstrap diff --git a/internal/cli/system/cmd/bootstrap/run.go b/internal/cli/system/cmd/bootstrap/run.go index 556fc98e..1b4a00e8 100644 --- a/internal/cli/system/cmd/bootstrap/run.go +++ b/internal/cli/system/cmd/bootstrap/run.go @@ -7,41 +7,32 @@ package bootstrap import ( - "encoding/json" - "fmt" "os" - "path/filepath" - "sort" - "strings" + bootstrap2 "github.com/ActiveMemory/ctx/internal/config/bootstrap" + "github.com/ActiveMemory/ctx/internal/write/bootstrap" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/cli/initialize" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" ) -// bootstrapRules are the standard rules emitted by the bootstrap command. -var bootstrapRules = []string{ - "Use context_dir above for ALL file reads/writes", - "Never say \"I don't have memory\" — context IS your memory", - "Read files silently, present as recall (not search)", - "Persist learnings/decisions before session ends", - "Run `ctx agent` for content summaries", - "Run `ctx status` for context health", -} - -// bootstrapNextSteps tells the agent what to do immediately after bootstrap. -var bootstrapNextSteps = []string{ - "Read AGENT_PLAYBOOK.md from the context directory", - "Run `ctx agent --budget 4000` for a content summary", -} - -func runBootstrap(cmd *cobra.Command) error { +// Run executes the bootstrap command, emitting context directory info, +// rules, and next steps for the calling agent. +// +// Parameters: +// - cmd: Cobra command providing flags and output streams. +// +// Returns: +// - error: non-nil if the context directory does not exist or JSON +// encoding fails. +func Run(cmd *cobra.Command) error { dir := rc.ContextDir() if _, statErr := os.Stat(dir); os.IsNotExist(statErr) { - return fmt.Errorf("context directory not found: %s — run 'ctx init'", dir) + return ctxerr.ContextDirNotFound(dir) } quiet, _ := cmd.Flags().GetBool("quiet") @@ -50,131 +41,23 @@ func runBootstrap(cmd *cobra.Command) error { return nil } - files := listContextFiles(dir) + files := core.ListContextFiles(dir) + rules := core.ParseNumberedLines( + assets.TextDesc(assets.TextDescKeyBootstrapRules), + ) + nextSteps := core.ParseNumberedLines( + assets.TextDesc(assets.TextDescKeyBootstrapNextSteps), + ) + warning := core.PluginWarning() jsonFlag, _ := cmd.Flags().GetBool("json") if jsonFlag { - return outputBootstrapJSON(cmd, dir, files) - } - outputBootstrapText(cmd, dir, files) - return nil -} - -func outputBootstrapText(cmd *cobra.Command, dir string, files []string) { - cmd.Println("ctx bootstrap") - cmd.Println("=============") - cmd.Println() - cmd.Println("context_dir: " + dir) - cmd.Println() - cmd.Println("Files:") - cmd.Println(wrapFileList(files, 55, " ")) - cmd.Println() - cmd.Println("Rules:") - for i, r := range bootstrapRules { - cmd.Println(fmt.Sprintf(" %d. %s", i+1, r)) - } - cmd.Println() - cmd.Println("Next steps:") - for i, s := range bootstrapNextSteps { - cmd.Println(fmt.Sprintf(" %d. %s", i+1, s)) - } - - if w := pluginWarning(); w != "" { - cmd.Println() - cmd.Println("Warning: " + w) - } -} - -func outputBootstrapJSON(cmd *cobra.Command, dir string, files []string) error { - type jsonOutput struct { - ContextDir string `json:"context_dir"` - Files []string `json:"files"` - Rules []string `json:"rules"` - NextSteps []string `json:"next_steps"` - Warnings []string `json:"warnings,omitempty"` - } - - out := jsonOutput{ - ContextDir: dir, - Files: files, - Rules: bootstrapRules, - NextSteps: bootstrapNextSteps, - } - - if w := pluginWarning(); w != "" { - out.Warnings = []string{w} + return bootstrap.BootstrapJSON(cmd, dir, files, rules, nextSteps, warning) } - enc := json.NewEncoder(cmd.OutOrStdout()) - enc.SetIndent("", " ") - return enc.Encode(out) -} - -// pluginWarning returns a warning string if the ctx plugin is installed -// but not enabled in either global or local settings. Returns empty string -// if no warning is needed. -func pluginWarning() string { - if !initialize.PluginInstalled() { - return "" - } - if initialize.PluginEnabledGlobally() || initialize.PluginEnabledLocally() { - return "" - } - return "ctx plugin is installed but not enabled. " + - "Run 'ctx init' to auto-enable, or add " + - "{\"enabledPlugins\": {\"ctx@activememory-ctx\": true}} " + - "to ~/.claude/settings.json" -} - -// listContextFiles reads the given directory and returns sorted .md filenames. -func listContextFiles(dir string) []string { - entries, readErr := os.ReadDir(dir) - if readErr != nil { - return nil - } - - var files []string - for _, e := range entries { - if e.IsDir() { - continue - } - if strings.EqualFold(filepath.Ext(e.Name()), config.ExtMarkdown) { - files = append(files, e.Name()) - } - } - sort.Strings(files) - return files -} - -// wrapFileList formats file names as a comma-separated list, wrapping lines -// at approximately maxWidth characters. Continuation lines are prefixed with -// the given indent string. -func wrapFileList(files []string, maxWidth int, indent string) string { - if len(files) == 0 { - return indent + "(none)" - } - - var lines []string - current := indent - - for i, f := range files { - entry := f - if i < len(files)-1 { - entry += "," - } - - switch { - case current == indent: - // First entry on this line — always add it. - current += entry - case len(current)+1+len(entry) > maxWidth: - // Would exceed width — start a new line. - lines = append(lines, current) - current = indent + entry - default: - current += " " + entry - } - } - lines = append(lines, current) - return strings.Join(lines, config.NewlineLF) + fileList := core.WrapFileList( + files, bootstrap2.BootstrapFileListWidth, bootstrap2.BootstrapFileListIndent, + ) + bootstrap.BootstrapText(cmd, dir, fileList, rules, nextSteps, warning) + return nil } diff --git a/internal/cli/system/cmd/check_backup_age/cmd.go b/internal/cli/system/cmd/check_backup_age/cmd.go new file mode 100644 index 00000000..1e10c208 --- /dev/null +++ b/internal/cli/system/cmd/check_backup_age/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_backup_age + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-backup-age" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-backup-age subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckBackupAge) + + return &cobra.Command{ + Use: "check-backup-age", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_backup_age/doc.go b/internal/cli/system/cmd/check_backup_age/doc.go new file mode 100644 index 00000000..4ba3ef77 --- /dev/null +++ b/internal/cli/system/cmd/check_backup_age/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_backup_age implements the ctx system check-backup-age +// subcommand. +// +// It checks how recently a backup was created and nudges the user when +// backups exceed the configured maximum age. +package check_backup_age diff --git a/internal/cli/system/cmd/check_backup_age/run.go b/internal/cli/system/cmd/check_backup_age/run.go new file mode 100644 index 00000000..fdf67ac2 --- /dev/null +++ b/internal/cli/system/cmd/check_backup_age/run.go @@ -0,0 +1,97 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_backup_age + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/env" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the check-backup-age hook logic. +// +// Reads a hook input from stdin, checks whether the SMB share is mounted +// and whether the backup marker file is fresh, then emits a relay warning +// if any issue is detected. Throttled to once per day. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + tmpDir := core.StateDir() + throttleFile := filepath.Join(tmpDir, archive.BackupThrottleID) + + if core.IsDailyThrottled(throttleFile) { + return nil + } + + home, homeErr := os.UserHomeDir() + if homeErr != nil { + return nil + } + + var warnings []string + + // Check 1: Is the SMB share mounted? + if smbURL := os.Getenv(env.BackupSMBURL); smbURL != "" { + warnings = core.CheckSMBMountWarnings(smbURL, warnings) + } + + // Check 2: Is the backup stale? + markerPath := filepath.Join(home, archive.BackupMarkerDir, archive.BackupMarkerFile) + warnings = core.CheckBackupMarker(markerPath, warnings) + + if len(warnings) == 0 { + return nil + } + + // Build pre-formatted warnings for the template variable + var warningText string + for _, w := range warnings { + warningText += w + token.NewlineLF + } + + vars := map[string]any{tpl.VarWarnings: warningText} + content := core.LoadMessage(hook.CheckBackupAge, hook.VariantWarning, vars, warningText) + if content == "" { + return nil + } + + // Emit VERBATIM relay + cmd.Println(core.NudgeBox( + assets.TextDesc(assets.TextDescKeyBackupRelayPrefix), + assets.TextDesc(assets.TextDescKeyBackupBoxTitle), + content)) + + ref := notify.NewTemplateRef(hook.CheckBackupAge, hook.VariantWarning, vars) + core.NudgeAndRelay(hook.CheckBackupAge+": "+ + assets.TextDesc(assets.TextDescKeyBackupRelayMessage), + input.SessionID, ref, + ) + + core.TouchFile(throttleFile) + + return nil +} diff --git a/internal/cli/system/cmd/checkceremonies/cmd.go b/internal/cli/system/cmd/check_ceremonies/cmd.go similarity index 52% rename from internal/cli/system/cmd/checkceremonies/cmd.go rename to internal/cli/system/cmd/check_ceremonies/cmd.go index b8d727ba..853970e1 100644 --- a/internal/cli/system/cmd/checkceremonies/cmd.go +++ b/internal/cli/system/cmd/check_ceremonies/cmd.go @@ -4,12 +4,14 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package checkceremonies +package check_ceremonies import ( "os" "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ) // Cmd returns the "ctx system check-ceremonies" subcommand. @@ -17,19 +19,15 @@ import ( // Returns: // - *cobra.Command: Configured check-ceremonies subcommand func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-ceremonies", - Short: "Session ceremony nudge hook", - Long: `Scans the last 3 journal entries for /ctx-remember and /ctx-wrap-up -usage. If either is missing, emits a VERBATIM relay nudge encouraging -adoption. Throttled to once per day. + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckCeremonies) -Hook event: UserPromptSubmit -Output: VERBATIM relay (when ceremonies missing), silent otherwise -Silent when: both ceremonies found in recent sessions`, + return &cobra.Command{ + Use: "check-ceremonies", + Short: short, + Long: long, Hidden: true, RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckCeremonies(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } } diff --git a/internal/cli/system/cmd/check_ceremonies/doc.go b/internal/cli/system/cmd/check_ceremonies/doc.go new file mode 100644 index 00000000..8d8d7492 --- /dev/null +++ b/internal/cli/system/cmd/check_ceremonies/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_ceremonies implements the ctx system check-ceremonies +// subcommand. +// +// It scans recent journal entries for session ceremony usage and nudges +// adoption of /ctx-remember and /ctx-wrap-up when missing. +package check_ceremonies diff --git a/internal/cli/system/cmd/check_ceremonies/run.go b/internal/cli/system/cmd/check_ceremonies/run.go new file mode 100644 index 00000000..6d584a9f --- /dev/null +++ b/internal/cli/system/cmd/check_ceremonies/run.go @@ -0,0 +1,76 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_ceremonies + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/ceremony" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the check-ceremonies hook logic. +// +// Scans recent journal files for /ctx-remember and /ctx-wrap-up usage. When +// either ceremony is missing, emits a nudge message and sends relay/nudge +// notifications. The check is daily-throttled and skipped when paused. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + remindedFile := filepath.Join(core.StateDir(), ceremony.CeremonyThrottleID) + + if core.IsDailyThrottled(remindedFile) { + return nil + } + + files := core.RecentJournalFiles( + ctxcontext.ResolvedJournalDir(), ceremony.CeremonyJournalLookback, + ) + + if len(files) == 0 { + return nil + } + + remember, wrapup := core.ScanJournalsForCeremonies(files) + + if remember && wrapup { + return nil + } + + msg, variant := core.EmitCeremonyNudge(cmd, remember, wrapup) + if msg == "" { + return nil + } + ref := notify.NewTemplateRef(hook.CheckCeremonies, variant, nil) + core.NudgeAndRelay(hook.CheckCeremonies+": "+ + assets.TextDesc(assets.TextDescKeyCeremonyRelayMessage), + input.SessionID, ref, + ) + core.TouchFile(remindedFile) + return nil +} diff --git a/internal/cli/system/cmd/check_context_size/cmd.go b/internal/cli/system/cmd/check_context_size/cmd.go new file mode 100644 index 00000000..443536ef --- /dev/null +++ b/internal/cli/system/cmd/check_context_size/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_context_size + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-context-size" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-context-size subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckContextSize) + + return &cobra.Command{ + Use: "check-context-size", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_context_size/doc.go b/internal/cli/system/cmd/check_context_size/doc.go new file mode 100644 index 00000000..0282575f --- /dev/null +++ b/internal/cli/system/cmd/check_context_size/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_context_size implements the ctx system check-context-size +// subcommand. +// +// It counts prompts per session and emits adaptive checkpoint reminders +// prompting the user to consider wrapping up. +package check_context_size diff --git a/internal/cli/system/cmd/check_context_size/run.go b/internal/cli/system/cmd/check_context_size/run.go new file mode 100644 index 00000000..7e1fb447 --- /dev/null +++ b/internal/cli/system/cmd/check_context_size/run.go @@ -0,0 +1,134 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_context_size + +import ( + "fmt" + "os" + "path/filepath" + "time" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/event" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// Run executes the check-context-size hook logic. +// +// Reads hook input from stdin, tracks per-session prompt counts, and emits +// context checkpoint or window warning messages at adaptive intervals. +// Also fires a one-shot billing warning when token usage exceeds the +// user-configured threshold. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input := core.ReadInput(stdin) + sessionID := input.SessionID + if sessionID == "" { + sessionID = session.IDUnknown + } + + // Pause check — this hook is the designated single emitter + if turns := core.Paused(sessionID); turns > 0 { + cmd.Println(core.PausedMessage(turns)) + return nil + } + + tmpDir := core.StateDir() + counterFile := filepath.Join(tmpDir, stats.ContextSizeCounterPrefix+sessionID) + logFile := filepath.Join(rc.ContextDir(), dir.Logs, stats.ContextSizeLogFile) + + // Increment counter + count := core.ReadCounter(counterFile) + 1 + core.WriteCounter(counterFile, count) + + // Read actual context window usage from session JSONL + info, _ := core.ReadSessionTokenInfo(sessionID) + tokens := info.Tokens + windowSize := core.EffectiveContextWindow(info.Model) + pct := 0 + if windowSize > 0 && tokens > 0 { + pct = tokens * stats.PercentMultiplier / windowSize + } + + // Billing threshold: one-shot warning when tokens exceed the + // user-configured billing_token_warn. Independent of all other + // triggers — fires even during wrap-up suppression because cost + // guards are never convenience nudges. + if billingThreshold := rc.BillingTokenWarn(); billingThreshold > 0 && tokens >= billingThreshold { + core.EmitBillingWarning(cmd, logFile, sessionID, count, tokens, billingThreshold) + } + + // Wrap-up suppression: if the user recently ran /ctx-wrap-up, + // suppress checkpoint and window nudges to avoid noise during/after + // the wrap-up ceremony. The marker expires after 2 hours. + // Stats are still recorded so token usage tracking is continuous. + if core.WrappedUpRecently() { + core.LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeSuppressedLogFormat), count)) + core.WriteSessionStats(sessionID, core.SessionStats{ + Timestamp: time.Now().Format(time.RFC3339), + Prompt: count, + Tokens: tokens, + Pct: pct, + WindowSize: windowSize, + Model: info.Model, + Event: event.EventSuppressed, + }) + return nil + } + + // Adaptive frequency (prompt counter) + counterTriggered := false + if count > 30 { + counterTriggered = count%3 == 0 + } else if count > 15 { + counterTriggered = count%5 == 0 + } + + windowTrigger := pct >= stats.ContextWindowThresholdPct + + evt := event.EventSilent + switch { + case counterTriggered: + evt = event.EventCheckpoint + core.EmitCheckpoint(cmd, logFile, sessionID, count, tokens, pct, windowSize) + case windowTrigger: + evt = event.EventWindowWarning + core.EmitWindowWarning(cmd, logFile, sessionID, count, tokens, pct) + default: + core.LogMessage(logFile, sessionID, + fmt.Sprintf(assets.TextDesc( + assets.TextDescKeyCheckContextSizeSilentLogFormat), count), + ) + } + + core.WriteSessionStats(sessionID, core.SessionStats{ + Timestamp: time.Now().Format(time.RFC3339), + Prompt: count, + Tokens: tokens, + Pct: pct, + WindowSize: windowSize, + Model: info.Model, + Event: evt, + }) + + return nil +} diff --git a/internal/cli/system/cmd/checkjournal/cmd.go b/internal/cli/system/cmd/check_journal/cmd.go similarity index 53% rename from internal/cli/system/cmd/checkjournal/cmd.go rename to internal/cli/system/cmd/check_journal/cmd.go index d6d7d4c3..30dfe026 100644 --- a/internal/cli/system/cmd/checkjournal/cmd.go +++ b/internal/cli/system/cmd/check_journal/cmd.go @@ -4,12 +4,14 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package checkjournal +package check_journal import ( "os" "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ) // Cmd returns the "ctx system check-journal" subcommand. @@ -17,18 +19,15 @@ import ( // Returns: // - *cobra.Command: Configured check-journal subcommand func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-journal", - Short: "Journal export/enrich reminder hook", - Long: `Detects unexported Claude Code sessions and unenriched journal entries, -then prints actionable commands. Throttled to once per day. + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckJournal) -Hook event: UserPromptSubmit -Output: VERBATIM relay with export/enrich commands, silent otherwise -Silent when: no unexported sessions and no unenriched entries`, + return &cobra.Command{ + Use: "check-journal", + Short: short, + Long: long, Hidden: true, RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckJournal(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } } diff --git a/internal/cli/system/cmd/check_journal/doc.go b/internal/cli/system/cmd/check_journal/doc.go new file mode 100644 index 00000000..62404487 --- /dev/null +++ b/internal/cli/system/cmd/check_journal/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_journal implements the ctx system check-journal subcommand. +// +// It detects unexported sessions and unenriched journal entries, then +// prints actionable commands to address them. +package check_journal diff --git a/internal/cli/system/cmd/check_journal/run.go b/internal/cli/system/cmd/check_journal/run.go new file mode 100644 index 00000000..48a90b64 --- /dev/null +++ b/internal/cli/system/cmd/check_journal/run.go @@ -0,0 +1,124 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_journal + +import ( + "fmt" + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/env" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the check-journal hook logic. +// +// Checks for unexported Claude Code sessions and unenriched journal +// entries, then emits a journal reminder nudge if either is found. +// Throttled to once per day. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + tmpDir := core.StateDir() + remindedFile := filepath.Join(tmpDir, journal.CheckJournalThrottleID) + claudeProjectsDir := filepath.Join( + os.Getenv(env.Home), journal.CheckJournalClaudeProjectsSubdir, + ) + + // Only remind once per day + if core.IsDailyThrottled(remindedFile) { + return nil + } + + // Bail out if journal or Claude projects directories don't exist + jDir := ctxcontext.ResolvedJournalDir() + if _, statErr := os.Stat(jDir); os.IsNotExist(statErr) { + return nil + } + if _, statErr := os.Stat(claudeProjectsDir); os.IsNotExist(statErr) { + return nil + } + + // Stage 1: Unexported sessions + newestJournal := core.NewestMtime(jDir, file.ExtMarkdown) + unexported := core.CountNewerFiles( + claudeProjectsDir, file.ExtJSONL, newestJournal, + ) + + // Stage 2: Unenriched entries + unenriched := core.CountUnenriched(jDir) + + if unexported == 0 && unenriched == 0 { + return nil + } + + vars := map[string]any{ + tpl.VarUnexportedCount: unexported, + tpl.VarUnenrichedCount: unenriched, + } + + var variant, fallback string + switch { + case unexported > 0 && unenriched > 0: + variant = hook.VariantBoth + fallback = fmt.Sprintf(assets.TextDesc( + assets.TextDescKeyCheckJournalFallbackBoth), unexported, unenriched, + ) + case unexported > 0: + variant = hook.VariantUnexported + fallback = fmt.Sprintf(assets.TextDesc( + assets.TextDescKeyCheckJournalFallbackUnexported), unexported, + ) + default: + variant = hook.VariantUnenriched + fallback = fmt.Sprintf(assets.TextDesc( + assets.TextDescKeyCheckJournalFallbackUnenriched), unenriched, + ) + } + + content := core.LoadMessage(hook.CheckJournal, variant, vars, fallback) + if content == "" { + return nil + } + + boxTitle := assets.TextDesc(assets.TextDescKeyCheckJournalBoxTitle) + relayPrefix := assets.TextDesc(assets.TextDescKeyCheckJournalRelayPrefix) + + cmd.Println(core.NudgeBox(relayPrefix, boxTitle, content)) + + ref := notify.NewTemplateRef(hook.CheckJournal, variant, vars) + journalMsg := hook.CheckJournal + ": " + fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyCheckJournalRelayFormat), + unexported, unenriched, + ) + core.NudgeAndRelay(journalMsg, input.SessionID, ref) + + core.TouchFile(remindedFile) + return nil +} diff --git a/internal/cli/system/cmd/check_knowledge/cmd.go b/internal/cli/system/cmd/check_knowledge/cmd.go new file mode 100644 index 00000000..c1af401d --- /dev/null +++ b/internal/cli/system/cmd/check_knowledge/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_knowledge + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-knowledge" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-knowledge subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckKnowledge) + + return &cobra.Command{ + Use: "check-knowledge", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_knowledge/doc.go b/internal/cli/system/cmd/check_knowledge/doc.go new file mode 100644 index 00000000..7c343399 --- /dev/null +++ b/internal/cli/system/cmd/check_knowledge/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_knowledge implements the ctx system check-knowledge +// subcommand. +// +// It counts entries in DECISIONS.md, LEARNINGS.md, and CONVENTIONS.md +// and nudges when any file exceeds its configured growth threshold. +package check_knowledge diff --git a/internal/cli/system/cmd/check_knowledge/run.go b/internal/cli/system/cmd/check_knowledge/run.go new file mode 100644 index 00000000..9a262702 --- /dev/null +++ b/internal/cli/system/cmd/check_knowledge/run.go @@ -0,0 +1,52 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_knowledge + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/knowledge" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/system/core" +) + +// Run executes the check-knowledge hook logic. +// +// Reads hook input from stdin, checks knowledge file sizes against +// configured thresholds (entry counts for DECISIONS.md and LEARNINGS.md, +// line count for CONVENTIONS.md), and emits a relay warning if any +// file exceeds its limit. Throttled to once per day. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + _, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + markerPath := filepath.Join(core.StateDir(), knowledge.KnowledgeThrottleID) + if core.IsDailyThrottled(markerPath) { + return nil + } + + if core.CheckKnowledgeHealth(cmd, sessionID) { + core.TouchFile(markerPath) + } + + return nil +} diff --git a/internal/cli/system/cmd/check_map_staleness/cmd.go b/internal/cli/system/cmd/check_map_staleness/cmd.go new file mode 100644 index 00000000..57e0e9b3 --- /dev/null +++ b/internal/cli/system/cmd/check_map_staleness/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_map_staleness + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-map-staleness" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-map-staleness subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckMapStaleness) + + return &cobra.Command{ + Use: "check-map-staleness", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_map_staleness/doc.go b/internal/cli/system/cmd/check_map_staleness/doc.go new file mode 100644 index 00000000..fc278b9c --- /dev/null +++ b/internal/cli/system/cmd/check_map_staleness/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_map_staleness implements the ctx system check-map-staleness +// subcommand. +// +// It detects when context map files have not been updated within the +// configured staleness window and nudges regeneration. +package check_map_staleness diff --git a/internal/cli/system/cmd/check_map_staleness/run.go b/internal/cli/system/cmd/check_map_staleness/run.go new file mode 100644 index 00000000..a27021e7 --- /dev/null +++ b/internal/cli/system/cmd/check_map_staleness/run.go @@ -0,0 +1,74 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_map_staleness + +import ( + "os" + "path/filepath" + "time" + + "github.com/ActiveMemory/ctx/internal/config/architecture" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/system/core" +) + +// Run executes the check-map-staleness hook logic. +// +// Reads hook input from stdin, checks the map-tracking.json file for +// stale architecture mapping, counts commits touching internal/ since +// the last refresh, and emits a relay nudge if the map is stale and +// there are relevant commits. Throttled to once per day. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + markerPath := filepath.Join(core.StateDir(), architecture.MapStalenessThrottleID) + if core.IsDailyThrottled(markerPath) { + return nil + } + + info := core.ReadMapTracking() + if info == nil || info.OptedOut { + return nil + } + + lastRun, parseErr := time.Parse(time2.DateFormat, info.LastRun) + if parseErr != nil { + return nil + } + + if time.Since(lastRun) < time.Duration(architecture.MapStaleDays)*24*time.Hour { + return nil + } + + // Count commits touching internal/ since last run + moduleCommits := core.CountModuleCommits(info.LastRun) + if moduleCommits == 0 { + return nil + } + + dateStr := lastRun.Format(time2.DateFormat) + core.EmitMapStalenessWarning(cmd, input.SessionID, dateStr, moduleCommits) + + core.TouchFile(markerPath) + + return nil +} diff --git a/internal/cli/system/cmd/check_memory_drift/cmd.go b/internal/cli/system/cmd/check_memory_drift/cmd.go new file mode 100644 index 00000000..07680da3 --- /dev/null +++ b/internal/cli/system/cmd/check_memory_drift/cmd.go @@ -0,0 +1,32 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_memory_drift + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-memory-drift" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-memory-drift subcommand +func Cmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemCheckMemoryDrift) + + return &cobra.Command{ + Use: "check-memory-drift", + Short: short, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_memory_drift/doc.go b/internal/cli/system/cmd/check_memory_drift/doc.go new file mode 100644 index 00000000..df58465d --- /dev/null +++ b/internal/cli/system/cmd/check_memory_drift/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_memory_drift implements the ctx system check-memory-drift +// subcommand. +// +// It detects drift between the agent's working memory and the persisted +// MEMORY.md file and nudges synchronization. +package check_memory_drift diff --git a/internal/cli/system/cmd/check_memory_drift/run.go b/internal/cli/system/cmd/check_memory_drift/run.go new file mode 100644 index 00000000..9abf41c5 --- /dev/null +++ b/internal/cli/system/cmd/check_memory_drift/run.go @@ -0,0 +1,72 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_memory_drift + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/memory" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/spf13/cobra" +) + +// Run executes the check-memory-drift hook logic. +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + // Session tombstone: nudge once per session, per session ID + tombstone := filepath.Join(core.StateDir(), hook.PrefixMemoryDriftThrottle+sessionID) + if _, statErr := os.Stat(tombstone); statErr == nil { + return nil + } + + contextDir := rc.ContextDir() + projectRoot := filepath.Dir(contextDir) + + sourcePath, discoverErr := memory.DiscoverMemoryPath(projectRoot) + if discoverErr != nil { + // Auto memory not active — skip silently + return nil + } + + if !memory.HasDrift(contextDir, sourcePath) { + return nil + } + + fallback := assets.TextDesc(assets.TextDescKeyCheckMemoryDriftContent) + content := core.LoadMessage(hook.CheckMemoryDrift, hook.VariantNudge, nil, fallback) + if content == "" { + return nil + } + + cmd.Println(core.NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckMemoryDriftRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckMemoryDriftBoxTitle), + content)) + + ref := notify.NewTemplateRef(hook.CheckMemoryDrift, hook.VariantNudge, nil) + core.NudgeAndRelay( + hook.CheckMemoryDrift+": "+assets.TextDesc(assets.TextDescKeyCheckMemoryDriftRelayMessage), + input.SessionID, ref, + ) + + core.TouchFile(tombstone) + + return nil +} diff --git a/internal/cli/system/cmd/check_persistence/cmd.go b/internal/cli/system/cmd/check_persistence/cmd.go new file mode 100644 index 00000000..be20208f --- /dev/null +++ b/internal/cli/system/cmd/check_persistence/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_persistence + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-persistence" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-persistence subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckPersistence) + + return &cobra.Command{ + Use: "check-persistence", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_persistence/doc.go b/internal/cli/system/cmd/check_persistence/doc.go new file mode 100644 index 00000000..c938a9ac --- /dev/null +++ b/internal/cli/system/cmd/check_persistence/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_persistence implements the ctx system check-persistence +// subcommand. +// +// It tracks prompts since the last .context/ file modification and nudges +// the agent to persist learnings, decisions, or task updates. +package check_persistence diff --git a/internal/cli/system/cmd/check_persistence/run.go b/internal/cli/system/cmd/check_persistence/run.go new file mode 100644 index 00000000..f31f82df --- /dev/null +++ b/internal/cli/system/cmd/check_persistence/run.go @@ -0,0 +1,110 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_persistence + +import ( + "fmt" + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/nudge" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// Run executes the check-persistence hook logic. +// +// Tracks how many prompts have passed without any context file updates +// and emits a persistence nudge when the threshold is reached. State is +// stored per-session in the .context/state/ directory. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + _, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + tmpDir := core.StateDir() + stateFile := filepath.Join(tmpDir, nudge.PersistenceNudgePrefix+sessionID) + contextDir := rc.ContextDir() + logFile := filepath.Join(contextDir, dir.Logs, nudge.PersistenceLogFile) + + // Initialize state if needed + ps, exists := core.ReadPersistenceState(stateFile) + if !exists { + initialMtime := core.GetLatestContextMtime(contextDir) + ps = core.PersistenceState{ + Count: 1, + LastNudge: 0, + LastMtime: initialMtime, + } + core.WritePersistenceState(stateFile, ps) + core.LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceInitLogFormat), initialMtime)) + return nil + } + + ps.Count++ + currentMtime := core.GetLatestContextMtime(contextDir) + + // If context files were modified since last check, reset the nudge counter + if currentMtime > ps.LastMtime { + ps.LastNudge = ps.Count + ps.LastMtime = currentMtime + core.WritePersistenceState(stateFile, ps) + core.LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceModifiedLogFormat), ps.Count)) + return nil + } + + sinceNudge := ps.Count - ps.LastNudge + + if core.PersistenceNudgeNeeded(ps.Count, sinceNudge) { + fallback := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceFallback), sinceNudge) + content := core.LoadMessage(hook.CheckPersistence, hook.VariantNudge, + map[string]any{ + tpl.VarPromptCount: ps.Count, + tpl.VarPromptsSinceNudge: sinceNudge, + }, fallback) + if content == "" { + core.LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceSilencedLogFormat), ps.Count)) + core.WritePersistenceState(stateFile, ps) + return nil + } + + boxTitle := assets.TextDesc(assets.TextDescKeyCheckPersistenceBoxTitle) + relayPrefix := assets.TextDesc(assets.TextDescKeyCheckPersistenceRelayPrefix) + + cmd.Println(core.NudgeBox(relayPrefix, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceBoxTitleFormat), boxTitle, ps.Count), content)) + cmd.Println() + core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d NUDGE since_nudge=%d", ps.Count, sinceNudge)) + ref := notify.NewTemplateRef(hook.CheckPersistence, hook.VariantNudge, + map[string]any{tpl.VarPromptCount: ps.Count, tpl.VarPromptsSinceNudge: sinceNudge}) + _ = notify.Send(hook.NotifyChannelNudge, hook.CheckPersistence+": "+fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceCheckpointFormat), ps.Count), sessionID, ref) + core.Relay(hook.CheckPersistence+": "+fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceRelayFormat), sinceNudge), sessionID, ref) + ps.LastNudge = ps.Count + } else { + core.LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceSilentLogFormat), ps.Count, sinceNudge)) + } + + core.WritePersistenceState(stateFile, ps) + return nil +} diff --git a/internal/cli/system/cmd/check_reminders/cmd.go b/internal/cli/system/cmd/check_reminders/cmd.go new file mode 100644 index 00000000..c7eb94c7 --- /dev/null +++ b/internal/cli/system/cmd/check_reminders/cmd.go @@ -0,0 +1,32 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_reminders + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-reminders" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-reminders subcommand +func Cmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemCheckReminders) + + return &cobra.Command{ + Use: "check-reminders", + Short: short, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_reminders/doc.go b/internal/cli/system/cmd/check_reminders/doc.go new file mode 100644 index 00000000..cc4ba0a7 --- /dev/null +++ b/internal/cli/system/cmd/check_reminders/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_reminders implements the ctx system check-reminders +// subcommand. +// +// It surfaces pending reminders at session start so the agent can act +// on deferred tasks. +package check_reminders diff --git a/internal/cli/system/cmd/check_reminders/run.go b/internal/cli/system/cmd/check_reminders/run.go new file mode 100644 index 00000000..92296d83 --- /dev/null +++ b/internal/cli/system/cmd/check_reminders/run.go @@ -0,0 +1,90 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_reminders + +import ( + "fmt" + "os" + "time" + + "github.com/ActiveMemory/ctx/internal/config/hook" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + remindcore "github.com/ActiveMemory/ctx/internal/cli/remind/core" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the check-reminders hook logic. +// +// Reads hook input from stdin, loads pending reminders, filters to those +// that are due today or earlier, then emits a relay box with the reminder +// list if any are due. Non-fatal on all errors. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + reminders, readErr := remindcore.ReadReminders() + if readErr != nil { + return nil // non-fatal: don't break session start + } + + today := time.Now().Format(time2.DateFormat) + var due []remindcore.Reminder + for _, r := range reminders { + if r.After == nil || *r.After <= today { + due = append(due, r) + } + } + + if len(due) == 0 { + return nil + } + + // Build a pre-formatted reminder list for the template variable + var reminderList string + for _, r := range due { + reminderList += fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckRemindersItemFormat)+token.NewlineLF, r.ID, r.Message) + } + + fallback := reminderList + + token.NewlineLF + assets.TextDesc(assets.TextDescKeyCheckRemindersDismissHint) + token.NewlineLF + + assets.TextDesc(assets.TextDescKeyCheckRemindersDismissAllHint) + vars := map[string]any{tpl.VarReminderList: reminderList} + content := core.LoadMessage(hook.CheckReminders, hook.VariantReminders, vars, fallback) + if content == "" { + return nil + } + + cmd.Println(core.NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckRemindersRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckRemindersBoxTitle), + content)) + + ref := notify.NewTemplateRef(hook.CheckReminders, hook.VariantReminders, vars) + nudgeMsg := hook.CheckReminders + ": " + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckRemindersNudgeFormat), len(due)) + core.NudgeAndRelay(nudgeMsg, input.SessionID, ref) + + return nil +} diff --git a/internal/cli/system/cmd/check_resources/cmd.go b/internal/cli/system/cmd/check_resources/cmd.go new file mode 100644 index 00000000..a4133cbc --- /dev/null +++ b/internal/cli/system/cmd/check_resources/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_resources + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-resources" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-resources subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckResources) + + return &cobra.Command{ + Use: "check-resources", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_resources/doc.go b/internal/cli/system/cmd/check_resources/doc.go new file mode 100644 index 00000000..2d3a8126 --- /dev/null +++ b/internal/cli/system/cmd/check_resources/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_resources implements the ctx system check-resources +// subcommand. +// +// It collects system resource metrics (memory, swap, disk, load) and +// emits a warning when any resource hits danger severity. +package check_resources diff --git a/internal/cli/system/cmd/check_resources/run.go b/internal/cli/system/cmd/check_resources/run.go new file mode 100644 index 00000000..f5b44a5f --- /dev/null +++ b/internal/cli/system/cmd/check_resources/run.go @@ -0,0 +1,87 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_resources + +import ( + "os" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/sysinfo" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run executes the check-resources hook logic. +// +// Collects system resource snapshots, evaluates alert thresholds, and +// emits a relay warning box when any resource is at danger level. +// Throttled by session pause state. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + snap := sysinfo.Collect(".") + alerts := sysinfo.Evaluate(snap) + + if sysinfo.MaxSeverity(alerts) < sysinfo.SeverityDanger { + return nil + } + + // Build pre-formatted alert messages for the template variable + var alertMessages string + for _, a := range alerts { + if a.Severity == sysinfo.SeverityDanger { + alertMessages += "✖ " + + a.Message + token.NewlineLF + } + } + + fallback := alertMessages + + token.NewlineLF + assets.TextDesc( + assets.TextDescKeyCheckResourcesFallbackLow) + token.NewlineLF + + assets.TextDesc( + assets.TextDescKeyCheckResourcesFallbackPersist) + token.NewlineLF + + assets.TextDesc( + assets.TextDescKeyCheckResourcesFallbackEnd) + vars := map[string]any{tpl.VarAlertMessages: alertMessages} + content := core.LoadMessage( + hook.CheckResources, hook.VariantAlert, vars, fallback, + ) + if content == "" { + return nil + } + + write.HookNudge(cmd, core.NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckResourcesRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckResourcesBoxTitle), + content)) + + ref := notify.NewTemplateRef( + hook.CheckResources, hook.VariantAlert, vars, + ) + core.NudgeAndRelay(hook.CheckResources+": "+ + assets.TextDesc(assets.TextDescKeyCheckResourcesRelayMessage), + input.SessionID, ref, + ) + + return nil +} diff --git a/internal/cli/system/cmd/check_task_completion/cmd.go b/internal/cli/system/cmd/check_task_completion/cmd.go new file mode 100644 index 00000000..1f8e2d08 --- /dev/null +++ b/internal/cli/system/cmd/check_task_completion/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_task_completion + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-task-completion" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-task-completion subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckTaskCompletion) + + return &cobra.Command{ + Use: "check-task-completion", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_task_completion/doc.go b/internal/cli/system/cmd/check_task_completion/doc.go new file mode 100644 index 00000000..9d5fdd68 --- /dev/null +++ b/internal/cli/system/cmd/check_task_completion/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_task_completion implements the ctx system +// check-task-completion subcommand. +// +// It counts Edit/Write tool calls and periodically nudges the agent to +// check whether any tasks should be marked done in TASKS.md. +package check_task_completion diff --git a/internal/cli/system/cmd/check_task_completion/run.go b/internal/cli/system/cmd/check_task_completion/run.go new file mode 100644 index 00000000..3835df20 --- /dev/null +++ b/internal/cli/system/cmd/check_task_completion/run.go @@ -0,0 +1,79 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_task_completion + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/nudge" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// Run executes the check-task-completion hook logic. +// +// Tracks a per-session prompt counter and emits a task completion nudge +// when the counter reaches the configured interval. The counter resets +// after each nudge. Disabled when the nudge interval is zero or negative. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + interval := rc.TaskNudgeInterval() + if interval <= 0 { + return nil + } + + counterPath := filepath.Join(core.StateDir(), nudge.PrefixTask+sessionID) + count := core.ReadCounter(counterPath) + count++ + + if count < interval { + core.WriteCounter(counterPath, count) + return nil + } + + // Threshold reached — reset and nudge. + core.WriteCounter(counterPath, 0) + + fallback := assets.TextDesc(assets.TextDescKeyCheckTaskCompletionFallback) + msg := core.LoadMessage( + hook.CheckTaskCompletion, hook.VariantNudge, nil, fallback, + ) + if msg == "" { + return nil + } + core.PrintHookContext(cmd, hook.EventPostToolUse, msg) + + nudgeMsg := assets.TextDesc(assets.TextDescKeyCheckTaskCompletionNudgeMessage) + ref := notify.NewTemplateRef( + hook.CheckTaskCompletion, hook.VariantNudge, nil, + ) + core.Relay( + hook.CheckTaskCompletion+": "+nudgeMsg, input.SessionID, ref, + ) + + return nil +} diff --git a/internal/cli/system/cmd/check_version/cmd.go b/internal/cli/system/cmd/check_version/cmd.go new file mode 100644 index 00000000..26fc7682 --- /dev/null +++ b/internal/cli/system/cmd/check_version/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_version + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system check-version" subcommand. +// +// Returns: +// - *cobra.Command: Configured check-version subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemCheckVersion) + + return &cobra.Command{ + Use: "check-version", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/check_version/doc.go b/internal/cli/system/cmd/check_version/doc.go new file mode 100644 index 00000000..d0c3ed7c --- /dev/null +++ b/internal/cli/system/cmd/check_version/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package check_version implements the ctx system check-version subcommand. +// +// It compares the ctx binary version against the embedded plugin version +// and warns when the binary is older than the plugin expects. +package check_version diff --git a/internal/cli/system/cmd/check_version/run.go b/internal/cli/system/cmd/check_version/run.go new file mode 100644 index 00000000..7737d9c3 --- /dev/null +++ b/internal/cli/system/cmd/check_version/run.go @@ -0,0 +1,117 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package check_version + +import ( + "fmt" + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/ActiveMemory/ctx/internal/config/version" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the check-version hook logic. +// +// Compares the binary version against the embedded plugin version and +// emits a version mismatch warning if they differ. Also, piggybacks +// a key rotation age check. Throttled to once per day. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + tmpDir := core.StateDir() + markerFile := filepath.Join(tmpDir, version.ThrottleID) + + if core.IsDailyThrottled(markerFile) { + return nil + } + + binaryVer := cmd.Root().Version + + // Skip check for dev builds + if binaryVer == version.DevBuild { + core.TouchFile(markerFile) + return nil + } + + pluginVer, pluginErr := assets.PluginVersion() + if pluginErr != nil { + return nil // embedded plugin.json missing — nothing to compare + } + + bMajor, bMinor, bOK := core.ParseMajorMinor(binaryVer) + pMajor, pMinor, pOK := core.ParseMajorMinor(pluginVer) + + if !bOK || !pOK { + core.TouchFile(markerFile) + return nil + } + + if bMajor == pMajor && bMinor == pMinor { + core.TouchFile(markerFile) + return nil + } + + // Version mismatch — emit warning + fallback := fmt.Sprintf(assets.TextDesc( + assets.TextDescKeyCheckVersionFallback), binaryVer, pluginVer, + ) + content := core.LoadMessage(hook.CheckVersion, hook.VariantMismatch, + map[string]any{ + tpl.VarBinaryVersion: binaryVer, + tpl.VarPluginVersion: pluginVer, + }, fallback) + if content == "" { + core.TouchFile(markerFile) + return nil + } + + boxTitle := assets.TextDesc(assets.TextDescKeyCheckVersionBoxTitle) + relayPrefix := assets.TextDesc(assets.TextDescKeyCheckVersionRelayPrefix) + + cmd.Println(core.NudgeBox(relayPrefix, boxTitle, content)) + + ref := notify.NewTemplateRef(hook.CheckVersion, hook.VariantMismatch, + map[string]any{ + tpl.VarBinaryVersion: binaryVer, + tpl.VarPluginVersion: pluginVer, + }) + versionMsg := hook.CheckVersion + ": " + + fmt.Sprintf( + assets.TextDesc( + assets.TextDescKeyCheckVersionMismatchRelayFormat, + ), binaryVer, pluginVer, + ) + core.NudgeAndRelay(versionMsg, input.SessionID, ref) + + core.TouchFile(markerFile) + + // Key age check — piggyback on the daily version check + core.CheckKeyAge(cmd, input.SessionID) + + return nil +} diff --git a/internal/cli/system/cmd/checkbackupage/cmd.go b/internal/cli/system/cmd/checkbackupage/cmd.go deleted file mode 100644 index e8381995..00000000 --- a/internal/cli/system/cmd/checkbackupage/cmd.go +++ /dev/null @@ -1,162 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkbackupage - -import ( - "fmt" - "os" - "path/filepath" - "time" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/notify" -) - -const ( - backupMaxAgeDays = 2 - backupThrottleID = "backup-reminded" -) - -// Cmd returns the "ctx system check-backup-age" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-backup-age subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-backup-age", - Short: "Backup staleness check hook", - Long: `Checks if the .context backup is stale (>2 days old) or the SMB share -is unmounted. Outputs a VERBATIM relay warning when issues are found. -Throttled to once per day. - -Environment: - CTX_BACKUP_SMB_URL - SMB share URL (e.g. smb://myhost/myshare). - If unset, the SMB mount check is skipped. - -Hook event: UserPromptSubmit -Output: VERBATIM relay with warning box, silent otherwise -Silent when: backup is fresh, or already checked today`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckBackupAge(cmd, os.Stdin) - }, - } -} - -func runCheckBackupAge(cmd *cobra.Command, stdin *os.File) error { - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - throttleFile := filepath.Join(tmpDir, backupThrottleID) - - if core.IsDailyThrottled(throttleFile) { - return nil - } - - home, homeErr := os.UserHomeDir() - if homeErr != nil { - return nil - } - - var warnings []string - - // Check 1: Is the SMB share mounted? - if smbURL := os.Getenv(config.EnvBackupSMBURL); smbURL != "" { - warnings = checkSMBMountWarnings(smbURL, warnings) - } - - // Check 2: Is the backup stale? - markerPath := filepath.Join(home, ".local", "state", config.BackupMarkerFile) - warnings = checkBackupMarker(markerPath, warnings) - - if len(warnings) == 0 { - return nil - } - - // Build pre-formatted warnings for the template variable - var warningText string - for _, w := range warnings { - warningText += w + config.NewlineLF - } - - content := core.LoadMessage("check-backup-age", "warning", - map[string]any{"Warnings": warningText}, warningText) - if content == "" { - return nil - } - - // Emit VERBATIM relay - msg := "IMPORTANT: Relay this backup warning to the user VERBATIM before answering their question.\n\n" + - "┌─ Backup Warning ──────────────────────────────────\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += config.NudgeBoxBottom - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-backup-age", "warning", - map[string]any{"Warnings": warningText}) - _ = notify.Send("nudge", "check-backup-age: Backup warning", input.SessionID, ref) - _ = notify.Send("relay", "check-backup-age: Backup warning", input.SessionID, ref) - - core.TouchFile(throttleFile) - - return nil -} - -// checkSMBMountWarnings checks if the GVFS mount for the given SMB URL exists. -func checkSMBMountWarnings(smbURL string, warnings []string) []string { - cfg, cfgErr := core.ParseSMBConfig(smbURL, "") - if cfgErr != nil { - return warnings - } - - if _, statErr := os.Stat(cfg.GVFSPath); os.IsNotExist(statErr) { - warnings = append(warnings, - fmt.Sprintf("SMB share (%s) is not mounted.", cfg.Host), - "Backups cannot run until it's available.", - ) - } - - return warnings -} - -// checkBackupMarker checks the backup marker file age. -func checkBackupMarker(markerPath string, warnings []string) []string { - info, statErr := os.Stat(markerPath) - if os.IsNotExist(statErr) { - return append(warnings, - "No backup marker found — backup may have never run.", - "Run: ctx system backup", - ) - } - if statErr != nil { - return warnings - } - - ageDays := int(time.Since(info.ModTime()).Hours() / 24) - if ageDays >= backupMaxAgeDays { - return append(warnings, - fmt.Sprintf("Last .context backup is %d days old.", ageDays), - "Run: ctx system backup", - ) - } - - return warnings -} diff --git a/internal/cli/system/cmd/checkceremonies/run.go b/internal/cli/system/cmd/checkceremonies/run.go deleted file mode 100644 index 9433eb81..00000000 --- a/internal/cli/system/cmd/checkceremonies/run.go +++ /dev/null @@ -1,179 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkceremonies - -import ( - "os" - "path/filepath" - "sort" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -func runCheckCeremonies(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - remindedFile := filepath.Join(tmpDir, "ceremony-reminded") - - if core.IsDailyThrottled(remindedFile) { - return nil - } - - files := recentJournalFiles(core.ResolvedJournalDir(), 3) - - if len(files) == 0 { - return nil - } - - remember, wrapup := scanJournalsForCeremonies(files) - - if remember && wrapup { - return nil - } - - msg := emitCeremonyNudge(cmd, remember, wrapup) - if msg == "" { - return nil - } - var variant string - switch { - case !remember && !wrapup: - variant = core.VariantBoth - case !remember: - variant = "remember" - default: - variant = "wrapup" - } - ref := notify.NewTemplateRef("check-ceremonies", variant, nil) - _ = notify.Send("nudge", "check-ceremonies: Session ceremony nudge", input.SessionID, ref) - _ = notify.Send("relay", "check-ceremonies: Session ceremony nudge", input.SessionID, ref) - eventlog.Append("relay", "check-ceremonies: Session ceremony nudge", input.SessionID, ref) - core.TouchFile(remindedFile) - return nil -} - -// recentJournalFiles returns the n most recent .md files in the journal -// directory, sorted by filename descending. -func recentJournalFiles(dir string, n int) []string { - entries, readErr := os.ReadDir(dir) - if readErr != nil { - return nil - } - - var names []string - for _, e := range entries { - if e.IsDir() || !strings.HasSuffix(e.Name(), config.ExtMarkdown) { - continue - } - names = append(names, e.Name()) - } - - sort.Sort(sort.Reverse(sort.StringSlice(names))) - - if len(names) > n { - names = names[:n] - } - - paths := make([]string, len(names)) - for i, name := range names { - paths[i] = filepath.Join(dir, name) - } - return paths -} - -// scanJournalsForCeremonies checks whether the given journal files contain -// references to /ctx-remember and /ctx-wrap-up. -func scanJournalsForCeremonies(files []string) (remember, wrapup bool) { - for _, path := range files { - data, readErr := os.ReadFile(path) //nolint:gosec // journal file path - if readErr != nil { - continue - } - content := string(data) - if !remember && strings.Contains(content, "ctx-remember") { - remember = true - } - if !wrapup && strings.Contains(content, "ctx-wrap-up") { - wrapup = true - } - if remember && wrapup { - return - } - } - return -} - -func emitCeremonyNudge(cmd *cobra.Command, remember, wrapup bool) string { - var variant, boxTitle, fallback string - - switch { - case !remember && !wrapup: - variant = core.VariantBoth - boxTitle = "Session Ceremonies" - fallback = "Your last 3 sessions didn't use /ctx-remember or\n" + - "/ctx-wrap-up.\n" + - config.NewlineLF + - "Start sessions with: /ctx-remember\n" + - " → Loads context, shows active tasks, picks up\n" + - " where you left off. No re-explaining needed.\n" + - config.NewlineLF + - "End sessions with: /ctx-wrap-up\n" + - " → Captures learnings and decisions so the next\n" + - " session starts informed, not from scratch.\n" + - config.NewlineLF + - "These take seconds and save minutes." - case !remember: - variant = "remember" - boxTitle = "Session Start" - fallback = "Try starting this session with /ctx-remember\n" + - config.NewlineLF + - "It loads your context, shows active tasks, and\n" + - "picks up where you left off — no re-explaining." - case !wrapup: - variant = "wrapup" - boxTitle = "Session End" - fallback = "Your last 3 sessions didn't end with /ctx-wrap-up\n" + - config.NewlineLF + - "It captures learnings and decisions so the next\n" + - "session starts informed, not from scratch." - } - - content := core.LoadMessage("check-ceremonies", variant, nil, fallback) - if content == "" { - return "" - } - - msg := "IMPORTANT: Relay this session tip to the user VERBATIM before answering their question.\n\n" + - "┌─ " + boxTitle + " " + strings.Repeat("─", 51-len(boxTitle)) + config.NewlineLF - msg += core.BoxLines(content) - if dirFooter := core.ContextDirLine(); dirFooter != "" { - msg += "│ " + dirFooter + config.NewlineLF - } - msg += "└───────────────────────────────────────────────────" - - cmd.Println(msg) - return msg -} diff --git a/internal/cli/system/cmd/checkcontextsize/cmd.go b/internal/cli/system/cmd/checkcontextsize/cmd.go deleted file mode 100644 index fb0eb5eb..00000000 --- a/internal/cli/system/cmd/checkcontextsize/cmd.go +++ /dev/null @@ -1,42 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkcontextsize - -import ( - "os" - - "github.com/spf13/cobra" -) - -// Cmd returns the "ctx system check-context-size" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-context-size subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-context-size", - Short: "Context size checkpoint hook", - Long: `Counts prompts per session and emits VERBATIM relay reminders at -adaptive intervals, prompting the user to consider wrapping up. - - Prompts 1-15: silent - Prompts 16-30: every 5th prompt - Prompts 30+: every 3rd prompt - -Also monitors actual context window token usage from session JSONL data. -Fires an independent warning when context window exceeds 80%, regardless -of prompt count. - -Hook event: UserPromptSubmit -Output: VERBATIM relay (when triggered), silent otherwise -Silent when: early in session or between checkpoints`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckContextSize(cmd, os.Stdin) - }, - } -} diff --git a/internal/cli/system/cmd/checkcontextsize/run.go b/internal/cli/system/cmd/checkcontextsize/run.go deleted file mode 100644 index 5ac8f33d..00000000 --- a/internal/cli/system/cmd/checkcontextsize/run.go +++ /dev/null @@ -1,275 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkcontextsize - -import ( - "fmt" - "os" - "path/filepath" - "regexp" - "strconv" - "time" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// contextWindowThresholdPct is the percentage of context window usage that -// triggers an independent warning, regardless of prompt count. -const contextWindowThresholdPct = 80 - -func runCheckContextSize(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - - // Pause check — this hook is the designated single emitter - if turns := core.Paused(sessionID); turns > 0 { - cmd.Println(core.PausedMessage(turns)) - return nil - } - - tmpDir := core.StateDir() - counterFile := filepath.Join(tmpDir, "context-check-"+sessionID) - logFile := filepath.Join(rc.ContextDir(), "logs", "check-context-size.log") - - // Increment counter - count := core.ReadCounter(counterFile) + 1 - core.WriteCounter(counterFile, count) - - // Read actual context window usage from session JSONL - info, _ := core.ReadSessionTokenInfo(sessionID) - tokens := info.Tokens - windowSize := core.EffectiveContextWindow(info.Model) - pct := 0 - if windowSize > 0 && tokens > 0 { - pct = tokens * 100 / windowSize - } - - // Billing threshold: one-shot warning when tokens exceed the - // user-configured billing_token_warn. Independent of all other - // triggers — fires even during wrap-up suppression because cost - // guards are never convenience nudges. - if billingThreshold := rc.BillingTokenWarn(); billingThreshold > 0 && tokens >= billingThreshold { - emitBillingWarning(cmd, logFile, sessionID, count, tokens, billingThreshold) - } - - // Wrap-up suppression: if the user recently ran /ctx-wrap-up, - // suppress checkpoint and window nudges to avoid noise during/after - // the wrap-up ceremony. The marker expires after 2 hours. - // Stats are still recorded so token usage tracking is continuous. - if core.WrappedUpRecently() { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d suppressed (wrapped up)", count)) - core.WriteSessionStats(sessionID, core.SessionStats{ - Timestamp: time.Now().Format(time.RFC3339), - Prompt: count, - Tokens: tokens, - Pct: pct, - WindowSize: windowSize, - Model: info.Model, - Event: "suppressed", - }) - return nil - } - - // Adaptive frequency (prompt counter) - counterTriggered := false - if count > 30 { - counterTriggered = count%3 == 0 - } else if count > 15 { - counterTriggered = count%5 == 0 - } - - windowTrigger := pct >= contextWindowThresholdPct - - event := "silent" - switch { - case counterTriggered: - event = "checkpoint" - emitCheckpoint(cmd, logFile, sessionID, count, tokens, pct, windowSize) - case windowTrigger: - event = "window-warning" - emitWindowWarning(cmd, logFile, sessionID, count, tokens, pct) - default: - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d silent", count)) - } - - core.WriteSessionStats(sessionID, core.SessionStats{ - Timestamp: time.Now().Format(time.RFC3339), - Prompt: count, - Tokens: tokens, - Pct: pct, - WindowSize: windowSize, - Model: info.Model, - Event: event, - }) - - return nil -} - -// emitCheckpoint emits the standard checkpoint box with optional token usage. -func emitCheckpoint(cmd *cobra.Command, logFile, sessionID string, count, tokens, pct, windowSize int) { - fallback := "This session is getting deep. Consider wrapping up\n" + - "soon. If there are unsaved learnings, decisions, or\n" + - "conventions, now is a good time to persist them." - content := core.LoadMessage("check-context-size", "checkpoint", nil, fallback) - if content == "" { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d silenced-by-template", count)) - return - } - msg := fmt.Sprintf("IMPORTANT: Relay this context checkpoint to the user VERBATIM before answering their question.\n\n"+ - "┌─ Context Checkpoint (prompt #%d) ────────────────\n", count) - msg += core.BoxLines(content) - if tokens > 0 { - msg += "│ " + tokenUsageLine(tokens, pct, windowSize) + config.NewlineLF - } - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += appendOversizeNudge() - msg += core.BoxBottom - cmd.Println(msg) - cmd.Println() - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d CHECKPOINT tokens=%d pct=%d%%", count, tokens, pct)) - ref := notify.NewTemplateRef("check-context-size", "checkpoint", nil) - checkpointMsg := fmt.Sprintf("check-context-size: Context Checkpoint at prompt #%d", count) - _ = notify.Send("nudge", checkpointMsg, sessionID, ref) - _ = notify.Send("relay", checkpointMsg, sessionID, ref) - eventlog.Append("relay", checkpointMsg, sessionID, ref) -} - -// emitWindowWarning emits an independent context window warning (>80%). -func emitWindowWarning(cmd *cobra.Command, logFile, sessionID string, count, tokens, pct int) { - fallback := fmt.Sprintf("⚠ Context window is %d%% full (~%s tokens).\n"+ - "The session will lose older context soon. Consider wrapping up\n"+ - "or starting a fresh session with /ctx-wrap-up.", pct, core.FormatTokenCount(tokens)) - content := core.LoadMessage("check-context-size", "window", - map[string]any{"Percentage": pct, "TokenCount": core.FormatTokenCount(tokens)}, fallback) - if content == "" { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d window-silenced pct=%d%%", count, pct)) - return - } - msg := "IMPORTANT: Relay this context window warning to the user VERBATIM before answering their question.\n\n" + - "┌─ Context Window Warning ─────────────────────────\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += core.BoxBottom - cmd.Println(msg) - cmd.Println() - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d WINDOW-WARNING tokens=%d pct=%d%%", count, tokens, pct)) - ref := notify.NewTemplateRef("check-context-size", "window", - map[string]any{"Percentage": pct, "TokenCount": core.FormatTokenCount(tokens)}) - windowMsg := fmt.Sprintf("check-context-size: Context window at %d%%", pct) - _ = notify.Send("nudge", windowMsg, sessionID, ref) - _ = notify.Send("relay", windowMsg, sessionID, ref) - eventlog.Append("relay", windowMsg, sessionID, ref) -} - -// tokenUsageLine formats a context window usage line for display. -func tokenUsageLine(tokens, pct, windowSize int) string { - icon := "⏱" - suffix := "" - if pct >= contextWindowThresholdPct { - icon = "⚠" - suffix = " — running low" - } - return fmt.Sprintf("%s Context window: ~%s tokens (~%d%% of %s)%s", - icon, core.FormatTokenCount(tokens), pct, core.FormatWindowSize(windowSize), suffix) -} - -// appendOversizeNudge checks for an injection-oversize flag file and returns -// box-formatted nudge lines if present. Deletes the flag after reading (one-shot). -func appendOversizeNudge() string { - flagPath := filepath.Join(rc.ContextDir(), config.DirState, "injection-oversize") - data, readErr := os.ReadFile(flagPath) //nolint:gosec // project-local state path - if readErr != nil { - return "" - } - - tokenCount := extractOversizeTokens(data) - fallback := fmt.Sprintf("⚠ Context injection is large (~%d tokens).\n"+ - "Run /ctx-consolidate to distill your context files.", tokenCount) - content := core.LoadMessage("check-context-size", "oversize", - map[string]any{"TokenCount": tokenCount}, fallback) - if content == "" { - _ = os.Remove(flagPath) // silenced, still consume the flag - return "" - } - - _ = os.Remove(flagPath) // one-shot: consumed - return core.BoxLines(content) -} - -// emitBillingWarning emits a one-shot warning when token usage crosses the -// billing_token_warn threshold. -func emitBillingWarning(cmd *cobra.Command, logFile, sessionID string, count, tokens, threshold int) { - // One-shot guard: skip if already warned this session. - warnedFile := filepath.Join(core.StateDir(), "billing-warned-"+sessionID) - if _, statErr := os.Stat(warnedFile); statErr == nil { - return // already fired - } - - fallback := fmt.Sprintf("⚠ Token usage (~%s) has exceeded your\n"+ - "billing_token_warn threshold (%s).\n"+ - "Additional tokens may incur extra cost.", - core.FormatTokenCount(tokens), core.FormatTokenCount(threshold)) - content := core.LoadMessage("check-context-size", "billing", - map[string]any{"TokenCount": core.FormatTokenCount(tokens), "Threshold": core.FormatTokenCount(threshold)}, fallback) - if content == "" { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d billing-silenced tokens=%d threshold=%d", count, tokens, threshold)) - core.TouchFile(warnedFile) // silenced counts as fired - return - } - - msg := "IMPORTANT: Relay this billing warning to the user VERBATIM before answering their question.\n\n" + - "┌─ Billing Threshold ──────────────────────────────\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += core.BoxBottom - cmd.Println(msg) - cmd.Println() - - core.TouchFile(warnedFile) // one-shot: mark as fired - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d BILLING-WARNING tokens=%d threshold=%d", count, tokens, threshold)) - ref := notify.NewTemplateRef("check-context-size", "billing", - map[string]any{"TokenCount": core.FormatTokenCount(tokens), "Threshold": core.FormatTokenCount(threshold)}) - billingMsg := fmt.Sprintf("check-context-size: Billing threshold exceeded (%s tokens > %s)", - core.FormatTokenCount(tokens), core.FormatTokenCount(threshold)) - _ = notify.Send("nudge", billingMsg, sessionID, ref) - _ = notify.Send("relay", billingMsg, sessionID, ref) - eventlog.Append("relay", billingMsg, sessionID, ref) -} - -// oversizeTokenRe matches "Injected: NNNNN tokens" in the flag file. -var oversizeTokenRe = regexp.MustCompile(`Injected:\s+(\d+)\s+tokens`) - -// extractOversizeTokens parses the token count from an injection-oversize flag file. -func extractOversizeTokens(data []byte) int { - m := oversizeTokenRe.FindSubmatch(data) - if m == nil { - return 0 - } - n, parseErr := strconv.Atoi(string(m[1])) - if parseErr != nil { - return 0 - } - return n -} diff --git a/internal/cli/system/cmd/checkjournal/run.go b/internal/cli/system/cmd/checkjournal/run.go deleted file mode 100644 index ec5262ae..00000000 --- a/internal/cli/system/cmd/checkjournal/run.go +++ /dev/null @@ -1,168 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkjournal - -import ( - "fmt" - "os" - "path/filepath" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/journal/state" - "github.com/ActiveMemory/ctx/internal/notify" -) - -func runCheckJournal(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - remindedFile := filepath.Join(tmpDir, "journal-reminded") - claudeProjectsDir := filepath.Join(os.Getenv("HOME"), ".claude", "projects") - - // Only remind once per day - if core.IsDailyThrottled(remindedFile) { - return nil - } - - // Bail out if journal or Claude projects directories don't exist - jDir := core.ResolvedJournalDir() - if _, statErr := os.Stat(jDir); os.IsNotExist(statErr) { - return nil - } - if _, statErr := os.Stat(claudeProjectsDir); os.IsNotExist(statErr) { - return nil - } - - // Stage 1: Unexported sessions - newestJournal := newestMtime(jDir, config.ExtMarkdown) - unexported := countNewerFiles(claudeProjectsDir, ".jsonl", newestJournal) - - // Stage 2: Unenriched entries - unenriched := countUnenriched(jDir) - - if unexported == 0 && unenriched == 0 { - return nil - } - - vars := map[string]any{ - "UnexportedCount": unexported, - "UnenrichedCount": unenriched, - } - - var variant, fallback string - switch { - case unexported > 0 && unenriched > 0: - variant = core.VariantBoth - fallback = fmt.Sprintf("You have %d new session(s) not yet exported.\n", unexported) + - fmt.Sprintf("%d existing entries need enrichment.\n", unenriched) + - "\nProcess journal (exports and enriches):\n /ctx-journal-enrich-all" - case unexported > 0: - variant = "unexported" - fallback = fmt.Sprintf("You have %d new session(s) not yet exported.\n", unexported) + - "\nProcess journal (exports and enriches):\n /ctx-journal-enrich-all" - default: - variant = "unenriched" - fallback = fmt.Sprintf("%d journal entries need enrichment.\n", unenriched) + - "\nEnrich:\n /ctx-journal-enrich-all" - } - - content := core.LoadMessage("check-journal", variant, vars, fallback) - if content == "" { - return nil - } - - msg := "IMPORTANT: Relay this journal reminder to the user VERBATIM before answering their question.\n\n" + - "┌─ Journal Reminder ─────────────────────────────\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += "└────────────────────────────────────────────────" - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-journal", variant, vars) - journalMsg := fmt.Sprintf("check-journal: %d unexported, %d unenriched", unexported, unenriched) - _ = notify.Send("nudge", journalMsg, input.SessionID, ref) - _ = notify.Send("relay", journalMsg, input.SessionID, ref) - eventlog.Append("relay", journalMsg, input.SessionID, ref) - - core.TouchFile(remindedFile) - return nil -} - -// newestMtime returns the most recent mtime (as Unix timestamp) of files -// with the given extension in the directory. Returns 0 if none found. -func newestMtime(dir, ext string) int64 { - entries, readErr := os.ReadDir(dir) - if readErr != nil { - return 0 - } - - var latest int64 - for _, entry := range entries { - if entry.IsDir() || !strings.HasSuffix(entry.Name(), ext) { - continue - } - info, infoErr := entry.Info() - if infoErr != nil { - continue - } - mtime := info.ModTime().Unix() - if mtime > latest { - latest = mtime - } - } - return latest -} - -// countNewerFiles recursively counts files with the given extension that -// are newer than the reference timestamp. -func countNewerFiles(dir, ext string, refTime int64) int { - count := 0 - _ = filepath.Walk(dir, func(_ string, info os.FileInfo, walkErr error) error { - if walkErr != nil { - return nil // skip errors - } - if info.IsDir() { - return nil - } - if !strings.HasSuffix(info.Name(), ext) { - return nil - } - if info.ModTime().Unix() > refTime { - count++ - } - return nil - }) - return count -} - -// countUnenriched counts journal .md files that lack an enriched date -// in the journal state file. -func countUnenriched(dir string) int { - jstate, loadErr := state.Load(dir) - if loadErr != nil { - return 0 - } - return jstate.CountUnenriched(dir) -} diff --git a/internal/cli/system/cmd/checkknowledge/cmd.go b/internal/cli/system/cmd/checkknowledge/cmd.go deleted file mode 100644 index cc6ce006..00000000 --- a/internal/cli/system/cmd/checkknowledge/cmd.go +++ /dev/null @@ -1,166 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkknowledge - -import ( - "bytes" - "fmt" - "os" - "path/filepath" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/index" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// Cmd returns the "ctx system check-knowledge" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-knowledge subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-knowledge", - Short: "Knowledge file growth nudge", - Long: `Counts entries in DECISIONS.md and LEARNINGS.md and lines in -CONVENTIONS.md, and outputs a VERBATIM relay nudge when any file exceeds -the configured threshold. Throttled to once per day. - - Learnings threshold: entry_count_learnings (default 30) - Decisions threshold: entry_count_decisions (default 20) - Conventions threshold: convention_line_count (default 200) - -Hook event: UserPromptSubmit -Output: VERBATIM relay (when thresholds exceeded), silent otherwise -Silent when: below thresholds, already nudged today, or uninitialized`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckKnowledge(cmd, os.Stdin) - }, - } -} - -func runCheckKnowledge(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - markerPath := filepath.Join(core.StateDir(), "check-knowledge") - if core.IsDailyThrottled(markerPath) { - return nil - } - - lrnThreshold := rc.EntryCountLearnings() - decThreshold := rc.EntryCountDecisions() - convThreshold := rc.ConventionLineCount() - - // All disabled — nothing to check - if lrnThreshold == 0 && decThreshold == 0 && convThreshold == 0 { - return nil - } - - contextDir := rc.ContextDir() - - type finding struct { - file string - count int - threshold int - unit string - } - var findings []finding - - if decThreshold > 0 { - decPath := filepath.Join(contextDir, config.FileDecision) - if data, readErr := os.ReadFile(decPath); readErr == nil { //nolint:gosec // project-local path - count := len(index.ParseEntryBlocks(string(data))) - if count > decThreshold { - findings = append(findings, finding{ - file: config.FileDecision, count: count, threshold: decThreshold, unit: "entries", - }) - } - } - } - - if lrnThreshold > 0 { - lrnPath := filepath.Join(contextDir, config.FileLearning) - if data, readErr := os.ReadFile(lrnPath); readErr == nil { //nolint:gosec // project-local path - count := len(index.ParseEntryBlocks(string(data))) - if count > lrnThreshold { - findings = append(findings, finding{ - file: config.FileLearning, count: count, threshold: lrnThreshold, unit: "entries", - }) - } - } - } - - if convThreshold > 0 { - convPath := filepath.Join(contextDir, config.FileConvention) - if data, readErr := os.ReadFile(convPath); readErr == nil { //nolint:gosec // project-local path - lineCount := bytes.Count(data, []byte(config.NewlineLF)) - if lineCount > convThreshold { - findings = append(findings, finding{ - file: config.FileConvention, count: lineCount, threshold: convThreshold, unit: "lines", - }) - } - } - } - - if len(findings) == 0 { - return nil - } - - // Build pre-formatted findings list for the template variable - var fileWarnings string - for _, f := range findings { - fileWarnings += fmt.Sprintf("%s has %d %s (recommended: \u2264%d).\n", f.file, f.count, f.unit, f.threshold) - } - - fallback := fileWarnings + - "\nLarge knowledge files dilute agent context. Consider:\n" + - " \u2022 Review and remove outdated entries\n" + - " \u2022 Use /ctx-consolidate to merge overlapping entries\n" + - " \u2022 Use /ctx-drift for semantic drift (stale patterns)\n" + - " \u2022 Move stale entries to .context/archive/ manually" - content := core.LoadMessage("check-knowledge", "warning", - map[string]any{"FileWarnings": fileWarnings}, fallback) - if content == "" { - return nil - } - - msg := "IMPORTANT: Relay this knowledge health notice to the user VERBATIM before answering their question.\n\n" + - "\u250c\u2500 Knowledge File Growth \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "\u2502 " + line + config.NewlineLF - } - msg += "\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500" - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-knowledge", "warning", - map[string]any{"FileWarnings": fileWarnings}) - _ = notify.Send("nudge", "check-knowledge: Knowledge file growth detected", input.SessionID, ref) - _ = notify.Send("relay", "check-knowledge: Knowledge file growth detected", input.SessionID, ref) - eventlog.Append("relay", "check-knowledge: Knowledge file growth detected", input.SessionID, ref) - - core.TouchFile(markerPath) - - return nil -} diff --git a/internal/cli/system/cmd/checkmapstaleness/cmd.go b/internal/cli/system/cmd/checkmapstaleness/cmd.go deleted file mode 100644 index 271314e4..00000000 --- a/internal/cli/system/cmd/checkmapstaleness/cmd.go +++ /dev/null @@ -1,156 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkmapstaleness - -import ( - "encoding/json" - "fmt" - "os" - "os/exec" - "path/filepath" - "strings" - "time" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -const mapStaleDays = 30 - -// Cmd returns the "ctx system check-map-staleness" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-map-staleness subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-map-staleness", - Short: "Architecture map staleness nudge", - Long: `Checks whether map-tracking.json is stale (>30 days) and there are -commits touching internal/ since the last map refresh. Outputs a VERBATIM -relay nudge suggesting /ctx-map when both conditions are met. - -Hook event: UserPromptSubmit -Output: VERBATIM relay (when stale and modules changed), silent otherwise -Silent when: map-tracking.json missing or fresh, opted out, no module -commits, already nudged today, or uninitialized`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckMapStaleness(cmd, os.Stdin) - }, - } -} - -// mapTrackingInfo holds the minimal fields needed from map-tracking.json. -type mapTrackingInfo struct { - OptedOut bool `json:"opted_out"` - LastRun string `json:"last_run"` -} - -func runCheckMapStaleness(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - markerPath := filepath.Join(core.StateDir(), "check-map-staleness") - if core.IsDailyThrottled(markerPath) { - return nil - } - - contextDir := rc.ContextDir() - trackingPath := filepath.Join(contextDir, config.FileMapTracking) - - data, readErr := os.ReadFile(trackingPath) //nolint:gosec // project-local path - if readErr != nil { - return nil // no tracking file — nothing to nudge about - } - - var info mapTrackingInfo - if jsonErr := json.Unmarshal(data, &info); jsonErr != nil { - return nil - } - - if info.OptedOut { - return nil - } - - lastRun, parseErr := time.Parse("2006-01-02", info.LastRun) - if parseErr != nil { - return nil - } - - if time.Since(lastRun) < time.Duration(mapStaleDays)*24*time.Hour { - return nil - } - - // Count commits touching internal/ since last run - moduleCommits := countModuleCommits(info.LastRun) - if moduleCommits == 0 { - return nil - } - - // Emit VERBATIM nudge - dateStr := lastRun.Format("2006-01-02") - fallback := fmt.Sprintf("ARCHITECTURE.md hasn't been refreshed since %s\n", dateStr) + - fmt.Sprintf("and there are commits touching %d modules.\n", moduleCommits) + - "/ctx-map keeps architecture docs drift-free.\n" + - config.NewlineLF + - "Want me to run /ctx-map to refresh?" - content := core.LoadMessage("check-map-staleness", "stale", - map[string]any{ - "LastRefreshDate": dateStr, - "ModuleCount": moduleCommits, - }, fallback) - if content == "" { - return nil - } - - msg := "IMPORTANT: Relay this architecture map notice to the user VERBATIM before answering their question.\n\n" + - "\u250c\u2500 Architecture Map Stale \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "\u2502 " + line + config.NewlineLF - } - msg += "\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500" - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-map-staleness", "stale", - map[string]any{"LastRefreshDate": dateStr, "ModuleCount": moduleCommits}) - _ = notify.Send("nudge", "check-map-staleness: Architecture map stale", input.SessionID, ref) - _ = notify.Send("relay", "check-map-staleness: Architecture map stale", input.SessionID, ref) - eventlog.Append("relay", "check-map-staleness: Architecture map stale", input.SessionID, ref) - - core.TouchFile(markerPath) - - return nil -} - -// countModuleCommits counts git commits touching internal/ since the given date. -func countModuleCommits(since string) int { - out, gitErr := exec.Command("git", "log", "--oneline", "--since="+since, "--", "internal/").Output() //nolint:gosec // date string from JSON - if gitErr != nil { - return 0 - } - lines := strings.TrimSpace(string(out)) - if lines == "" { - return 0 - } - return len(strings.Split(lines, config.NewlineLF)) -} diff --git a/internal/cli/system/cmd/checkmemorydrift/cmd.go b/internal/cli/system/cmd/checkmemorydrift/cmd.go deleted file mode 100644 index abf84fd6..00000000 --- a/internal/cli/system/cmd/checkmemorydrift/cmd.go +++ /dev/null @@ -1,83 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkmemorydrift - -import ( - "os" - "path/filepath" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/memory" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// Cmd returns the "ctx system check-memory-drift" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-memory-drift subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-memory-drift", - Short: "Memory drift nudge", - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckMemoryDrift(cmd, os.Stdin) - }, - } -} - -func runCheckMemoryDrift(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - // Session tombstone — nudge once per session per session ID - tombstone := filepath.Join(core.StateDir(), "memory-drift-nudged-"+sessionID) - if _, statErr := os.Stat(tombstone); statErr == nil { - return nil - } - - contextDir := rc.ContextDir() - projectRoot := filepath.Dir(contextDir) - - sourcePath, discoverErr := memory.DiscoverMemoryPath(projectRoot) - if discoverErr != nil { - // Auto memory not active — skip silently - return nil - } - - if !memory.HasDrift(contextDir, sourcePath) { - return nil - } - - msg := "IMPORTANT: Relay this memory drift notice to the user VERBATIM before answering their question.\n\n" + - "\u250c\u2500 Memory Drift \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n" + - "\u2502 MEMORY.md has changed since last sync.\n" + - "\u2502 Run: ctx memory sync\n" - if line := core.ContextDirLine(); line != "" { - msg += "\u2502 " + line + config.NewlineLF - } - msg += "\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500" - cmd.Println(msg) - - core.TouchFile(tombstone) - - return nil -} diff --git a/internal/cli/system/cmd/checkpersistence/cmd.go b/internal/cli/system/cmd/checkpersistence/cmd.go deleted file mode 100644 index f18273ba..00000000 --- a/internal/cli/system/cmd/checkpersistence/cmd.go +++ /dev/null @@ -1,38 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkpersistence - -import ( - "os" - - "github.com/spf13/cobra" -) - -// Cmd returns the "ctx system check-persistence" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-persistence subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-persistence", - Short: "Persistence nudge hook", - Long: `Tracks prompts since the last .context/ file modification and nudges -the agent to persist learnings, decisions, or task updates. - - Prompts 1-10: silent (too early) - Prompts 11-25: nudge once at prompt 20 since last modification - Prompts 25+: every 15th prompt since last modification - -Hook event: UserPromptSubmit -Output: agent directive (when triggered), silent otherwise -Silent when: context files were recently modified`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckPersistence(cmd, os.Stdin) - }, - } -} diff --git a/internal/cli/system/cmd/checkpersistence/run.go b/internal/cli/system/cmd/checkpersistence/run.go deleted file mode 100644 index 87d9955b..00000000 --- a/internal/cli/system/cmd/checkpersistence/run.go +++ /dev/null @@ -1,164 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkpersistence - -import ( - "fmt" - "os" - "path/filepath" - "strconv" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// persistenceState holds the counter state for persistence nudging. -type persistenceState struct { - Count int - LastNudge int - LastMtime int64 -} - -func readPersistenceState(path string) (persistenceState, bool) { - data, readErr := os.ReadFile(path) //nolint:gosec // state dir path - if readErr != nil { - return persistenceState{}, false - } - - var ps persistenceState - for _, line := range strings.Split(strings.TrimSpace(string(data)), config.NewlineLF) { - parts := strings.SplitN(line, "=", 2) - if len(parts) != 2 { - continue - } - switch parts[0] { - case "count": - n, parseErr := strconv.Atoi(parts[1]) - if parseErr == nil { - ps.Count = n - } - case "last_nudge": - n, parseErr := strconv.Atoi(parts[1]) - if parseErr == nil { - ps.LastNudge = n - } - case "last_mtime": - n, parseErr := strconv.ParseInt(parts[1], 10, 64) - if parseErr == nil { - ps.LastMtime = n - } - } - } - return ps, true -} - -func writePersistenceState(path string, s persistenceState) { - content := fmt.Sprintf("count=%d\nlast_nudge=%d\nlast_mtime=%d\n", - s.Count, s.LastNudge, s.LastMtime) - _ = os.WriteFile(path, []byte(content), 0o600) -} - -func runCheckPersistence(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - stateFile := filepath.Join(tmpDir, "persistence-nudge-"+sessionID) - contextDir := rc.ContextDir() - logFile := filepath.Join(contextDir, "logs", "check-persistence.log") - - // Initialize state if needed - ps, exists := readPersistenceState(stateFile) - if !exists { - initialMtime := core.GetLatestContextMtime(contextDir) - ps = persistenceState{ - Count: 1, - LastNudge: 0, - LastMtime: initialMtime, - } - writePersistenceState(stateFile, ps) - core.LogMessage(logFile, sessionID, fmt.Sprintf("init count=1 mtime=%d", initialMtime)) - return nil - } - - ps.Count++ - currentMtime := core.GetLatestContextMtime(contextDir) - - // If context files were modified since last check, reset the nudge counter - if currentMtime > ps.LastMtime { - ps.LastNudge = ps.Count - ps.LastMtime = currentMtime - writePersistenceState(stateFile, ps) - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d context-modified, reset nudge counter", ps.Count)) - return nil - } - - sinceNudge := ps.Count - ps.LastNudge - - // Determine if we should nudge - shouldNudge := false - if ps.Count >= 11 && ps.Count <= 25 && sinceNudge >= 20 { - shouldNudge = true - } else if ps.Count > 25 && sinceNudge >= 15 { - shouldNudge = true - } - - if shouldNudge { - fallback := fmt.Sprintf("No context files updated in %d+ prompts.\n", sinceNudge) + - "Have you discovered learnings, made decisions,\n" + - "established conventions, or completed tasks\n" + - "worth persisting?\n" + - config.NewlineLF + - "Run /ctx-wrap-up to capture session context." - content := core.LoadMessage("check-persistence", "nudge", - map[string]any{ - "PromptCount": ps.Count, - "PromptsSinceNudge": sinceNudge, - }, fallback) - if content == "" { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d silenced-by-template", ps.Count)) - writePersistenceState(stateFile, ps) - return nil - } - msg := fmt.Sprintf("IMPORTANT: Relay this persistence checkpoint to the user VERBATIM before answering their question.\n\n"+ - "┌─ Persistence Checkpoint (prompt #%d) ───────────\n", ps.Count) - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += config.NudgeBoxBottom - cmd.Println(msg) - cmd.Println() - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d NUDGE since_nudge=%d", ps.Count, sinceNudge)) - ref := notify.NewTemplateRef("check-persistence", "nudge", - map[string]any{"PromptCount": ps.Count, "PromptsSinceNudge": sinceNudge}) - _ = notify.Send("nudge", fmt.Sprintf("check-persistence: Persistence Checkpoint at prompt #%d", ps.Count), sessionID, ref) - _ = notify.Send("relay", fmt.Sprintf("check-persistence: No context updated in %d+ prompts", sinceNudge), sessionID, ref) - eventlog.Append("relay", fmt.Sprintf("check-persistence: No context updated in %d+ prompts", sinceNudge), sessionID, ref) - ps.LastNudge = ps.Count - } else { - core.LogMessage(logFile, sessionID, fmt.Sprintf("prompt#%d silent since_nudge=%d", ps.Count, sinceNudge)) - } - - writePersistenceState(stateFile, ps) - return nil -} diff --git a/internal/cli/system/cmd/checkreminders/cmd.go b/internal/cli/system/cmd/checkreminders/cmd.go deleted file mode 100644 index 1546a558..00000000 --- a/internal/cli/system/cmd/checkreminders/cmd.go +++ /dev/null @@ -1,98 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkreminders - -import ( - "fmt" - "os" - "time" - - "github.com/spf13/cobra" - - remindcore "github.com/ActiveMemory/ctx/internal/cli/remind/core" - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system check-reminders" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-reminders subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-reminders", - Short: "Surface pending reminders at session start", - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckReminders(cmd, os.Stdin) - }, - } -} - -func runCheckReminders(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - reminders, readErr := remindcore.ReadReminders() - if readErr != nil { - return nil // non-fatal: don't break session start - } - - today := time.Now().Format("2006-01-02") - var due []remindcore.Reminder - for _, r := range reminders { - if r.After == nil || *r.After <= today { - due = append(due, r) - } - } - - if len(due) == 0 { - return nil - } - - // Build pre-formatted reminder list for the template variable - var reminderList string - for _, r := range due { - reminderList += fmt.Sprintf(" [%d] %s\n", r.ID, r.Message) - } - - fallback := reminderList + - "\nDismiss: ctx remind dismiss \n" + - "Dismiss all: ctx remind dismiss --all" - content := core.LoadMessage("check-reminders", "reminders", - map[string]any{"ReminderList": reminderList}, fallback) - if content == "" { - return nil - } - - msg := "IMPORTANT: Relay these reminders to the user VERBATIM before answering their question.\n\n" + - "┌─ Reminders ──────────────────────────────────────\n" - msg += core.BoxLines(content) - msg += config.NudgeBoxBottom - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-reminders", "reminders", - map[string]any{"ReminderList": reminderList}) - nudgeMsg := fmt.Sprintf("You have %d pending reminders", len(due)) - _ = notify.Send("nudge", nudgeMsg, input.SessionID, ref) - eventlog.Append("nudge", nudgeMsg, input.SessionID, ref) - - return nil -} diff --git a/internal/cli/system/cmd/checkresources/cmd.go b/internal/cli/system/cmd/checkresources/cmd.go deleted file mode 100644 index a310b89f..00000000 --- a/internal/cli/system/cmd/checkresources/cmd.go +++ /dev/null @@ -1,100 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkresources - -import ( - "os" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/sysinfo" -) - -// Cmd returns the "ctx system check-resources" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-resources subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-resources", - Short: "Resource pressure hook", - Long: `Collects system resource metrics (memory, swap, disk, load) and outputs -a VERBATIM relay warning when any resource hits DANGER severity. -Silent at WARNING level and below. - - Memory DANGER: >= 90% used Swap DANGER: >= 75% used - Disk DANGER: >= 95% full Load DANGER: >= 1.5x CPUs - -For full resource stats at any severity, use: ctx system resources - -Hook event: UserPromptSubmit -Output: VERBATIM relay (DANGER only), silent otherwise -Silent when: all resources below DANGER thresholds`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckResources(cmd, os.Stdin) - }, - } -} - -func runCheckResources(cmd *cobra.Command, stdin *os.File) error { - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - snap := sysinfo.Collect(".") - alerts := sysinfo.Evaluate(snap) - - if sysinfo.MaxSeverity(alerts) < sysinfo.SeverityDanger { - return nil - } - - // Build pre-formatted alert messages for the template variable - var alertMessages string - for _, a := range alerts { - if a.Severity == sysinfo.SeverityDanger { - alertMessages += "\u2716 " + a.Message + config.NewlineLF - } - } - - fallback := alertMessages + - "\nSystem resources are critically low.\n" + - "Persist unsaved context NOW with /ctx-wrap-up\n" + - "and consider ending this session." - content := core.LoadMessage("check-resources", "alert", - map[string]any{"AlertMessages": alertMessages}, fallback) - if content == "" { - return nil - } - - msg := "IMPORTANT: Relay this resource warning to the user VERBATIM.\n\n" + - "\u250c\u2500 Resource Alert \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "\u2502 " + line + config.NewlineLF - } - msg += "\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500" - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-resources", "alert", - map[string]any{"AlertMessages": alertMessages}) - _ = notify.Send("nudge", "check-resources: System resources critically low", input.SessionID, ref) - _ = notify.Send("relay", "check-resources: System resources critically low", input.SessionID, ref) - eventlog.Append("relay", "check-resources: System resources critically low", input.SessionID, ref) - - return nil -} diff --git a/internal/cli/system/cmd/checktaskcompletion/cmd.go b/internal/cli/system/cmd/checktaskcompletion/cmd.go deleted file mode 100644 index 26461c13..00000000 --- a/internal/cli/system/cmd/checktaskcompletion/cmd.go +++ /dev/null @@ -1,85 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checktaskcompletion - -import ( - "os" - "path/filepath" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// Cmd returns the "ctx system check-task-completion" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-task-completion subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-task-completion", - Short: "Task completion nudge after edits", - Long: `Counts Edit/Write tool calls and periodically nudges the agent -to check whether any tasks should be marked done in TASKS.md. - -Hook event: PostToolUse (Edit, Write) -Output: agent directive every N edits, silent otherwise -Silent when: counter below threshold, interval is 0, or session is paused`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckTaskCompletion(cmd, os.Stdin) - }, - } -} - -func runCheckTaskCompletion(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - interval := rc.TaskNudgeInterval() - if interval <= 0 { - return nil - } - - counterPath := filepath.Join(core.StateDir(), "task-nudge-"+sessionID) - count := core.ReadCounter(counterPath) - count++ - - if count < interval { - core.WriteCounter(counterPath, count) - return nil - } - - // Threshold reached — reset and nudge. - core.WriteCounter(counterPath, 0) - - fallback := "If you completed a task, mark it [x] in TASKS.md." - msg := core.LoadMessage("check-task-completion", "nudge", nil, fallback) - if msg == "" { - return nil - } - core.PrintHookContext(cmd, "PostToolUse", msg) - - ref := notify.NewTemplateRef("check-task-completion", "nudge", nil) - _ = notify.Send("relay", "check-task-completion: task completion nudge", input.SessionID, ref) - eventlog.Append("relay", "check-task-completion: task completion nudge", input.SessionID, ref) - - return nil -} diff --git a/internal/cli/system/cmd/checkversion/cmd.go b/internal/cli/system/cmd/checkversion/cmd.go deleted file mode 100644 index 74c48f65..00000000 --- a/internal/cli/system/cmd/checkversion/cmd.go +++ /dev/null @@ -1,36 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkversion - -import ( - "os" - - "github.com/spf13/cobra" -) - -// Cmd returns the "ctx system check-version" subcommand. -// -// Returns: -// - *cobra.Command: Configured check-version subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "check-version", - Short: "Binary/plugin version drift detection hook", - Long: `Compares the ctx binary version against the embedded plugin version. -Warns when the binary is older than the plugin expects, which happens -when the marketplace plugin updates but the binary hasn't been -reinstalled. Throttled to once per day. Skipped for dev builds. - -Hook event: UserPromptSubmit -Output: VERBATIM relay with reinstall command, silent otherwise -Silent when: versions match, dev build, or already checked today`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runCheckVersion(cmd, os.Stdin) - }, - } -} diff --git a/internal/cli/system/cmd/checkversion/run.go b/internal/cli/system/cmd/checkversion/run.go deleted file mode 100644 index 469d157f..00000000 --- a/internal/cli/system/cmd/checkversion/run.go +++ /dev/null @@ -1,169 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package checkversion - -import ( - "fmt" - "os" - "path/filepath" - "strings" - "time" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -func runCheckVersion(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - markerFile := filepath.Join(tmpDir, "version-checked") - - if core.IsDailyThrottled(markerFile) { - return nil - } - - binaryVer := config.BinaryVersion - - // Skip check for dev builds - if binaryVer == "dev" { - core.TouchFile(markerFile) - return nil - } - - pluginVer, pluginErr := assets.PluginVersion() - if pluginErr != nil { - return nil // embedded plugin.json missing — nothing to compare - } - - bMajor, bMinor, bOK := parseMajorMinor(binaryVer) - pMajor, pMinor, pOK := parseMajorMinor(pluginVer) - - if !bOK || !pOK { - core.TouchFile(markerFile) - return nil - } - - if bMajor == pMajor && bMinor == pMinor { - core.TouchFile(markerFile) - return nil - } - - // Version mismatch — emit warning - fallback := fmt.Sprintf("Your ctx binary is v%s but the plugin expects v%s.\n", binaryVer, pluginVer) + - "\nReinstall the binary to get the best out of ctx:\n" + - " go install github.com/ActiveMemory/ctx/cmd/ctx@latest" - content := core.LoadMessage("check-version", "mismatch", - map[string]any{ - "BinaryVersion": binaryVer, - "PluginVersion": pluginVer, - }, fallback) - if content == "" { - core.TouchFile(markerFile) - return nil - } - - msg := "IMPORTANT: Relay this version warning to the user VERBATIM before answering their question.\n\n" + - "┌─ Version Mismatch ─────────────────────────────\n" - msg += core.BoxLines(content) - if line := core.ContextDirLine(); line != "" { - msg += "│ " + line + config.NewlineLF - } - msg += "└────────────────────────────────────────────────" - cmd.Println(msg) - - ref := notify.NewTemplateRef("check-version", "mismatch", - map[string]any{"BinaryVersion": binaryVer, "PluginVersion": pluginVer}) - versionMsg := fmt.Sprintf("check-version: Binary v%s vs plugin v%s", binaryVer, pluginVer) - _ = notify.Send("nudge", versionMsg, input.SessionID, ref) - _ = notify.Send("relay", versionMsg, input.SessionID, ref) - eventlog.Append("relay", versionMsg, input.SessionID, ref) - - core.TouchFile(markerFile) - - // Key age check — piggyback on the daily version check - checkKeyAge(cmd, input.SessionID) - - return nil -} - -// checkKeyAge emits a nudge when the encryption key is older than the -// configured rotation threshold. -func checkKeyAge(cmd *cobra.Command, sessionID string) { - config.MigrateKeyFile(rc.ContextDir()) - kp := rc.KeyPath() - info, statErr := os.Stat(kp) - if statErr != nil { - return // no key — nothing to check - } - - ageDays := int(time.Since(info.ModTime()).Hours() / 24) - threshold := rc.KeyRotationDays() - - if ageDays < threshold { - return - } - - keyFallback := fmt.Sprintf("Your encryption key is %d days old.\n", ageDays) + - "Consider rotating: ctx pad rotate-key" - keyContent := core.LoadMessage("check-version", "key-rotation", - map[string]any{"KeyAgeDays": ageDays}, keyFallback) - if keyContent == "" { - return - } - - keyMsg := "\nIMPORTANT: Relay this security reminder to the user VERBATIM.\n\n" + - "┌─ Key Rotation ──────────────────────────────────┐\n" - keyMsg += core.BoxLines(keyContent) - if line := core.ContextDirLine(); line != "" { - keyMsg += "│ " + line + config.NewlineLF - } - keyMsg += "└──────────────────────────────────────────────────┘" - cmd.Println(keyMsg) - - keyRef := notify.NewTemplateRef("check-version", "key-rotation", - map[string]any{"KeyAgeDays": ageDays}) - keyNotifyMsg := fmt.Sprintf("check-version: Encryption key is %d days old", ageDays) - _ = notify.Send("nudge", keyNotifyMsg, sessionID, keyRef) - _ = notify.Send("relay", keyNotifyMsg, sessionID, keyRef) - eventlog.Append("relay", keyNotifyMsg, sessionID, keyRef) -} - -// parseMajorMinor extracts major and minor version numbers from a semver -// string like "1.2.3". Returns ok=false for unparseable versions. -func parseMajorMinor(ver string) (major, minor int, ok bool) { - parts := strings.SplitN(ver, ".", 3) - if len(parts) < 2 { - return 0, 0, false - } - var m, n int - if _, scanErr := fmt.Sscanf(parts[0], "%d", &m); scanErr != nil { - return 0, 0, false - } - if _, scanErr := fmt.Sscanf(parts[1], "%d", &n); scanErr != nil { - return 0, 0, false - } - return m, n, true -} diff --git a/internal/cli/system/cmd/context_load_gate/cmd.go b/internal/cli/system/cmd/context_load_gate/cmd.go new file mode 100644 index 00000000..f1d21597 --- /dev/null +++ b/internal/cli/system/cmd/context_load_gate/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package context_load_gate + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system context-load-gate" subcommand. +// +// Returns: +// - *cobra.Command: Configured context-load-gate subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemContextLoadGate) + + return &cobra.Command{ + Use: "context-load-gate", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/context_load_gate/doc.go b/internal/cli/system/cmd/context_load_gate/doc.go new file mode 100644 index 00000000..7bece3f6 --- /dev/null +++ b/internal/cli/system/cmd/context_load_gate/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package context_load_gate implements the ctx system context-load-gate +// subcommand. +// +// It auto-injects project context into the agent's context window on the +// first tool use per session, with subsequent calls silently skipped. +package context_load_gate diff --git a/internal/cli/system/cmd/context_load_gate/run.go b/internal/cli/system/cmd/context_load_gate/run.go new file mode 100644 index 00000000..3ccea724 --- /dev/null +++ b/internal/cli/system/cmd/context_load_gate/run.go @@ -0,0 +1,158 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package context_load_gate + +import ( + "fmt" + "os" + "path/filepath" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/load_gate" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + changescore "github.com/ActiveMemory/ctx/internal/cli/changes/core" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// Run executes the context-load-gate hook logic. +// +// Injects project context files into the agent's context window on the +// first tool call of each session. Reads context files in priority order, +// extracts indexes for large files, appends a changes summary, and emits +// a webhook notification with token counts. Writes an oversize flag when +// total injected tokens exceed the configured threshold. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + + input := core.ReadInput(stdin) + if input.SessionID == "" { + return nil + } + + if core.Paused(input.SessionID) > 0 { + return nil + } + + tmpDir := core.StateDir() + marker := filepath.Join(tmpDir, load_gate.PrefixCtxLoaded+input.SessionID) + + if _, statErr := os.Stat(marker); statErr == nil { + return nil // already fired this session + } + + // Create the marker before emitting — ensures one-shot even if + // the agent makes multiple parallel tool calls. + core.TouchFile(marker) + + // Auto-prune stale session state files (best-effort, silent). + // Runs once per session at startup — fast directory scan. + core.AutoPrune(load_gate.AutoPruneStaleDays) + + dir := rc.ContextDir() + var content strings.Builder + var totalTokens int + var filesLoaded int + var perFile []core.FileTokenEntry + + content.WriteString( + assets.TextDesc(assets.TextDescKeyContextLoadGateHeader) + + strings.Repeat( + load_gate.ContextLoadSeparatorChar, load_gate.ContextLoadSeparatorWidth, + ) + + token.NewlineLF + token.NewlineLF, + ) + + for _, f := range ctx.ReadOrder { + if f == ctx.Glossary { + continue + } + + data, readErr := validation.SafeReadFile(dir, f) + if readErr != nil { + continue // file missing — skip gracefully + } + + switch f { + case ctx.Task: + // One-liner mention in footer, don't inject content + continue + + case ctx.Decision, ctx.Learning: + idx := core.ExtractIndex(string(data)) + if idx == "" { + idx = assets.TextDesc(assets.TextDescKeyContextLoadGateIndexFallback) + } + content.WriteString(fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateIndexHeader), f, idx)) + tokens := context.EstimateTokensString(idx) + totalTokens += tokens + perFile = append(perFile, core.FileTokenEntry{ + Name: f + load_gate.ContextLoadIndexSuffix, + Tokens: tokens, + }) + filesLoaded++ + + default: + content.WriteString(fmt.Sprintf( + assets.TextDesc( + assets.TextDescKeyContextLoadGateFileHeader, + ), f, string(data))) + tokens := context.EstimateTokens(data) + totalTokens += tokens + perFile = append(perFile, core.FileTokenEntry{Name: f, Tokens: tokens}) + filesLoaded++ + } + } + + // Best-effort changes summary — never blocks injection + if refTime, refLabel, refErr := changescore.DetectReferenceTime(""); refErr == nil { + ctxChanges, _ := changescore.FindContextChanges(refTime) + codeChanges, _ := changescore.SummarizeCodeChanges(refTime) + if len(ctxChanges) > 0 || codeChanges.CommitCount > 0 { + content.WriteString(token.NewlineLF + changescore.RenderChangesForHook( + refLabel, ctxChanges, codeChanges)) + } + } + + content.WriteString( + strings.Repeat( + load_gate.ContextLoadSeparatorChar, load_gate.ContextLoadSeparatorWidth, + ) + token.NewlineLF) + content.WriteString(fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateFooter), + filesLoaded, totalTokens)) + + core.PrintHookContext(cmd, hook.EventPreToolUse, content.String()) + + // Webhook: metadata only — never send file content externally + webhookMsg := fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateWebhook), + filesLoaded, totalTokens) + core.Relay(webhookMsg, input.SessionID, nil) + + // Oversize nudge: write the flag for check-context-size to pick up + core.WriteOversizeFlag(dir, totalTokens, perFile) + + return nil +} diff --git a/internal/cli/system/cmd/contextloadgate/cmd.go b/internal/cli/system/cmd/contextloadgate/cmd.go deleted file mode 100644 index 009e2ca0..00000000 --- a/internal/cli/system/cmd/contextloadgate/cmd.go +++ /dev/null @@ -1,39 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package contextloadgate - -import ( - "os" - - "github.com/spf13/cobra" -) - -// Cmd returns the "ctx system context-load-gate" subcommand. -// -// Returns: -// - *cobra.Command: Configured context-load-gate subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "context-load-gate", - Short: "Auto-inject project context on first tool use", - Long: `Auto-injects project context into the agent's context window. -Fires on the first tool use per session via PreToolUse hook. Subsequent -tool calls in the same session are silent (tracked by session marker file). - -Reads context files directly and injects content — no delegation to -bootstrap command, no agent compliance required. -See specs/context-load-gate-v2.md for design rationale. - -Hook event: PreToolUse (.*) -Output: JSON HookResponse (additionalContext) on first tool use, silent otherwise -Silent when: marker exists for session_id, or context not initialized`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runContextLoadGate(cmd, os.Stdin) - }, - } -} diff --git a/internal/cli/system/cmd/contextloadgate/run.go b/internal/cli/system/cmd/contextloadgate/run.go deleted file mode 100644 index 2f3ee653..00000000 --- a/internal/cli/system/cmd/contextloadgate/run.go +++ /dev/null @@ -1,186 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package contextloadgate - -import ( - "fmt" - "os" - "path/filepath" - "strings" - "time" - - "github.com/spf13/cobra" - - changescore "github.com/ActiveMemory/ctx/internal/cli/changes/core" - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/context" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// fileTokenEntry tracks per-file token counts during injection. -type fileTokenEntry struct { - name string - tokens int -} - -func runContextLoadGate(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - - input := core.ReadInput(stdin) - if input.SessionID == "" { - return nil - } - - if core.Paused(input.SessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - marker := filepath.Join(tmpDir, "ctx-loaded-"+input.SessionID) - - if _, statErr := os.Stat(marker); statErr == nil { - return nil // already fired this session - } - - // Create marker before emitting — ensures one-shot even if - // the agent makes multiple parallel tool calls. - core.TouchFile(marker) - - // Auto-prune stale session state files (best-effort, silent). - // Runs once per session at startup — fast directory scan. - core.AutoPrune(7) - - dir := rc.ContextDir() - var content strings.Builder - var totalTokens int - var filesLoaded int - var perFile []fileTokenEntry - - content.WriteString( - "PROJECT CONTEXT (auto-loaded by system hook" + - " — already in your context window)\n" + - strings.Repeat("=", 80) + "\n\n") - - for _, f := range config.FileReadOrder { - if f == config.FileGlossary { - continue - } - - path := filepath.Join(dir, f) - data, readErr := os.ReadFile(path) //#nosec G304 — path is within .context/ - if readErr != nil { - continue // file missing — skip gracefully - } - - switch f { - case config.FileTask: - // One-liner mention in footer, don't inject content - continue - - case config.FileDecision, config.FileLearning: - idx := extractIndex(string(data)) - if idx == "" { - idx = "(no index entries)" - } - content.WriteString(fmt.Sprintf( - "--- %s (index — read full entries by date "+ - "when relevant) ---\n%s\n\n", f, idx)) - tokens := context.EstimateTokensString(idx) - totalTokens += tokens - perFile = append(perFile, fileTokenEntry{name: f + " (idx)", tokens: tokens}) - filesLoaded++ - - default: - content.WriteString(fmt.Sprintf( - "--- %s ---\n%s\n\n", f, string(data))) - tokens := context.EstimateTokens(data) - totalTokens += tokens - perFile = append(perFile, fileTokenEntry{name: f, tokens: tokens}) - filesLoaded++ - } - } - - // Best-effort changes summary — never blocks injection - if refTime, refLabel, refErr := changescore.DetectReferenceTime(""); refErr == nil { - ctxChanges, _ := changescore.FindContextChanges(refTime) - codeChanges, _ := changescore.SummarizeCodeChanges(refTime) - if len(ctxChanges) > 0 || codeChanges.CommitCount > 0 { - content.WriteString(config.NewlineLF + changescore.RenderChangesForHook( - refLabel, ctxChanges, codeChanges)) - } - } - - content.WriteString(strings.Repeat("=", 80) + config.NewlineLF) - content.WriteString(fmt.Sprintf( - "Context: %d files loaded (~%d tokens). "+ - "Order follows config.FileReadOrder.\n\n"+ - "TASKS.md contains the project's prioritized work items. "+ - "Read it when discussing priorities, picking up work, "+ - "or when the user asks about tasks.\n\n"+ - "For full decision or learning details, read the entry "+ - "in DECISIONS.md or LEARNINGS.md by timestamp.\n", - filesLoaded, totalTokens)) - - core.PrintHookContext(cmd, "PreToolUse", content.String()) - - // Webhook: metadata only — never send file content externally - webhookMsg := fmt.Sprintf( - "context-load-gate: injected %d files (~%d tokens)", - filesLoaded, totalTokens) - _ = notify.Send("relay", webhookMsg, input.SessionID, nil) - eventlog.Append("relay", webhookMsg, input.SessionID, nil) - - // Oversize nudge: write flag for check-context-size to pick up - writeOversizeFlag(dir, totalTokens, perFile) - - return nil -} - -// writeOversizeFlag writes an injection-oversize flag file when the total -// injected tokens exceed the configured threshold. -func writeOversizeFlag(contextDir string, totalTokens int, perFile []fileTokenEntry) { - threshold := rc.InjectionTokenWarn() - if threshold == 0 || totalTokens <= threshold { - return - } - - sd := filepath.Join(contextDir, config.DirState) - _ = os.MkdirAll(sd, 0o750) - - var flag strings.Builder - flag.WriteString("Context injection oversize warning\n") - flag.WriteString(strings.Repeat("=", 35) + config.NewlineLF) - flag.WriteString(fmt.Sprintf("Timestamp: %s\n", time.Now().UTC().Format(time.RFC3339))) - flag.WriteString(fmt.Sprintf("Injected: %d tokens (threshold: %d)\n\n", totalTokens, threshold)) - flag.WriteString("Per-file breakdown:\n") - for _, entry := range perFile { - flag.WriteString(fmt.Sprintf(" %-22s %5d tokens\n", entry.name, entry.tokens)) - } - flag.WriteString("\nAction: Run /ctx-consolidate to distill context files.\n") - flag.WriteString("Files with the most growth are the best candidates.\n") - - _ = os.WriteFile( - filepath.Join(sd, "injection-oversize"), - []byte(flag.String()), 0o600) -} - -// extractIndex returns the content between INDEX:START and INDEX:END -// markers, or empty string if markers are not found. -func extractIndex(content string) string { - start := strings.Index(content, config.IndexStart) - end := strings.Index(content, config.IndexEnd) - if start < 0 || end < 0 || end <= start { - return "" - } - startPos := start + len(config.IndexStart) - return strings.TrimSpace(content[startPos:end]) -} diff --git a/internal/cli/system/cmd/events/cmd.go b/internal/cli/system/cmd/events/cmd.go index 9df23c69..0bf0c416 100644 --- a/internal/cli/system/cmd/events/cmd.go +++ b/internal/cli/system/cmd/events/cmd.go @@ -17,33 +17,35 @@ import ( // Returns: // - *cobra.Command: Configured events subcommand func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemEvents) + cmd := &cobra.Command{ Use: "events", - Short: "Query the local hook event log", - Long: `Query the local event log (requires event_log: true in .ctxrc). - -Reads events from .context/state/events.jsonl and outputs them in -human-readable or raw JSONL format. All filter flags use intersection -(AND) logic. - -Flags: - --hook Filter by hook name - --session Filter by session ID - --event Filter by event type (relay, nudge) - --last Show last N events (default 50) - --json Output raw JSONL (for piping to jq) - --all Include rotated log file`, + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runEvents(cmd) + return Run(cmd) }, } - cmd.Flags().StringP("hook", "k", "", assets.FlagDesc("system.events.hook")) - cmd.Flags().StringP("session", "s", "", assets.FlagDesc("system.events.session")) - cmd.Flags().StringP("event", "e", "", assets.FlagDesc("system.events.event")) - cmd.Flags().IntP("last", "n", 50, assets.FlagDesc("system.events.last")) - cmd.Flags().BoolP("json", "j", false, assets.FlagDesc("system.events.json")) - cmd.Flags().BoolP("all", "a", false, assets.FlagDesc("system.events.all")) + cmd.Flags().StringP( + "hook", "k", "", assets.FlagDesc(assets.FlagDescKeySystemEventsHook), + ) + cmd.Flags().StringP( + "session", "s", "", assets.FlagDesc(assets.FlagDescKeySystemEventsSession), + ) + cmd.Flags().StringP( + "event", "e", "", assets.FlagDesc(assets.FlagDescKeySystemEventsEvent), + ) + cmd.Flags().IntP( + "last", "n", 50, assets.FlagDesc(assets.FlagDescKeySystemEventsLast), + ) + cmd.Flags().BoolP( + "json", "j", false, assets.FlagDesc(assets.FlagDescKeySystemEventsJson), + ) + cmd.Flags().BoolP( + "all", "a", false, assets.FlagDesc(assets.FlagDescKeySystemEventsAll), + ) return cmd } diff --git a/internal/cli/system/cmd/events/doc.go b/internal/cli/system/cmd/events/doc.go new file mode 100644 index 00000000..ff128919 --- /dev/null +++ b/internal/cli/system/cmd/events/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package events implements the ctx system events subcommand. +// +// It queries and displays entries from the local hook event log. +package events diff --git a/internal/cli/system/cmd/events/run.go b/internal/cli/system/cmd/events/run.go index 55ef0b71..84e09c79 100644 --- a/internal/cli/system/cmd/events/run.go +++ b/internal/cli/system/cmd/events/run.go @@ -7,18 +7,23 @@ package events import ( - "encoding/json" - "fmt" - "strings" - "time" - "github.com/spf13/cobra" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" ) -func runEvents(cmd *cobra.Command) error { +// Run executes the events subcommand, querying and displaying event log +// entries filtered by hook, session, event type, and count. +// +// Parameters: +// - cmd: Cobra command for flag access and output +// +// Returns: +// - error: Non-nil on event log read failure +func Run(cmd *cobra.Command) error { hook, _ := cmd.Flags().GetString("hook") session, _ := cmd.Flags().GetString("session") event, _ := cmd.Flags().GetString("event") @@ -36,68 +41,16 @@ func runEvents(cmd *cobra.Command) error { evts, queryErr := eventlog.Query(opts) if queryErr != nil { - return fmt.Errorf("reading event log: %w", queryErr) + return ctxerr.EventLogRead(queryErr) } if len(evts) == 0 { - cmd.Println("No events logged.") + cmd.Println(assets.TextDesc(assets.TextDescKeyEventsEmpty)) return nil } if jsonOut { - return outputEventsJSON(cmd, evts) - } - return outputEventsHuman(cmd, evts) -} - -// outputEventsJSON writes events as raw JSONL. -func outputEventsJSON(cmd *cobra.Command, evts []notify.Payload) error { - for _, e := range evts { - line, marshalErr := json.Marshal(e) - if marshalErr != nil { - continue - } - cmd.Println(string(line)) - } - return nil -} - -// outputEventsHuman writes events in aligned columns. -func outputEventsHuman(cmd *cobra.Command, evts []notify.Payload) error { - for _, e := range evts { - ts := formatEventTimestamp(e.Timestamp) - hookName := extractHookName(e) - msg := truncateMessage(e.Message, 60) - cmd.Println(fmt.Sprintf("%-19s %-5s %-24s %s", ts, e.Event, hookName, msg)) - } - return nil -} - -// formatEventTimestamp converts an RFC3339 timestamp to local time display. -func formatEventTimestamp(ts string) string { - t, parseErr := time.Parse(time.RFC3339, ts) - if parseErr != nil { - return ts - } - return t.Local().Format("2006-01-02 15:04:05") -} - -// extractHookName gets the hook name from the event payload detail. -func extractHookName(e notify.Payload) string { - if e.Detail != nil && e.Detail.Hook != "" { - return e.Detail.Hook - } - // Fall back to extracting from message prefix (e.g., "qa-reminder: ...") - if idx := strings.Index(e.Message, ":"); idx > 0 { - return e.Message[:idx] - } - return "-" -} - -// truncateMessage limits message length for display. -func truncateMessage(msg string, maxLen int) string { - if len(msg) <= maxLen { - return msg + return core.OutputEventsJSON(cmd, evts) } - return msg[:maxLen-3] + "..." + return core.OutputEventsHuman(cmd, evts) } diff --git a/internal/cli/system/cmd/heartbeat/cmd.go b/internal/cli/system/cmd/heartbeat/cmd.go index 93356ef9..73d5f419 100644 --- a/internal/cli/system/cmd/heartbeat/cmd.go +++ b/internal/cli/system/cmd/heartbeat/cmd.go @@ -7,18 +7,11 @@ package heartbeat import ( - "fmt" "os" - "path/filepath" - "strconv" - "strings" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" - "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/assets" ) // Cmd returns the "ctx system heartbeat" subcommand. @@ -26,114 +19,15 @@ import ( // Returns: // - *cobra.Command: Configured heartbeat subcommand func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "heartbeat", - Short: "Session heartbeat webhook", - Long: `Sends a heartbeat webhook notification on every prompt, providing -continuous session-alive visibility with metadata (prompt count, session ID, -context modification status). - -Unlike other hooks, the heartbeat never produces stdout — the agent never -sees it. It only fires a webhook and writes to the event log. + short, long := assets.CommandDesc(assets.CmdDescKeySystemHeartbeat) -Hook event: UserPromptSubmit -Output: none (webhook + event log only) -Silent when: not initialized, paused, or no webhook configured`, + return &cobra.Command{ + Use: "heartbeat", + Short: short, + Long: long, Hidden: true, RunE: func(cmd *cobra.Command, _ []string) error { - return runHeartbeat(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } } - -func runHeartbeat(_ *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - tmpDir := core.StateDir() - counterFile := filepath.Join(tmpDir, "heartbeat-"+sessionID) - mtimeFile := filepath.Join(tmpDir, "heartbeat-mtime-"+sessionID) - contextDir := rc.ContextDir() - logFile := filepath.Join(contextDir, "logs", "heartbeat.log") - - // Increment prompt counter. - count := core.ReadCounter(counterFile) + 1 - core.WriteCounter(counterFile, count) - - // Detect context modification since last heartbeat. - currentMtime := core.GetLatestContextMtime(contextDir) - lastMtime := readMtime(mtimeFile) - contextModified := currentMtime > lastMtime - writeMtime(mtimeFile, currentMtime) - - // Read token usage for this session. - info, _ := core.ReadSessionTokenInfo(sessionID) - tokens := info.Tokens - window := core.EffectiveContextWindow(info.Model) - - // Build and send notification. - vars := map[string]any{ - "prompt_count": count, - "session_id": sessionID, - "context_modified": contextModified, - } - if tokens > 0 { - pct := tokens * 100 / window - vars["tokens"] = tokens - vars["context_window"] = window - vars["usage_pct"] = pct - } - ref := notify.NewTemplateRef("heartbeat", "pulse", vars) - - var msg string - if tokens > 0 { - pct := tokens * 100 / window - msg = fmt.Sprintf("heartbeat: prompt #%d (context_modified=%t tokens=%s pct=%d%%)", - count, contextModified, core.FormatTokenCount(tokens), pct) - } else { - msg = fmt.Sprintf("heartbeat: prompt #%d (context_modified=%t)", count, contextModified) - } - _ = notify.Send("heartbeat", msg, sessionID, ref) - eventlog.Append("heartbeat", msg, sessionID, ref) - - var logLine string - if tokens > 0 { - pct := tokens * 100 / window - logLine = fmt.Sprintf("prompt#%d context_modified=%t tokens=%s pct=%d%%", - count, contextModified, core.FormatTokenCount(tokens), pct) - } else { - logLine = fmt.Sprintf("prompt#%d context_modified=%t", count, contextModified) - } - core.LogMessage(logFile, sessionID, logLine) - - // No stdout — agent never sees this hook. - return nil -} - -// readMtime reads a stored mtime value from a file. Returns 0 if the -// file does not exist or cannot be parsed. -func readMtime(path string) int64 { - data, readErr := os.ReadFile(path) //nolint:gosec // temp file path - if readErr != nil { - return 0 - } - n, parseErr := strconv.ParseInt(strings.TrimSpace(string(data)), 10, 64) - if parseErr != nil { - return 0 - } - return n -} - -// writeMtime writes a mtime value to a file. -func writeMtime(path string, mtime int64) { - _ = os.WriteFile(path, []byte(strconv.FormatInt(mtime, 10)), 0o600) -} diff --git a/internal/cli/system/cmd/heartbeat/doc.go b/internal/cli/system/cmd/heartbeat/doc.go new file mode 100644 index 00000000..73eda2fd --- /dev/null +++ b/internal/cli/system/cmd/heartbeat/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package heartbeat implements the ctx system heartbeat subcommand. +// +// It sends a heartbeat webhook on every prompt for continuous +// session-alive visibility, without producing any agent-visible output. +package heartbeat diff --git a/internal/cli/system/cmd/heartbeat/run.go b/internal/cli/system/cmd/heartbeat/run.go new file mode 100644 index 00000000..648a74e8 --- /dev/null +++ b/internal/cli/system/cmd/heartbeat/run.go @@ -0,0 +1,110 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package heartbeat + +import ( + "fmt" + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/heartbeat" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/eventlog" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// Run executes the heartbeat hook logic. +// +// Increments a per-session prompt counter, detects context file +// modifications since the last heartbeat, reads token usage, and +// emits a notification plus event log entry. Produces no stdout +// output — the agent never sees this hook. +// +// Parameters: +// - cmd: Cobra command (unused, heartbeat produces no output) +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(_ *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + _, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + tmpDir := core.StateDir() + counterFile := filepath.Join(tmpDir, heartbeat.HeartbeatCounterPrefix+sessionID) + mtimeFile := filepath.Join(tmpDir, heartbeat.HeartbeatMtimePrefix+sessionID) + contextDir := rc.ContextDir() + logFile := filepath.Join(contextDir, dir.Logs, heartbeat.HeartbeatLogFile) + + // Increment prompt counter. + count := core.ReadCounter(counterFile) + 1 + core.WriteCounter(counterFile, count) + + // Detect context modification since the last heartbeat. + currentMtime := core.GetLatestContextMtime(contextDir) + lastMtime := core.ReadMtime(mtimeFile) + contextModified := currentMtime > lastMtime + core.WriteMtime(mtimeFile, currentMtime) + + // Read token usage for this session. + info, _ := core.ReadSessionTokenInfo(sessionID) + tokens := info.Tokens + window := core.EffectiveContextWindow(info.Model) + + // Build and send notification. + vars := map[string]any{ + tpl.VarHeartbeatPromptCount: count, + tpl.VarHeartbeatSessionID: sessionID, + tpl.VarHeartbeatContextModified: contextModified, + } + if tokens > 0 { + pct := tokens * stats.PercentMultiplier / window + vars[tpl.VarHeartbeatTokens] = tokens + vars[tpl.VarHeartbeatContextWindow] = window + vars[tpl.VarHeartbeatUsagePct] = pct + } + ref := notify.NewTemplateRef(hook.Heartbeat, hook.VariantPulse, vars) + + var msg string + if tokens > 0 { + pct := tokens * stats.PercentMultiplier / window + msg = fmt.Sprintf(assets.TextDesc(assets.TextDescKeyHeartbeatNotifyTokens), + count, contextModified, core.FormatTokenCount(tokens), pct) + } else { + msg = fmt.Sprintf(assets.TextDesc(assets.TextDescKeyHeartbeatNotifyPlain), + count, contextModified) + } + _ = notify.Send(hook.NotifyChannelHeartbeat, msg, sessionID, ref) + eventlog.Append(hook.NotifyChannelHeartbeat, msg, sessionID, ref) + + var logLine string + if tokens > 0 { + pct := tokens * stats.PercentMultiplier / window + logLine = fmt.Sprintf(assets.TextDesc(assets.TextDescKeyHeartbeatLogTokens), + count, contextModified, core.FormatTokenCount(tokens), pct) + } else { + logLine = fmt.Sprintf(assets.TextDesc(assets.TextDescKeyHeartbeatLogPlain), + count, contextModified) + } + core.LogMessage(logFile, sessionID, logLine) + + // No stdout — agent never sees this hook. + return nil +} diff --git a/internal/cli/system/cmd/markjournal/cmd.go b/internal/cli/system/cmd/mark_journal/cmd.go similarity index 52% rename from internal/cli/system/cmd/markjournal/cmd.go rename to internal/cli/system/cmd/mark_journal/cmd.go index c6409878..c05722f7 100644 --- a/internal/cli/system/cmd/markjournal/cmd.go +++ b/internal/cli/system/cmd/mark_journal/cmd.go @@ -4,7 +4,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package markjournal +package mark_journal import ( "fmt" @@ -21,20 +21,12 @@ import ( // Returns: // - *cobra.Command: Configured mark-journal subcommand func Cmd() *cobra.Command { - cmd := &cobra.Command{ - Use: "mark-journal ", - Short: "Update journal processing state", - Long: fmt.Sprintf(`Mark a journal entry as having completed a processing stage. - -Valid stages: %s + short, long := assets.CommandDesc(assets.CmdDescKeySystemMarkJournal) -The state is recorded in .context/journal/.state.json with today's date. - -Examples: - ctx system mark-journal 2026-01-21-session-abc12345.md exported - ctx system mark-journal 2026-01-21-session-abc12345.md enriched - ctx system mark-journal 2026-01-21-session-abc12345.md normalized - ctx system mark-journal 2026-01-21-session-abc12345.md fences_verified`, strings.Join(state.ValidStages, ", ")), + cmd := &cobra.Command{ + Use: "mark-journal ", + Short: short, + Long: fmt.Sprintf(long, strings.Join(state.ValidStages, ", ")), Hidden: true, Args: cobra.ExactArgs(2), //nolint:mnd // 2 positional args: filename, stage RunE: func(cmd *cobra.Command, args []string) error { @@ -42,7 +34,7 @@ Examples: }, } - cmd.Flags().Bool("check", false, assets.FlagDesc("system.markjournal.check")) + cmd.Flags().Bool("check", false, assets.FlagDesc(assets.FlagDescKeySystemMarkjournalCheck)) return cmd } diff --git a/internal/cli/system/cmd/mark_journal/doc.go b/internal/cli/system/cmd/mark_journal/doc.go new file mode 100644 index 00000000..4f726f56 --- /dev/null +++ b/internal/cli/system/cmd/mark_journal/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package mark_journal implements the ctx system mark-journal subcommand. +// +// It updates the processing stage of a journal entry in the state file. +package mark_journal diff --git a/internal/cli/system/cmd/mark_journal/run.go b/internal/cli/system/cmd/mark_journal/run.go new file mode 100644 index 00000000..e9670328 --- /dev/null +++ b/internal/cli/system/cmd/mark_journal/run.go @@ -0,0 +1,77 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package mark_journal + +import ( + "fmt" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/journal/state" +) + +// runMarkJournal handles the mark-journal command. +// +// Marks a journal file as having reached a given processing stage, or +// checks the current stage value when --check is set. +// +// Parameters: +// - cmd: Cobra command for output and flag access +// - filename: journal filename to mark or check +// - stage: processing stage name (exported, enriched, normalized, etc.) +// +// Returns: +// - error: Non-nil on state load/save failure or unknown stage +func runMarkJournal(cmd *cobra.Command, filename, stage string) error { + journalDir := ctxcontext.ResolvedJournalDir() + + jstate, loadErr := state.Load(journalDir) + if loadErr != nil { + return ctxerr.LoadJournalStateFailed(loadErr) + } + + check, _ := cmd.Flags().GetBool("check") + if check { + fs := jstate.Entries[filename] + var val string + switch stage { + case journal.StageExported: + val = fs.Exported + case journal.StageEnriched: + val = fs.Enriched + case journal.StageNormalized: + val = fs.Normalized + case journal.StageFencesVerified: + val = fs.FencesVerified + case journal.StageLocked: + val = fs.Locked + default: + return ctxerr.UnknownStage(stage, strings.Join(state.ValidStages, ", ")) + } + if val == "" { + return ctxerr.StageNotSet(filename, stage) + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMarkJournalChecked), filename, stage, val)) + return nil + } + + if ok := jstate.Mark(filename, stage); !ok { + return ctxerr.UnknownStage(stage, strings.Join(state.ValidStages, ", ")) + } + + if saveErr := jstate.Save(journalDir); saveErr != nil { + return ctxerr.SaveJournalStateFailed(saveErr) + } + + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMarkJournalMarked), filename, stage)) + return nil +} diff --git a/internal/cli/system/cmd/mark_wrapped_up/cmd.go b/internal/cli/system/cmd/mark_wrapped_up/cmd.go new file mode 100644 index 00000000..f5bf73d4 --- /dev/null +++ b/internal/cli/system/cmd/mark_wrapped_up/cmd.go @@ -0,0 +1,31 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package mark_wrapped_up + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system mark-wrapped-up" subcommand. +// +// Returns: +// - *cobra.Command: Configured mark-wrapped-up subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemMarkWrappedUp) + + return &cobra.Command{ + Use: "mark-wrapped-up", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd) + }, + } +} diff --git a/internal/cli/system/cmd/mark_wrapped_up/doc.go b/internal/cli/system/cmd/mark_wrapped_up/doc.go new file mode 100644 index 00000000..5a7efc08 --- /dev/null +++ b/internal/cli/system/cmd/mark_wrapped_up/doc.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package mark_wrapped_up implements the ctx system mark-wrapped-up +// subcommand. +// +// It writes a marker file that suppresses context checkpoint nudges for +// two hours after a wrap-up ceremony. +package mark_wrapped_up diff --git a/internal/cli/system/cmd/mark_wrapped_up/run.go b/internal/cli/system/cmd/mark_wrapped_up/run.go new file mode 100644 index 00000000..8d7410d1 --- /dev/null +++ b/internal/cli/system/cmd/mark_wrapped_up/run.go @@ -0,0 +1,42 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package mark_wrapped_up + +import ( + "os" + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/wrap" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run creates or updates the wrap-up marker file. +// +// Writes the marker so that nudge hooks (ceremonies, persistence, etc.) +// are suppressed for WrappedUpExpiry after a wrap-up ceremony completes. +// +// Parameters: +// - cmd: Cobra command for output +// +// Returns: +// - error: Non-nil if the marker file cannot be written +func Run(cmd *cobra.Command) error { + markerPath := filepath.Join(core.StateDir(), wrap.WrappedUpMarker) + + if writeErr := os.WriteFile( + markerPath, []byte(wrap.WrappedUpContent), fs.PermSecret, + ); writeErr != nil { + return writeErr + } + + write.SessionWrappedUp(cmd) + return nil +} diff --git a/internal/cli/system/cmd/markjournal/run.go b/internal/cli/system/cmd/markjournal/run.go deleted file mode 100644 index c5979afe..00000000 --- a/internal/cli/system/cmd/markjournal/run.go +++ /dev/null @@ -1,65 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package markjournal - -import ( - "fmt" - "path/filepath" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/journal/state" - "github.com/ActiveMemory/ctx/internal/rc" -) - -// runMarkJournal handles the mark-journal command. -func runMarkJournal(cmd *cobra.Command, filename, stage string) error { - journalDir := filepath.Join(rc.ContextDir(), config.DirJournal) - - jstate, loadErr := state.Load(journalDir) - if loadErr != nil { - return fmt.Errorf("load journal state: %w", loadErr) - } - - check, _ := cmd.Flags().GetBool("check") - if check { - fs := jstate.Entries[filename] - var val string - switch stage { - case "exported": - val = fs.Exported - case "enriched": - val = fs.Enriched - case "normalized": - val = fs.Normalized - case "fences_verified": - val = fs.FencesVerified - case "locked": - val = fs.Locked - default: - return fmt.Errorf("unknown stage %q; valid: %s", stage, strings.Join(state.ValidStages, ", ")) - } - if val == "" { - return fmt.Errorf("%s: %s not set", filename, stage) - } - cmd.Println(fmt.Sprintf("%s: %s = %s", filename, stage, val)) - return nil - } - - if ok := jstate.Mark(filename, stage); !ok { - return fmt.Errorf("unknown stage %q; valid: %s", stage, strings.Join(state.ValidStages, ", ")) - } - - if saveErr := jstate.Save(journalDir); saveErr != nil { - return fmt.Errorf("save journal state: %w", saveErr) - } - - cmd.Println(fmt.Sprintf("%s: marked %s", filename, stage)) - return nil -} diff --git a/internal/cli/system/cmd/markwrappedup/cmd.go b/internal/cli/system/cmd/markwrappedup/cmd.go deleted file mode 100644 index 63e3d03d..00000000 --- a/internal/cli/system/cmd/markwrappedup/cmd.go +++ /dev/null @@ -1,53 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package markwrappedup - -import ( - "os" - "path/filepath" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" -) - -// Cmd returns the "ctx system mark-wrapped-up" subcommand. -// -// Returns: -// - *cobra.Command: Configured mark-wrapped-up subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "mark-wrapped-up", - Short: "Suppress checkpoint nudges after wrap-up", - Long: `Write a marker file that suppresses context checkpoint nudges -for 2 hours. Called by /ctx-wrap-up after persisting context. - -The check-context-size hook checks this marker before emitting -a checkpoint. If the marker exists and is less than 2 hours old, -the nudge is suppressed. - -This is a plumbing command — use /ctx-wrap-up instead.`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runMarkWrappedUp(cmd) - }, - } -} - -// runMarkWrappedUp creates or updates the wrap-up marker file. -func runMarkWrappedUp(cmd *cobra.Command) error { - markerPath := filepath.Join(core.StateDir(), core.WrappedUpMarker) - - if writeErr := os.WriteFile( - markerPath, []byte("wrapped-up"), 0o600, - ); writeErr != nil { - return writeErr - } - - cmd.Println("marked wrapped-up") - return nil -} diff --git a/internal/cli/system/cmd/message/cmd.go b/internal/cli/system/cmd/message/cmd.go index 61c7043f..ce3e0db1 100644 --- a/internal/cli/system/cmd/message/cmd.go +++ b/internal/cli/system/cmd/message/cmd.go @@ -8,6 +8,8 @@ package message import ( "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ) // Cmd returns the "ctx system message" subcommand. @@ -15,20 +17,12 @@ import ( // Returns: // - *cobra.Command: Configured message subcommand with sub-subcommands func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemMessage) + cmd := &cobra.Command{ Use: "message", - Short: "Manage hook message templates", - Long: `Manage hook message templates. - -Hook messages control what text hooks emit. The hook logic (when to -fire, counting, state tracking) is universal. The messages are opinions -that can be customized per-project. - -Subcommands: - list Show all hook messages with category and override status - show Print the effective message template for a hook/variant - edit Copy the embedded default to .context/ for editing - reset Delete a user override and revert to embedded default`, + Short: short, + Long: long, } cmd.AddCommand( @@ -40,3 +34,60 @@ Subcommands: return cmd } + +// messageListCmd returns the "ctx system message list" subcommand. +func messageListCmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemMessageList) + + cmd := &cobra.Command{ + Use: "list", + Short: short, + RunE: func(cmd *cobra.Command, _ []string) error { + return RunMessageList(cmd) + }, + } + cmd.Flags().Bool("json", false, assets.FlagDesc(assets.FlagDescKeySystemMessageJson)) + return cmd +} + +// messageShowCmd returns the "ctx system message show" subcommand. +func messageShowCmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemMessageShow) + + return &cobra.Command{ + Use: "show ", + Short: short, + Args: cobra.ExactArgs(2), + RunE: func(cmd *cobra.Command, args []string) error { + return RunMessageShow(cmd, args[0], args[1]) + }, + } +} + +// messageEditCmd returns the "ctx system message edit" subcommand. +func messageEditCmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemMessageEdit) + + return &cobra.Command{ + Use: "edit ", + Short: short, + Args: cobra.ExactArgs(2), + RunE: func(cmd *cobra.Command, args []string) error { + return RunMessageEdit(cmd, args[0], args[1]) + }, + } +} + +// messageResetCmd returns the "ctx system message reset" subcommand. +func messageResetCmd() *cobra.Command { + short, _ := assets.CommandDesc(assets.CmdDescKeySystemMessageReset) + + return &cobra.Command{ + Use: "reset ", + Short: short, + Args: cobra.ExactArgs(2), + RunE: func(cmd *cobra.Command, args []string) error { + return RunMessageReset(cmd, args[0], args[1]) + }, + } +} diff --git a/internal/cli/system/cmd/message/doc.go b/internal/cli/system/cmd/message/doc.go new file mode 100644 index 00000000..4aab1a89 --- /dev/null +++ b/internal/cli/system/cmd/message/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package message implements the ctx system message subcommand. +// +// It manages hook message templates, providing list, show, and edit +// sub-subcommands for customizing hook output per project. +package message diff --git a/internal/cli/system/cmd/message/run.go b/internal/cli/system/cmd/message/run.go index 5b60e04b..2da1963a 100644 --- a/internal/cli/system/cmd/message/run.go +++ b/internal/cli/system/cmd/message/run.go @@ -13,46 +13,38 @@ import ( "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/msg" + "github.com/spf13/cobra" + "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/assets/hooks/messages" - "github.com/ActiveMemory/ctx/internal/rc" - "github.com/spf13/cobra" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) -// messageListCmd returns the "ctx system message list" subcommand. -func messageListCmd() *cobra.Command { - cmd := &cobra.Command{ - Use: "list", - Short: "Show all hook messages with category and override status", - RunE: func(cmd *cobra.Command, _ []string) error { - return runMessageList(cmd) - }, - } - cmd.Flags().Bool("json", false, assets.FlagDesc("system.message.json")) - return cmd -} - -type messageListEntry struct { - Hook string `json:"hook"` - Variant string `json:"variant"` - Category string `json:"category"` - Description string `json:"description"` - TemplateVars []string `json:"template_vars"` - HasOverride bool `json:"has_override"` -} - -func runMessageList(cmd *cobra.Command) error { +// RunMessageList executes the message list logic. +// +// Collects all registered hook messages from the registry and outputs +// them as either a JSON array or a formatted table. +// +// Parameters: +// - cmd: Cobra command for output and flag access +// +// Returns: +// - error: Non-nil on JSON encoding failure +func RunMessageList(cmd *cobra.Command) error { registry := messages.Registry() - entries := make([]messageListEntry, 0, len(registry)) + entries := make([]core.MessageListEntry, 0, len(registry)) for _, info := range registry { - entry := messageListEntry{ + entry := core.MessageListEntry{ Hook: info.Hook, Variant: info.Variant, Category: info.Category, Description: info.Description, TemplateVars: info.TemplateVars, - HasOverride: hasOverride(info.Hook, info.Variant), + HasOverride: core.HasOverride(info.Hook, info.Variant), } if entry.TemplateVars == nil { entry.TemplateVars = []string{} @@ -68,47 +60,53 @@ func runMessageList(cmd *cobra.Command) error { } // Table output - cmd.Println(fmt.Sprintf("%-24s %-20s %-16s %s", "Hook", "Variant", "Category", "Override")) - cmd.Println(fmt.Sprintf("%-24s %-20s %-16s %s", - strings.Repeat("\u2500", 22), - strings.Repeat("\u2500", 18), - strings.Repeat("\u2500", 14), - strings.Repeat("\u2500", 8))) + headerFmt := fmt.Sprintf("%%-%ds %%-%ds %%-%ds %%s", + msg.MessageColHook, msg.MessageColVariant, msg.MessageColCategory) + cmd.Println(fmt.Sprintf(headerFmt, + assets.TextDesc(assets.TextDescKeyMessageListHeaderHook), + assets.TextDesc(assets.TextDescKeyMessageListHeaderVariant), + assets.TextDesc(assets.TextDescKeyMessageListHeaderCategory), + assets.TextDesc(assets.TextDescKeyMessageListHeaderOverride))) + cmd.Println(fmt.Sprintf(headerFmt, + strings.Repeat("\u2500", msg.MessageSepHook), + strings.Repeat("\u2500", msg.MessageSepVariant), + strings.Repeat("\u2500", msg.MessageSepCategory), + strings.Repeat("\u2500", msg.MessageSepOverride))) for _, e := range entries { override := "" if e.HasOverride { - override = "override" + override = assets.TextDesc(assets.TextDescKeyMessageOverrideLabel) } - cmd.Println(fmt.Sprintf("%-24s %-20s %-16s %s", e.Hook, e.Variant, e.Category, override)) + cmd.Println(fmt.Sprintf(headerFmt, e.Hook, e.Variant, e.Category, override)) } return nil } -// messageShowCmd returns the "ctx system message show" subcommand. -func messageShowCmd() *cobra.Command { - return &cobra.Command{ - Use: "show ", - Short: "Print the effective message template for a hook/variant", - Args: cobra.ExactArgs(2), - RunE: func(cmd *cobra.Command, args []string) error { - return runMessageShow(cmd, args[0], args[1]) - }, - } -} - -func runMessageShow(cmd *cobra.Command, hook, variant string) error { +// RunMessageShow executes the message show logic. +// +// Displays the content of a hook message template, checking for a user +// override first and falling back to the embedded default. +// +// Parameters: +// - cmd: Cobra command for output +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - error: Non-nil if the hook/variant is unknown or template is missing +func RunMessageShow(cmd *cobra.Command, hook, variant string) error { info := messages.Lookup(hook, variant) if info == nil { - return validationError(hook, variant) + return core.ValidationError(hook, variant) } // Check user override first - oPath := overridePath(hook, variant) + oPath := core.OverridePath(hook, variant) if data, readErr := os.ReadFile(oPath); readErr == nil { //nolint:gosec // project-local override path - cmd.Println(fmt.Sprintf("Source: user override (%s)", oPath)) - printTemplateVars(cmd, info) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMessageSourceOverride), oPath)) + core.PrintTemplateVars(cmd, info) cmd.Println() cmd.Print(string(data)) if len(data) > 0 && data[len(data)-1] != '\n' { @@ -118,13 +116,13 @@ func runMessageShow(cmd *cobra.Command, hook, variant string) error { } // Embedded default - data, readErr := assets.HookMessage(hook, variant+".txt") + data, readErr := assets.HookMessage(hook, variant+file.ExtTxt) if readErr != nil { - return fmt.Errorf("embedded template not found for %s/%s", hook, variant) + return ctxerr.EmbeddedTemplateNotFound(hook, variant) } - cmd.Println("Source: embedded default") - printTemplateVars(cmd, info) + cmd.Println(assets.TextDesc(assets.TextDescKeyMessageSourceDefault)) + core.PrintTemplateVars(cmd, info) cmd.Println() cmd.Print(string(data)) if len(data) > 0 && data[len(data)-1] != '\n' { @@ -133,89 +131,88 @@ func runMessageShow(cmd *cobra.Command, hook, variant string) error { return nil } -// messageEditCmd returns the "ctx system message edit" subcommand. -func messageEditCmd() *cobra.Command { - return &cobra.Command{ - Use: "edit ", - Short: "Copy the embedded default to .context/ for editing", - Args: cobra.ExactArgs(2), - RunE: func(cmd *cobra.Command, args []string) error { - return runMessageEdit(cmd, args[0], args[1]) - }, - } -} - -func runMessageEdit(cmd *cobra.Command, hook, variant string) error { +// RunMessageEdit executes the message edit logic. +// +// Creates a user override file by copying the embedded default template +// to the project's .context/hooks/messages/ directory. +// +// Parameters: +// - cmd: Cobra command for output +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - error: Non-nil if the hook/variant is unknown, override exists, +// or file operations fail +func RunMessageEdit(cmd *cobra.Command, hook, variant string) error { info := messages.Lookup(hook, variant) if info == nil { - return validationError(hook, variant) + return core.ValidationError(hook, variant) } - oPath := overridePath(hook, variant) + oPath := core.OverridePath(hook, variant) // Refuse if override already exists if _, statErr := os.Stat(oPath); statErr == nil { - return fmt.Errorf("override already exists at %s\nEdit it directly or use `ctx system message reset %s %s` first", - oPath, hook, variant) + return ctxerr.OverrideExists(oPath, hook, variant) } // Warn for ctx-specific messages if info.Category == messages.CategoryCtxSpecific { - cmd.Println("Warning: this message is ctx-specific (intended for ctx development).") - cmd.Println("Customizing it may produce unexpected results.") + cmd.Println(assets.TextDesc(assets.TextDescKeyMessageCtxSpecificWarning)) cmd.Println() } // Read embedded default - data, readErr := assets.HookMessage(hook, variant+".txt") + data, readErr := assets.HookMessage(hook, variant+file.ExtTxt) if readErr != nil { - return fmt.Errorf("embedded template not found for %s/%s", hook, variant) + return ctxerr.EmbeddedTemplateNotFound(hook, variant) } // Create directories dir := filepath.Dir(oPath) if mkdirErr := os.MkdirAll(dir, 0o750); mkdirErr != nil { - return fmt.Errorf("failed to create directory %s: %w", dir, mkdirErr) + return ctxerr.CreateDir(dir, mkdirErr) } // Write override file if writeErr := os.WriteFile(oPath, data, 0o600); writeErr != nil { - return fmt.Errorf("failed to write override %s: %w", oPath, writeErr) + return ctxerr.WriteOverride(oPath, writeErr) } - cmd.Println(fmt.Sprintf("Override created at %s", oPath)) - cmd.Println("Edit this file to customize the message.") - printTemplateVars(cmd, info) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMessageOverrideCreated), oPath)) + cmd.Println(assets.TextDesc(assets.TextDescKeyMessageEditHint)) + core.PrintTemplateVars(cmd, info) return nil } -// messageResetCmd returns the "ctx system message reset" subcommand. -func messageResetCmd() *cobra.Command { - return &cobra.Command{ - Use: "reset ", - Short: "Delete a user override and revert to embedded default", - Args: cobra.ExactArgs(2), - RunE: func(cmd *cobra.Command, args []string) error { - return runMessageReset(cmd, args[0], args[1]) - }, - } -} - -func runMessageReset(cmd *cobra.Command, hook, variant string) error { +// RunMessageReset executes the message reset logic. +// +// Removes a user override file, reverting to the embedded default. +// Cleans up empty parent directories after removal. +// +// Parameters: +// - cmd: Cobra command for output +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - error: Non-nil if the hook/variant is unknown or removal fails +func RunMessageReset(cmd *cobra.Command, hook, variant string) error { info := messages.Lookup(hook, variant) if info == nil { - return validationError(hook, variant) + return core.ValidationError(hook, variant) } - oPath := overridePath(hook, variant) + oPath := core.OverridePath(hook, variant) if removeErr := os.Remove(oPath); removeErr != nil { if os.IsNotExist(removeErr) { - cmd.Println(fmt.Sprintf("No override found for %s/%s. Already using embedded default.", hook, variant)) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMessageNoOverride), hook, variant)) return nil } - return fmt.Errorf("failed to remove override %s: %w", oPath, removeErr) + return ctxerr.RemoveOverride(oPath, removeErr) } // Clean up empty parent directories @@ -224,39 +221,6 @@ func runMessageReset(cmd *cobra.Command, hook, variant string) error { messagesDir := filepath.Dir(hookDir) _ = os.Remove(messagesDir) // only succeeds if empty - cmd.Println(fmt.Sprintf("Override removed for %s/%s. Using embedded default.", hook, variant)) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMessageOverrideRemoved), hook, variant)) return nil } - -// overridePath returns the user override file path for a hook/variant. -func overridePath(hook, variant string) string { - return filepath.Join(rc.ContextDir(), "hooks", "messages", hook, variant+".txt") -} - -// hasOverride checks whether a user override file exists. -func hasOverride(hook, variant string) bool { - _, statErr := os.Stat(overridePath(hook, variant)) - return statErr == nil -} - -// validationError returns an error for an unknown hook/variant. -func validationError(hook, variant string) error { - // Check if the hook exists at all - if messages.Variants(hook) == nil { - return fmt.Errorf("unknown hook: %s\nRun `ctx system message list` to see available hooks", hook) - } - return fmt.Errorf("unknown variant %q for hook %q\nRun `ctx system message list` to see available variants", variant, hook) -} - -// printTemplateVars prints available template variables if any exist. -func printTemplateVars(cmd *cobra.Command, info *messages.HookMessageInfo) { - if len(info.TemplateVars) == 0 { - cmd.Println("Template variables: (none)") - return - } - formatted := make([]string, len(info.TemplateVars)) - for i, v := range info.TemplateVars { - formatted[i] = "{{." + v + "}}" - } - cmd.Println(fmt.Sprintf("Template variables: %s", strings.Join(formatted, ", "))) -} diff --git a/internal/cli/system/cmd/pause/cmd.go b/internal/cli/system/cmd/pause/cmd.go index 946eaa07..e161639c 100644 --- a/internal/cli/system/cmd/pause/cmd.go +++ b/internal/cli/system/cmd/pause/cmd.go @@ -7,13 +7,11 @@ package pause import ( - "fmt" "os" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/cli/system/core" ) // Cmd returns the "ctx system pause" plumbing command. @@ -21,34 +19,19 @@ import ( // Returns: // - *cobra.Command: Configured pause subcommand func Cmd() *cobra.Command { - cmd := &cobra.Command{ - Use: "pause", - Short: "Pause context hooks for this session", - Long: `Creates a session-scoped pause marker. While paused, all nudge -and reminder hooks no-op. Security and housekeeping hooks still fire. + short, long := assets.CommandDesc(assets.CmdDescKeySystemPause) -The session ID is read from stdin JSON (same as hooks) or --session-id flag.`, + cmd := &cobra.Command{ + Use: "pause", + Short: short, + Long: long, Hidden: true, RunE: func(cmd *cobra.Command, _ []string) error { - return runPause(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } - cmd.Flags().String("session-id", "", assets.FlagDesc("system.pause.session-id")) + cmd.Flags().String("session-id", "", + assets.FlagDesc(assets.FlagDescKeySystemPauseSessionId), + ) return cmd } - -func runPause(cmd *cobra.Command, stdin *os.File) error { - sessionID, _ := cmd.Flags().GetString("session-id") - if sessionID == "" { - input := core.ReadInput(stdin) - sessionID = input.SessionID - } - if sessionID == "" { - sessionID = core.SessionUnknown - } - - path := core.PauseMarkerPath(sessionID) - core.WriteCounter(path, 0) - cmd.Println(fmt.Sprintf("Context hooks paused for session %s", sessionID)) - return nil -} diff --git a/internal/cli/system/cmd/pause/doc.go b/internal/cli/system/cmd/pause/doc.go new file mode 100644 index 00000000..be2a2684 --- /dev/null +++ b/internal/cli/system/cmd/pause/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package pause implements the ctx system pause subcommand. +// +// It creates a session-scoped pause marker that suppresses nudge and +// reminder hooks while allowing security hooks to continue firing. +package pause diff --git a/internal/cli/system/cmd/pause/run.go b/internal/cli/system/cmd/pause/run.go new file mode 100644 index 00000000..6cf5a84c --- /dev/null +++ b/internal/cli/system/cmd/pause/run.go @@ -0,0 +1,48 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package pause + +import ( + "fmt" + "os" + + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" +) + +// Run executes the pause logic. +// +// Reads a session ID from the --session-id flag or stdin JSON, then +// creates a pause marker file so all subsequent hooks for that session +// are suppressed until resumed. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil +func Run(cmd *cobra.Command, stdin *os.File) error { + sessionID, _ := cmd.Flags().GetString("session-id") + if sessionID == "" { + input := core.ReadInput(stdin) + sessionID = input.SessionID + } + if sessionID == "" { + sessionID = session.IDUnknown + } + + path := core.PauseMarkerPath(sessionID) + core.WriteCounter(path, 0) + cmd.Println( + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPauseConfirmed), sessionID), + ) + return nil +} diff --git a/internal/cli/system/cmd/post_commit/cmd.go b/internal/cli/system/cmd/post_commit/cmd.go new file mode 100644 index 00000000..e20c0998 --- /dev/null +++ b/internal/cli/system/cmd/post_commit/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package post_commit + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system post-commit" subcommand. +// +// Returns: +// - *cobra.Command: Configured post-commit subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemPostCommit) + + return &cobra.Command{ + Use: "post-commit", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/post_commit/doc.go b/internal/cli/system/cmd/post_commit/doc.go new file mode 100644 index 00000000..b59b9f1c --- /dev/null +++ b/internal/cli/system/cmd/post_commit/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package post_commit implements the ctx system post-commit subcommand. +// +// It detects git commit commands and nudges the agent to capture context +// (decisions or learnings) and run lints/tests after committing. +package post_commit diff --git a/internal/cli/system/cmd/post_commit/run.go b/internal/cli/system/cmd/post_commit/run.go new file mode 100644 index 00000000..9e04e52b --- /dev/null +++ b/internal/cli/system/cmd/post_commit/run.go @@ -0,0 +1,76 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package post_commit + +import ( + "os" + "regexp" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/notify" +) + +var ( + reGitCommit = regexp.MustCompile(`git\s+commit`) + reAmend = regexp.MustCompile(`--amend`) +) + +// Run executes the post-commit hook logic. +// +// After a successful git commit (non-amend), nudges the agent to offer +// context capture (decision or learning) and to run lints/tests before +// pushing. Also checks for version drift. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input, sessionID, paused := core.HookPreamble(stdin) + if paused { + return nil + } + + command := input.ToolInput.Command + + // Only trigger on git commit commands + if !reGitCommit.MatchString(command) { + return nil + } + + // Skip amend commits + if reAmend.MatchString(command) { + return nil + } + + hookName, variant := hook.PostCommit, hook.VariantNudge + + fallback := assets.TextDesc(assets.TextDescKeyPostCommitFallback) + msg := core.LoadMessage(hookName, variant, nil, fallback) + if msg == "" { + return nil + } + msg = ctxcontext.AppendDir(msg) + core.PrintHookContext(cmd, hook.EventPostToolUse, msg) + + ref := notify.NewTemplateRef(hookName, variant, nil) + core.Relay(hookName+": "+assets.TextDesc(assets.TextDescKeyPostCommitRelayMessage), input.SessionID, ref) + + core.CheckVersionDrift(cmd, sessionID) + + return nil +} diff --git a/internal/cli/system/cmd/postcommit/cmd.go b/internal/cli/system/cmd/postcommit/cmd.go deleted file mode 100644 index 4c116ff0..00000000 --- a/internal/cli/system/cmd/postcommit/cmd.go +++ /dev/null @@ -1,94 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package postcommit - -import ( - "os" - "regexp" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system post-commit" subcommand. -// -// Returns: -// - *cobra.Command: Configured post-commit subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "post-commit", - Short: "Post-commit context capture nudge", - Long: `Detects git commit commands and nudges the agent to offer context -capture (decision or learning) and suggest running lints/tests. -Skips amend commits. - -Hook event: PostToolUse (Bash) -Output: agent directive after git commits, silent otherwise -Silent when: command is not a git commit, or is an amend`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - return runPostCommit(cmd, os.Stdin) - }, - } -} - -var ( - reGitCommit = regexp.MustCompile(`git\s+commit`) - reAmend = regexp.MustCompile(`--amend`) -) - -func runPostCommit(cmd *cobra.Command, stdin *os.File) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(stdin) - - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - - command := input.ToolInput.Command - - // Only trigger on git commit commands - if !reGitCommit.MatchString(command) { - return nil - } - - // Skip amend commits - if reAmend.MatchString(command) { - return nil - } - - fallback := "Commit succeeded." + - " 1. Offer context capture to the user:" + - " Decision (design choice?), Learning (gotcha?), or Neither." + - " 2. Ask the user: \"Want me to run lints and tests before you push?\"" + - " Do NOT push. The user pushes manually." - msg := core.LoadMessage("post-commit", "nudge", nil, fallback) - if msg == "" { - return nil - } - if line := core.ContextDirLine(); line != "" { - msg += " [" + line + "]" - } - core.PrintHookContext(cmd, "PostToolUse", msg) - - ref := notify.NewTemplateRef("post-commit", "nudge", nil) - _ = notify.Send("relay", "post-commit: Commit succeeded, context capture offered", input.SessionID, ref) - eventlog.Append("relay", "post-commit: Commit succeeded, context capture offered", input.SessionID, ref) - - core.CheckVersionDrift(cmd, sessionID) - - return nil -} diff --git a/internal/cli/system/cmd/prune/cmd.go b/internal/cli/system/cmd/prune/cmd.go index cdec54f3..97327342 100644 --- a/internal/cli/system/cmd/prune/cmd.go +++ b/internal/cli/system/cmd/prune/cmd.go @@ -7,15 +7,9 @@ package prune import ( - "fmt" - "os" - "path/filepath" - "time" - "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/cli/system/core" ) // Cmd returns the "ctx system prune" subcommand. @@ -26,87 +20,23 @@ func Cmd() *cobra.Command { var days int var dryRun bool + short, long := assets.CommandDesc(assets.CmdDescKeySystemPrune) + cmd := &cobra.Command{ Use: "prune", - Short: "Clean stale per-session state files", - Long: `Remove per-session state files from .context/state/ that are -older than the specified age. Session state files are identified by -UUID suffixes (e.g. context-check-, heartbeat-). - -Global files without session IDs (events.jsonl, memory-import.json, etc.) -are always preserved. - -Examples: - ctx system prune # Prune files older than 7 days - ctx system prune --days 3 # Prune files older than 3 days - ctx system prune --dry-run # Show what would be pruned`, + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runPrune(cmd, days, dryRun) + return Run(cmd, days, dryRun) }, } - cmd.Flags().IntVar(&days, "days", 7, assets.FlagDesc("system.prune.days")) - cmd.Flags().BoolVar(&dryRun, "dry-run", false, assets.FlagDesc("system.prune.dry-run")) + cmd.Flags().IntVar(&days, "days", 7, + assets.FlagDesc(assets.FlagDescKeySystemPruneDays), + ) + cmd.Flags().BoolVar(&dryRun, "dry-run", false, + assets.FlagDesc(assets.FlagDescKeySystemPruneDryRun), + ) return cmd } - -func runPrune(cmd *cobra.Command, days int, dryRun bool) error { - dir := core.StateDir() - - entries, readErr := os.ReadDir(dir) - if readErr != nil { - return fmt.Errorf("reading state directory: %w", readErr) - } - - cutoff := time.Now().Add(-time.Duration(days) * 24 * time.Hour) - var pruned, skipped, preserved int - - for _, entry := range entries { - if entry.IsDir() { - continue - } - - name := entry.Name() - - // Only prune files with UUID session IDs - if !core.UUIDPattern.MatchString(name) { - preserved++ - continue - } - - info, statErr := entry.Info() - if statErr != nil { - continue - } - - if info.ModTime().After(cutoff) { - skipped++ - continue - } - - if dryRun { - cmd.Println(fmt.Sprintf(" would prune: %s (age: %s)", name, core.FormatAge(info.ModTime()))) - pruned++ - continue - } - - path := filepath.Join(dir, name) - if rmErr := os.Remove(path); rmErr != nil { - cmd.PrintErrln(fmt.Sprintf(" error removing %s: %v", name, rmErr)) - continue - } - pruned++ - } - - if dryRun { - cmd.Println() - cmd.Println(fmt.Sprintf("Dry run — would prune %d files (skip %d recent, preserve %d global)", - pruned, skipped, preserved)) - } else { - cmd.Println(fmt.Sprintf("Pruned %d files (skipped %d recent, preserved %d global)", - pruned, skipped, preserved)) - } - - return nil -} diff --git a/internal/cli/system/cmd/prune/doc.go b/internal/cli/system/cmd/prune/doc.go new file mode 100644 index 00000000..e70b3e7b --- /dev/null +++ b/internal/cli/system/cmd/prune/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package prune implements the ctx system prune subcommand. +// +// It removes stale per-session state files from .context/state/ that +// exceed the configured age, while preserving global state files. +package prune diff --git a/internal/cli/system/cmd/prune/run.go b/internal/cli/system/cmd/prune/run.go new file mode 100644 index 00000000..6c96581b --- /dev/null +++ b/internal/cli/system/cmd/prune/run.go @@ -0,0 +1,86 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package prune + +import ( + "os" + "path/filepath" + "time" + + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run executes the prune logic. +// +// Scans the state directory for session-scoped files (identified by UUID +// patterns) older than the given number of days and removes them. Global +// state files (non-UUID) are preserved. Supports dry-run mode. +// +// Parameters: +// - cmd: Cobra command for output +// - days: prune files older than this many days +// - dryRun: if true, report what would be pruned without removing +// +// Returns: +// - error: Non-nil on state directory read failure +func Run(cmd *cobra.Command, days int, dryRun bool) error { + dir := core.StateDir() + + entries, readErr := os.ReadDir(dir) + if readErr != nil { + return ctxerr.ReadingStateDir(readErr) + } + + cutoff := time.Now().Add(-time.Duration(days) * time2.HoursPerDay * time.Hour) + var pruned, skipped, preserved int + + for _, entry := range entries { + if entry.IsDir() { + continue + } + + name := entry.Name() + + // Only prune files with UUID session IDs + if !core.UUIDPattern.MatchString(name) { + preserved++ + continue + } + + info, statErr := entry.Info() + if statErr != nil { + continue + } + + if info.ModTime().After(cutoff) { + skipped++ + continue + } + + if dryRun { + write.PruneDryRunLine(cmd, name, core.FormatAge(info.ModTime())) + pruned++ + continue + } + + path := filepath.Join(dir, name) + if rmErr := os.Remove(path); rmErr != nil { + write.PruneErrorLine(cmd, name, rmErr) + continue + } + pruned++ + } + + write.PruneSummary(cmd, dryRun, pruned, skipped, preserved) + + return nil +} diff --git a/internal/cli/system/cmd/qa_reminder/cmd.go b/internal/cli/system/cmd/qa_reminder/cmd.go new file mode 100644 index 00000000..37ac6485 --- /dev/null +++ b/internal/cli/system/cmd/qa_reminder/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package qa_reminder + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system qa-reminder" subcommand. +// +// Returns: +// - *cobra.Command: Configured qa-reminder subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemQaReminder) + + return &cobra.Command{ + Use: "qa-reminder", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/qa_reminder/doc.go b/internal/cli/system/cmd/qa_reminder/doc.go new file mode 100644 index 00000000..dd5ea573 --- /dev/null +++ b/internal/cli/system/cmd/qa_reminder/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package qa_reminder implements the ctx system qa-reminder subcommand. +// +// It emits a reminder to lint and test the project before committing, +// firing on Bash tool use when the command contains git operations. +package qa_reminder diff --git a/internal/cli/system/cmd/qa_reminder/run.go b/internal/cli/system/cmd/qa_reminder/run.go new file mode 100644 index 00000000..59cc55c5 --- /dev/null +++ b/internal/cli/system/cmd/qa_reminder/run.go @@ -0,0 +1,61 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package qa_reminder + +import ( + "os" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the qa-reminder hook logic. +// +// Fires before any git command to inject a hard gate reminding the agent +// to lint, test, and verify a clean working tree before committing. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + if !strings.Contains(input.ToolInput.Command, "git") { + return nil + } + fallback := assets.TextDesc(assets.TextDescKeyQaReminderFallback) + msg := core.LoadMessage( + hook.QAReminder, hook.VariantGate, nil, fallback, + ) + if msg == "" { + return nil + } + msg = ctxcontext.AppendDir(msg) + + core.PrintHookContext(cmd, hook.EventPreToolUse, msg) + + ref := notify.NewTemplateRef(hook.QAReminder, hook.VariantGate, nil) + core.Relay(hook.QAReminder+": "+ + assets.TextDesc(assets.TextDescKeyQaReminderRelayMessage), + input.SessionID, ref, + ) + return nil +} diff --git a/internal/cli/system/cmd/qareminder/cmd.go b/internal/cli/system/cmd/qareminder/cmd.go deleted file mode 100644 index 4c6c7c26..00000000 --- a/internal/cli/system/cmd/qareminder/cmd.go +++ /dev/null @@ -1,75 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package qareminder - -import ( - "os" - "strings" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system qa-reminder" subcommand. -// -// Returns: -// - *cobra.Command: Configured qa-reminder subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "qa-reminder", - Short: "QA reminder hook", - Long: `Emits a hard reminder to lint and test the entire project before -committing. Fires on Bash tool use when the command contains "git", -placing reinforcement at the commit sequence rather than during edits. - -Hook event: PreToolUse (Bash) -Output: agent directive (when command contains "git" and .context/ is initialized) -Silent when: .context/ not initialized or command does not contain "git"`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(os.Stdin) - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - if !strings.Contains(input.ToolInput.Command, "git") { - return nil - } - fallback := "HARD GATE — DO NOT COMMIT without completing ALL of these steps first:" + - " (1) lint the ENTIRE project," + - " (2) test the ENTIRE project," + - " (3) verify a clean working tree (no modified or untracked files left behind)." + - " Not just the files you changed — the whole branch." + - " If unrelated modified files remain," + - " offer to commit them separately, stash them," + - " or get explicit confirmation to leave them." + - " Do NOT say 'I'll do that at the end' or 'I'll handle that after committing.'" + - " Run lint and tests BEFORE every git commit, every time, no exceptions." - msg := core.LoadMessage("qa-reminder", "gate", nil, fallback) - if msg == "" { - return nil - } - if line := core.ContextDirLine(); line != "" { - msg += " [" + line + "]" - } - core.PrintHookContext(cmd, "PreToolUse", msg) - ref := notify.NewTemplateRef("qa-reminder", "gate", nil) - _ = notify.Send("relay", "qa-reminder: QA gate reminder emitted", input.SessionID, ref) - eventlog.Append("relay", "qa-reminder: QA gate reminder emitted", input.SessionID, ref) - return nil - }, - } -} diff --git a/internal/cli/system/cmd/resources/cmd.go b/internal/cli/system/cmd/resources/cmd.go index 41a92aac..e3b49463 100644 --- a/internal/cli/system/cmd/resources/cmd.go +++ b/internal/cli/system/cmd/resources/cmd.go @@ -17,13 +17,18 @@ import ( // Returns: // - *cobra.Command: Configured resources subcommand func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemResources) + cmd := &cobra.Command{ Use: "resources", - Short: "Show system resource usage (memory, swap, disk, load)", + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { return runResources(cmd) }, } - cmd.Flags().Bool("json", false, assets.FlagDesc("system.resources.json")) + cmd.Flags().Bool("json", false, + assets.FlagDesc(assets.FlagDescKeySystemResourcesJson), + ) return cmd } diff --git a/internal/cli/system/cmd/resources/doc.go b/internal/cli/system/cmd/resources/doc.go new file mode 100644 index 00000000..b27d9b7c --- /dev/null +++ b/internal/cli/system/cmd/resources/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package resources implements the ctx system resources subcommand. +// +// It displays current system resource usage including memory, swap, +// disk, and load averages. +package resources diff --git a/internal/cli/system/cmd/resources/run.go b/internal/cli/system/cmd/resources/run.go index 7d2d55a2..bbf06d96 100644 --- a/internal/cli/system/cmd/resources/run.go +++ b/internal/cli/system/cmd/resources/run.go @@ -7,208 +7,32 @@ package resources import ( - "encoding/json" - "fmt" - "strings" + "github.com/spf13/cobra" + "github.com/ActiveMemory/ctx/internal/cli/system/core" "github.com/ActiveMemory/ctx/internal/sysinfo" - "github.com/spf13/cobra" ) -// statusCol is the column where the status indicator starts. -const statusCol = 52 - +// runResources executes the resources display logic. +// +// Collects a system resource snapshot, evaluates alerts, and outputs +// results as either a JSON object or a human-readable table with +// status indicators. +// +// Parameters: +// - cmd: Cobra command for output and flag access +// +// Returns: +// - error: Non-nil on JSON encoding failure func runResources(cmd *cobra.Command) error { snap := sysinfo.Collect(".") alerts := sysinfo.Evaluate(snap) jsonFlag, _ := cmd.Flags().GetBool("json") if jsonFlag { - return outputResourcesJSON(cmd, snap, alerts) + return core.OutputResourcesJSON(cmd, snap, alerts) } - outputResourcesText(cmd, snap, alerts) - return nil -} - -func outputResourcesText(cmd *cobra.Command, snap sysinfo.Snapshot, alerts []sysinfo.ResourceAlert) { - cmd.Println("System Resources") - cmd.Println("====================") - cmd.Println() - - // Memory line - if snap.Memory.Supported { - pct := pctOf(snap.Memory.UsedBytes, snap.Memory.TotalBytes) - values := fmt.Sprintf("%5s / %5s GB (%d%%)", - sysinfo.FormatGiB(snap.Memory.UsedBytes), - sysinfo.FormatGiB(snap.Memory.TotalBytes), - pct) - sev := severityFor(alerts, "memory") - cmd.Println(formatLine("Memory:", values, statusText(sev))) - } - - // Swap line - if snap.Memory.Supported { - pct := pctOf(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes) - values := fmt.Sprintf("%5s / %5s GB (%d%%)", - sysinfo.FormatGiB(snap.Memory.SwapUsedBytes), - sysinfo.FormatGiB(snap.Memory.SwapTotalBytes), - pct) - sev := severityFor(alerts, "swap") - cmd.Println(formatLine("Swap:", values, statusText(sev))) - } - - // Disk line - if snap.Disk.Supported { - pct := pctOf(snap.Disk.UsedBytes, snap.Disk.TotalBytes) - values := fmt.Sprintf("%5s / %5s GB (%d%%)", - sysinfo.FormatGiB(snap.Disk.UsedBytes), - sysinfo.FormatGiB(snap.Disk.TotalBytes), - pct) - sev := severityFor(alerts, "disk") - cmd.Println(formatLine("Disk:", values, statusText(sev))) - } - - // Load line - if snap.Load.Supported { - ratio := 0.0 - if snap.Load.NumCPU > 0 { - ratio = snap.Load.Load1 / float64(snap.Load.NumCPU) - } - values := fmt.Sprintf("%5.2f / %5.2f / %5.2f (%d CPUs, ratio %.2f)", - snap.Load.Load1, snap.Load.Load5, snap.Load.Load15, - snap.Load.NumCPU, ratio) - sev := severityFor(alerts, "load") - cmd.Println(formatLine("Load:", values, statusText(sev))) - } - - // Summary - cmd.Println() - if len(alerts) == 0 { - cmd.Println("All clear \u2014 no resource warnings.") - } else { - cmd.Println("Alerts:") - for _, a := range alerts { - icon := "\u26a0" - if a.Severity == sysinfo.SeverityDanger { - icon = "\u2716" - } - cmd.Println(fmt.Sprintf(" %s %s", icon, a.Message)) - } - } -} - -func outputResourcesJSON(cmd *cobra.Command, snap sysinfo.Snapshot, alerts []sysinfo.ResourceAlert) error { - type jsonAlert struct { - Severity string `json:"severity"` - Resource string `json:"resource"` - Message string `json:"message"` - } - type jsonOutput struct { - Memory struct { - TotalBytes uint64 `json:"total_bytes"` - UsedBytes uint64 `json:"used_bytes"` - Percent int `json:"percent"` - Supported bool `json:"supported"` - } `json:"memory"` - Swap struct { - TotalBytes uint64 `json:"total_bytes"` - UsedBytes uint64 `json:"used_bytes"` - Percent int `json:"percent"` - Supported bool `json:"supported"` - } `json:"swap"` - Disk struct { - TotalBytes uint64 `json:"total_bytes"` - UsedBytes uint64 `json:"used_bytes"` - Percent int `json:"percent"` - Path string `json:"path"` - Supported bool `json:"supported"` - } `json:"disk"` - Load struct { - Load1 float64 `json:"load1"` - Load5 float64 `json:"load5"` - Load15 float64 `json:"load15"` - NumCPU int `json:"num_cpu"` - Ratio float64 `json:"ratio"` - Supported bool `json:"supported"` - } `json:"load"` - Alerts []jsonAlert `json:"alerts"` - MaxSeverity string `json:"max_severity"` - } - - out := jsonOutput{} - out.Memory.TotalBytes = snap.Memory.TotalBytes - out.Memory.UsedBytes = snap.Memory.UsedBytes - out.Memory.Percent = pctOf(snap.Memory.UsedBytes, snap.Memory.TotalBytes) - out.Memory.Supported = snap.Memory.Supported - - out.Swap.TotalBytes = snap.Memory.SwapTotalBytes - out.Swap.UsedBytes = snap.Memory.SwapUsedBytes - out.Swap.Percent = pctOf(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes) - out.Swap.Supported = snap.Memory.Supported - - out.Disk.TotalBytes = snap.Disk.TotalBytes - out.Disk.UsedBytes = snap.Disk.UsedBytes - out.Disk.Percent = pctOf(snap.Disk.UsedBytes, snap.Disk.TotalBytes) - out.Disk.Path = snap.Disk.Path - out.Disk.Supported = snap.Disk.Supported - - out.Load.Load1 = snap.Load.Load1 - out.Load.Load5 = snap.Load.Load5 - out.Load.Load15 = snap.Load.Load15 - out.Load.NumCPU = snap.Load.NumCPU - if snap.Load.NumCPU > 0 { - out.Load.Ratio = snap.Load.Load1 / float64(snap.Load.NumCPU) - } - out.Load.Supported = snap.Load.Supported - - out.Alerts = make([]jsonAlert, 0, len(alerts)) - for _, a := range alerts { - out.Alerts = append(out.Alerts, jsonAlert{ - Severity: a.Severity.String(), - Resource: a.Resource, - Message: a.Message, - }) - } - out.MaxSeverity = sysinfo.MaxSeverity(alerts).String() - - enc := json.NewEncoder(cmd.OutOrStdout()) - enc.SetIndent("", " ") - return enc.Encode(out) -} - -func formatLine(label, values, status string) string { - left := fmt.Sprintf("%-7s %s", label, values) - pad := statusCol - len(left) - if pad < 1 { - pad = 1 - } - return left + strings.Repeat(" ", pad) + status -} - -func statusText(sev sysinfo.Severity) string { - switch sev { - case sysinfo.SeverityWarning: - return "\u26a0 WARNING" - case sysinfo.SeverityDanger: - return "\u2716 DANGER" - default: - return "\u2713 ok" - } -} - -func severityFor(alerts []sysinfo.ResourceAlert, resource string) sysinfo.Severity { - for _, a := range alerts { - if a.Resource == resource { - return a.Severity - } - } - return sysinfo.SeverityOK -} - -func pctOf(used, total uint64) int { - if total == 0 { - return 0 - } - return int(float64(used) / float64(total) * 100) + core.OutputResourcesText(cmd, snap, alerts) + return nil } diff --git a/internal/cli/system/cmd/resume/cmd.go b/internal/cli/system/cmd/resume/cmd.go index 3bfd3bec..1bf1fe5f 100644 --- a/internal/cli/system/cmd/resume/cmd.go +++ b/internal/cli/system/cmd/resume/cmd.go @@ -7,13 +7,11 @@ package resume import ( - "fmt" "os" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/cli/system/core" ) // Cmd returns the "ctx system resume" plumbing command. @@ -21,34 +19,21 @@ import ( // Returns: // - *cobra.Command: Configured resume subcommand func Cmd() *cobra.Command { - cmd := &cobra.Command{ - Use: "resume", - Short: "Resume context hooks for this session", - Long: `Removes the session-scoped pause marker. Hooks resume normal -behavior. Silent no-op if not paused. + short, long := assets.CommandDesc(assets.CmdDescKeySystemResume) -The session ID is read from stdin JSON (same as hooks) or --session-id flag.`, + cmd := &cobra.Command{ + Use: "resume", + Short: short, + Long: long, Hidden: true, RunE: func(cmd *cobra.Command, _ []string) error { - return runResume(cmd, os.Stdin) + return Run(cmd, os.Stdin) }, } - cmd.Flags().String("session-id", "", assets.FlagDesc("system.resume.session-id")) - return cmd -} -func runResume(cmd *cobra.Command, stdin *os.File) error { - sessionID, _ := cmd.Flags().GetString("session-id") - if sessionID == "" { - input := core.ReadInput(stdin) - sessionID = input.SessionID - } - if sessionID == "" { - sessionID = core.SessionUnknown - } + cmd.Flags().String("session-id", "", + assets.FlagDesc(assets.FlagDescKeySystemResumeSessionId), + ) - path := core.PauseMarkerPath(sessionID) - _ = os.Remove(path) - cmd.Println(fmt.Sprintf("Context hooks resumed for session %s", sessionID)) - return nil + return cmd } diff --git a/internal/cli/system/cmd/resume/doc.go b/internal/cli/system/cmd/resume/doc.go new file mode 100644 index 00000000..a61e5ea0 --- /dev/null +++ b/internal/cli/system/cmd/resume/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package resume implements the ctx system resume subcommand. +// +// It removes the session-scoped pause marker so that hooks resume +// normal behavior. +package resume diff --git a/internal/cli/system/cmd/resume/run.go b/internal/cli/system/cmd/resume/run.go new file mode 100644 index 00000000..1145b038 --- /dev/null +++ b/internal/cli/system/cmd/resume/run.go @@ -0,0 +1,44 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package resume + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/cli/system/core" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/write" +) + +// Run executes the resume logic. +// +// Reads a session ID from the --session-id flag or stdin JSON, then +// removes the pause marker file so hooks fire normally again. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil +func Run(cmd *cobra.Command, stdin *os.File) error { + sessionID, _ := cmd.Flags().GetString("session-id") + if sessionID == "" { + input := core.ReadInput(stdin) + sessionID = input.SessionID + } + if sessionID == "" { + sessionID = session.IDUnknown + } + + path := core.PauseMarkerPath(sessionID) + _ = os.Remove(path) + write.SessionResumed(cmd, sessionID) + return nil +} diff --git a/internal/cli/system/cmd/specs_nudge/cmd.go b/internal/cli/system/cmd/specs_nudge/cmd.go new file mode 100644 index 00000000..60963f9c --- /dev/null +++ b/internal/cli/system/cmd/specs_nudge/cmd.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package specs_nudge + +import ( + "os" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx system specs-nudge" subcommand. +// +// Returns: +// - *cobra.Command: Configured specs-nudge subcommand +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemSpecsNudge) + + return &cobra.Command{ + Use: "specs-nudge", + Short: short, + Long: long, + Hidden: true, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, os.Stdin) + }, + } +} diff --git a/internal/cli/system/cmd/specs_nudge/doc.go b/internal/cli/system/cmd/specs_nudge/doc.go new file mode 100644 index 00000000..3b10b10c --- /dev/null +++ b/internal/cli/system/cmd/specs_nudge/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package specs_nudge implements the ctx system specs-nudge subcommand. +// +// It reminds the agent to save plans to the specs/ directory for release +// tracking when entering plan mode. +package specs_nudge diff --git a/internal/cli/system/cmd/specs_nudge/run.go b/internal/cli/system/cmd/specs_nudge/run.go new file mode 100644 index 00000000..5fe9890f --- /dev/null +++ b/internal/cli/system/cmd/specs_nudge/run.go @@ -0,0 +1,54 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package specs_nudge + +import ( + "os" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/system/core" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Run executes the specs-nudge hook logic. +// +// Emits a PreToolUse nudge reminding the agent to save plans to specs/ +// when a new implementation is detected. Appends a context directory +// footer if available. +// +// Parameters: +// - cmd: Cobra command for output +// - stdin: standard input for hook JSON +// +// Returns: +// - error: Always nil (hook errors are non-fatal) +func Run(cmd *cobra.Command, stdin *os.File) error { + if !core.Initialized() { + return nil + } + input, _, paused := core.HookPreamble(stdin) + if paused { + return nil + } + fallback := assets.TextDesc(assets.TextDescKeySpecsNudgeFallback) + msg := core.LoadMessage( + hook.SpecsNudge, hook.VariantNudge, nil, fallback, + ) + if msg == "" { + return nil + } + msg = ctxcontext.AppendDir(msg) + core.PrintHookContext(cmd, hook.EventPreToolUse, msg) + nudgeMsg := assets.TextDesc(assets.TextDescKeySpecsNudgeNudgeMessage) + ref := notify.NewTemplateRef(hook.SpecsNudge, hook.VariantNudge, nil) + core.Relay(hook.SpecsNudge+": "+nudgeMsg, input.SessionID, ref) + return nil +} diff --git a/internal/cli/system/cmd/specsnudge/cmd.go b/internal/cli/system/cmd/specsnudge/cmd.go deleted file mode 100644 index 01a3088f..00000000 --- a/internal/cli/system/cmd/specsnudge/cmd.go +++ /dev/null @@ -1,63 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package specsnudge - -import ( - "os" - - "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/eventlog" - "github.com/ActiveMemory/ctx/internal/notify" -) - -// Cmd returns the "ctx system specs-nudge" subcommand. -// -// Returns: -// - *cobra.Command: Configured specs-nudge subcommand -func Cmd() *cobra.Command { - return &cobra.Command{ - Use: "specs-nudge", - Short: "Plan-to-specs directory nudge", - Long: `Emits a directive reminding the agent to save plans to specs/ -for release tracking. Fires on EnterPlanMode tool use. - -Hook event: PreToolUse (EnterPlanMode) -Output: agent directive (always, when .context/ is initialized) -Silent when: .context/ not initialized`, - Hidden: true, - RunE: func(cmd *cobra.Command, _ []string) error { - if !core.IsInitialized() { - return nil - } - input := core.ReadInput(os.Stdin) - sessionID := input.SessionID - if sessionID == "" { - sessionID = core.SessionUnknown - } - if core.Paused(sessionID) > 0 { - return nil - } - fallback := "Save your plan to specs/ — these documents track what was designed" + - " for the current release. Use specs/feature-name.md naming. If this" + - " is a quick fix that doesn't need a spec, proceed without one." - msg := core.LoadMessage("specs-nudge", "nudge", nil, fallback) - if msg == "" { - return nil - } - if line := core.ContextDirLine(); line != "" { - msg += " [" + line + "]" - } - core.PrintHookContext(cmd, "PreToolUse", msg) - ref := notify.NewTemplateRef("specs-nudge", "nudge", nil) - _ = notify.Send("relay", "specs-nudge: plan-to-specs nudge emitted", input.SessionID, ref) - eventlog.Append("relay", "specs-nudge: plan-to-specs nudge emitted", input.SessionID, ref) - return nil - }, - } -} diff --git a/internal/cli/system/cmd/stats/cmd.go b/internal/cli/system/cmd/stats/cmd.go index 03055585..84e1913f 100644 --- a/internal/cli/system/cmd/stats/cmd.go +++ b/internal/cli/system/cmd/stats/cmd.go @@ -17,28 +17,29 @@ import ( // Returns: // - *cobra.Command: Configured stats subcommand func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystemStats) + cmd := &cobra.Command{ Use: "stats", - Short: "Show session token usage stats", - Long: `Display per-session token usage statistics from stats JSONL files. - -By default, shows the last 20 entries across all sessions. Use --follow -to stream new entries as they arrive (like tail -f). - -Flags: - --follow, -f Stream new entries as they arrive - --session, -s Filter by session ID (prefix match) - --last, -n Show last N entries (default 20) - --json, -j Output raw JSONL`, + Short: short, + Long: long, RunE: func(cmd *cobra.Command, _ []string) error { - return runStats(cmd) + return Run(cmd) }, } - cmd.Flags().BoolP("follow", "f", false, assets.FlagDesc("system.stats.follow")) - cmd.Flags().StringP("session", "s", "", assets.FlagDesc("system.stats.session")) - cmd.Flags().IntP("last", "n", 20, assets.FlagDesc("system.stats.last")) - cmd.Flags().BoolP("json", "j", false, assets.FlagDesc("system.stats.json")) + cmd.Flags().BoolP("follow", "f", false, + assets.FlagDesc(assets.FlagDescKeySystemStatsFollow), + ) + cmd.Flags().StringP("session", "s", "", + assets.FlagDesc(assets.FlagDescKeySystemStatsSession), + ) + cmd.Flags().IntP("last", "n", 20, + assets.FlagDesc(assets.FlagDescKeySystemStatsLast), + ) + cmd.Flags().BoolP("json", "j", false, + assets.FlagDesc(assets.FlagDescKeySystemStatsJson), + ) return cmd } diff --git a/internal/cli/system/cmd/stats/doc.go b/internal/cli/system/cmd/stats/doc.go new file mode 100644 index 00000000..16526c6d --- /dev/null +++ b/internal/cli/system/cmd/stats/doc.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package stats implements the ctx system stats subcommand. +// +// It displays per-session token usage statistics from stats JSONL files. +package stats diff --git a/internal/cli/system/cmd/stats/run.go b/internal/cli/system/cmd/stats/run.go index 55d67f6e..facdf223 100644 --- a/internal/cli/system/cmd/stats/run.go +++ b/internal/cli/system/cmd/stats/run.go @@ -7,300 +7,45 @@ package stats import ( - "encoding/json" - "fmt" - "os" "path/filepath" - "sort" - "strings" - "time" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/system/core" - "github.com/ActiveMemory/ctx/internal/config" "github.com/ActiveMemory/ctx/internal/rc" ) -func runStats(cmd *cobra.Command) error { +// Run executes the stats subcommand, reading and displaying per-session +// token usage statistics from JSONL state files. Supports filtering by +// session, limiting output count, JSON output, and live follow mode. +// +// Parameters: +// - cmd: Cobra command for flag access and output +// +// Returns: +// - error: Non-nil on stats directory read failure +func Run(cmd *cobra.Command) error { follow, _ := cmd.Flags().GetBool("follow") session, _ := cmd.Flags().GetString("session") last, _ := cmd.Flags().GetInt("last") jsonOut, _ := cmd.Flags().GetBool("json") - dir := filepath.Join(rc.ContextDir(), config.DirState) + dir := filepath.Join(rc.ContextDir(), dir.State) - entries, readErr := ReadStatsDir(dir, session) + entries, readErr := core.ReadStatsDir(dir, session) if readErr != nil { return readErr } if !follow { - return DumpStats(cmd, entries, last, jsonOut) + return core.DumpStats(cmd, entries, last, jsonOut) } // Dump existing entries first, then stream. - if dumpErr := DumpStats(cmd, entries, last, jsonOut); dumpErr != nil { + if dumpErr := core.DumpStats(cmd, entries, last, jsonOut); dumpErr != nil { return dumpErr } - return streamStats(cmd, dir, session, jsonOut) -} - -// StatsEntry is a core.SessionStats with the source file for display. -type StatsEntry struct { - core.SessionStats - Session string `json:"session"` -} - -// ReadStatsDir reads all stats JSONL files, optionally filtered by session prefix. -// -// Parameters: -// - dir: Path to the state directory -// - sessionFilter: Session ID prefix to filter by (empty for all) -// -// Returns: -// - []StatsEntry: Sorted stats entries -// - error: Non-nil on glob failure -func ReadStatsDir(dir, sessionFilter string) ([]StatsEntry, error) { - pattern := filepath.Join(dir, "stats-*.jsonl") - matches, globErr := filepath.Glob(pattern) - if globErr != nil { - return nil, fmt.Errorf("globbing stats files: %w", globErr) - } - - var entries []StatsEntry - for _, path := range matches { - sid := ExtractSessionID(filepath.Base(path)) - if sessionFilter != "" && !strings.HasPrefix(sid, sessionFilter) { - continue - } - fileEntries, parseErr := ParseStatsFile(path, sid) - if parseErr != nil { - continue - } - entries = append(entries, fileEntries...) - } - - sort.Slice(entries, func(i, j int) bool { - ti, ei := time.Parse(time.RFC3339, entries[i].Timestamp) - tj, ej := time.Parse(time.RFC3339, entries[j].Timestamp) - if ei != nil || ej != nil { - return entries[i].Timestamp < entries[j].Timestamp - } - return ti.Before(tj) - }) - - return entries, nil -} - -// ExtractSessionID gets the session ID from a filename like "stats-abc123.jsonl". -// -// Parameters: -// - basename: File basename -// -// Returns: -// - string: Session ID -func ExtractSessionID(basename string) string { - s := strings.TrimPrefix(basename, "stats-") - return strings.TrimSuffix(s, ".jsonl") -} - -// ParseStatsFile reads all JSONL lines from a stats file. -// -// Parameters: -// - path: Absolute path to the stats file -// - sid: Session ID for this file -// -// Returns: -// - []StatsEntry: Parsed entries -// - error: Non-nil on read failure -func ParseStatsFile(path, sid string) ([]StatsEntry, error) { - data, readErr := os.ReadFile(path) //nolint:gosec // project-local state path - if readErr != nil { - return nil, readErr - } - - var entries []StatsEntry - for _, line := range strings.Split(strings.TrimSpace(string(data)), config.NewlineLF) { - if line == "" { - continue - } - var s core.SessionStats - if jsonErr := json.Unmarshal([]byte(line), &s); jsonErr != nil { - continue - } - entries = append(entries, StatsEntry{SessionStats: s, Session: sid}) - } - return entries, nil -} - -// DumpStats outputs the last N entries. -// -// Parameters: -// - cmd: Cobra command for output -// - entries: Stats entries to display -// - last: Number of entries to show (0 for all) -// - jsonOut: Whether to output as JSONL -// -// Returns: -// - error: Non-nil on output failure -func DumpStats(cmd *cobra.Command, entries []StatsEntry, last int, jsonOut bool) error { - if len(entries) == 0 { - cmd.Println("No stats recorded yet.") - return nil - } - - // Tail: take last N entries. - if last > 0 && len(entries) > last { - entries = entries[len(entries)-last:] - } - - if jsonOut { - return outputStatsJSON(cmd, entries) - } - - PrintStatsHeader(cmd) - for i := range entries { - PrintStatsLine(cmd, &entries[i]) - } - return nil -} - -// streamStats polls for new JSONL lines and prints them as they arrive. -func streamStats(cmd *cobra.Command, dir, sessionFilter string, jsonOut bool) error { - // Track file sizes to detect new content. - offsets := make(map[string]int64) - matches, _ := filepath.Glob(filepath.Join(dir, "stats-*.jsonl")) - for _, path := range matches { - info, statErr := os.Stat(path) - if statErr == nil { - offsets[path] = info.Size() - } - } - - ticker := time.NewTicker(time.Second) - defer ticker.Stop() - - for range ticker.C { - matches, _ = filepath.Glob(filepath.Join(dir, "stats-*.jsonl")) - for _, path := range matches { - sid := ExtractSessionID(filepath.Base(path)) - if sessionFilter != "" && !strings.HasPrefix(sid, sessionFilter) { - continue - } - - info, statErr := os.Stat(path) - if statErr != nil { - continue - } - prev := offsets[path] - if info.Size() <= prev { - continue - } - - newEntries := ReadNewLines(path, prev, sid) - for i := range newEntries { - if jsonOut { - line, marshalErr := json.Marshal(newEntries[i]) - if marshalErr == nil { - cmd.Println(string(line)) - } - } else { - PrintStatsLine(cmd, &newEntries[i]) - } - } - offsets[path] = info.Size() - } - } - - return nil -} - -// ReadNewLines reads bytes from offset to end and parses JSONL lines. -// -// Parameters: -// - path: Absolute path to the stats file -// - offset: Byte offset to start reading from -// - sid: Session ID for this file -// -// Returns: -// - []StatsEntry: Newly parsed entries -func ReadNewLines(path string, offset int64, sid string) []StatsEntry { - f, openErr := os.Open(path) //nolint:gosec // project-local state path - if openErr != nil { - return nil - } - defer func() { _ = f.Close() }() - - if _, seekErr := f.Seek(offset, 0); seekErr != nil { - return nil - } - - buf := make([]byte, 8192) - n, readErr := f.Read(buf) - if readErr != nil || n == 0 { - return nil - } - - var entries []StatsEntry - for _, line := range strings.Split(strings.TrimSpace(string(buf[:n])), config.NewlineLF) { - if line == "" { - continue - } - var s core.SessionStats - if jsonErr := json.Unmarshal([]byte(line), &s); jsonErr != nil { - continue - } - entries = append(entries, StatsEntry{SessionStats: s, Session: sid}) - } - return entries -} - -// outputStatsJSON writes entries as raw JSONL. -func outputStatsJSON(cmd *cobra.Command, entries []StatsEntry) error { - for _, e := range entries { - line, marshalErr := json.Marshal(e) - if marshalErr != nil { - continue - } - cmd.Println(string(line)) - } - return nil -} - -// PrintStatsHeader prints the column header for human output. -// -// Parameters: -// - cmd: Cobra command for output -func PrintStatsHeader(cmd *cobra.Command) { - cmd.Println(fmt.Sprintf("%-19s %-8s %6s %8s %4s %-12s", - "TIME", "SESSION", "PROMPT", "TOKENS", "PCT", "EVENT")) - cmd.Println(fmt.Sprintf("%-19s %-8s %6s %8s %4s %-12s", - "-------------------", "--------", "------", "--------", "----", "------------")) -} - -// PrintStatsLine prints a single stats entry in human-readable format. -// -// Parameters: -// - cmd: Cobra command for output -// - e: Stats entry to print -func PrintStatsLine(cmd *cobra.Command, e *StatsEntry) { - ts := formatStatsTimestamp(e.Timestamp) - sid := e.Session - if len(sid) > 8 { - sid = sid[:8] - } - tokens := core.FormatTokenCount(e.Tokens) - cmd.Println(fmt.Sprintf("%-19s %-8s %6d %7s %3d%% %-12s", - ts, sid, e.Prompt, tokens, e.Pct, e.Event)) -} - -// formatStatsTimestamp converts an RFC3339 timestamp to local time display. -func formatStatsTimestamp(ts string) string { - t, parseErr := time.Parse(time.RFC3339, ts) - if parseErr != nil { - return ts - } - return t.Local().Format("2006-01-02 15:04:05") + return core.StreamStats(cmd, dir, session, jsonOut) } diff --git a/internal/cli/system/core/backup.go b/internal/cli/system/core/backup.go new file mode 100644 index 00000000..4a469cff --- /dev/null +++ b/internal/cli/system/core/backup.go @@ -0,0 +1,300 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "archive/tar" + "compress/gzip" + "fmt" + "io" + "io/fs" + "os" + "path/filepath" + "time" + + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/dir" + fs2 "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + ctxerr "github.com/ActiveMemory/ctx/internal/err" +) + +// BackupProject creates a project-scoped backup archive. +// +// Parameters: +// - cmd: Cobra command for diagnostic output +// - home: user home directory +// - timestamp: formatted timestamp for the archive filename +// - smb: optional SMB configuration (nil to skip remote copy) +// +// Returns: +// - BackupResult: archive path, size, and optional SMB destination +// - error: non-nil on archive or SMB failure +func BackupProject( + cmd *cobra.Command, home, timestamp string, smb *SMBConfig, +) (BackupResult, error) { + cwd, cwdErr := os.Getwd() + if cwdErr != nil { + return BackupResult{}, cwdErr + } + + archiveName := fmt.Sprintf(archive.BackupTplProjectArchive, timestamp) + archivePath := filepath.Join(os.TempDir(), archiveName) + + entries := []ArchiveEntry{ + {SourcePath: filepath.Join(cwd, dir.Context), Prefix: dir.Context, ExcludeDir: dir.JournalSite}, + {SourcePath: filepath.Join(cwd, dir.Claude), Prefix: dir.Claude}, + {SourcePath: filepath.Join(cwd, dir.Ideas), Prefix: dir.Ideas, Optional: true}, + {SourcePath: filepath.Join(home, archive.Bashrc), Prefix: archive.Bashrc}, + } + + result, finalizeErr := finalizeArchive( + cmd, archivePath, archiveName, archive.BackupScopeProject, entries, smb, + ) + if finalizeErr != nil { + return result, finalizeErr + } + + // Touch marker file for check-backup-age hook. + markerDir := filepath.Join(home, archive.BackupMarkerDir) + _ = os.MkdirAll(markerDir, fs2.PermExec) + markerPath := filepath.Join(markerDir, archive.BackupMarkerFile) + TouchFile(markerPath) + + return result, nil +} + +// BackupGlobal creates a global-scoped backup archive. +// +// Parameters: +// - cmd: Cobra command for diagnostic output +// - home: user home directory +// - timestamp: formatted timestamp for the archive filename +// - smb: optional SMB configuration (nil to skip remote copy) +// +// Returns: +// - BackupResult: archive path, size, and optional SMB destination +// - error: non-nil on archive or SMB failure +func BackupGlobal( + cmd *cobra.Command, home, timestamp string, smb *SMBConfig, +) (BackupResult, error) { + archiveName := fmt.Sprintf(archive.BackupTplGlobalArchive, timestamp) + archivePath := filepath.Join(os.TempDir(), archiveName) + + entries := []ArchiveEntry{ + {SourcePath: filepath.Join(home, dir.Claude), Prefix: dir.Claude, ExcludeDir: archive.BackupExcludeTodos}, + } + + return finalizeArchive( + cmd, archivePath, archiveName, archive.BackupScopeGlobal, entries, smb, + ) +} + +// finalizeArchive creates the archive, populates the result with size, +// and optionally copies to an SMB share. +func finalizeArchive( + cmd *cobra.Command, archivePath, archiveName, scope string, + entries []ArchiveEntry, smb *SMBConfig, +) (BackupResult, error) { + if archiveErr := CreateArchive(archivePath, entries, cmd); archiveErr != nil { + return BackupResult{}, archiveErr + } + + result := BackupResult{Scope: scope, Archive: archivePath} + if info, statErr := os.Stat(archivePath); statErr == nil { + result.Size = info.Size() + } + + if smb != nil { + if mountErr := EnsureSMBMount(smb); mountErr != nil { + return result, mountErr + } + if copyErr := CopyToSMB(smb, archivePath); copyErr != nil { + return result, copyErr + } + result.SMBDest = filepath.Join(smb.GVFSPath, smb.Subdir, archiveName) + } + + return result, nil +} + +// CreateArchive builds a tar.gz archive from the given entries. +// +// Parameters: +// - archivePath: output file path for the archive +// - entries: directories and files to include +// - cmd: Cobra command for diagnostic output +// +// Returns: +// - error: non-nil on file creation or tar writing failure +func CreateArchive( + archivePath string, entries []ArchiveEntry, cmd *cobra.Command, +) error { + outFile, createErr := os.Create(archivePath) //nolint:gosec // tmp path from our own code + if createErr != nil { + return ctxerr.CreateArchive(createErr) + } + defer func() { _ = outFile.Close() }() + + gzw := gzip.NewWriter(outFile) + defer func() { _ = gzw.Close() }() + + tw := tar.NewWriter(gzw) + defer func() { _ = tw.Close() }() + + for _, entry := range entries { + if addErr := addEntry(tw, entry, cmd); addErr != nil { + return addErr + } + } + return nil +} + +// addEntry adds a single ArchiveEntry (file or directory) to the tar writer. +func addEntry(tw *tar.Writer, entry ArchiveEntry, cmd *cobra.Command) error { + info, statErr := os.Stat(entry.SourcePath) + if os.IsNotExist(statErr) { + if entry.Optional { + cmd.PrintErrln(fmt.Sprintf("skipping %s (not found)", entry.Prefix)) + return nil + } + return ctxerr.SourceNotFound(entry.SourcePath) + } + if statErr != nil { + return statErr + } + + if !info.IsDir() { + return addSingleFile(tw, entry.SourcePath, entry.Prefix, info) + } + + return filepath.WalkDir(entry.SourcePath, + func(path string, d fs.DirEntry, walkErr error) error { + if walkErr != nil { + return walkErr + } + if d.IsDir() && entry.ExcludeDir != "" && d.Name() == entry.ExcludeDir { + return filepath.SkipDir + } + if d.Type()&os.ModeSymlink != 0 { + return nil + } + + rel, relErr := filepath.Rel(entry.SourcePath, path) + if relErr != nil { + return relErr + } + + name := filepath.ToSlash(filepath.Join(entry.Prefix, rel)) + + fileInfo, infoErr := d.Info() + if infoErr != nil { + return infoErr + } + + header, headerErr := tar.FileInfoHeader(fileInfo, "") + if headerErr != nil { + return headerErr + } + header.Name = name + + if writeErr := tw.WriteHeader(header); writeErr != nil { + return writeErr + } + + if d.IsDir() { + return nil + } + return copyFileToTar(tw, path) + }) +} + +// addSingleFile writes a single file entry into the tar. +func addSingleFile( + tw *tar.Writer, path, name string, info fs.FileInfo, +) error { + header, headerErr := tar.FileInfoHeader(info, "") + if headerErr != nil { + return headerErr + } + header.Name = name + + if writeErr := tw.WriteHeader(header); writeErr != nil { + return writeErr + } + return copyFileToTar(tw, path) +} + +// copyFileToTar reads a file and writes its contents to the tar writer. +func copyFileToTar(tw *tar.Writer, path string) error { + f, openErr := os.Open(path) //nolint:gosec // paths are from our own entries + if openErr != nil { + return openErr + } + defer func() { _ = f.Close() }() + _, copyErr := io.Copy(tw, f) + return copyErr +} + +// CheckSMBMountWarnings checks whether the GVFS mount for the given SMB URL +// exists and appends warning strings if the share is not mounted. +// +// Parameters: +// - smbURL: the SMB share URL from the environment +// - warnings: existing warning slice to append to +// +// Returns: +// - []string: the warnings slice, possibly with SMB mount warnings appended +func CheckSMBMountWarnings(smbURL string, warnings []string) []string { + cfg, cfgErr := ParseSMBConfig(smbURL, "") + if cfgErr != nil { + return warnings + } + + if _, statErr := os.Stat(cfg.GVFSPath); os.IsNotExist(statErr) { + warnings = append(warnings, + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyBackupSMBNotMounted), cfg.Host), + assets.TextDesc(assets.TextDescKeyBackupSMBUnavailable), + ) + } + + return warnings +} + +// CheckBackupMarker checks the backup marker file age and appends warnings +// when the marker is missing or older than config.BackupMaxAgeDays. +// +// Parameters: +// - markerPath: absolute path to the backup marker file +// - warnings: existing warning slice to append to +// +// Returns: +// - []string: the warnings slice, possibly with staleness warnings appended +func CheckBackupMarker(markerPath string, warnings []string) []string { + info, statErr := os.Stat(markerPath) + if os.IsNotExist(statErr) { + return append(warnings, + assets.TextDesc(assets.TextDescKeyBackupNoMarker), + assets.TextDesc(assets.TextDescKeyBackupRunHint), + ) + } + if statErr != nil { + return warnings + } + + ageDays := int(time.Since(info.ModTime()).Hours() / 24) + if ageDays >= archive.BackupMaxAgeDays { + return append(warnings, + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyBackupStale), ageDays), + assets.TextDesc(assets.TextDescKeyBackupRunHint), + ) + } + + return warnings +} diff --git a/internal/cli/system/core/bootstrap.go b/internal/cli/system/core/bootstrap.go new file mode 100644 index 00000000..de698551 --- /dev/null +++ b/internal/cli/system/core/bootstrap.go @@ -0,0 +1,129 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "os" + "path/filepath" + "sort" + "strings" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/cli/initialize" + "github.com/ActiveMemory/ctx/internal/config/bootstrap" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/token" +) + +// PluginWarning returns a warning string if the ctx plugin is installed +// but not enabled in either global or local settings. +// +// Returns: +// - string: warning message, or empty string if no warning is needed. +func PluginWarning() string { + if !initialize.PluginInstalled() { + return "" + } + if initialize.PluginEnabledGlobally() || initialize.PluginEnabledLocally() { + return "" + } + return assets.TextDesc(assets.TextDescKeyBootstrapPluginWarning) +} + +// ListContextFiles reads the given directory and returns sorted .md filenames. +// +// Parameters: +// - dir: absolute path to the context directory. +// +// Returns: +// - []string: sorted list of Markdown filenames, or nil on read error. +func ListContextFiles(dir string) []string { + entries, readErr := os.ReadDir(dir) + if readErr != nil { + return nil + } + + var files []string + for _, e := range entries { + if e.IsDir() { + continue + } + if strings.EqualFold(filepath.Ext(e.Name()), file.ExtMarkdown) { + files = append(files, e.Name()) + } + } + sort.Strings(files) + return files +} + +// WrapFileList formats file names as a comma-separated list, wrapping lines +// at approximately maxWidth characters. Continuation lines are prefixed with +// the given indent string. Returns the "none" label from assets when the +// list is empty. +// +// Parameters: +// - files: list of filenames to format. +// - maxWidth: approximate character width before wrapping. +// - indent: prefix string for each line. +// +// Returns: +// - string: formatted, wrapped file list. +func WrapFileList(files []string, maxWidth int, indent string) string { + if len(files) == 0 { + return indent + assets.TextDesc(assets.TextDescKeyBootstrapNone) + } + + var lines []string + current := indent + + for i, f := range files { + entry := f + if i < len(files)-1 { + entry += "," + } + + switch { + case current == indent: + // First entry on this line — always add it. + current += entry + case len(current)+1+len(entry) > maxWidth: + // Would exceed width — start a new line. + lines = append(lines, current) + current = indent + entry + default: + current += " " + entry + } + } + lines = append(lines, current) + return strings.Join(lines, token.NewlineLF) +} + +// ParseNumberedLines splits a numbered multiline string into individual +// items, stripping the leading "N. " prefix from each line. Empty lines +// are skipped. +// +// Parameters: +// - text: multiline string with "1. ...\n2. ..." formatting +// +// Returns: +// - []string: list of items with number prefixes removed +func ParseNumberedLines(text string) []string { + lines := strings.Split(text, token.NewlineLF) + items := make([]string, 0, len(lines)) + for _, line := range lines { + line = strings.TrimSpace(line) + if line == "" { + continue + } + if idx := strings.Index(line, bootstrap.NumberedListSep); idx >= 0 && + idx <= bootstrap.NumberedListMaxDigits { + line = line[idx+len(bootstrap.NumberedListSep):] + } + items = append(items, line) + } + return items +} diff --git a/internal/cli/system/core/ceremony.go b/internal/cli/system/core/ceremony.go new file mode 100644 index 00000000..05e9170d --- /dev/null +++ b/internal/cli/system/core/ceremony.go @@ -0,0 +1,131 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "os" + "path/filepath" + "sort" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/ceremony" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// RecentJournalFiles returns the n most recent markdown files in the given +// journal directory, sorted by filename descending (newest first). +// +// Parameters: +// - dir: absolute path to the journal directory +// - n: maximum number of files to return +// +// Returns: +// - []string: absolute paths to the most recent journal files, or nil on +// read error or empty directory +func RecentJournalFiles(dir string, n int) []string { + entries, readErr := os.ReadDir(dir) + if readErr != nil { + return nil + } + + var names []string + for _, e := range entries { + if e.IsDir() || !strings.HasSuffix(e.Name(), file.ExtMarkdown) { + continue + } + names = append(names, e.Name()) + } + + sort.Sort(sort.Reverse(sort.StringSlice(names))) + + if len(names) > n { + names = names[:n] + } + + paths := make([]string, len(names)) + for i, name := range names { + paths[i] = filepath.Join(dir, name) + } + return paths +} + +// ScanJournalsForCeremonies checks whether the given journal files contain +// references to /ctx-remember and /ctx-wrap-up ceremony commands. +// +// Parameters: +// - files: absolute paths to journal files to scan +// +// Returns: +// - remember: true if any file contains "ctx-remember" +// - wrapup: true if any file contains "ctx-wrap-up" +func ScanJournalsForCeremonies(files []string) (remember, wrapup bool) { + for _, path := range files { + data, readErr := os.ReadFile(path) //nolint:gosec // journal file path + if readErr != nil { + continue + } + content := string(data) + if !remember && strings.Contains(content, ceremony.CeremonyRememberCmd) { + remember = true + } + if !wrapup && strings.Contains(content, ceremony.CeremonyWrapUpCmd) { + wrapup = true + } + if remember && wrapup { + return + } + } + return +} + +// EmitCeremonyNudge builds and prints a ceremony nudge message box based on +// which ceremonies (remember, wrapup) are missing from recent sessions. +// +// Parameters: +// - cmd: Cobra command used for output +// - remember: whether /ctx-remember was found in recent journals +// - wrapup: whether /ctx-wrap-up was found in recent journals +// +// Returns: +// - msg: the formatted nudge message, or empty string if no content +// - variant: the selected variant string for notifications +func EmitCeremonyNudge(cmd *cobra.Command, remember, wrapup bool) (msg, variant string) { + var boxTitleKey, fallbackKey string + + switch { + case !remember && !wrapup: + variant = hook.VariantBoth + boxTitleKey = assets.TextDescKeyCeremonyBoxBoth + fallbackKey = assets.TextDescKeyCeremonyFallbackBoth + case !remember: + variant = hook.VariantRemember + boxTitleKey = assets.TextDescKeyCeremonyBoxRemember + fallbackKey = assets.TextDescKeyCeremonyFallbackRemember + case !wrapup: + variant = hook.VariantWrapup + boxTitleKey = assets.TextDescKeyCeremonyBoxWrapup + fallbackKey = assets.TextDescKeyCeremonyFallbackWrapup + } + + boxTitle := assets.TextDesc(boxTitleKey) + fallback := assets.TextDesc(fallbackKey) + + content := LoadMessage(hook.CheckCeremonies, variant, nil, fallback) + if content == "" { + return "", variant + } + + relayPrefix := assets.TextDesc(assets.TextDescKeyCeremonyRelayPrefix) + + msg = NudgeBox(relayPrefix, boxTitle, content) + cmd.Println(msg) + return msg, variant +} diff --git a/internal/cli/system/core/context_mtime.go b/internal/cli/system/core/context_mtime.go index 6aa818ad..43002964 100644 --- a/internal/cli/system/core/context_mtime.go +++ b/internal/cli/system/core/context_mtime.go @@ -10,7 +10,7 @@ import ( "os" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/file" ) // GetLatestContextMtime returns the most recent mtime of any .context/*.md file. @@ -28,7 +28,7 @@ func GetLatestContextMtime(contextDir string) int64 { var latest int64 for _, entry := range entries { - if entry.IsDir() || !strings.HasSuffix(entry.Name(), config.ExtMarkdown) { + if entry.IsDir() || !strings.HasSuffix(entry.Name(), file.ExtMarkdown) { continue } info, infoErr := entry.Info() diff --git a/internal/cli/system/core/context_size.go b/internal/cli/system/core/context_size.go new file mode 100644 index 00000000..568adb77 --- /dev/null +++ b/internal/cli/system/core/context_size.go @@ -0,0 +1,199 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "fmt" + "os" + "path/filepath" + "strconv" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// TokenUsageLine formats a context window usage line for display. +// Shows an icon (normal or warning), token count, percentage, and window size. +// +// Parameters: +// - tokens: number of tokens used +// - pct: percentage of context window used +// - windowSize: total context window size +// +// Returns: +// - string: formatted usage line (e.g., "⏱ Context window: ~12k tokens (~60% of 200k)") +func TokenUsageLine(tokens, pct, windowSize int) string { + icon := assets.TextDesc(assets.TextDescKeyCheckContextSizeTokenNormal) + suffix := "" + if pct >= stats.ContextWindowThresholdPct { + icon = assets.TextDesc(assets.TextDescKeyCheckContextSizeTokenLow) + suffix = assets.TextDesc(assets.TextDescKeyCheckContextSizeRunningLowSuffix) + } + return fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeTokenUsage), + icon, FormatTokenCount(tokens), pct, FormatWindowSize(windowSize), suffix) +} + +// OversizeNudgeContent checks for an injection-oversize flag file and returns +// the raw nudge content if present. Deletes the flag after reading (one-shot). +// +// Returns: +// - string: raw oversize nudge content, or empty string if no flag +func OversizeNudgeContent() string { + baseDir := filepath.Join(rc.ContextDir(), dir.State) + flagPath := filepath.Join(baseDir, stats.ContextSizeInjectionOversizeFlag) + data, readErr := validation.SafeReadFile(baseDir, stats.ContextSizeInjectionOversizeFlag) + if readErr != nil { + return "" + } + + tokenCount := ExtractOversizeTokens(data) + fallback := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeOversizeFallback), tokenCount) + content := LoadMessage(hook.CheckContextSize, hook.VariantOversize, + map[string]any{tpl.VarTokenCount: tokenCount}, fallback) + if content == "" { + _ = os.Remove(flagPath) // silenced, still consume the flag + return "" + } + + _ = os.Remove(flagPath) // one-shot: consumed + return content +} + +// ExtractOversizeTokens parses the token count from an injection-oversize flag file. +// +// Parameters: +// - data: raw bytes from the flag file +// +// Returns: +// - int: parsed token count, or 0 if not found +func ExtractOversizeTokens(data []byte) int { + m := regex.OversizeTokens.FindSubmatch(data) + if m == nil { + return 0 + } + n, parseErr := strconv.Atoi(string(m[1])) + if parseErr != nil { + return 0 + } + return n +} + +// EmitCheckpoint emits the standard checkpoint box with optional token usage. +// +// Parameters: +// - cmd: Cobra command for output +// - logFile: absolute path to the log file +// - sessionID: session identifier +// - count: current prompt count +// - tokens: token usage count +// - pct: context window usage percentage +// - windowSize: total context window size +func EmitCheckpoint(cmd *cobra.Command, logFile, sessionID string, count, tokens, pct, windowSize int) { + fallback := assets.TextDesc(assets.TextDescKeyCheckContextSizeCheckpointFallback) + content := LoadMessage(hook.CheckContextSize, hook.VariantCheckpoint, nil, fallback) + if content == "" { + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeSilencedCheckpointLog), count)) + return + } + // Append optional token usage and oversize nudge to content + if tokens > 0 { + content += token.NewlineLF + TokenUsageLine(tokens, pct, windowSize) + } + if extra := OversizeNudgeContent(); extra != "" { + content += token.NewlineLF + extra + } + cmd.Println(NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckContextSizeRelayPrefix), + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeCheckpointBoxTitle), count), + content)) + cmd.Println() + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeCheckpointLogFormat), count, tokens, pct)) + ref := notify.NewTemplateRef(hook.CheckContextSize, hook.VariantCheckpoint, nil) + checkpointMsg := hook.CheckContextSize + ": " + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeCheckpointRelayFormat), count) + NudgeAndRelay(checkpointMsg, sessionID, ref) +} + +// EmitWindowWarning emits an independent context window warning (>80%). +// +// Parameters: +// - cmd: Cobra command for output +// - logFile: absolute path to the log file +// - sessionID: session identifier +// - count: current prompt count +// - tokens: token usage count +// - pct: context window usage percentage +func EmitWindowWarning(cmd *cobra.Command, logFile, sessionID string, count, tokens, pct int) { + fallback := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeWindowFallback), pct, FormatTokenCount(tokens)) + content := LoadMessage(hook.CheckContextSize, hook.VariantWindow, + map[string]any{tpl.VarPercentage: pct, tpl.VarTokenCount: FormatTokenCount(tokens)}, fallback) + if content == "" { + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeSilencedWindowLog), count, pct)) + return + } + cmd.Println(NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckContextSizeRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckContextSizeWindowBoxTitle), + content)) + cmd.Println() + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeWindowLogFormat), count, tokens, pct)) + ref := notify.NewTemplateRef(hook.CheckContextSize, hook.VariantWindow, + map[string]any{tpl.VarPercentage: pct, tpl.VarTokenCount: FormatTokenCount(tokens)}) + windowMsg := hook.CheckContextSize + ": " + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeWindowRelayFormat), pct) + NudgeAndRelay(windowMsg, sessionID, ref) +} + +// EmitBillingWarning emits a one-shot warning when token usage crosses the +// billing_token_warn threshold. +// +// Parameters: +// - cmd: Cobra command for output +// - logFile: absolute path to the log file +// - sessionID: session identifier +// - count: current prompt count +// - tokens: token usage count +// - threshold: billing token warning threshold +func EmitBillingWarning(cmd *cobra.Command, logFile, sessionID string, count, tokens, threshold int) { + // One-shot guard: skip if already warned this session. + warnedFile := filepath.Join(StateDir(), stats.ContextSizeBillingWarnedPrefix+sessionID) + if _, statErr := os.Stat(warnedFile); statErr == nil { + return // already fired + } + + fallback := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeBillingFallback), + FormatTokenCount(tokens), FormatTokenCount(threshold)) + content := LoadMessage(hook.CheckContextSize, hook.VariantBilling, + map[string]any{tpl.VarTokenCount: FormatTokenCount(tokens), tpl.VarThreshold: FormatTokenCount(threshold)}, fallback) + if content == "" { + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeSilencedBillingLog), count, tokens, threshold)) + TouchFile(warnedFile) // silenced counts as fired + return + } + + cmd.Println(NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckContextSizeBillingRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckContextSizeBillingBoxTitle), + content)) + cmd.Println() + + TouchFile(warnedFile) // one-shot: mark as fired + LogMessage(logFile, sessionID, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeBillingLogFormat), count, tokens, threshold)) + ref := notify.NewTemplateRef(hook.CheckContextSize, hook.VariantBilling, + map[string]any{tpl.VarTokenCount: FormatTokenCount(tokens), tpl.VarThreshold: FormatTokenCount(threshold)}) + billingMsg := hook.CheckContextSize + ": " + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckContextSizeBillingRelayFormat), + FormatTokenCount(tokens), FormatTokenCount(threshold)) + NudgeAndRelay(billingMsg, sessionID, ref) +} diff --git a/internal/cli/system/core/events.go b/internal/cli/system/core/events.go new file mode 100644 index 00000000..8230668d --- /dev/null +++ b/internal/cli/system/core/events.go @@ -0,0 +1,112 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "encoding/json" + "fmt" + "strings" + "time" + + "github.com/ActiveMemory/ctx/internal/config/event" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// FormatEventTimestamp converts an RFC3339 timestamp to local time display +// using the DateTimePreciseFormat layout. +// +// Parameters: +// - ts: RFC3339-formatted timestamp string +// +// Returns: +// - string: local time formatted as "2006-01-02 15:04:05", or the +// original string on parse failure +func FormatEventTimestamp(ts string) string { + t, parseErr := time.Parse(time.RFC3339, ts) + if parseErr != nil { + return ts + } + return t.Local().Format(time2.DateTimePreciseFormat) +} + +// ExtractHookName gets the hook name from an event payload's detail field. +// Falls back to extracting from the message prefix (e.g., "qa-reminder: ..."). +// +// Parameters: +// - e: event payload to inspect +// +// Returns: +// - string: hook name, or EventsHookFallback if undetermined +func ExtractHookName(e notify.Payload) string { + if e.Detail != nil && e.Detail.Hook != "" { + return e.Detail.Hook + } + // Fall back to extracting from message prefix (e.g., "qa-reminder: ...") + if idx := strings.Index(e.Message, ":"); idx > 0 { + return e.Message[:idx] + } + return event.EventsHookFallback +} + +// TruncateMessage limits message length for display, appending a +// truncation suffix when the message exceeds maxLen characters. +// +// Parameters: +// - msg: message to potentially truncate +// - maxLen: maximum allowed length including suffix +// +// Returns: +// - string: original or truncated message +func TruncateMessage(msg string, maxLen int) string { + if len(msg) <= maxLen { + return msg + } + return msg[:maxLen-len(event.EventsTruncationSuffix)] + + event.EventsTruncationSuffix +} + +// OutputEventsJSON writes events as raw JSONL to the command output. +// +// Parameters: +// - cmd: Cobra command for output +// - evts: event payloads to serialize +// +// Returns: +// - error: Always nil (marshal errors are silently skipped) +func OutputEventsJSON(cmd *cobra.Command, evts []notify.Payload) error { + for _, e := range evts { + line, marshalErr := json.Marshal(e) + if marshalErr != nil { + continue + } + cmd.Println(string(line)) + } + return nil +} + +// OutputEventsHuman writes events in aligned columns for human reading. +// +// Parameters: +// - cmd: Cobra command for output +// - evts: event payloads to display +// +// Returns: +// - error: Always nil +func OutputEventsHuman(cmd *cobra.Command, evts []notify.Payload) error { + fmtStr := assets.TextDesc(assets.TextDescKeyEventsHumanFormat) + for _, e := range evts { + ts := FormatEventTimestamp(e.Timestamp) + hookName := ExtractHookName(e) + msg := TruncateMessage(e.Message, event.EventsMessageMaxLen) + cmd.Println(fmt.Sprintf(fmtStr, ts, e.Event, hookName, msg)) + } + return nil +} diff --git a/internal/cli/system/core/heartbeat.go b/internal/cli/system/core/heartbeat.go new file mode 100644 index 00000000..a93c4355 --- /dev/null +++ b/internal/cli/system/core/heartbeat.go @@ -0,0 +1,43 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "os" + "strconv" + "strings" +) + +// ReadMtime reads a stored mtime value from a file. Returns 0 if the +// file does not exist or the content cannot be parsed as an integer. +// +// Parameters: +// - path: absolute path to the mtime state file +// +// Returns: +// - int64: the stored mtime value, or 0 on error +func ReadMtime(path string) int64 { + data, readErr := os.ReadFile(path) //nolint:gosec // temp file path + if readErr != nil { + return 0 + } + n, parseErr := strconv.ParseInt(strings.TrimSpace(string(data)), 10, 64) + if parseErr != nil { + return 0 + } + return n +} + +// WriteMtime writes a mtime value to a file. Errors are silently +// ignored (best-effort state persistence). +// +// Parameters: +// - path: absolute path to the mtime state file +// - mtime: the mtime value to store +func WriteMtime(path string, mtime int64) { + _ = os.WriteFile(path, []byte(strconv.FormatInt(mtime, 10)), 0o600) +} diff --git a/internal/cli/system/core/input.go b/internal/cli/system/core/input.go index ed80fb46..680b7369 100644 --- a/internal/cli/system/core/input.go +++ b/internal/cli/system/core/input.go @@ -12,35 +12,10 @@ import ( "os" "time" + "github.com/ActiveMemory/ctx/internal/config/session" "github.com/spf13/cobra" ) -// HookInput represents the JSON payload that Claude Code sends to hook -// commands via stdin. -type HookInput struct { - SessionID string `json:"session_id"` - ToolInput ToolInput `json:"tool_input"` -} - -// ToolInput contains the tool-specific fields from a Claude Code hook -// invocation. For Bash hooks, Command holds the shell command. -type ToolInput struct { - Command string `json:"command"` -} - -// HookResponse is the JSON output format for Claude Code hooks. -// Using structured JSON ensures the agent processes the output as a directive -// rather than treating it as ignorable plain text. -type HookResponse struct { - HookSpecificOutput *HookSpecificOutput `json:"hookSpecificOutput,omitempty"` -} - -// HookSpecificOutput carries event-specific fields inside a HookResponse. -type HookSpecificOutput struct { - HookEventName string `json:"hookEventName"` - AdditionalContext string `json:"additionalContext,omitempty"` -} - // PrintHookContext emits a JSON HookResponse with additionalContext for the // given hook event. This is the standard way for non-blocking hooks to inject // directives that the agent will actually process (plain text gets ignored). @@ -60,6 +35,26 @@ func PrintHookContext(cmd *cobra.Command, event, context string) { cmd.Println(string(data)) } +// HookPreamble reads hook input, resolves the session ID, and checks the +// pause state. Most hooks share this exact preamble sequence. +// +// Parameters: +// - stdin: standard input for hook JSON +// +// Returns: +// - input: parsed hook input +// - sessionID: resolved session identifier (falls back to config.IDSessionUnknown) +// - paused: true if the session is currently paused +func HookPreamble(stdin *os.File) (input HookInput, sessionID string, paused bool) { + input = ReadInput(stdin) + sessionID = input.SessionID + if sessionID == "" { + sessionID = session.IDUnknown + } + paused = Paused(sessionID) > 0 + return +} + // ReadInput reads and parses the JSON hook input from r. // Returns a zero-value HookInput on any error (graceful degradation). // diff --git a/internal/cli/system/core/journal_check.go b/internal/cli/system/core/journal_check.go new file mode 100644 index 00000000..30cc6fc8 --- /dev/null +++ b/internal/cli/system/core/journal_check.go @@ -0,0 +1,93 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "os" + "path/filepath" + "strings" + + "github.com/ActiveMemory/ctx/internal/journal/state" +) + +// NewestMtime returns the most recent mtime (as Unix timestamp) of files +// with the given extension in the directory. Returns 0 if none found. +// +// Parameters: +// - dir: absolute path to the directory to scan +// - ext: file extension to match (e.g. file.ExtMarkdown) +// +// Returns: +// - int64: Unix timestamp of the newest matching file, or 0 +func NewestMtime(dir, ext string) int64 { + entries, readErr := os.ReadDir(dir) + if readErr != nil { + return 0 + } + + var latest int64 + for _, entry := range entries { + if entry.IsDir() || !strings.HasSuffix(entry.Name(), ext) { + continue + } + info, infoErr := entry.Info() + if infoErr != nil { + continue + } + mtime := info.ModTime().Unix() + if mtime > latest { + latest = mtime + } + } + return latest +} + +// CountNewerFiles recursively counts files with the given extension that +// are newer than the reference timestamp. +// +// Parameters: +// - dir: absolute path to the root directory to walk +// - ext: file extension to match (e.g. ".jsonl") +// - refTime: Unix timestamp threshold; only files newer than this are counted +// +// Returns: +// - int: number of matching files newer than refTime +func CountNewerFiles(dir, ext string, refTime int64) int { + count := 0 + _ = filepath.Walk(dir, func(_ string, info os.FileInfo, walkErr error) error { + if walkErr != nil { + return nil // skip errors + } + if info.IsDir() { + return nil + } + if !strings.HasSuffix(info.Name(), ext) { + return nil + } + if info.ModTime().Unix() > refTime { + count++ + } + return nil + }) + return count +} + +// CountUnenriched counts journal .md files that lack an enriched date +// in the journal state file. +// +// Parameters: +// - dir: absolute path to the journal directory +// +// Returns: +// - int: number of unenriched journal entries +func CountUnenriched(dir string) int { + jstate, loadErr := state.Load(dir) + if loadErr != nil { + return 0 + } + return jstate.CountUnenriched(dir) +} diff --git a/internal/cli/system/core/knowledge.go b/internal/cli/system/core/knowledge.go new file mode 100644 index 00000000..5d23ac2c --- /dev/null +++ b/internal/cli/system/core/knowledge.go @@ -0,0 +1,149 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "bytes" + "fmt" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/index" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// ScanKnowledgeFiles checks knowledge files against their configured +// thresholds and returns any that exceed the limits. +// +// Parameters: +// - contextDir: absolute path to the context directory +// - decThreshold: max decision entries (0 = disabled) +// - lrnThreshold: max learning entries (0 = disabled) +// - convThreshold: max convention lines (0 = disabled) +// +// Returns: +// - []KnowledgeFinding: files exceeding thresholds, or nil if all within limits +func ScanKnowledgeFiles( + contextDir string, decThreshold, lrnThreshold, convThreshold int, +) []KnowledgeFinding { + var findings []KnowledgeFinding + + if decThreshold > 0 { + if data, readErr := validation.SafeReadFile(contextDir, ctx.Decision); readErr == nil { + count := len(index.ParseEntryBlocks(string(data))) + if count > decThreshold { + findings = append(findings, KnowledgeFinding{ + File: ctx.Decision, Count: count, Threshold: decThreshold, Unit: "entries", + }) + } + } + } + + if lrnThreshold > 0 { + if data, readErr := validation.SafeReadFile(contextDir, ctx.Learning); readErr == nil { + count := len(index.ParseEntryBlocks(string(data))) + if count > lrnThreshold { + findings = append(findings, KnowledgeFinding{ + File: ctx.Learning, Count: count, Threshold: lrnThreshold, Unit: "entries", + }) + } + } + } + + if convThreshold > 0 { + if data, readErr := validation.SafeReadFile(contextDir, ctx.Convention); readErr == nil { + lineCount := bytes.Count(data, []byte(token.NewlineLF)) + if lineCount > convThreshold { + findings = append(findings, KnowledgeFinding{ + File: ctx.Convention, Count: lineCount, Threshold: convThreshold, Unit: "lines", + }) + } + } + } + + return findings +} + +// FormatKnowledgeWarnings builds a pre-formatted findings list string +// from the given findings. +// +// Parameters: +// - findings: knowledge file threshold violations +// +// Returns: +// - string: formatted warning lines for template injection +func FormatKnowledgeWarnings(findings []KnowledgeFinding) string { + var b strings.Builder + findingFmt := assets.TextDesc(assets.TextDescKeyCheckKnowledgeFindingFormat) + for _, f := range findings { + b.WriteString(fmt.Sprintf(findingFmt, f.File, f.Count, f.Unit, f.Threshold)) + } + return b.String() +} + +// EmitKnowledgeWarning builds and prints the knowledge file growth warning box. +// +// Parameters: +// - cmd: Cobra command for output +// - sessionID: session identifier for notifications +// - fileWarnings: pre-formatted findings text +func EmitKnowledgeWarning(cmd *cobra.Command, sessionID, fileWarnings string) { + fallback := fileWarnings + token.NewlineLF + assets.TextDesc(assets.TextDescKeyCheckKnowledgeFallback) + content := LoadMessage(hook.CheckKnowledge, hook.VariantWarning, + map[string]any{tpl.VarFileWarnings: fileWarnings}, fallback) + if content == "" { + return + } + + cmd.Println(NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckKnowledgeRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckKnowledgeBoxTitle), + content)) + + ref := notify.NewTemplateRef(hook.CheckKnowledge, hook.VariantWarning, + map[string]any{tpl.VarFileWarnings: fileWarnings}) + notifyMsg := hook.CheckKnowledge + ": " + assets.TextDesc(assets.TextDescKeyCheckKnowledgeRelayMessage) + NudgeAndRelay(notifyMsg, sessionID, ref) +} + +// CheckKnowledgeHealth runs the full knowledge health check: scans files, +// formats warnings, and emits output if any thresholds are exceeded. +// Returns true if warnings were emitted. +// +// Parameters: +// - cmd: Cobra command for output +// - sessionID: session identifier for notifications +// +// Returns: +// - bool: true if warnings were emitted +func CheckKnowledgeHealth(cmd *cobra.Command, sessionID string) bool { + lrnThreshold := rc.EntryCountLearnings() + decThreshold := rc.EntryCountDecisions() + convThreshold := rc.ConventionLineCount() + + // All disabled — nothing to check + if lrnThreshold == 0 && decThreshold == 0 && convThreshold == 0 { + return false + } + + findings := ScanKnowledgeFiles(rc.ContextDir(), decThreshold, lrnThreshold, convThreshold) + if len(findings) == 0 { + return false + } + + fileWarnings := FormatKnowledgeWarnings(findings) + EmitKnowledgeWarning(cmd, sessionID, fileWarnings) + return true +} diff --git a/internal/cli/system/core/load_gate.go b/internal/cli/system/core/load_gate.go new file mode 100644 index 00000000..68475ed4 --- /dev/null +++ b/internal/cli/system/core/load_gate.go @@ -0,0 +1,84 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "fmt" + "os" + "path/filepath" + "strings" + "time" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// ExtractIndex returns the content between INDEX:START and INDEX:END +// markers within a context file. +// +// Parameters: +// - content: full file content to search +// +// Returns: +// - string: trimmed index content, or empty string if markers are +// not found or improperly ordered +func ExtractIndex(content string) string { + start := strings.Index(content, marker.IndexStart) + end := strings.Index(content, marker.IndexEnd) + if start < 0 || end < 0 || end <= start { + return "" + } + startPos := start + len(marker.IndexStart) + return strings.TrimSpace(content[startPos:end]) +} + +// WriteOversizeFlag writes an injection-oversize flag file when the total +// injected tokens exceed the configured threshold. The flag file is read +// by check-context-size to emit an oversize warning. +// +// Parameters: +// - contextDir: absolute path to the .context/ directory +// - totalTokens: total injected token count +// - perFile: per-file token breakdown for diagnostics +func WriteOversizeFlag( + contextDir string, totalTokens int, perFile []FileTokenEntry, +) { + threshold := rc.InjectionTokenWarn() + if threshold == 0 || totalTokens <= threshold { + return + } + + sd := filepath.Join(contextDir, dir.State) + _ = os.MkdirAll(sd, fs.PermRestrictedDir) + + var flag strings.Builder + flag.WriteString(assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeHeader)) + flag.WriteString(strings.Repeat("=", stats.ContextSizeOversizeSepLen) + token.NewlineLF) + flag.WriteString(fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeTimestamp), + time.Now().UTC().Format(time.RFC3339))) + flag.WriteString(fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeInjected), + totalTokens, threshold)) + flag.WriteString(assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeBreakdown)) + for _, entry := range perFile { + flag.WriteString(fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeFileEntry), + entry.Name, entry.Tokens)) + } + flag.WriteString(token.NewlineLF) + flag.WriteString(assets.TextDesc(assets.TextDescKeyContextLoadGateOversizeAction)) + + _ = os.WriteFile( + filepath.Join(sd, stats.ContextSizeInjectionOversizeFlag), + []byte(flag.String()), fs.PermSecret) +} diff --git a/internal/cli/system/core/map_staleness.go b/internal/cli/system/core/map_staleness.go new file mode 100644 index 00000000..8f11a04c --- /dev/null +++ b/internal/cli/system/core/map_staleness.go @@ -0,0 +1,96 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "encoding/json" + "fmt" + "os/exec" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/architecture" + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// ReadMapTracking reads and parses the map-tracking.json file from the +// context directory. +// +// Returns: +// - *MapTrackingInfo: parsed tracking info, or nil if not found or invalid +func ReadMapTracking() *MapTrackingInfo { + data, readErr := validation.SafeReadFile(rc.ContextDir(), architecture.MapTracking) + if readErr != nil { + return nil + } + + var info MapTrackingInfo + if jsonErr := json.Unmarshal(data, &info); jsonErr != nil { + return nil + } + + return &info +} + +// CountModuleCommits counts git commits touching internal/ since the given date. +// +// Parameters: +// - since: date string in YYYY-MM-DD format +// +// Returns: +// - int: number of commits, or 0 on error or if git is unavailable +func CountModuleCommits(since string) int { + if _, lookErr := exec.LookPath("git"); lookErr != nil { + return 0 + } + out, gitErr := exec.Command("git", "log", "--oneline", "--since="+since, "--", "internal/").Output() //nolint:gosec // date string from JSON + if gitErr != nil { + return 0 + } + lines := strings.TrimSpace(string(out)) + if lines == "" { + return 0 + } + return len(strings.Split(lines, token.NewlineLF)) +} + +// EmitMapStalenessWarning builds and prints the architecture map staleness +// warning box. +// +// Parameters: +// - cmd: Cobra command for output +// - sessionID: session identifier for notifications +// - dateStr: last refresh date (YYYY-MM-DD) +// - moduleCommits: number of commits touching modules since last refresh +func EmitMapStalenessWarning(cmd *cobra.Command, sessionID, dateStr string, moduleCommits int) { + fallback := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckMapStalenessFallback), dateStr, moduleCommits) + content := LoadMessage(hook.CheckMapStaleness, hook.VariantStale, + map[string]any{ + tpl.VarLastRefreshDate: dateStr, + tpl.VarModuleCount: moduleCommits, + }, fallback) + if content == "" { + return + } + + cmd.Println(NudgeBox( + assets.TextDesc(assets.TextDescKeyCheckMapStalenessRelayPrefix), + assets.TextDesc(assets.TextDescKeyCheckMapStalenessBoxTitle), + content)) + + ref := notify.NewTemplateRef(hook.CheckMapStaleness, hook.VariantStale, + map[string]any{tpl.VarLastRefreshDate: dateStr, tpl.VarModuleCount: moduleCommits}) + notifyMsg := hook.CheckMapStaleness + ": " + assets.TextDesc(assets.TextDescKeyCheckMapStalenessRelayMessage) + NudgeAndRelay(notifyMsg, sessionID, ref) +} diff --git a/internal/cli/system/core/message.go b/internal/cli/system/core/message.go index b936b032..d4f05019 100644 --- a/internal/cli/system/core/message.go +++ b/internal/cli/system/core/message.go @@ -8,14 +8,19 @@ package core import ( "bytes" - "os" "path/filepath" "strings" "text/template" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/box" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/config/token" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" "github.com/ActiveMemory/ctx/internal/rc" + "github.com/ActiveMemory/ctx/internal/validation" ) // LoadMessage loads a hook message template by hook name and variant. @@ -38,11 +43,11 @@ import ( // Returns: // - string: Rendered message or empty string for intentional silence func LoadMessage(hook, variant string, vars map[string]any, fallback string) string { - filename := variant + ".txt" + filename := variant + file.ExtTxt // 1. User override in .context/ - userPath := filepath.Join(rc.ContextDir(), "hooks", "messages", hook, filename) - if data, readErr := os.ReadFile(userPath); readErr == nil { //nolint:gosec // project-local override path + overrideDir := filepath.Join(rc.ContextDir(), dir.HooksMessages, hook) + if data, readErr := validation.SafeReadFile(overrideDir, filename); readErr == nil { return renderTemplate(string(data), vars, fallback) } @@ -64,7 +69,7 @@ func renderTemplate(tmpl string, vars map[string]any, fallback string) string { return "" // intentional silence } - t, parseErr := template.New("msg").Parse(tmpl) + t, parseErr := template.New(session.TemplateName).Parse(tmpl) if parseErr != nil { return fallback } @@ -76,16 +81,6 @@ func renderTemplate(tmpl string, vars map[string]any, fallback string) string { return buf.String() } -// BoxBottom is the standard bottom border for hook message boxes. -const BoxBottom = "└──────────────────────────────────────────────────" - -// VariantBoth is the template variant name used when both ceremony -// conditions are unmet (e.g. neither remember nor wrapup done). -const VariantBoth = "both" - -// SessionUnknown is the fallback session ID used when input lacks one. -const SessionUnknown = "unknown" - // BoxLines wraps each line of content with the │ box-drawing prefix. // Trailing newlines on content are trimmed before splitting to avoid // an empty trailing box line. @@ -97,10 +92,36 @@ const SessionUnknown = "unknown" // - string: Box-wrapped content func BoxLines(content string) string { var b strings.Builder - for _, line := range strings.Split(strings.TrimRight(content, config.NewlineLF), config.NewlineLF) { - b.WriteString("│ ") + for _, line := range strings.Split(strings.TrimRight(content, token.NewlineLF), token.NewlineLF) { + b.WriteString(box.LinePrefix) b.WriteString(line) - b.WriteString(config.NewlineLF) + b.WriteString(token.NewlineLF) } return b.String() } + +// NudgeBox builds a complete nudge box with relay prefix, titled top +// border, box-wrapped content, optional context directory footer, and +// bottom border. +// +// Parameters: +// - relayPrefix: VERBATIM relay instruction line +// - title: box title (e.g., "Backup Warning") +// - content: multi-line body text +// +// Returns: +// - string: fully formatted nudge box +func NudgeBox(relayPrefix, title, content string) string { + pad := box.NudgeBoxWidth - len(title) + if pad < 0 { + pad = 0 + } + msg := relayPrefix + token.NewlineLF + token.NewlineLF + + box.Top + title + " " + strings.Repeat("─", pad) + token.NewlineLF + msg += BoxLines(content) + if line := ctxcontext.DirLine(); line != "" { + msg += box.LinePrefix + line + token.NewlineLF + } + msg += box.Bottom + return msg +} diff --git a/internal/cli/system/core/message_cmd.go b/internal/cli/system/core/message_cmd.go new file mode 100644 index 00000000..3a79b382 --- /dev/null +++ b/internal/cli/system/core/message_cmd.go @@ -0,0 +1,83 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "fmt" + "os" + "path/filepath" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/assets/hooks/messages" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// ValidationError returns an error for an unknown hook/variant combination. +// It distinguishes between an entirely unknown hook and an unknown variant +// within a known hook. +// +// Parameters: +// - hook: the hook name to validate +// - variant: the variant name to validate +// +// Returns: +// - error: descriptive error with guidance to list available options +func ValidationError(hook, variant string) error { + if messages.Variants(hook) == nil { + return ctxerr.UnknownHook(hook) + } + return ctxerr.UnknownVariant(variant, hook) +} + +// PrintTemplateVars prints available template variables for a hook message. +// If no variables are defined, prints a "(none)" indicator. +// +// Parameters: +// - cmd: Cobra command for output +// - info: hook message info containing template variable names +func PrintTemplateVars(cmd *cobra.Command, info *messages.HookMessageInfo) { + if len(info.TemplateVars) == 0 { + cmd.Println(assets.TextDesc(assets.TextDescKeyMessageTemplateVarsNone)) + return + } + formatted := make([]string, len(info.TemplateVars)) + for i, v := range info.TemplateVars { + formatted[i] = "{{." + v + "}}" + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMessageTemplateVarsLabel), strings.Join(formatted, ", "))) +} + +// OverridePath returns the user override file path for a hook/variant. +// +// Parameters: +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - string: full filesystem path to the override file +func OverridePath(hook, variant string) string { + return filepath.Join(rc.ContextDir(), dir.HooksMessages, hook, variant+file.ExtTxt) +} + +// HasOverride checks whether a user override file exists. +// +// Parameters: +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - bool: true if an override file exists +func HasOverride(hook, variant string) bool { + _, statErr := os.Stat(OverridePath(hook, variant)) + return statErr == nil +} diff --git a/internal/cli/system/core/persistence.go b/internal/cli/system/core/persistence.go new file mode 100644 index 00000000..02de7ada --- /dev/null +++ b/internal/cli/system/core/persistence.go @@ -0,0 +1,94 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "fmt" + "os" + "path/filepath" + "strconv" + "strings" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/nudge" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/validation" +) + +// ReadPersistenceState reads a persistence state file and returns the +// parsed state. Returns ok=false if the file does not exist or cannot +// be read. +// +// Parameters: +// - path: absolute path to the state file +// +// Returns: +// - PersistenceState: parsed counter state +// - bool: true if the file was read successfully +func ReadPersistenceState(path string) (PersistenceState, bool) { + data, readErr := validation.SafeReadFile(filepath.Dir(path), filepath.Base(path)) + if readErr != nil { + return PersistenceState{}, false + } + + var ps PersistenceState + for _, line := range strings.Split(strings.TrimSpace(string(data)), token.NewlineLF) { + parts := strings.SplitN(line, token.KeyValueSep, 2) + if len(parts) != 2 { + continue + } + switch parts[0] { + case nudge.PersistenceKeyCount: + n, parseErr := strconv.Atoi(parts[1]) + if parseErr == nil { + ps.Count = n + } + case nudge.PersistenceKeyLastNudge: + n, parseErr := strconv.Atoi(parts[1]) + if parseErr == nil { + ps.LastNudge = n + } + case nudge.PersistenceKeyLastMtime: + n, parseErr := strconv.ParseInt(parts[1], 10, 64) + if parseErr == nil { + ps.LastMtime = n + } + } + } + return ps, true +} + +// WritePersistenceState writes the persistence state to the given file. +// +// Parameters: +// - path: absolute path to the state file +// - s: state to persist +func WritePersistenceState(path string, s PersistenceState) { + content := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckPersistenceStateFormat), + s.Count, s.LastNudge, s.LastMtime) + _ = os.WriteFile(path, []byte(content), fs.PermSecret) +} + +// PersistenceNudgeNeeded determines whether a persistence nudge should +// fire based on prompt count and the number of prompts since the last nudge. +// +// Parameters: +// - count: total prompt count for the session +// - sinceNudge: number of prompts since the last nudge or context update +// +// Returns: +// - bool: true if a nudge should be emitted +func PersistenceNudgeNeeded(count, sinceNudge int) bool { + if count >= nudge.PersistenceEarlyMin && count <= nudge.PersistenceEarlyMax && sinceNudge >= nudge.PersistenceEarlyInterval { + return true + } + if count > nudge.PersistenceEarlyMax && sinceNudge >= nudge.PersistenceLateInterval { + return true + } + return false +} diff --git a/internal/cli/system/core/prune.go b/internal/cli/system/core/prune.go index 128bb4c5..894abb80 100644 --- a/internal/cli/system/core/prune.go +++ b/internal/cli/system/core/prune.go @@ -12,6 +12,8 @@ import ( "path/filepath" "regexp" "time" + + time2 "github.com/ActiveMemory/ctx/internal/config/time" ) // UUIDPattern matches a UUID (v4) anywhere in a filename. @@ -37,7 +39,7 @@ func AutoPrune(days int) int { return 0 } - cutoff := time.Now().Add(-time.Duration(days) * 24 * time.Hour) + cutoff := time.Now().Add(-time.Duration(days) * time2.HoursPerDay * time.Hour) var pruned int for _, entry := range entries { diff --git a/internal/cli/system/core/relay.go b/internal/cli/system/core/relay.go new file mode 100644 index 00000000..33e55bec --- /dev/null +++ b/internal/cli/system/core/relay.go @@ -0,0 +1,39 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/eventlog" + "github.com/ActiveMemory/ctx/internal/notify" +) + +// Relay sends a relay notification and appends the same event to the +// local event log. This is the standard two-sink pattern used by most +// hooks after emitting output. +// +// Parameters: +// - message: human-readable event description +// - sessionID: current session identifier +// - ref: template reference for filtering/aggregation (may be nil) +func Relay(message, sessionID string, ref *notify.TemplateRef) { + _ = notify.Send(hook.NotifyChannelRelay, message, sessionID, ref) + eventlog.Append(hook.NotifyChannelRelay, message, sessionID, ref) +} + +// NudgeAndRelay sends both a nudge and a relay notification, then +// appends the relay event to the local event log. Used by hooks that +// emit both notification types with the same message. +// +// Parameters: +// - message: human-readable event description +// - sessionID: current session identifier +// - ref: template reference for filtering/aggregation (may be nil) +func NudgeAndRelay(message, sessionID string, ref *notify.TemplateRef) { + _ = notify.Send(hook.NotifyChannelNudge, message, sessionID, ref) + Relay(message, sessionID, ref) +} diff --git a/internal/cli/system/core/resources.go b/internal/cli/system/core/resources.go new file mode 100644 index 00000000..1f589525 --- /dev/null +++ b/internal/cli/system/core/resources.go @@ -0,0 +1,256 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "encoding/json" + "fmt" + "strings" + + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/sysinfo" +) + +// PctOf calculates the percentage of used relative to total. +// Returns 0 when total is zero to avoid division by zero. +// +// Parameters: +// - used: the consumed amount +// - total: the total capacity +// +// Returns: +// - int: percentage (0-100) +func PctOf(used, total uint64) int { + if total == 0 { + return 0 + } + return int(float64(used) / float64(total) * 100) +} + +// SeverityFor returns the severity level for a given resource name +// from an alert list. Returns SeverityOK if no alert matches. +// +// Parameters: +// - alerts: list of resource alerts to search +// - resource: resource name to match (e.g., "memory", "disk") +// +// Returns: +// - sysinfo.Severity: the severity level for the resource +func SeverityFor(alerts []sysinfo.ResourceAlert, resource string) sysinfo.Severity { + for _, a := range alerts { + if a.Resource == resource { + return a.Severity + } + } + return sysinfo.SeverityOK +} + +// StatusText returns a human-readable status indicator string for a +// severity level, using the embedded text assets. +// +// Parameters: +// - sev: severity level +// +// Returns: +// - string: formatted status text (e.g., "✓ ok", "⚠ WARNING", "✖ DANGER") +func StatusText(sev sysinfo.Severity) string { + switch sev { + case sysinfo.SeverityWarning: + return assets.TextDesc(assets.TextDescKeyResourcesStatusWarn) + case sysinfo.SeverityDanger: + return assets.TextDesc(assets.TextDescKeyResourcesStatusDanger) + default: + return assets.TextDesc(assets.TextDescKeyResourcesStatusOk) + } +} + +// FormatResourceLine builds a single resource output line with +// left-aligned label+values and a right-aligned status indicator +// at the configured column position. +// +// Parameters: +// - label: resource label (e.g., "Memory:") +// - values: formatted resource values +// - status: status indicator text +// +// Returns: +// - string: formatted line with aligned status +func FormatResourceLine(label, values, status string) string { + left := fmt.Sprintf("%-7s %s", label, values) + pad := stats.ResourcesStatusCol - len(left) + if pad < 1 { + pad = 1 + } + return left + strings.Repeat(" ", pad) + status +} + +// OutputResourcesText prints system resource information in human-readable +// table format with status indicators and alert summaries. +// +// Parameters: +// - cmd: Cobra command for output +// - snap: collected system resource snapshot +// - alerts: evaluated resource alerts +func OutputResourcesText(cmd *cobra.Command, snap sysinfo.Snapshot, alerts []sysinfo.ResourceAlert) { + cmd.Println(assets.TextDesc(assets.TextDescKeyResourcesHeader)) + cmd.Println(assets.TextDesc(assets.TextDescKeyResourcesSeparator)) + cmd.Println() + + // Memory line + if snap.Memory.Supported { + pct := PctOf(snap.Memory.UsedBytes, snap.Memory.TotalBytes) + values := fmt.Sprintf("%5s / %5s GB (%d%%)", + sysinfo.FormatGiB(snap.Memory.UsedBytes), + sysinfo.FormatGiB(snap.Memory.TotalBytes), + pct) + sev := SeverityFor(alerts, "memory") + cmd.Println(FormatResourceLine("Memory:", values, StatusText(sev))) + } + + // Swap line + if snap.Memory.Supported { + pct := PctOf(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes) + values := fmt.Sprintf("%5s / %5s GB (%d%%)", + sysinfo.FormatGiB(snap.Memory.SwapUsedBytes), + sysinfo.FormatGiB(snap.Memory.SwapTotalBytes), + pct) + sev := SeverityFor(alerts, "swap") + cmd.Println(FormatResourceLine("Swap:", values, StatusText(sev))) + } + + // Disk line + if snap.Disk.Supported { + pct := PctOf(snap.Disk.UsedBytes, snap.Disk.TotalBytes) + values := fmt.Sprintf("%5s / %5s GB (%d%%)", + sysinfo.FormatGiB(snap.Disk.UsedBytes), + sysinfo.FormatGiB(snap.Disk.TotalBytes), + pct) + sev := SeverityFor(alerts, "disk") + cmd.Println(FormatResourceLine("Disk:", values, StatusText(sev))) + } + + // Load line + if snap.Load.Supported { + ratio := 0.0 + if snap.Load.NumCPU > 0 { + ratio = snap.Load.Load1 / float64(snap.Load.NumCPU) + } + values := fmt.Sprintf("%5.2f / %5.2f / %5.2f (%d CPUs, ratio %.2f)", + snap.Load.Load1, snap.Load.Load5, snap.Load.Load15, + snap.Load.NumCPU, ratio) + sev := SeverityFor(alerts, "load") + cmd.Println(FormatResourceLine("Load:", values, StatusText(sev))) + } + + // Summary + cmd.Println() + if len(alerts) == 0 { + cmd.Println(assets.TextDesc(assets.TextDescKeyResourcesAllClear)) + } else { + cmd.Println(assets.TextDesc(assets.TextDescKeyResourcesAlerts)) + for _, a := range alerts { + if a.Severity == sysinfo.SeverityDanger { + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertDanger), a.Message)) + } else { + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertWarning), a.Message)) + } + } + } +} + +// OutputResourcesJSON writes system resource information as formatted +// JSON to the command's output writer. +// +// Parameters: +// - cmd: Cobra command for output +// - snap: collected system resource snapshot +// - alerts: evaluated resource alerts +// +// Returns: +// - error: Non-nil on JSON encoding failure +func OutputResourcesJSON(cmd *cobra.Command, snap sysinfo.Snapshot, alerts []sysinfo.ResourceAlert) error { + type jsonAlert struct { + Severity string `json:"severity"` + Resource string `json:"resource"` + Message string `json:"message"` + } + type jsonOutput struct { + Memory struct { + TotalBytes uint64 `json:"total_bytes"` + UsedBytes uint64 `json:"used_bytes"` + Percent int `json:"percent"` + Supported bool `json:"supported"` + } `json:"memory"` + Swap struct { + TotalBytes uint64 `json:"total_bytes"` + UsedBytes uint64 `json:"used_bytes"` + Percent int `json:"percent"` + Supported bool `json:"supported"` + } `json:"swap"` + Disk struct { + TotalBytes uint64 `json:"total_bytes"` + UsedBytes uint64 `json:"used_bytes"` + Percent int `json:"percent"` + Path string `json:"path"` + Supported bool `json:"supported"` + } `json:"disk"` + Load struct { + Load1 float64 `json:"load1"` + Load5 float64 `json:"load5"` + Load15 float64 `json:"load15"` + NumCPU int `json:"num_cpu"` + Ratio float64 `json:"ratio"` + Supported bool `json:"supported"` + } `json:"load"` + Alerts []jsonAlert `json:"alerts"` + MaxSeverity string `json:"max_severity"` + } + + out := jsonOutput{} + + out.Memory.TotalBytes = snap.Memory.TotalBytes + out.Memory.UsedBytes = snap.Memory.UsedBytes + out.Memory.Percent = PctOf(snap.Memory.UsedBytes, snap.Memory.TotalBytes) + out.Memory.Supported = snap.Memory.Supported + + out.Swap.TotalBytes = snap.Memory.SwapTotalBytes + out.Swap.UsedBytes = snap.Memory.SwapUsedBytes + out.Swap.Percent = PctOf(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes) + out.Swap.Supported = snap.Memory.Supported + + out.Disk.TotalBytes = snap.Disk.TotalBytes + out.Disk.UsedBytes = snap.Disk.UsedBytes + out.Disk.Percent = PctOf(snap.Disk.UsedBytes, snap.Disk.TotalBytes) + out.Disk.Path = snap.Disk.Path + out.Disk.Supported = snap.Disk.Supported + + out.Load.Load1 = snap.Load.Load1 + out.Load.Load5 = snap.Load.Load5 + out.Load.Load15 = snap.Load.Load15 + out.Load.NumCPU = snap.Load.NumCPU + if snap.Load.NumCPU > 0 { + out.Load.Ratio = snap.Load.Load1 / float64(snap.Load.NumCPU) + } + out.Load.Supported = snap.Load.Supported + + out.Alerts = make([]jsonAlert, 0, len(alerts)) + for _, a := range alerts { + out.Alerts = append(out.Alerts, jsonAlert{ + Severity: a.Severity.String(), + Resource: a.Resource, + Message: a.Message, + }) + } + out.MaxSeverity = sysinfo.MaxSeverity(alerts).String() + + enc := json.NewEncoder(cmd.OutOrStdout()) + enc.SetIndent("", " ") + return enc.Encode(out) +} diff --git a/internal/cli/system/core/session_tokens.go b/internal/cli/system/core/session_tokens.go index c3f3b95f..dabb9124 100644 --- a/internal/cli/system/core/session_tokens.go +++ b/internal/cli/system/core/session_tokens.go @@ -15,7 +15,13 @@ import ( "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/config/stats" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -23,13 +29,6 @@ import ( // JSONL file when scanning for the last usage block. const MaxTailBytes = 32768 -// SessionTokenInfo holds token usage and model information extracted from a -// session's JSONL file. -type SessionTokenInfo struct { - Tokens int // Total input tokens (input + cache_creation + cache_read) - Model string // Model ID from the last assistant message, or "" -} - // ReadSessionTokenInfo finds the current session's JSONL file and returns // the most recent total input token count and model ID from the last // assistant message. Returns zero value if the file isn't found or has no @@ -42,7 +41,7 @@ type SessionTokenInfo struct { // - SessionTokenInfo: Token count and model from the last assistant message // - error: Non-nil only on unexpected I/O errors func ReadSessionTokenInfo(sessionID string) (SessionTokenInfo, error) { - if sessionID == "" || sessionID == SessionUnknown { + if sessionID == "" || sessionID == session.IDUnknown { return SessionTokenInfo{}, nil } @@ -68,7 +67,7 @@ func ReadSessionTokenInfo(sessionID string) (SessionTokenInfo, error) { // - error: Non-nil only on unexpected errors func FindJSONLPath(sessionID string) (string, error) { // Check cache first - cacheFile := filepath.Join(StateDir(), "jsonl-path-"+sessionID) + cacheFile := filepath.Join(StateDir(), stats.JsonlPathCachePrefix+sessionID) if data, readErr := os.ReadFile(cacheFile); readErr == nil { //nolint:gosec // state dir path cached := strings.TrimSpace(string(data)) if cached != "" { @@ -83,7 +82,7 @@ func FindJSONLPath(sessionID string) (string, error) { return "", nil } - pattern := filepath.Join(home, ".claude", "projects", "*", sessionID+".jsonl") + pattern := filepath.Join(home, dir.Claude, dir.Projects, "*", sessionID+file.ExtJSONL) matches, globErr := filepath.Glob(pattern) if globErr != nil { return "", globErr @@ -94,29 +93,10 @@ func FindJSONLPath(sessionID string) (string, error) { } // Cache the result for subsequent calls this session - _ = os.WriteFile(cacheFile, []byte(matches[0]), 0o600) + _ = os.WriteFile(cacheFile, []byte(matches[0]), fs.PermSecret) return matches[0], nil } -// usageData represents the minimal usage fields from a Claude Code JSONL -// assistant message. Only the fields needed for token counting are included. -type usageData struct { - InputTokens int `json:"input_tokens"` - CacheCreationInputTokens int `json:"cache_creation_input_tokens"` - CacheReadInputTokens int `json:"cache_read_input_tokens"` -} - -// jsonlMessage represents the minimal structure of a Claude Code JSONL line -// needed to extract usage and model data from assistant messages. -type jsonlMessage struct { - Type string `json:"type"` - Message struct { - Role string `json:"role"` - Model string `json:"model"` - Usage usageData `json:"usage"` - } `json:"message"` -} - // ParseLastUsageAndModel reads the tail of a JSONL file and extracts the // last assistant message's usage data and model ID. // @@ -155,7 +135,7 @@ func ParseLastUsageAndModel(path string) (SessionTokenInfo, error) { } // Scan lines in reverse for the last assistant message with usage - lines := bytes.Split(tail, []byte(config.NewlineLF)) + lines := bytes.Split(tail, []byte(token.NewlineLF)) for i := len(lines) - 1; i >= 0; i-- { line := bytes.TrimSpace(lines[i]) if len(line) == 0 { @@ -175,7 +155,7 @@ func ParseLastUsageAndModel(path string) (SessionTokenInfo, error) { continue } - if msg.Message.Role != "assistant" { + if msg.Message.Role != claude.RoleAssistant { continue } @@ -257,7 +237,7 @@ func ClaudeSettingsHas1M() bool { if homeErr != nil { return false } - data, readErr := os.ReadFile(filepath.Join(home, ".claude", "settings.json")) //nolint:gosec // user home config + data, readErr := os.ReadFile(filepath.Join(home, dir.Claude, claude.GlobalSettings)) //nolint:gosec // user home config if readErr != nil { return false } diff --git a/internal/cli/system/core/smb.go b/internal/cli/system/core/smb.go index 69a9c40f..7df80dfd 100644 --- a/internal/cli/system/core/smb.go +++ b/internal/cli/system/core/smb.go @@ -13,7 +13,8 @@ import ( "os/exec" "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/fs" ) // SMBConfig holds parsed SMB share connection details. @@ -51,7 +52,7 @@ func ParseSMBConfig(smbURL, subdir string) (*SMBConfig, error) { } if subdir == "" { - subdir = config.BackupDefaultSubdir + subdir = archive.BackupDefaultSubdir } gvfsPath := fmt.Sprintf("/run/user/%d/gvfs/smb-share:server=%s,share=%s", @@ -97,7 +98,7 @@ func EnsureSMBMount(cfg *SMBConfig) error { // - error: Non-nil on copy failure func CopyToSMB(cfg *SMBConfig, localPath string) error { dest := filepath.Join(cfg.GVFSPath, cfg.Subdir) - if mkdirErr := os.MkdirAll(dest, config.PermExec); mkdirErr != nil { + if mkdirErr := os.MkdirAll(dest, fs.PermExec); mkdirErr != nil { return fmt.Errorf("create destination dir: %w", mkdirErr) } @@ -107,7 +108,7 @@ func CopyToSMB(cfg *SMBConfig, localPath string) error { } destFile := filepath.Join(dest, filepath.Base(localPath)) - if writeErr := os.WriteFile(destFile, data, config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(destFile, data, fs.PermFile); writeErr != nil { return fmt.Errorf("write to SMB: %w", writeErr) } diff --git a/internal/cli/system/core/state.go b/internal/cli/system/core/state.go index 31a665d2..18d6dc2c 100644 --- a/internal/cli/system/core/state.go +++ b/internal/cli/system/core/state.go @@ -15,20 +15,13 @@ import ( "strings" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/event" + "github.com/ActiveMemory/ctx/internal/config/session" + ctxcontext "github.com/ActiveMemory/ctx/internal/context" "github.com/ActiveMemory/ctx/internal/rc" ) -// ResolvedJournalDir returns the path to the journal directory within the -// configured context directory. Uses rc.ContextDir() so it respects .ctxrc -// and CLI overrides. -// -// Returns: -// - string: Absolute path to the journal directory -func ResolvedJournalDir() string { - return filepath.Join(rc.ContextDir(), config.DirJournal) -} - // StateDir returns the project-scoped runtime state directory // (.context/state/). Ensures the directory exists on each call — MkdirAll // is a no-op when the directory is already present. @@ -36,9 +29,9 @@ func ResolvedJournalDir() string { // Returns: // - string: Absolute path to the state directory func StateDir() string { - dir := filepath.Join(rc.ContextDir(), config.DirState) - _ = os.MkdirAll(dir, 0o750) - return dir + d := filepath.Join(rc.ContextDir(), dir.State) + _ = os.MkdirAll(d, 0o750) + return d } // ReadCounter reads an integer counter from a file. Returns 0 if the file @@ -71,7 +64,7 @@ func WriteCounter(path string, n int) { } // LogMessage appends a timestamped log line to the given file. -// Rotates the log when it exceeds config.LogMaxBytes, keeping one +// Rotates the log when it exceeds config.HookLogMaxBytes, keeping one // previous generation (.1 suffix) — same pattern as eventlog. // // Parameters: @@ -101,7 +94,7 @@ func LogMessage(logFile, sessionID, msg string) { } // RotateLog checks the log file size and rotates if it exceeds -// config.LogMaxBytes. The previous generation is replaced. +// config.HookLogMaxBytes. The previous generation is replaced. // // Parameters: // - logFile: Absolute path to the log file @@ -110,7 +103,7 @@ func RotateLog(logFile string) { if statErr != nil { return } - if info.Size() < int64(config.LogMaxBytes) { + if info.Size() < int64(event.HookLogMaxBytes) { return } prev := logFile + ".1" @@ -144,14 +137,14 @@ func TouchFile(path string) { _ = os.WriteFile(path, nil, 0o600) } -// IsInitialized reports whether the context directory has been properly set up +// Initialized reports whether the context directory has been properly set up // via "ctx init". Hooks should no-op when this returns false to avoid // creating partial state (e.g. logs/) before initialization. // // Returns: // - bool: True if context directory is initialized -func IsInitialized() bool { - return config.Initialized(rc.ContextDir()) +func Initialized() bool { + return ctxcontext.Initialized(rc.ContextDir()) } // PauseMarkerPath returns the path to the session pause marker file. @@ -262,24 +255,11 @@ func WriteSessionStats(sessionID string, stats SessionStats) { // - stdin: File to read input from // // Returns: -// - string: Session ID or SessionUnknown +// - string: Session ID or config.IDSessionUnknown func ReadSessionID(stdin *os.File) string { input := ReadInput(stdin) if input.SessionID == "" { - return SessionUnknown + return session.IDUnknown } return input.SessionID } - -// ContextDirLine returns a one-line context directory identifier. -// Returns empty string if directory cannot be resolved (callers omit footer). -// -// Returns: -// - string: "Context: " or empty string -func ContextDirLine() string { - dir := rc.ContextDir() - if dir == "" { - return "" - } - return "Context: " + dir -} diff --git a/internal/cli/system/core/stats.go b/internal/cli/system/core/stats.go new file mode 100644 index 00000000..8146dd4c --- /dev/null +++ b/internal/cli/system/core/stats.go @@ -0,0 +1,309 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "encoding/json" + "fmt" + "os" + "path/filepath" + "sort" + "strings" + "time" + + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/journal" + "github.com/ActiveMemory/ctx/internal/config/stats" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + ctxerr "github.com/ActiveMemory/ctx/internal/err" +) + +// ReadStatsDir reads all stats JSONL files, optionally filtered by session prefix. +// +// Parameters: +// - dir: path to the state directory +// - sessionFilter: session ID prefix to filter by (empty for all) +// +// Returns: +// - []StatsEntry: sorted stats entries +// - error: non-nil on glob failure +func ReadStatsDir(dir, sessionFilter string) ([]StatsEntry, error) { + pattern := filepath.Join(dir, stats.FilePrefix+"*"+file.ExtJSONL) + matches, globErr := filepath.Glob(pattern) + if globErr != nil { + return nil, ctxerr.StatsGlob(globErr) + } + + var entries []StatsEntry + for _, path := range matches { + sid := ExtractStatsSessionID(filepath.Base(path)) + if sessionFilter != "" && !strings.HasPrefix(sid, sessionFilter) { + continue + } + fileEntries, parseErr := ParseStatsFile(path, sid) + if parseErr != nil { + continue + } + entries = append(entries, fileEntries...) + } + + sort.Slice(entries, func(i, j int) bool { + ti, ei := time.Parse(time.RFC3339, entries[i].Timestamp) + tj, ej := time.Parse(time.RFC3339, entries[j].Timestamp) + if ei != nil || ej != nil { + return entries[i].Timestamp < entries[j].Timestamp + } + return ti.Before(tj) + }) + + return entries, nil +} + +// ExtractStatsSessionID gets the session ID from a filename like +// "stats-abc123.jsonl". +// +// Parameters: +// - basename: file basename +// +// Returns: +// - string: session ID +func ExtractStatsSessionID(basename string) string { + s := strings.TrimPrefix(basename, stats.FilePrefix) + return strings.TrimSuffix(s, file.ExtJSONL) +} + +// ParseStatsFile reads all JSONL lines from a stats file. +// +// Parameters: +// - path: absolute path to the stats file +// - sid: session ID for this file +// +// Returns: +// - []StatsEntry: parsed entries +// - error: non-nil on read failure +func ParseStatsFile(path, sid string) ([]StatsEntry, error) { + data, readErr := os.ReadFile(path) //nolint:gosec // project-local state path + if readErr != nil { + return nil, readErr + } + + var entries []StatsEntry + for _, line := range strings.Split(strings.TrimSpace(string(data)), token.NewlineLF) { + if line == "" { + continue + } + var s SessionStats + if jsonErr := json.Unmarshal([]byte(line), &s); jsonErr != nil { + continue + } + entries = append(entries, StatsEntry{SessionStats: s, Session: sid}) + } + return entries, nil +} + +// DumpStats outputs the last N entries in either JSON or human-readable format. +// +// Parameters: +// - cmd: Cobra command for output +// - entries: stats entries to display +// - last: number of entries to show (0 for all) +// - jsonOut: whether to output as JSONL +// +// Returns: +// - error: non-nil on output failure +func DumpStats(cmd *cobra.Command, entries []StatsEntry, last int, jsonOut bool) error { + if len(entries) == 0 { + cmd.Println(assets.TextDesc(assets.TextDescKeyStatsEmpty)) + return nil + } + + // Tail: take last N entries. + if last > 0 && len(entries) > last { + entries = entries[len(entries)-last:] + } + + if jsonOut { + return OutputStatsJSON(cmd, entries) + } + + PrintStatsHeader(cmd) + for i := range entries { + PrintStatsLine(cmd, &entries[i]) + } + return nil +} + +// OutputStatsJSON writes entries as raw JSONL. +// +// Parameters: +// - cmd: Cobra command for output +// - entries: stats entries to serialize +// +// Returns: +// - error: Always nil (marshal errors are silently skipped) +func OutputStatsJSON(cmd *cobra.Command, entries []StatsEntry) error { + for _, e := range entries { + line, marshalErr := json.Marshal(e) + if marshalErr != nil { + continue + } + cmd.Println(string(line)) + } + return nil +} + +// PrintStatsHeader prints the column header for human output. +// +// Parameters: +// - cmd: Cobra command for output +func PrintStatsHeader(cmd *cobra.Command) { + fmtStr := assets.TextDesc(assets.TextDescKeyStatsHeaderFormat) + cmd.Println(fmt.Sprintf(fmtStr, + stats.HeaderTime, stats.HeaderSession, + stats.HeaderPrompt, stats.HeaderTokens, + stats.HeaderPct, stats.HeaderEvent)) + cmd.Println(fmt.Sprintf(fmtStr, + stats.SepTime, stats.SepSession, + stats.SepPrompt, stats.SepTokens, + stats.SepPct, stats.SepEvent)) +} + +// PrintStatsLine prints a single stats entry in human-readable format. +// +// Parameters: +// - cmd: Cobra command for output +// - e: stats entry to print +func PrintStatsLine(cmd *cobra.Command, e *StatsEntry) { + ts := FormatStatsTimestamp(e.Timestamp) + sid := e.Session + if len(sid) > journal.SessionIDShortLen { + sid = sid[:journal.SessionIDShortLen] + } + tokens := FormatTokenCount(e.Tokens) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyStatsLineFormat), + ts, sid, e.Prompt, tokens, e.Pct, e.Event)) +} + +// FormatStatsTimestamp converts an RFC3339 timestamp to local time display +// using the DateTimePreciseFormat layout. +// +// Parameters: +// - ts: RFC3339-formatted timestamp string +// +// Returns: +// - string: local time formatted as "2006-01-02 15:04:05", or the +// original string on parse failure +func FormatStatsTimestamp(ts string) string { + t, parseErr := time.Parse(time.RFC3339, ts) + if parseErr != nil { + return ts + } + return t.Local().Format(time2.DateTimePreciseFormat) +} + +// ReadNewLines reads bytes from offset to end and parses JSONL lines. +// +// Parameters: +// - path: absolute path to the stats file +// - offset: byte offset to start reading from +// - sid: session ID for this file +// +// Returns: +// - []StatsEntry: newly parsed entries +func ReadNewLines(path string, offset int64, sid string) []StatsEntry { + f, openErr := os.Open(path) //nolint:gosec // project-local state path + if openErr != nil { + return nil + } + defer func() { _ = f.Close() }() + + if _, seekErr := f.Seek(offset, 0); seekErr != nil { + return nil + } + + buf := make([]byte, stats.ReadBufSize) + n, readErr := f.Read(buf) + if readErr != nil || n == 0 { + return nil + } + + var entries []StatsEntry + for _, line := range strings.Split(strings.TrimSpace(string(buf[:n])), token.NewlineLF) { + if line == "" { + continue + } + var s SessionStats + if jsonErr := json.Unmarshal([]byte(line), &s); jsonErr != nil { + continue + } + entries = append(entries, StatsEntry{SessionStats: s, Session: sid}) + } + return entries +} + +// StreamStats polls for new JSONL lines and prints them as they arrive. +// +// Parameters: +// - cmd: Cobra command for output +// - dir: path to the state directory +// - sessionFilter: session ID prefix to filter by (empty for all) +// - jsonOut: whether to output as JSONL +// +// Returns: +// - error: Always nil +func StreamStats(cmd *cobra.Command, dir, sessionFilter string, jsonOut bool) error { + // Track file sizes to detect new content. + offsets := make(map[string]int64) + matches, _ := filepath.Glob(filepath.Join(dir, stats.FilePrefix+"*"+file.ExtJSONL)) + for _, path := range matches { + info, statErr := os.Stat(path) + if statErr == nil { + offsets[path] = info.Size() + } + } + + ticker := time.NewTicker(time.Second) + defer ticker.Stop() + + for range ticker.C { + matches, _ = filepath.Glob(filepath.Join(dir, stats.FilePrefix+"*"+file.ExtJSONL)) + for _, path := range matches { + sid := ExtractStatsSessionID(filepath.Base(path)) + if sessionFilter != "" && !strings.HasPrefix(sid, sessionFilter) { + continue + } + + info, statErr := os.Stat(path) + if statErr != nil { + continue + } + prev := offsets[path] + if info.Size() <= prev { + continue + } + + newEntries := ReadNewLines(path, prev, sid) + for i := range newEntries { + if jsonOut { + line, marshalErr := json.Marshal(newEntries[i]) + if marshalErr == nil { + cmd.Println(string(line)) + } + } else { + PrintStatsLine(cmd, &newEntries[i]) + } + } + offsets[path] = info.Size() + } + } + + return nil +} diff --git a/internal/cli/system/core/types.go b/internal/cli/system/core/types.go new file mode 100644 index 00000000..203d6112 --- /dev/null +++ b/internal/cli/system/core/types.go @@ -0,0 +1,133 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +// ArchiveEntry describes a directory or file to include in a backup archive. +type ArchiveEntry struct { + // SourcePath is the absolute path to the directory or file. + SourcePath string + // Prefix is the path prefix inside the tar archive. + Prefix string + // ExcludeDir is a directory name to skip (e.g. "journal-site"). + ExcludeDir string + // Optional means a missing source is not an error. + Optional bool +} + +// BackupResult holds the outcome of a single archive creation. +type BackupResult struct { + Scope string `json:"scope"` + Archive string `json:"archive"` + Size int64 `json:"size"` + SMBDest string `json:"smb_dest,omitempty"` +} + +// BlockResponse is the JSON output for blocked commands. +type BlockResponse struct { + Decision string `json:"decision"` + Reason string `json:"reason"` +} + +// HookInput represents the JSON payload that Claude Code sends to hook +// commands via stdin. +type HookInput struct { + SessionID string `json:"session_id"` + ToolInput ToolInput `json:"tool_input"` +} + +// ToolInput contains the tool-specific fields from a Claude Code hook +// invocation. For Bash hooks, Command holds the shell command. +type ToolInput struct { + Command string `json:"command"` +} + +// HookResponse is the JSON output format for Claude Code hooks. +// Using structured JSON ensures the agent processes the output as a directive +// rather than treating it as ignorable plain text. +type HookResponse struct { + HookSpecificOutput *HookSpecificOutput `json:"hookSpecificOutput,omitempty"` +} + +// HookSpecificOutput carries event-specific fields inside a HookResponse. +type HookSpecificOutput struct { + HookEventName string `json:"hookEventName"` + AdditionalContext string `json:"additionalContext,omitempty"` +} + +// FileTokenEntry tracks per-file token counts during context injection. +type FileTokenEntry struct { + Name string + Tokens int +} + +// StatsEntry is a SessionStats with the source file for display. +type StatsEntry struct { + SessionStats + Session string `json:"session"` +} + +// SessionTokenInfo holds token usage and model information extracted from a +// session's JSONL file. +type SessionTokenInfo struct { + Tokens int // Total input tokens (input + cache_creation + cache_read) + Model string // Model ID from the last assistant message, or "" +} + +// usageData represents the minimal usage fields from a Claude Code JSONL +// assistant message. Only the fields needed for token counting are included. +type usageData struct { + InputTokens int `json:"input_tokens"` + CacheCreationInputTokens int `json:"cache_creation_input_tokens"` + CacheReadInputTokens int `json:"cache_read_input_tokens"` +} + +// jsonlMessage represents the minimal structure of a Claude Code JSONL line +// needed to extract usage and model data from assistant messages. +type jsonlMessage struct { + Type string `json:"type"` + Message struct { + Role string `json:"role"` + Model string `json:"model"` + Usage usageData `json:"usage"` + } `json:"message"` +} + +// PersistenceState holds the counter state for persistence nudging. +type PersistenceState struct { + Count int + LastNudge int + LastMtime int64 +} + +// MessageListEntry holds the data for a single row in the message list output. +type MessageListEntry struct { + Hook string `json:"hook"` + Variant string `json:"variant"` + Category string `json:"category"` + Description string `json:"description"` + TemplateVars []string `json:"template_vars"` + HasOverride bool `json:"has_override"` +} + +// MapTrackingInfo holds the minimal fields needed from map-tracking.json. +type MapTrackingInfo struct { + OptedOut bool `json:"opted_out"` + LastRun string `json:"last_run"` +} + +// KnowledgeFinding describes a single knowledge file that exceeds its +// configured threshold. +type KnowledgeFinding struct { + // File is the context filename (e.g., DECISIONS.md). + File string + // Count is the actual entry or line count. + Count int + // Threshold is the configured maximum. + Threshold int + // Unit is the measurement unit ("entries" or "lines"). + Unit string +} diff --git a/internal/cli/system/core/version.go b/internal/cli/system/core/version.go new file mode 100644 index 00000000..ad5a3095 --- /dev/null +++ b/internal/cli/system/core/version.go @@ -0,0 +1,90 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package core + +import ( + "fmt" + "os" + "strings" + "time" + + "github.com/ActiveMemory/ctx/internal/config/hook" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/tpl" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/crypto" + "github.com/ActiveMemory/ctx/internal/notify" + "github.com/ActiveMemory/ctx/internal/rc" +) + +// ParseMajorMinor extracts major and minor version numbers from a semver +// string like "1.2.3". Returns ok=false for unparseable versions. +// +// Parameters: +// - ver: version string in semver format +// +// Returns: +// - major: major version number +// - minor: minor version number +// - ok: true if parsing succeeded +func ParseMajorMinor(ver string) (major, minor int, ok bool) { + parts := strings.SplitN(ver, ".", 3) + if len(parts) < 2 { + return 0, 0, false + } + var m, n int + if _, scanErr := fmt.Sscanf(parts[0], "%d", &m); scanErr != nil { + return 0, 0, false + } + if _, scanErr := fmt.Sscanf(parts[1], "%d", &n); scanErr != nil { + return 0, 0, false + } + return m, n, true +} + +// CheckKeyAge emits a nudge when the encryption key is older than the +// configured rotation threshold. +// +// Parameters: +// - cmd: Cobra command for output +// - sessionID: current session identifier +func CheckKeyAge(cmd *cobra.Command, sessionID string) { + crypto.MigrateKeyFile(rc.ContextDir()) + kp := rc.KeyPath() + info, statErr := os.Stat(kp) + if statErr != nil { + return // no key — nothing to check + } + + ageDays := int(time.Since(info.ModTime()).Hours() / 24) + threshold := rc.KeyRotationDays() + + if ageDays < threshold { + return + } + + keyFallback := fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyCheckVersionKeyFallback), ageDays, + ) + keyContent := LoadMessage(hook.CheckVersion, hook.VariantKeyRotation, + map[string]any{tpl.VarKeyAgeDays: ageDays}, keyFallback) + if keyContent == "" { + return + } + + boxTitle := assets.TextDesc(assets.TextDescKeyCheckVersionKeyBoxTitle) + relayPrefix := assets.TextDesc(assets.TextDescKeyCheckVersionKeyRelayPrefix) + + cmd.Println(token.NewlineLF + NudgeBox(relayPrefix, boxTitle, keyContent)) + + keyRef := notify.NewTemplateRef(hook.CheckVersion, hook.VariantKeyRotation, + map[string]any{tpl.VarKeyAgeDays: ageDays}) + keyNotifyMsg := hook.CheckVersion + ": " + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyCheckVersionKeyRelayFormat), ageDays) + NudgeAndRelay(keyNotifyMsg, sessionID, keyRef) +} diff --git a/internal/cli/system/core/version_drift.go b/internal/cli/system/core/version_drift.go index 691780f9..db603f72 100644 --- a/internal/cli/system/core/version_drift.go +++ b/internal/cli/system/core/version_drift.go @@ -12,10 +12,10 @@ import ( "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/config/hook" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - "github.com/ActiveMemory/ctx/internal/eventlog" "github.com/ActiveMemory/ctx/internal/notify" ) @@ -52,15 +52,14 @@ func CheckVersionDrift(cmd *cobra.Command, sessionID string) { } fallback := "VERSION (" + fileVer + "), plugin.json (" + pluginVer + "), marketplace.json (" + marketVer + ") are out of sync. Update all three before releasing." - msg := LoadMessage("version-drift", "nudge", vars, fallback) + msg := LoadMessage(hook.VersionDrift, hook.VariantNudge, vars, fallback) if msg == "" { return } - PrintHookContext(cmd, "PostToolUse", msg) + PrintHookContext(cmd, hook.EventPostToolUse, msg) - ref := notify.NewTemplateRef("version-drift", "nudge", vars) - _ = notify.Send("relay", "version-drift: versions out of sync", sessionID, ref) - eventlog.Append("relay", "version-drift: versions out of sync", sessionID, ref) + ref := notify.NewTemplateRef(hook.VersionDrift, hook.VariantNudge, vars) + Relay(hook.VersionDrift+": "+assets.TextDesc(assets.TextDescKeyVersionDriftRelayMessage), sessionID, ref) } // ReadVersionFile reads and trims the VERSION file from the project root. diff --git a/internal/cli/system/core/wrapup.go b/internal/cli/system/core/wrapup.go index acb3bb3f..1141aa8a 100644 --- a/internal/cli/system/core/wrapup.go +++ b/internal/cli/system/core/wrapup.go @@ -10,10 +10,9 @@ import ( "os" "path/filepath" "time" -) -// WrappedUpMarker is the filename for the wrap-up suppression marker. -const WrappedUpMarker = "ctx-wrapped-up" + "github.com/ActiveMemory/ctx/internal/config/wrap" +) // WrappedUpExpiry is how long the marker suppresses nudges. const WrappedUpExpiry = 2 * time.Hour @@ -26,7 +25,7 @@ const WrappedUpExpiry = 2 * time.Hour // Returns: // - bool: True if wrap-up marker is fresh func WrappedUpRecently() bool { - markerPath := filepath.Join(StateDir(), WrappedUpMarker) + markerPath := filepath.Join(StateDir(), wrap.WrappedUpMarker) info, statErr := os.Stat(markerPath) if statErr != nil { diff --git a/internal/cli/system/system.go b/internal/cli/system/system.go index 6bc058ef..89ddeeda 100644 --- a/internal/cli/system/system.go +++ b/internal/cli/system/system.go @@ -9,35 +9,36 @@ package system import ( "github.com/spf13/cobra" + "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/backup" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/blockdangerouscommands" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/blocknonpathctx" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/block_dangerous_commands" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/block_non_path_ctx" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/bootstrap" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkbackupage" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkceremonies" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkcontextsize" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkjournal" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkknowledge" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkmapstaleness" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkmemorydrift" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkpersistence" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkreminders" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkresources" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checktaskcompletion" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/checkversion" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/contextloadgate" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_backup_age" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_ceremonies" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_context_size" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_journal" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_knowledge" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_map_staleness" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_memory_drift" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_persistence" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_reminders" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_resources" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_task_completion" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/check_version" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/context_load_gate" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/events" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/heartbeat" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/markjournal" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/markwrappedup" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/mark_journal" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/mark_wrapped_up" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/message" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/pause" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/postcommit" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/post_commit" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/prune" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/qareminder" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/qa_reminder" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/resources" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/resume" - "github.com/ActiveMemory/ctx/internal/cli/system/cmd/specsnudge" + "github.com/ActiveMemory/ctx/internal/cli/system/cmd/specs_nudge" "github.com/ActiveMemory/ctx/internal/cli/system/cmd/stats" ) @@ -57,80 +58,45 @@ import ( // Returns: // - *cobra.Command: Parent command with resource display, plumbing, and hook subcommands func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeySystem) + cmd := &cobra.Command{ Use: "system", - Short: "System diagnostics and hook commands", - Long: `System diagnostics and hook commands. - -Subcommands: - backup Backup context and Claude data - resources Show system resource usage (memory, swap, disk, load) - bootstrap Print context location for AI agents - message Manage hook message templates (list/show/edit/reset) - - stats Show session token usage stats - -Plumbing subcommands (used by skills and automation): - mark-journal Update journal processing state - mark-wrapped-up Suppress checkpoint nudges after wrap-up - pause Pause context hooks for this session - resume Resume context hooks for this session - prune Clean stale per-session state files - events Query the local hook event log - -Hook subcommands (Claude Code plugin — safe to run manually): - context-load-gate Context file read directive (PreToolUse) - check-context-size Context size checkpoint - check-ceremonies Session ceremony adoption nudge - check-persistence Context persistence nudge - check-journal Journal maintenance reminder - check-resources Resource pressure warning (DANGER only) - check-knowledge Knowledge file growth nudge - check-reminders Pending reminders relay - check-version Version update nudge - check-map-staleness Architecture map staleness nudge - block-non-path-ctx Block non-PATH ctx invocations - block-dangerous-commands Block dangerous command patterns (project-local) - check-backup-age Backup staleness check (project-local) - check-task-completion Task completion nudge after edits - post-commit Post-commit context capture nudge - qa-reminder QA reminder before completion - specs-nudge Plan-to-specs directory nudge (PreToolUse) - check-memory-drift Memory drift nudge (MEMORY.md changed) - heartbeat Session heartbeat webhook (no stdout)`, + Short: short, + Long: long, } cmd.AddCommand( backup.Cmd(), - resources.Cmd(), - stats.Cmd(), + block_dangerous_commands.Cmd(), + block_non_path_ctx.Cmd(), bootstrap.Cmd(), + check_backup_age.Cmd(), + check_ceremonies.Cmd(), + check_context_size.Cmd(), + check_journal.Cmd(), + check_knowledge.Cmd(), + check_map_staleness.Cmd(), + check_memory_drift.Cmd(), + check_persistence.Cmd(), + check_reminders.Cmd(), + check_resources.Cmd(), + check_task_completion.Cmd(), + check_version.Cmd(), + context_load_gate.Cmd(), + events.Cmd(), + heartbeat.Cmd(), + mark_journal.Cmd(), + mark_wrapped_up.Cmd(), message.Cmd(), - markjournal.Cmd(), - markwrappedup.Cmd(), pause.Cmd(), - resume.Cmd(), + post_commit.Cmd(), prune.Cmd(), - events.Cmd(), - contextloadgate.Cmd(), - checkcontextsize.Cmd(), - checkpersistence.Cmd(), - checkjournal.Cmd(), - checkceremonies.Cmd(), - checkreminders.Cmd(), - checkversion.Cmd(), - blocknonpathctx.Cmd(), - checktaskcompletion.Cmd(), - postcommit.Cmd(), - qareminder.Cmd(), - checkresources.Cmd(), - checkknowledge.Cmd(), - checkmapstaleness.Cmd(), - blockdangerouscommands.Cmd(), - checkbackupage.Cmd(), - specsnudge.Cmd(), - checkmemorydrift.Cmd(), - heartbeat.Cmd(), + qa_reminder.Cmd(), + resources.Cmd(), + resume.Cmd(), + specs_nudge.Cmd(), + stats.Cmd(), ) return cmd diff --git a/internal/cli/task/cmd/archive/cmd.go b/internal/cli/task/cmd/archive/cmd.go index 5258dc79..4d6bb1da 100644 --- a/internal/cli/task/cmd/archive/cmd.go +++ b/internal/cli/task/cmd/archive/cmd.go @@ -26,7 +26,7 @@ import ( func Cmd() *cobra.Command { var dryRun bool - short, long := assets.CommandDesc("task.archive") + short, long := assets.CommandDesc(assets.CmdDescKeyTaskArchive) cmd := &cobra.Command{ Use: "archive", @@ -41,7 +41,7 @@ func Cmd() *cobra.Command { &dryRun, "dry-run", false, - assets.FlagDesc("task.archive.dry-run"), + assets.FlagDesc(assets.FlagDescKeyTaskArchiveDryRun), ) return cmd diff --git a/internal/cli/task/cmd/archive/doc.go b/internal/cli/task/cmd/archive/doc.go new file mode 100644 index 00000000..b696eb9a --- /dev/null +++ b/internal/cli/task/cmd/archive/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package archive implements the ctx task archive subcommand. +// +// It moves completed tasks from TASKS.md to a timestamped archive file, +// leaving pending tasks in place. +package archive diff --git a/internal/cli/task/cmd/archive/run.go b/internal/cli/task/cmd/archive/run.go index 28705032..6518ec5a 100644 --- a/internal/cli/task/cmd/archive/run.go +++ b/internal/cli/task/cmd/archive/run.go @@ -7,16 +7,19 @@ package archive import ( - "fmt" "os" - "path/filepath" "strings" + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/cli/compact" + "github.com/ActiveMemory/ctx/internal/assets" + compactcore "github.com/ActiveMemory/ctx/internal/cli/compact/core" "github.com/ActiveMemory/ctx/internal/cli/task/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // runArchive executes the archive subcommand logic. @@ -33,36 +36,33 @@ import ( // - error: Non-nil if TASKS.md doesn't exist or file operations fail func runArchive(cmd *cobra.Command, dryRun bool) error { tasksPath := core.TasksFilePath() - nl := config.NewlineLF + nl := token.NewlineLF // Check if TASKS.md exists if _, statErr := os.Stat(tasksPath); os.IsNotExist(statErr) { - return fmt.Errorf("no TASKS.md found") + return ctxerr.TaskFileNotFound() } // Read TASKS.md - content, readErr := os.ReadFile(filepath.Clean(tasksPath)) + content, readErr := os.ReadFile(tasksPath) //nolint:gosec // project-local context path if readErr != nil { - return fmt.Errorf("failed to read TASKS.md: %w", readErr) + return ctxerr.TaskFileRead(readErr) } lines := strings.Split(string(content), nl) // Parse task blocks using block-based parsing - blocks := compact.ParseTaskBlocks(lines) + blocks := compactcore.ParseTaskBlocks(lines) // Filter to only archivable blocks (completed with no incomplete children) - var archivableBlocks []compact.TaskBlock + var archivableBlocks []compactcore.TaskBlock var skippedCount int for _, block := range blocks { if block.IsArchivable { archivableBlocks = append(archivableBlocks, block) } else { skippedCount++ - cmd.Println(fmt.Sprintf( - "! Skipping (has incomplete children): %s", - block.ParentTaskText(), - )) + write.ArchiveSkipping(cmd, block.ParentTaskText()) } } @@ -71,12 +71,9 @@ func runArchive(cmd *cobra.Command, dryRun bool) error { if len(archivableBlocks) == 0 { if skippedCount > 0 { - cmd.Println(fmt.Sprintf( - "No tasks to archive (%d skipped due to incomplete children).", - skippedCount, - )) + write.ArchiveSkipIncomplete(cmd, skippedCount) } else { - cmd.Println("No completed tasks to archive.") + write.ArchiveNoCompleted(cmd) } return nil } @@ -89,42 +86,28 @@ func runArchive(cmd *cobra.Command, dryRun bool) error { } if dryRun { - cmd.Println("Dry run - no files modified") - cmd.Println() - cmd.Println(fmt.Sprintf( - "Would archive %d completed tasks (keeping %d pending)", - len(archivableBlocks), pendingCount, - )) - cmd.Println() - cmd.Println("Archived content preview:") - cmd.Println(config.Separator) - cmd.Print(archivedContent.String()) - cmd.Println(config.Separator) + write.ArchiveDryRun(cmd, len(archivableBlocks), pendingCount, + archivedContent.String(), token.Separator) return nil } // Write to archive - archiveFilePath, writeErr := compact.WriteArchive("tasks", config.HeadingArchivedTasks, archivedContent.String()) + archiveFilePath, writeErr := compactcore.WriteArchive(archive.ArchiveScopeTasks, assets.HeadingArchivedTasks, archivedContent.String()) if writeErr != nil { return writeErr } // Remove archived blocks from lines and write back - newLines := compact.RemoveBlocksFromLines(lines, archivableBlocks) + newLines := compactcore.RemoveBlocksFromLines(lines, archivableBlocks) newContent := strings.Join(newLines, nl) if updateErr := os.WriteFile( - tasksPath, []byte(newContent), config.PermFile, + tasksPath, []byte(newContent), fs.PermFile, ); updateErr != nil { - return fmt.Errorf("failed to update TASKS.md: %w", updateErr) + return ctxerr.TaskFileWrite(updateErr) } - cmd.Println(fmt.Sprintf( - "✓ Archived %d completed tasks to %s", - len(archivableBlocks), - archiveFilePath, - )) - cmd.Println(fmt.Sprintf(" %d pending tasks remain in TASKS.md", pendingCount)) + write.ArchiveSuccess(cmd, len(archivableBlocks), archiveFilePath, pendingCount) return nil } diff --git a/internal/cli/complete/complete.go b/internal/cli/task/cmd/complete/cmd.go similarity index 73% rename from internal/cli/complete/complete.go rename to internal/cli/task/cmd/complete/cmd.go index 078d7013..b54de494 100644 --- a/internal/cli/complete/complete.go +++ b/internal/cli/task/cmd/complete/cmd.go @@ -10,12 +10,8 @@ import ( "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" - completeroot "github.com/ActiveMemory/ctx/internal/cli/complete/cmd/root" ) -// CompleteTask finds a task and marks it complete. Re-exported from cmd/root. -var CompleteTask = completeroot.CompleteTask - // Cmd returns the "ctx complete" command for marking tasks as done. // // Tasks can be specified by number, partial text match, or full text. @@ -24,14 +20,14 @@ var CompleteTask = completeroot.CompleteTask // Returns: // - *cobra.Command: Configured complete command func Cmd() *cobra.Command { - short, long := assets.CommandDesc("complete") + short, long := assets.CommandDesc(assets.CmdDescKeyComplete) cmd := &cobra.Command{ Use: "complete ", Short: short, Long: long, Args: cobra.ExactArgs(1), - RunE: completeroot.Run, + RunE: Run, } return cmd diff --git a/internal/cli/task/cmd/complete/doc.go b/internal/cli/task/cmd/complete/doc.go new file mode 100644 index 00000000..559d01da --- /dev/null +++ b/internal/cli/task/cmd/complete/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package complete implements the ctx tasks complete command. +// +// It marks a task as completed in TASKS.md by number, partial text +// match, or full text. +package complete diff --git a/internal/cli/complete/cmd/root/run.go b/internal/cli/task/cmd/complete/run.go similarity index 71% rename from internal/cli/complete/cmd/root/run.go rename to internal/cli/task/cmd/complete/run.go index b5e0a840..db9cb6c6 100644 --- a/internal/cli/complete/cmd/root/run.go +++ b/internal/cli/task/cmd/complete/run.go @@ -4,20 +4,24 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package root +package complete import ( - "fmt" "os" "path/filepath" "strconv" "strings" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" "github.com/ActiveMemory/ctx/internal/task" + "github.com/ActiveMemory/ctx/internal/write" ) // CompleteTask finds a task in TASKS.md by number or text match and marks @@ -36,21 +40,21 @@ func CompleteTask(query, contextDir string) (string, error) { contextDir = rc.ContextDir() } - filePath := filepath.Join(contextDir, config.FileTask) + filePath := filepath.Join(contextDir, ctx.Task) // Check if the file exists if _, statErr := os.Stat(filePath); os.IsNotExist(statErr) { - return "", fmt.Errorf("TASKS.md not found") + return "", ctxerr.TaskFileNotFound() } // Read existing content content, readErr := os.ReadFile(filepath.Clean(filePath)) if readErr != nil { - return "", fmt.Errorf("failed to read TASKS.md: %w", readErr) + return "", ctxerr.TaskFileRead(readErr) } // Parse tasks and find matching one - lines := strings.Split(string(content), config.NewlineLF) + lines := strings.Split(string(content), token.NewlineLF) var taskNumber int isNumber := false @@ -64,7 +68,7 @@ func CompleteTask(query, contextDir string) (string, error) { matchedTask := "" for i, line := range lines { - match := config.RegExTask.FindStringSubmatch(line) + match := regex.Task.FindStringSubmatch(line) if match != nil && task.Pending(match) { currentTaskNum++ taskText := task.Content(match) @@ -81,10 +85,7 @@ func CompleteTask(query, contextDir string) (string, error) { strings.ToLower(taskText), strings.ToLower(query), ) { if matchedLine != -1 { - return "", fmt.Errorf( - "multiple tasks match %q; be more specific or use task number", - query, - ) + return "", ctxerr.TaskMultipleMatches(query) } matchedLine = i matchedTask = taskText @@ -93,18 +94,18 @@ func CompleteTask(query, contextDir string) (string, error) { } if matchedLine == -1 { - return "", fmt.Errorf("no task matching %q found", query) + return "", ctxerr.TaskNotFound(query) } // Mark the task as complete - lines[matchedLine] = config.RegExTask.ReplaceAllString( - lines[matchedLine], "$1- [x] $3", + lines[matchedLine] = regex.Task.ReplaceAllString( + lines[matchedLine], regex.TaskCompleteReplace, ) // Write back - newContent := strings.Join(lines, config.NewlineLF) - if writeErr := os.WriteFile(filePath, []byte(newContent), config.PermFile); writeErr != nil { - return "", fmt.Errorf("failed to write TASKS.md: %w", writeErr) + newContent := strings.Join(lines, token.NewlineLF) + if writeErr := os.WriteFile(filePath, []byte(newContent), fs.PermFile); writeErr != nil { + return "", ctxerr.TaskFileWrite(writeErr) } return matchedTask, nil @@ -124,7 +125,7 @@ func Run(cmd *cobra.Command, args []string) error { return completeErr } - cmd.Println(fmt.Sprintf("✓ Completed: %s", matchedTask)) + write.InfoCompletedTask(cmd, matchedTask) return nil } diff --git a/internal/cli/task/cmd/snapshot/cmd.go b/internal/cli/task/cmd/snapshot/cmd.go index ed8c4475..f898158e 100644 --- a/internal/cli/task/cmd/snapshot/cmd.go +++ b/internal/cli/task/cmd/snapshot/cmd.go @@ -24,14 +24,14 @@ import ( // Returns: // - *cobra.Command: Configured snapshot subcommand func Cmd() *cobra.Command { - short, long := assets.CommandDesc("task.snapshot") + short, long := assets.CommandDesc(assets.CmdDescKeyTaskSnapshot) cmd := &cobra.Command{ Use: "snapshot [name]", Short: short, Long: long, Args: cobra.MaximumNArgs(1), - RunE: runSnapshot, + RunE: Run, } return cmd diff --git a/internal/cli/task/cmd/snapshot/doc.go b/internal/cli/task/cmd/snapshot/doc.go new file mode 100644 index 00000000..676b48ba --- /dev/null +++ b/internal/cli/task/cmd/snapshot/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package snapshot implements the ctx task snapshot subcommand. +// +// It creates a point-in-time copy of TASKS.md without modifying the +// original, storing the snapshot in the archive directory. +package snapshot diff --git a/internal/cli/task/cmd/snapshot/run.go b/internal/cli/task/cmd/snapshot/run.go index e52ae0ae..44a30187 100644 --- a/internal/cli/task/cmd/snapshot/run.go +++ b/internal/cli/task/cmd/snapshot/run.go @@ -12,14 +12,18 @@ import ( "path/filepath" "time" + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/cli/task/core" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/validation" + "github.com/ActiveMemory/ctx/internal/write" ) -// runSnapshot executes the snapshot subcommand logic. +// Run executes the snapshot subcommand logic. // // Creates a point-in-time copy of TASKS.md in the archive directory. // The snapshot includes a header with the name and timestamp. @@ -30,54 +34,51 @@ import ( // // Returns: // - error: Non-nil if TASKS.md doesn't exist or file operations fail -func runSnapshot(cmd *cobra.Command, args []string) error { +func Run(cmd *cobra.Command, args []string) error { tasksPath := core.TasksFilePath() archivePath := core.ArchiveDirPath() // Check if TASKS.md exists if _, statErr := os.Stat(tasksPath); os.IsNotExist(statErr) { - return fmt.Errorf("no TASKS.md found") + return ctxerr.TaskFileNotFound() } // Read TASKS.md content, readErr := os.ReadFile(filepath.Clean(tasksPath)) if readErr != nil { - return fmt.Errorf("failed to read TASKS.md: %w", readErr) + return ctxerr.TaskFileRead(readErr) } // Ensure the archive directory exists - if mkdirErr := os.MkdirAll(archivePath, config.PermExec); mkdirErr != nil { - return fmt.Errorf("failed to create archive directory: %w", mkdirErr) + if mkdirErr := os.MkdirAll(archivePath, fs.PermExec); mkdirErr != nil { + return ctxerr.CreateArchiveDir(mkdirErr) } // Generate snapshot filename now := time.Now() - name := "snapshot" + name := archive.DefaultSnapshotName if len(args) > 0 { name = validation.SanitizeFilename(args[0]) } snapshotFilename := fmt.Sprintf( - "tasks-%s-%s.md", name, now.Format("2006-01-02-1504"), + archive.SnapshotFilenameFormat, name, now.Format(archive.SnapshotTimeFormat), ) snapshotPath := filepath.Join(archivePath, snapshotFilename) - // Add snapshot header - nl := config.NewlineLF - snapshotContent := fmt.Sprintf( - "# TASKS.md Snapshot — %s"+ - nl+nl+ - "Created: %s"+nl+nl+config.Separator+nl+nl+"%s", - name, now.Format(time.RFC3339), string(content), + // Build snapshot content + nl := token.NewlineLF + snapshotContent := write.SnapshotContent( + name, now.Format(time.RFC3339), token.Separator, nl, string(content), ) // Write snapshot if writeErr := os.WriteFile( - snapshotPath, []byte(snapshotContent), config.PermFile, + snapshotPath, []byte(snapshotContent), fs.PermFile, ); writeErr != nil { - return fmt.Errorf("failed to write snapshot: %w", writeErr) + return ctxerr.SnapshotWrite(writeErr) } - cmd.Println(fmt.Sprintf("✓ Snapshot saved to %s", snapshotPath)) + write.SnapshotSaved(cmd, snapshotPath) return nil } diff --git a/internal/cli/task/core/count.go b/internal/cli/task/core/count.go index 38481eb5..c7b0341a 100644 --- a/internal/cli/task/core/count.go +++ b/internal/cli/task/core/count.go @@ -7,7 +7,7 @@ package core import ( - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/regex" "github.com/ActiveMemory/ctx/internal/task" ) @@ -21,7 +21,7 @@ import ( func CountPendingTasks(lines []string) int { count := 0 for _, line := range lines { - match := config.RegExTask.FindStringSubmatch(line) + match := regex.Task.FindStringSubmatch(line) if match != nil && task.Pending(match) && !task.SubTask(match) { count++ } diff --git a/internal/cli/task/core/path.go b/internal/cli/task/core/path.go index 7c11d603..e4ba1db1 100644 --- a/internal/cli/task/core/path.go +++ b/internal/cli/task/core/path.go @@ -9,7 +9,8 @@ package core import ( "path/filepath" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -18,7 +19,7 @@ import ( // Returns: // - string: Full path to .context/TASKS.md func TasksFilePath() string { - return filepath.Join(rc.ContextDir(), config.FileTask) + return filepath.Join(rc.ContextDir(), ctx.Task) } // ArchiveDirPath returns the path to the archive directory. @@ -26,5 +27,5 @@ func TasksFilePath() string { // Returns: // - string: Full path to .context/archive/ func ArchiveDirPath() string { - return filepath.Join(rc.ContextDir(), config.DirArchive) + return filepath.Join(rc.ContextDir(), dir.Archive) } diff --git a/internal/cli/task/core/process.go b/internal/cli/task/core/process.go deleted file mode 100644 index 9f46971f..00000000 --- a/internal/cli/task/core/process.go +++ /dev/null @@ -1,107 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package core - -import ( - "bufio" - "strings" - - "github.com/ActiveMemory/ctx/internal/config" - "github.com/ActiveMemory/ctx/internal/task" -) - -// SeparateTasks parses TASKS.md and separates completed from pending tasks. -// -// The function scans TASKS.md line by line, identifying task items by their -// checkbox markers ([x] for completed, [ ] for pending). It preserves phase -// headers (### Phase ...) in the archived content for traceability. -// -// Subtasks (indented task items) follow their parent task: -// - Subtasks of completed tasks are archived with the parent -// - Subtasks of pending tasks remain with the parent -// -// Parameters: -// - content: Full content of TASKS.md as a string -// -// Returns: -// - remaining: Content with only pending tasks (to write back to TASKS.md) -// - archived: Content with completed tasks and their phase headers -// - stats: Counts of completed and pending tasks processed -func SeparateTasks(content string) (string, string, TaskStats) { - var remaining strings.Builder - var archived strings.Builder - var stats TaskStats - nl := config.NewlineLF - - // Track the current phase header - var currentPhase string - var phaseHasArchivedTasks bool - var phaseArchiveBuffer strings.Builder - - scanner := bufio.NewScanner(strings.NewReader(content)) - var inCompletedTask bool - - for scanner.Scan() { - line := scanner.Text() - - // Check for phase headers - if config.RegExPhase.MatchString(line) { - // Flush previous phase's archived tasks - if phaseHasArchivedTasks { - archived.WriteString(currentPhase + nl) - archived.WriteString(phaseArchiveBuffer.String()) - archived.WriteString(nl) - } - - currentPhase = line - phaseHasArchivedTasks = false - phaseArchiveBuffer.Reset() - remaining.WriteString(line + nl) - inCompletedTask = false - continue - } - - // Check if the line is a task item - match := config.RegExTask.FindStringSubmatch(line) - if match != nil { - if task.SubTask(match) { - // Handle subtasks - follow their parent - if inCompletedTask { - phaseArchiveBuffer.WriteString(line + nl) - } else { - remaining.WriteString(line + nl) - } - continue - } - - // Top-level task - if task.Completed(match) { - stats.Completed++ - phaseHasArchivedTasks = true - phaseArchiveBuffer.WriteString(line + nl) - inCompletedTask = true - } else { - stats.Pending++ - remaining.WriteString(line + nl) - inCompletedTask = false - } - continue - } - - // Non-task lines go to the remaining - remaining.WriteString(line + nl) - inCompletedTask = false - } - - // Flush final phase's archived tasks - if phaseHasArchivedTasks { - archived.WriteString(currentPhase + nl) - archived.WriteString(phaseArchiveBuffer.String()) - } - - return remaining.String(), archived.String(), stats -} diff --git a/internal/cli/task/core/types.go b/internal/cli/task/core/types.go deleted file mode 100644 index 8ca3834e..00000000 --- a/internal/cli/task/core/types.go +++ /dev/null @@ -1,20 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package core - -// TaskStats holds counts of completed and pending tasks. -// -// Used by SeparateTasks to report how many tasks were processed during -// an archive operation. -// -// Fields: -// - Completed: Number of tasks marked with [x] -// - Pending: Number of tasks marked with [ ] -type TaskStats struct { - Completed int - Pending int -} diff --git a/internal/cli/task/task.go b/internal/cli/task/task.go index d51ac074..fc0d6bd9 100644 --- a/internal/cli/task/task.go +++ b/internal/cli/task/task.go @@ -20,6 +20,7 @@ import ( "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/cli/task/cmd/archive" + "github.com/ActiveMemory/ctx/internal/cli/task/cmd/complete" "github.com/ActiveMemory/ctx/internal/cli/task/cmd/snapshot" ) @@ -32,7 +33,7 @@ import ( // Returns: // - *cobra.Command: Configured tasks command with subcommands func Cmd() *cobra.Command { - short, long := assets.CommandDesc("task") + short, long := assets.CommandDesc(assets.CmdDescKeyTask) cmd := &cobra.Command{ Use: "tasks", @@ -41,6 +42,7 @@ func Cmd() *cobra.Command { } cmd.AddCommand(archive.Cmd()) + cmd.AddCommand(complete.Cmd()) cmd.AddCommand(snapshot.Cmd()) return cmd diff --git a/internal/cli/task/task_test.go b/internal/cli/task/task_test.go index 30804ddb..ef7e0fb7 100644 --- a/internal/cli/task/task_test.go +++ b/internal/cli/task/task_test.go @@ -16,57 +16,11 @@ import ( "github.com/ActiveMemory/ctx/internal/cli/add" "github.com/ActiveMemory/ctx/internal/cli/initialize" "github.com/ActiveMemory/ctx/internal/cli/task/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) -// TestSeparateTasks tests the separateTasks helper function. -func TestSeparateTasks(t *testing.T) { - tests := []struct { - name string - input string - expectedCompleted int - expectedPending int - }{ - { - name: "mixed tasks", - input: "# Tasks\n\n### Phase 1\n- [x] Done task\n- [ ] Pending task\n", - expectedCompleted: 1, - expectedPending: 1, - }, - { - name: "all completed", - input: "# Tasks\n\n- [x] Task 1\n- [x] Task 2\n", - expectedCompleted: 2, - expectedPending: 0, - }, - { - name: "all pending", - input: "# Tasks\n\n- [ ] Task 1\n- [ ] Task 2\n", - expectedCompleted: 0, - expectedPending: 2, - }, - { - name: "no tasks", - input: "# Tasks\n\nNo tasks here.\n", - expectedCompleted: 0, - expectedPending: 0, - }, - } - - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - _, _, stats := core.SeparateTasks(tt.input) - if stats.Completed != tt.expectedCompleted { - t.Errorf("SeparateTasks() completed = %d, want %d", stats.Completed, tt.expectedCompleted) - } - if stats.Pending != tt.expectedPending { - t.Errorf("SeparateTasks() pending = %d, want %d", stats.Pending, tt.expectedPending) - } - }) - } -} - // TestTasksCommands tests the tasks subcommands. func TestTasksCommands(t *testing.T) { tmpDir, err := os.MkdirTemp("", "cli-tasks-test-*") @@ -162,126 +116,6 @@ func runTaskCmd(args ...string) (string, error) { return buf.String(), err } -func TestSeparateTasks_WithSubtasks(t *testing.T) { - content := `# Tasks - -### Phase 1 -- [x] Completed parent - - [ ] Subtask of completed (should be archived) - - [x] Done subtask -- [ ] Pending parent - - [ ] Subtask of pending (should remain) -` - - remaining, archived, stats := core.SeparateTasks(content) - - if stats.Completed != 1 { - t.Errorf("completed = %d, want 1", stats.Completed) - } - if stats.Pending != 1 { - t.Errorf("pending = %d, want 1", stats.Pending) - } - - // Archived should contain the completed parent and its subtasks - if !strings.Contains(archived, "Completed parent") { - t.Error("archived should contain completed parent") - } - if !strings.Contains(archived, "Subtask of completed") { - t.Error("archived should contain subtask of completed parent") - } - - // Remaining should contain the pending parent and its subtask - if !strings.Contains(remaining, "Pending parent") { - t.Error("remaining should contain pending parent") - } - if !strings.Contains(remaining, "Subtask of pending") { - t.Error("remaining should contain subtask of pending parent") - } -} - -func TestSeparateTasks_MultiplePhases(t *testing.T) { - content := `# Tasks - -### Phase 1 -- [x] Phase 1 done -- [ ] Phase 1 pending - -### Phase 2 -- [x] Phase 2 done -- [ ] Phase 2 pending -` - - remaining, archived, stats := core.SeparateTasks(content) - - if stats.Completed != 2 { - t.Errorf("completed = %d, want 2", stats.Completed) - } - if stats.Pending != 2 { - t.Errorf("pending = %d, want 2", stats.Pending) - } - - // Each phase header should appear in archived since both have completed tasks - if !strings.Contains(archived, "Phase 1") { - t.Error("archived should contain Phase 1 header") - } - if !strings.Contains(archived, "Phase 2") { - t.Error("archived should contain Phase 2 header") - } - - // Remaining should still have phase headers and pending tasks - if !strings.Contains(remaining, "Phase 1 pending") { - t.Error("remaining should contain Phase 1 pending task") - } - if !strings.Contains(remaining, "Phase 2 pending") { - t.Error("remaining should contain Phase 2 pending task") - } -} - -func TestSeparateTasks_PhaseWithNoCompletedTasks(t *testing.T) { - content := `# Tasks - -### Phase 1 -- [ ] Only pending - -### Phase 2 -- [x] Only completed -` - - _, archived, _ := core.SeparateTasks(content) - - // Phase 1 should NOT appear in archived (no completed tasks) - lines := strings.Split(archived, "\n") - for _, line := range lines { - if strings.Contains(line, "Phase 1") { - t.Error("Phase 1 should not be in archived (no completed tasks)") - } - } - if !strings.Contains(archived, "Phase 2") { - t.Error("Phase 2 should be in archived") - } -} - -func TestSeparateTasks_NonTaskLines(t *testing.T) { - content := `# Tasks - -Some description text. - -- [x] Done -- [ ] Pending - -More notes. -` - - remaining, _, _ := core.SeparateTasks(content) - - if !strings.Contains(remaining, "Some description text.") { - t.Error("non-task lines should remain") - } - if !strings.Contains(remaining, "More notes.") { - t.Error("trailing non-task lines should remain") - } -} - func TestCountPendingTasks(t *testing.T) { tests := []struct { name string @@ -329,8 +163,8 @@ func TestTasksFilePath(t *testing.T) { setupTaskDir(t) path := core.TasksFilePath() - if !strings.Contains(path, config.FileTask) { - t.Errorf("TasksFilePath() = %q, want to contain %q", path, config.FileTask) + if !strings.Contains(path, ctx.Task) { + t.Errorf("TasksFilePath() = %q, want to contain %q", path, ctx.Task) } } @@ -338,8 +172,8 @@ func TestArchiveDirPath(t *testing.T) { setupTaskDir(t) path := core.ArchiveDirPath() - if !strings.Contains(path, config.DirArchive) { - t.Errorf("ArchiveDirPath() = %q, want to contain %q", path, config.DirArchive) + if !strings.Contains(path, dir.Archive) { + t.Errorf("ArchiveDirPath() = %q, want to contain %q", path, dir.Archive) } } @@ -356,8 +190,8 @@ func TestSnapshotCommand_NoTasks(t *testing.T) { // Create .context but no TASKS.md rc.Reset() - rc.OverrideContextDir(config.DirContext) - if err := os.MkdirAll(config.DirContext, 0750); err != nil { + rc.OverrideContextDir(dir.Context) + if err := os.MkdirAll(dir.Context, 0750); err != nil { t.Fatal(err) } @@ -365,8 +199,8 @@ func TestSnapshotCommand_NoTasks(t *testing.T) { if err == nil { t.Fatal("expected error when TASKS.md doesn't exist") } - if !strings.Contains(err.Error(), "no TASKS.md") { - t.Errorf("error = %q, want 'no TASKS.md'", err.Error()) + if !strings.Contains(err.Error(), "TASKS.md not found") { + t.Errorf("error = %q, want 'TASKS.md not found'", err.Error()) } } @@ -389,7 +223,7 @@ func TestSnapshotCommand_DefaultName(t *testing.T) { } // Verify file was created with default name - entries, err := os.ReadDir(filepath.Join(config.DirContext, config.DirArchive)) + entries, err := os.ReadDir(filepath.Join(dir.Context, dir.Archive)) if err != nil { t.Fatal(err) } @@ -416,8 +250,8 @@ func TestArchiveCommand_NoTasks(t *testing.T) { }) rc.Reset() - rc.OverrideContextDir(config.DirContext) - if err := os.MkdirAll(config.DirContext, 0750); err != nil { + rc.OverrideContextDir(dir.Context) + if err := os.MkdirAll(dir.Context, 0750); err != nil { t.Fatal(err) } @@ -457,7 +291,7 @@ func TestArchiveCommand_WithCompletedTasks(t *testing.T) { - [ ] Pending task 1 - [x] Completed task 2 ` - tasksPath := filepath.Join(config.DirContext, config.FileTask) + tasksPath := filepath.Join(dir.Context, ctx.Task) if err := os.WriteFile(tasksPath, []byte(tasksContent), 0600); err != nil { t.Fatal(err) } @@ -494,7 +328,7 @@ func TestArchiveCommand_DryRunWithCompleted(t *testing.T) { - [x] Done task - [ ] Not done task ` - tasksPath := filepath.Join(config.DirContext, config.FileTask) + tasksPath := filepath.Join(dir.Context, ctx.Task) if err := os.WriteFile(tasksPath, []byte(tasksContent), 0600); err != nil { t.Fatal(err) } @@ -550,20 +384,11 @@ func TestArchiveCommand_DryRunFlag(t *testing.T) { } } -func TestSeparateTasks_EmptyContent(t *testing.T) { - remaining, archived, stats := core.SeparateTasks("") - if stats.Completed != 0 || stats.Pending != 0 { - t.Errorf("stats = %+v, want zero for empty content", stats) - } - _ = remaining - _ = archived -} - func TestSnapshotCommand_SnapshotContentFormat(t *testing.T) { setupTaskDir(t) tasksContent := "# Tasks\n\n- [ ] My task\n" - tasksPath := filepath.Join(config.DirContext, config.FileTask) + tasksPath := filepath.Join(dir.Context, ctx.Task) if err := os.WriteFile(tasksPath, []byte(tasksContent), 0600); err != nil { t.Fatal(err) } @@ -574,13 +399,13 @@ func TestSnapshotCommand_SnapshotContentFormat(t *testing.T) { } // Find the snapshot file and verify content - entries, err := os.ReadDir(filepath.Join(config.DirContext, config.DirArchive)) + entries, err := os.ReadDir(filepath.Join(dir.Context, dir.Archive)) if err != nil { t.Fatal(err) } for _, e := range entries { if strings.Contains(e.Name(), "my-snap") { - data, err := os.ReadFile(filepath.Join(config.DirContext, config.DirArchive, e.Name())) + data, err := os.ReadFile(filepath.Join(dir.Context, dir.Archive, e.Name())) if err != nil { t.Fatal(err) } diff --git a/internal/cli/watch/cmd/root/cmd.go b/internal/cli/watch/cmd/root/cmd.go index d37451a9..a1ca0c26 100644 --- a/internal/cli/watch/cmd/root/cmd.go +++ b/internal/cli/watch/cmd/root/cmd.go @@ -5,3 +5,44 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the watch command. +// +// Flags: +// - --log: Log file to watch (default: stdin) +// - --dry-run: Show updates without applying +// +// Returns: +// - *cobra.Command: Configured watch command with flags registered +func Cmd() *cobra.Command { + var ( + logPath string + dryRun bool + ) + + short, long := assets.CommandDesc(assets.CmdDescKeyWatch) + + cmd := &cobra.Command{ + Use: "watch", + Short: short, + Long: long, + RunE: func(cmd *cobra.Command, _ []string) error { + return Run(cmd, logPath, dryRun) + }, + } + + cmd.Flags().StringVar( + &logPath, "log", "", assets.FlagDesc(assets.FlagDescKeyWatchLog), + ) + cmd.Flags().BoolVar( + &dryRun, "dry-run", false, assets.FlagDesc(assets.FlagDescKeyWatchDryRun), + ) + + return cmd +} diff --git a/internal/cli/watch/cmd/root/doc.go b/internal/cli/watch/cmd/root/doc.go new file mode 100644 index 00000000..540792ce --- /dev/null +++ b/internal/cli/watch/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx watch command. +// +// It watches AI tool output for context-update commands and applies them +// to the .context/ directory. +package root diff --git a/internal/cli/watch/cmd/root/run.go b/internal/cli/watch/cmd/root/run.go index 8114ec55..0b9a8258 100644 --- a/internal/cli/watch/cmd/root/run.go +++ b/internal/cli/watch/cmd/root/run.go @@ -7,7 +7,6 @@ package root import ( - "fmt" "io" "os" @@ -15,6 +14,8 @@ import ( "github.com/ActiveMemory/ctx/internal/cli/watch/core" "github.com/ActiveMemory/ctx/internal/context" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // Run executes the watch command logic. @@ -33,26 +34,25 @@ import ( // be opened, or stream processing fails func Run(cmd *cobra.Command, logPath string, dryRun bool) error { if !context.Exists("") { - return fmt.Errorf("no .context/ directory found. Run 'ctx init' first") + return ctxerr.ContextNotInitialized() } - cmd.Println("Watching for context updates...") + write.WatchWatching(cmd) if dryRun { - cmd.Println("DRY RUN — No changes will be made") + write.WatchDryRun(cmd) } - cmd.Println("Press Ctrl+C to stop") + write.WatchStopHint(cmd) cmd.Println() var reader io.Reader if logPath != "" { file, err := os.Open(logPath) //nolint:gosec // user-provided path via --log flag if err != nil { - return fmt.Errorf("failed to open log file: %w", err) + return ctxerr.OpenLogFile(err) } defer func(file *os.File) { - err := file.Close() - if err != nil { - cmd.Println(fmt.Sprintf("failed to close log file: %v", err)) + if closeErr := file.Close(); closeErr != nil { + write.WatchCloseLogError(cmd, closeErr) } }(file) reader = file diff --git a/internal/cli/watch/core/apply.go b/internal/cli/watch/core/apply.go index 0cae39c9..b15f9403 100644 --- a/internal/cli/watch/core/apply.go +++ b/internal/cli/watch/core/apply.go @@ -7,13 +7,17 @@ package core import ( - "fmt" "os" "path/filepath" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + entry2 "github.com/ActiveMemory/ctx/internal/config/entry" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/entry" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/rc" "github.com/ActiveMemory/ctx/internal/task" ) @@ -32,18 +36,18 @@ import ( // - error: Non-nil if type is unknown or the handler fails func ApplyUpdate(update ContextUpdate) error { switch update.Type { - case config.EntryTask: + case entry2.Task: return RunAddSilent(update) - case config.EntryDecision: + case entry2.Decision: return RunAddSilent(update) - case config.EntryLearning: + case entry2.Learning: return RunAddSilent(update) - case config.EntryConvention: + case entry2.Convention: return RunAddSilent(update) - case config.EntryComplete: + case entry2.Complete: return RunCompleteSilent([]string{update.Content}) default: - return fmt.Errorf("unknown update type: %s", update.Type) + return ctxerr.UnknownUpdateType(update.Type) } } @@ -97,12 +101,12 @@ func RunAddSilent(update ContextUpdate) error { // or file operations fail func RunCompleteSilent(args []string) error { if len(args) < 1 { - return fmt.Errorf("no task specified") + return ctxerr.NoTaskSpecified() } query := args[0] - filePath := filepath.Join(rc.ContextDir(), config.FileTask) - nl := config.NewlineLF + filePath := filepath.Join(rc.ContextDir(), ctx.Task) + nl := token.NewlineLF content, err := os.ReadFile(filepath.Clean(filePath)) if err != nil { @@ -113,7 +117,7 @@ func RunCompleteSilent(args []string) error { matchedLine := -1 for i, line := range lines { - match := config.RegExTask.FindStringSubmatch(line) + match := regex.Task.FindStringSubmatch(line) if match != nil && task.Pending(match) { if strings.Contains( strings.ToLower(task.Content(match)), @@ -126,11 +130,11 @@ func RunCompleteSilent(args []string) error { } if matchedLine == -1 { - return fmt.Errorf("no task matching %q found", query) + return ctxerr.NoTaskMatch(query) } - lines[matchedLine] = config.RegExTask.ReplaceAllString( - lines[matchedLine], "$1- [x] $3", + lines[matchedLine] = regex.Task.ReplaceAllString( + lines[matchedLine], regex.TaskCompleteReplace, ) - return os.WriteFile(filePath, []byte(strings.Join(lines, nl)), config.PermFile) + return os.WriteFile(filePath, []byte(strings.Join(lines, nl)), fs.PermFile) } diff --git a/internal/cli/watch/core/core_test.go b/internal/cli/watch/core/core_test.go index 5dd87ce1..49d90324 100644 --- a/internal/cli/watch/core/core_test.go +++ b/internal/cli/watch/core/core_test.go @@ -14,7 +14,8 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/initialize" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/entry" "github.com/ActiveMemory/ctx/internal/rc" "github.com/spf13/cobra" ) @@ -49,48 +50,48 @@ func TestApplyUpdate(t *testing.T) { }{ { name: "task update", - update: ContextUpdate{Type: config.EntryTask, Content: "Test task from watch"}, - checkFile: config.FileTask, + update: ContextUpdate{Type: entry.Task, Content: "Test task from watch"}, + checkFile: ctx.Task, checkFor: "Test task from watch", }, { name: "decision update", update: ContextUpdate{ - Type: config.EntryDecision, + Type: entry.Decision, Content: "Test decision from watch", Context: "Testing watch functionality", Rationale: "Need to verify watch applies decisions", Consequences: "Decision will appear in DECISIONS.md", }, - checkFile: config.FileDecision, + checkFile: ctx.Decision, checkFor: "Test decision from watch", }, { name: "learning update", update: ContextUpdate{ - Type: config.EntryLearning, + Type: entry.Learning, Content: "Test learning from watch", Context: "Testing watch functionality", Lesson: "Watch can add learnings", Application: "Use structured attributes in context-update tags", }, - checkFile: config.FileLearning, + checkFile: ctx.Learning, checkFor: "Test learning from watch", }, { name: "decision without required fields", - update: ContextUpdate{Type: config.EntryDecision, Content: "Missing fields"}, + update: ContextUpdate{Type: entry.Decision, Content: "Missing fields"}, expectError: true, }, { name: "learning without required fields", - update: ContextUpdate{Type: config.EntryLearning, Content: "Missing fields"}, + update: ContextUpdate{Type: entry.Learning, Content: "Missing fields"}, expectError: true, }, { name: "convention update", - update: ContextUpdate{Type: config.EntryConvention, Content: "Test convention from watch"}, - checkFile: config.FileConvention, + update: ContextUpdate{Type: entry.Convention, Content: "Test convention from watch"}, + checkFile: ctx.Convention, checkFor: "Test convention from watch", }, { @@ -150,7 +151,7 @@ func TestApplyCompleteUpdate(t *testing.T) { } // Add a task to complete - tasksPath := filepath.Join(rc.ContextDir(), config.FileTask) + tasksPath := filepath.Join(rc.ContextDir(), ctx.Task) tasksContent := `# Tasks ## Next Up @@ -163,7 +164,7 @@ func TestApplyCompleteUpdate(t *testing.T) { } // Complete the task - update := ContextUpdate{Type: config.EntryComplete, Content: "authentication"} + update := ContextUpdate{Type: entry.Complete, Content: "authentication"} if err = ApplyUpdate(update); err != nil { t.Fatalf("ApplyUpdate failed: %v", err) } @@ -218,7 +219,7 @@ More output } // Verify task was written - tasksPath := filepath.Join(rc.ContextDir(), config.FileTask) + tasksPath := filepath.Join(rc.ContextDir(), ctx.Task) content, err := os.ReadFile(filepath.Clean(tasksPath)) if err != nil { t.Fatalf("failed to read tasks: %v", err) @@ -265,7 +266,7 @@ More output } // Verify learning was written with structured fields - learningsPath := filepath.Join(rc.ContextDir(), config.FileLearning) + learningsPath := filepath.Join(rc.ContextDir(), ctx.Learning) content, err := os.ReadFile(filepath.Clean(learningsPath)) if err != nil { t.Fatalf("failed to read learnings: %v", err) @@ -499,7 +500,7 @@ func TestProcessStream_DecisionWithAttributes(t *testing.T) { } // Verify decision was written - decPath := filepath.Join(rc.ContextDir(), config.FileDecision) + decPath := filepath.Join(rc.ContextDir(), ctx.Decision) content, err := os.ReadFile(filepath.Clean(decPath)) if err != nil { t.Fatal(err) @@ -603,7 +604,7 @@ func TestProcessStream_CompleteUpdate(t *testing.T) { } // Write a task to complete - tasksPath := filepath.Join(rc.ContextDir(), config.FileTask) + tasksPath := filepath.Join(rc.ContextDir(), ctx.Task) tasksContent := "# Tasks\n\n- [ ] Implement login\n- [ ] Write tests\n" if err := os.WriteFile(tasksPath, []byte(tasksContent), 0600); err != nil { t.Fatal(err) diff --git a/internal/cli/watch/core/stream.go b/internal/cli/watch/core/stream.go index 145181fd..35098069 100644 --- a/internal/cli/watch/core/stream.go +++ b/internal/cli/watch/core/stream.go @@ -8,13 +8,17 @@ package core import ( "bufio" - "fmt" "io" + "regexp" "strings" + "github.com/ActiveMemory/ctx/internal/config/cli" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/watch" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" + ctxerr "github.com/ActiveMemory/ctx/internal/err" + "github.com/ActiveMemory/ctx/internal/write" ) // ExtractAttribute extracts a named attribute from an XML tag string. @@ -26,7 +30,7 @@ import ( // Returns: // - string: Attribute value, or empty string if not found func ExtractAttribute(tag, attrName string) string { - pattern := config.RegExFromAttrName(attrName) + pattern := regexp.MustCompile(attrName + `="([^"]*)"`) match := pattern.FindStringSubmatch(tag) if len(match) >= 2 { return match[1] @@ -50,8 +54,8 @@ func ExtractAttribute(tag, attrName string) string { func ProcessStream(cmd *cobra.Command, reader io.Reader, dryRun bool) error { scanner := bufio.NewScanner(reader) // Use a larger buffer for long lines - buf := make([]byte, 0, 64*1024) - scanner.Buffer(buf, 1024*1024) + buf := make([]byte, 0, watch.StreamScannerInitCap) + scanner.Buffer(buf, watch.StreamScannerMaxSize) updateCount := 0 @@ -59,36 +63,28 @@ func ProcessStream(cmd *cobra.Command, reader io.Reader, dryRun bool) error { line := scanner.Text() // Check for context-update commands - matches := config.RegExContextUpdate.FindAllStringSubmatch(line, -1) + matches := regex.SystemContextUpdate.FindAllStringSubmatch(line, -1) for _, match := range matches { if len(match) >= 3 { openingTag := match[1] update := ContextUpdate{ - Type: strings.ToLower(ExtractAttribute(openingTag, "type")), + Type: strings.ToLower(ExtractAttribute(openingTag, cli.AttrType)), Content: strings.TrimSpace(match[2]), - Context: ExtractAttribute(openingTag, "context"), - Lesson: ExtractAttribute(openingTag, "lesson"), - Application: ExtractAttribute(openingTag, "application"), - Rationale: ExtractAttribute(openingTag, "rationale"), - Consequences: ExtractAttribute(openingTag, "consequences"), + Context: ExtractAttribute(openingTag, cli.AttrContext), + Lesson: ExtractAttribute(openingTag, cli.AttrLesson), + Application: ExtractAttribute(openingTag, cli.AttrApplication), + Rationale: ExtractAttribute(openingTag, cli.AttrRationale), + Consequences: ExtractAttribute(openingTag, cli.AttrConsequences), } if dryRun { - cmd.Println(fmt.Sprintf( - "○ Would apply: [%s] %s\n", - update.Type, update.Content, - )) + write.WatchDryRunPreview(cmd, update.Type, update.Content) } else { err := ApplyUpdate(update) if err != nil { - cmd.Println(fmt.Sprintf( - "✗ Failed to apply [%s]: %v\n", - update.Type, err, - )) + write.WatchApplyFailed(cmd, update.Type, err) } else { - cmd.Println(fmt.Sprintf( - "✓ Applied: [%s] %s\n", update.Type, update.Content, - )) + write.WatchApplySuccess(cmd, update.Type, update.Content) updateCount++ } } @@ -97,7 +93,7 @@ func ProcessStream(cmd *cobra.Command, reader io.Reader, dryRun bool) error { } if err := scanner.Err(); err != nil { - return fmt.Errorf("error reading input: %w", err) + return ctxerr.ReadInputStream(err) } return nil diff --git a/internal/cli/watch/watch.go b/internal/cli/watch/watch.go index 9c8557e3..0b861ed8 100644 --- a/internal/cli/watch/watch.go +++ b/internal/cli/watch/watch.go @@ -9,41 +9,10 @@ package watch import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" watchroot "github.com/ActiveMemory/ctx/internal/cli/watch/cmd/root" ) // Cmd returns the watch command. -// -// Flags: -// - --log: Log file to watch (default: stdin) -// - --dry-run: Show updates without applying -// -// Returns: -// - *cobra.Command: Configured watch command with flags registered func Cmd() *cobra.Command { - var ( - logPath string - dryRun bool - ) - - short, long := assets.CommandDesc("watch") - - cmd := &cobra.Command{ - Use: "watch", - Short: short, - Long: long, - RunE: func(cmd *cobra.Command, _ []string) error { - return watchroot.Run(cmd, logPath, dryRun) - }, - } - - cmd.Flags().StringVar( - &logPath, "log", "", assets.FlagDesc("watch.log"), - ) - cmd.Flags().BoolVar( - &dryRun, "dry-run", false, assets.FlagDesc("watch.dry-run"), - ) - - return cmd + return watchroot.Cmd() } diff --git a/internal/cli/watch/watch_test.go b/internal/cli/watch/watch_test.go index 4fb86dcf..a22840c2 100644 --- a/internal/cli/watch/watch_test.go +++ b/internal/cli/watch/watch_test.go @@ -14,7 +14,7 @@ import ( "testing" "github.com/ActiveMemory/ctx/internal/cli/initialize" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -83,7 +83,7 @@ More output } // Verify task was written - tasksPath := filepath.Join(rc.ContextDir(), config.FileTask) + tasksPath := filepath.Join(rc.ContextDir(), ctx.Task) content, err := os.ReadFile(filepath.Clean(tasksPath)) if err != nil { t.Fatal(err) diff --git a/internal/cli/why/cmd/root/cmd.go b/internal/cli/why/cmd/root/cmd.go index d37451a9..f121800b 100644 --- a/internal/cli/why/cmd/root/cmd.go +++ b/internal/cli/why/cmd/root/cmd.go @@ -5,3 +5,30 @@ // SPDX-License-Identifier: Apache-2.0 package root + +import ( + "github.com/ActiveMemory/ctx/internal/config/cli" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// Cmd returns the "ctx why" cobra command. +// +// Returns: +// - *cobra.Command: Configured why command with document aliases +func Cmd() *cobra.Command { + short, long := assets.CommandDesc(assets.CmdDescKeyWhy) + + cmd := &cobra.Command{ + Use: "why [DOCUMENT]", + Short: short, + Annotations: map[string]string{cli.AnnotationSkipInit: ""}, + ValidArgs: []string{"manifesto", "about", "invariants"}, + Long: long, + Args: cobra.MaximumNArgs(1), + RunE: Run, + } + + return cmd +} diff --git a/internal/cli/why/cmd/root/data.go b/internal/cli/why/cmd/root/data.go new file mode 100644 index 00000000..a520767b --- /dev/null +++ b/internal/cli/why/cmd/root/data.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package root + +// DocAliases maps user-facing names to embedded asset names. +var DocAliases = map[string]string{ + "manifesto": "manifesto", + "about": "about", + "invariants": "design-invariants", +} + +// DocEntry pairs a document alias with its display label. +type DocEntry struct { + Alias string + Label string +} + +// DocOrder defines the display order for the interactive menu. +var DocOrder = []DocEntry{ + {"manifesto", "The ctx Manifesto"}, + {"about", "About ctx"}, + {"invariants", "Design Invariants"}, +} diff --git a/internal/cli/why/cmd/root/doc.go b/internal/cli/why/cmd/root/doc.go new file mode 100644 index 00000000..17253c45 --- /dev/null +++ b/internal/cli/why/cmd/root/doc.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package root implements the ctx why command. +// +// It displays the philosophy and design rationale behind ctx, including +// the manifesto and project invariants. +package root diff --git a/internal/cli/why/cmd/root/run.go b/internal/cli/why/cmd/root/run.go index 2adfcc1a..c8b1f741 100644 --- a/internal/cli/why/cmd/root/run.go +++ b/internal/cli/why/cmd/root/run.go @@ -16,25 +16,9 @@ import ( "github.com/spf13/cobra" "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/write" ) -// DocAliases maps user-facing names to embedded asset names. -var DocAliases = map[string]string{ - "manifesto": "manifesto", - "about": "about", - "invariants": "design-invariants", -} - -// DocOrder defines the display order for the interactive menu. -var DocOrder = []struct { - Alias string - Label string -}{ - {"manifesto", "The ctx Manifesto"}, - {"about", "About ctx"}, - {"invariants", "Design Invariants"}, -} - // Run dispatches to the interactive menu or direct document display. // // Parameters: @@ -52,19 +36,12 @@ func Run(cmd *cobra.Command, args []string) error { // showMenu presents a numbered menu and reads user selection from stdin. func showMenu(cmd *cobra.Command) error { - bt := "`" - cmd.Println(` - / ctx: https://ctx.ist - ,'` + bt + `./ do you remember? - ` + bt + `.,'\ - \ - {} -> what - ctx -> why`) + write.WhyBanner(cmd) cmd.Println() for i, doc := range DocOrder { - cmd.Println(fmt.Sprintf(" [%d] %s", i+1, doc.Label)) + write.WhyMenuItem(cmd, i+1, doc.Label) } - cmd.Print("\nSelect a document (1-3): ") + write.WhyMenuPrompt(cmd) reader := bufio.NewReader(os.Stdin) input, readErr := reader.ReadString('\n') diff --git a/internal/cli/why/cmd/root/strip.go b/internal/cli/why/cmd/root/strip.go index 2cac9ecd..9a117ecf 100644 --- a/internal/cli/why/cmd/root/strip.go +++ b/internal/cli/why/cmd/root/strip.go @@ -7,18 +7,15 @@ package root import ( - "regexp" + "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" + "github.com/ActiveMemory/ctx/internal/config/zensical" ) -// linkRe matches Markdown links with relative .md targets. -var linkRe = regexp.MustCompile(`\[([^\]]+)\]\([^\)]*\.md[^\)]*\)`) - -// imageRe matches Markdown image lines. -var imageRe = regexp.MustCompile(`^\s*!\[.*\]\(.*\)\s*$`) - // StripMkDocs removes MkDocs-specific syntax from Markdown content so it // reads cleanly in the terminal. // @@ -35,13 +32,13 @@ var imageRe = regexp.MustCompile(`^\s*!\[.*\]\(.*\)\s*$`) // Returns: // - string: Cleaned Markdown suitable for terminal display func StripMkDocs(content string) string { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var result []string // Strip YAML frontmatter. - if len(lines) > 0 && strings.TrimSpace(lines[0]) == "---" { + if len(lines) > 0 && strings.TrimSpace(lines[0]) == zensical.MkDocsFrontmatterDelim { for i := 1; i < len(lines); i++ { - if strings.TrimSpace(lines[i]) == "---" { + if strings.TrimSpace(lines[i]) == zensical.MkDocsFrontmatterDelim { lines = lines[i+1:] break } @@ -50,29 +47,31 @@ func StripMkDocs(content string) string { inAdmonition := false inTab := false + blockquotePrefix := assets.TextDesc(assets.TextDescKeyWhyBlockquotePrefix) for i := 0; i < len(lines); i++ { line := lines[i] // Skip image lines. - if imageRe.MatchString(line) { + if regex.MarkdownImage.MatchString(line) { continue } // Admonition start: !!! type "Title" - if strings.HasPrefix(strings.TrimSpace(line), "!!!") { + if strings.HasPrefix(strings.TrimSpace(line), zensical.MkDocsAdmonitionPrefix) { inAdmonition = true title := ExtractAdmonitionTitle(line) if title != "" { - result = append(result, "> **"+title+"**") + result = append(result, + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWhyAdmonitionFormat), title)) } continue } // Inside admonition: dedent 4-space body. if inAdmonition { - if strings.HasPrefix(line, " ") { - result = append(result, "> "+line[4:]) + if strings.HasPrefix(line, zensical.MkDocsIndent) { + result = append(result, blockquotePrefix+line[zensical.MkDocsIndentWidth:]) continue } // End of admonition body. @@ -80,19 +79,20 @@ func StripMkDocs(content string) string { } // Tab marker: === "Name" - if strings.HasPrefix(strings.TrimSpace(line), "=== ") { + if strings.HasPrefix(strings.TrimSpace(line), zensical.MkDocsTabPrefix) { inTab = true title := ExtractTabTitle(line) if title != "" { - result = append(result, "**"+title+"**") + result = append(result, + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWhyBoldFormat), title)) } continue } // Inside tab: dedent 4-space body. if inTab { - if strings.HasPrefix(line, " ") { - result = append(result, line[4:]) + if strings.HasPrefix(line, zensical.MkDocsIndent) { + result = append(result, line[zensical.MkDocsIndentWidth:]) continue } if strings.TrimSpace(line) == "" { @@ -104,12 +104,12 @@ func StripMkDocs(content string) string { } // Strip relative .md links, keep display text. - line = linkRe.ReplaceAllString(line, "$1") + line = regex.MarkdownLink.ReplaceAllString(line, "$1") result = append(result, line) } - return strings.Join(result, config.NewlineLF) + return strings.Join(result, token.NewlineLF) } // ExtractAdmonitionTitle pulls the quoted title from an admonition line. diff --git a/internal/cli/why/why.go b/internal/cli/why/why.go index aa6381ec..c7f5a1f6 100644 --- a/internal/cli/why/why.go +++ b/internal/cli/why/why.go @@ -9,27 +9,10 @@ package why import ( "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/assets" whyroot "github.com/ActiveMemory/ctx/internal/cli/why/cmd/root" - "github.com/ActiveMemory/ctx/internal/config" ) // Cmd returns the "ctx why" cobra command. -// -// Returns: -// - *cobra.Command: Configured why command with document aliases func Cmd() *cobra.Command { - short, long := assets.CommandDesc("why") - - cmd := &cobra.Command{ - Use: "why [DOCUMENT]", - Short: short, - Annotations: map[string]string{config.AnnotationSkipInit: ""}, - ValidArgs: []string{"manifesto", "about", "invariants"}, - Long: long, - Args: cobra.MaximumNArgs(1), - RunE: whyroot.Run, - } - - return cmd + return whyroot.Cmd() } diff --git a/internal/compliance/compliance_test.go b/internal/compliance/compliance_test.go index 2a4d2099..dd18e7dc 100644 --- a/internal/compliance/compliance_test.go +++ b/internal/compliance/compliance_test.go @@ -197,7 +197,7 @@ func TestNoLiteralNewline(t *testing.T) { re := regexp.MustCompile(`"\\n"`) for _, p := range nonTestGoFiles(t, root) { - if strings.HasSuffix(p, "token.go") { + if strings.HasSuffix(p, "token.go") || strings.HasSuffix(p, "whitespace.go") { continue } rel, _ := filepath.Rel(root, p) @@ -225,7 +225,8 @@ func TestNoLiteralMdExtension(t *testing.T) { re := regexp.MustCompile(`"\.md"`) for _, p := range nonTestGoFiles(t, root) { - if strings.HasSuffix(p, filepath.Join("config", "file.go")) { + if strings.HasSuffix(p, filepath.Join("config", "file.go")) || + strings.HasSuffix(p, filepath.Join("file", "ext.go")) { continue } rel, _ := filepath.Rel(root, p) @@ -842,7 +843,7 @@ func TestProjectCompiles(t *testing.T) { // use the expected permission values. func TestPermissionConstants(t *testing.T) { root := projectRoot(t) - filePath := filepath.Join(root, "internal", "config", "file.go") + filePath := filepath.Join(root, "internal", "config", "fs", "perm.go") data, err := os.ReadFile(filepath.Clean(filePath)) //nolint:gosec // constructed from test constants if err != nil { diff --git a/internal/compliance/doc.go b/internal/compliance/doc.go new file mode 100644 index 00000000..221fd5bb --- /dev/null +++ b/internal/compliance/doc.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package compliance contains cross-cutting tests that verify the entire +// codebase adheres to project standards. +// +// These tests inspect source files, configs, and build artifacts across the +// whole repository, mirroring the checks performed by the lint-drift and +// lint-docs scripts so that violations surface in go test without requiring +// bash. +package compliance diff --git a/internal/config/agent/agent.go b/internal/config/agent/agent.go new file mode 100644 index 00000000..1371f8a0 --- /dev/null +++ b/internal/config/agent/agent.go @@ -0,0 +1,45 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package agent + +import "time" + +// Budget allocation. +const ( + // TaskBudgetPct is the fraction of the token budget allocated to tasks. + TaskBudgetPct = 0.40 + // ConventionBudgetPct is the fraction of the token budget allocated to conventions. + ConventionBudgetPct = 0.20 +) + +// Cooldown configuration. +const ( + // DefaultCooldown is the default cooldown between agent context packet emissions. + DefaultCooldown = 10 * time.Minute + // TombstonePrefix is the filename prefix for agent cooldown tombstone files. + TombstonePrefix = "ctx-agent-" +) + +// Scoring configuration. +const ( + // RecencyDaysWeek is the threshold for "recent" entries (0-7 days). + RecencyDaysWeek = 7 + // RecencyDaysMonth is the threshold for "this month" entries (8-30 days). + RecencyDaysMonth = 30 + // RecencyDaysQuarter is the threshold for "this quarter" entries (31-90 days). + RecencyDaysQuarter = 90 + // RecencyScoreWeek is the recency score for entries within a week. + RecencyScoreWeek = 1.0 + // RecencyScoreMonth is the recency score for entries within a month. + RecencyScoreMonth = 0.7 + // RecencyScoreQuarter is the recency score for entries within a quarter. + RecencyScoreQuarter = 0.4 + // RecencyScoreOld is the recency score for entries older than a quarter. + RecencyScoreOld = 0.2 + // RelevanceMatchCap is the keyword match count that yields maximum relevance (1.0). + RelevanceMatchCap = 3 +) diff --git a/internal/config/agent/doc.go b/internal/config/agent/doc.go new file mode 100644 index 00000000..4bbf2cb3 --- /dev/null +++ b/internal/config/agent/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package agent defines budget, cooldown, and scoring constants for the ctx agent command. +package agent diff --git a/internal/config/architecture/arch.go b/internal/config/architecture/arch.go new file mode 100644 index 00000000..73aaf856 --- /dev/null +++ b/internal/config/architecture/arch.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package architecture + +// Architecture mapping file constants for .context/ directory. +const ( + // MapTracking is the architecture mapping coverage state file. + MapTracking = "map-tracking.json" +) diff --git a/internal/config/architecture/doc.go b/internal/config/architecture/doc.go new file mode 100644 index 00000000..ca082746 --- /dev/null +++ b/internal/config/architecture/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package architecture defines constants for architecture map files and staleness checks. +package architecture diff --git a/internal/config/architecture/stale.go b/internal/config/architecture/stale.go new file mode 100644 index 00000000..436d2d6f --- /dev/null +++ b/internal/config/architecture/stale.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package architecture + +// Map staleness hook configuration. +const ( + // MapStaleDays is the threshold in days before a map refresh is considered stale. + MapStaleDays = 30 + // MapStalenessThrottleID is the state file name for daily throttle of map staleness checks. + MapStalenessThrottleID = "check-map-staleness" +) diff --git a/internal/config/archive/archive.go b/internal/config/archive/archive.go new file mode 100644 index 00000000..c6a9dc1a --- /dev/null +++ b/internal/config/archive/archive.go @@ -0,0 +1,22 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package archive + +import "github.com/ActiveMemory/ctx/internal/config/file" + +// Task archive/snapshot constants. +const ( + // ArchiveScopeTasks is the scope identifier for task archives. + ArchiveScopeTasks = "tasks" + // DefaultSnapshotName is the default name when no snapshot name is provided. + DefaultSnapshotName = "snapshot" + // SnapshotFilenameFormat is the filename template for task snapshots. + // Args: name, formatted timestamp. + SnapshotFilenameFormat = "tasks-%s-%s" + file.ExtMarkdown + // SnapshotTimeFormat is the compact timestamp layout for snapshot filenames. + SnapshotTimeFormat = "2006-01-02-1504" +) diff --git a/internal/config/archive/backup.go b/internal/config/archive/backup.go new file mode 100644 index 00000000..18f7511b --- /dev/null +++ b/internal/config/archive/backup.go @@ -0,0 +1,39 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package archive + +// Backup configuration. +const ( + // BackupDefaultSubdir is the default subdirectory on the SMB share. + BackupDefaultSubdir = "ctx-sessions" + // BackupMarkerFile is the state file touched on successful project backup. + BackupMarkerFile = "ctx-last-backup" + // BackupScopeProject backs up only the project context. + BackupScopeProject = "project" + // BackupScopeGlobal backs up only global Claude data. + BackupScopeGlobal = "global" + // BackupScopeAll backs up both project and global. + BackupScopeAll = "all" + // BackupTplProjectArchive is the filename template for project archives. + // Argument: timestamp. + BackupTplProjectArchive = "ctx-backup-%s.tar.gz" + // BackupTplGlobalArchive is the filename template for global archives. + // Argument: timestamp. + BackupTplGlobalArchive = "claude-global-backup-%s.tar.gz" + // BackupTimestampFormat is the compact timestamp layout for backup filenames. + BackupTimestampFormat = "20060102-150405" + // BackupExcludeTodos is the directory name excluded from global backups. + BackupExcludeTodos = "todos" + // BackupMarkerDir is the XDG state directory for the backup marker. + BackupMarkerDir = ".local/state" + // BackupMaxAgeDays is the threshold in days before a backup is considered stale. + BackupMaxAgeDays = 2 + // BackupThrottleID is the state file name for daily throttle of backup age checks. + BackupThrottleID = "backup-reminded" + // Bashrc is the user's bash configuration file. + Bashrc = ".bashrc" +) diff --git a/internal/config/archive/doc.go b/internal/config/archive/doc.go new file mode 100644 index 00000000..f86c6c44 --- /dev/null +++ b/internal/config/archive/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package archive defines constants for task archival, backups, and snapshot formatting. +package archive diff --git a/internal/config/archive/subtask.go b/internal/config/archive/subtask.go new file mode 100644 index 00000000..9ec85feb --- /dev/null +++ b/internal/config/archive/subtask.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package archive + +// Task parsing constants. +const ( + // SubTaskMinIndent is the minimum indent length (in spaces) for a line + // to be considered a subtask rather than a top-level task. + SubTaskMinIndent = 2 +) diff --git a/internal/config/archive/tpl.go b/internal/config/archive/tpl.go new file mode 100644 index 00000000..aac6609b --- /dev/null +++ b/internal/config/archive/tpl.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package archive + +import "github.com/ActiveMemory/ctx/internal/config/file" + +const ( + // TplArchiveFilename is the format for dated archive filenames. + // Args: prefix, date. + TplArchiveFilename = "%s-%s" + file.ExtMarkdown + // ArchiveDateSep is the separator between heading and date in archive headers. + ArchiveDateSep = " - " +) diff --git a/internal/config/bootstrap/bootstrap.go b/internal/config/bootstrap/bootstrap.go new file mode 100644 index 00000000..17204da3 --- /dev/null +++ b/internal/config/bootstrap/bootstrap.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package bootstrap + +// Bootstrap display constants. +const ( + // BootstrapFileListWidth is the character width at which the file list wraps. + BootstrapFileListWidth = 55 + // BootstrapFileListIndent is the indentation prefix for file list lines. + BootstrapFileListIndent = " " +) diff --git a/internal/config/bootstrap/doc.go b/internal/config/bootstrap/doc.go new file mode 100644 index 00000000..38a25497 --- /dev/null +++ b/internal/config/bootstrap/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package bootstrap defines display and parsing constants for the ctx bootstrap command. +package bootstrap diff --git a/internal/config/bootstrap/list.go b/internal/config/bootstrap/list.go new file mode 100644 index 00000000..d2418292 --- /dev/null +++ b/internal/config/bootstrap/list.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package bootstrap + +// Numbered list parsing constants. +const ( + // NumberedListSep is the separator between the number and text in numbered lists (e.g. "1. item"). + NumberedListSep = ". " + // NumberedListMaxDigits is the maximum index position for the separator to be recognized as a prefix. + NumberedListMaxDigits = 2 +) diff --git a/internal/config/box/box.go b/internal/config/box/box.go new file mode 100644 index 00000000..b55d214e --- /dev/null +++ b/internal/config/box/box.go @@ -0,0 +1,22 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package box + +// Nudge box drawing constants. +const ( + // Top is the top-left corner of a nudge box. + Top = "┌─ " + // LinePrefix is the left border prefix for nudge box content lines. + LinePrefix = "│ " + // Bottom is the bottom border of a nudge box. + Bottom = "└──────────────────────────────────────────────────" + // NudgeBoxWidth is the inner character width of the nudge box border. + NudgeBoxWidth = 51 +) + +// PipeSeparator is the inline separator used between navigation links. +const PipeSeparator = " | " diff --git a/internal/config/box/doc.go b/internal/config/box/doc.go new file mode 100644 index 00000000..7bb496dd --- /dev/null +++ b/internal/config/box/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package box defines box-drawing characters and layout constants for nudge display. +package box diff --git a/internal/config/ceremony/ceremony.go b/internal/config/ceremony/ceremony.go new file mode 100644 index 00000000..06759ed5 --- /dev/null +++ b/internal/config/ceremony/ceremony.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package ceremony + +// Ceremony configuration. +const ( + // CeremonyThrottleID is the state file name for daily throttle of ceremony checks. + CeremonyThrottleID = "ceremony-reminded" + // CeremonyJournalLookback is the number of recent journal files to scan for ceremony usage. + CeremonyJournalLookback = 3 + // CeremonyRememberCmd is the command name scanned in journals for /ctx-remember usage. + CeremonyRememberCmd = "ctx-remember" + // CeremonyWrapUpCmd is the command name scanned in journals for /ctx-wrap-up usage. + CeremonyWrapUpCmd = "ctx-wrap-up" +) diff --git a/internal/config/ceremony/doc.go b/internal/config/ceremony/doc.go new file mode 100644 index 00000000..8dd3f4a7 --- /dev/null +++ b/internal/config/ceremony/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package ceremony defines configuration constants for end-of-session ceremony hooks. +package ceremony diff --git a/internal/config/claude/claude.go b/internal/config/claude/claude.go new file mode 100644 index 00000000..e4bb5e8f --- /dev/null +++ b/internal/config/claude/claude.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package claude + +// Claude API content block types. +const ( + // BlockText is a text content block. + BlockText = "text" + // BlockThinking is an extended thinking content block. + BlockThinking = "thinking" + // BlockToolUse is a tool invocation block. + BlockToolUse = "tool_use" + // BlockToolResult is a tool execution result block. + BlockToolResult = "tool_result" +) + +// Claude API message roles. +const ( + // RoleUser is a user message. + RoleUser = "user" + // RoleAssistant is an assistant message. + RoleAssistant = "assistant" +) diff --git a/internal/config/claude/doc.go b/internal/config/claude/doc.go new file mode 100644 index 00000000..0b6487c0 --- /dev/null +++ b/internal/config/claude/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package claude defines constants for Claude API content types, roles, and integration files. +package claude diff --git a/internal/config/claude/integ.go b/internal/config/claude/integ.go new file mode 100644 index 00000000..e74d5b3a --- /dev/null +++ b/internal/config/claude/integ.go @@ -0,0 +1,28 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package claude + +// Claude Code integration file names. +const ( + // Md is the Claude Code configuration file in the project root. + Md = "CLAUDE.md" + + // Settings is the Claude Code local settings file. + Settings = ".claude/settings.local.json" + // SettingsGolden is the golden image of the Claude Code settings. + SettingsGolden = ".claude/settings.golden.json" + + // GlobalSettings is the Claude Code global settings file. + // Located at ~/.claude/settings.json (not the project-local one). + GlobalSettings = "settings.json" + // InstalledPlugins is the Claude Code installed plugins registry. + // Located at ~/.claude/plugins/installed_plugins.json. + InstalledPlugins = "plugins/installed_plugins.json" + + // PluginID is the ctx plugin identifier in Claude Code. + PluginID = "ctx@activememory-ctx" +) diff --git a/internal/config/cli/attr.go b/internal/config/cli/attr.go new file mode 100644 index 00000000..762a7852 --- /dev/null +++ b/internal/config/cli/attr.go @@ -0,0 +1,23 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package cli + +// XML attribute name constants for context-update tag parsing. +const ( + // AttrType is the "type" attribute on a context-update tag. + AttrType = "type" + // AttrContext is the "context" attribute on a context-update tag. + AttrContext = "context" + // AttrLesson is the "lesson" attribute on a context-update tag. + AttrLesson = "lesson" + // AttrApplication is the "application" attribute on a context-update tag. + AttrApplication = "application" + // AttrRationale is the "rationale" attribute on a context-update tag. + AttrRationale = "rationale" + // AttrConsequences is the "consequences" attribute on a context-update tag. + AttrConsequences = "consequences" +) diff --git a/internal/config/cli/cli.go b/internal/config/cli/cli.go new file mode 100644 index 00000000..462d7a45 --- /dev/null +++ b/internal/config/cli/cli.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package cli + +// AnnotationSkipInit is the cobra.Command annotation key that exempts +// a command from the PersistentPreRunE initialization guard. +const AnnotationSkipInit = "skipInitCheck" + +// AnnotationTrue is the canonical value for boolean cobra annotations. +const AnnotationTrue = "true" diff --git a/internal/config/cli/confirm.go b/internal/config/cli/confirm.go new file mode 100644 index 00000000..74890e32 --- /dev/null +++ b/internal/config/cli/confirm.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package cli + +// User confirmation input values. +const ( + // ConfirmShort is the short affirmative response for y/N prompts. + ConfirmShort = "y" + // ConfirmLong is the long affirmative response for y/N prompts. + ConfirmLong = "yes" +) diff --git a/internal/config/cli/doc.go b/internal/config/cli/doc.go new file mode 100644 index 00000000..f2eda404 --- /dev/null +++ b/internal/config/cli/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package cli defines CLI annotation keys, XML attribute names, and confirmation input constants. +package cli diff --git a/internal/config/config_test.go b/internal/config/config_test.go index 62abc5fa..560d0411 100644 --- a/internal/config/config_test.go +++ b/internal/config/config_test.go @@ -7,126 +7,14 @@ package config import ( - "os" - "path/filepath" "testing" -) - -func TestUserInputToEntry(t *testing.T) { - tests := []struct { - input string - want string - }{ - // Task variations - {"task", EntryTask}, - {"tasks", EntryTask}, - {"Task", EntryTask}, - {"TASKS", EntryTask}, - - // Decision variations - {"decision", EntryDecision}, - {"decisions", EntryDecision}, - {"Decision", EntryDecision}, - {"DECISION", EntryDecision}, - - // Learning variations - {"learning", EntryLearning}, - {"learnings", EntryLearning}, - {"Learning", EntryLearning}, - {"LEARNINGS", EntryLearning}, - - // Convention variations - {"convention", EntryConvention}, - {"conventions", EntryConvention}, - {"Convention", EntryConvention}, - {"CONVENTIONS", EntryConvention}, - - // Unknown inputs - {"", EntryUnknown}, - {"unknown", EntryUnknown}, - {"foo", EntryUnknown}, - {"taskss", EntryUnknown}, - {"learn", EntryUnknown}, - } - - for _, tt := range tests { - t.Run(tt.input, func(t *testing.T) { - got := UserInputToEntry(tt.input) - if got != tt.want { - t.Errorf("UserInputToEntry(%q) = %q, want %q", tt.input, got, tt.want) - } - }) - } -} - -func TestRegExFromAttrName(t *testing.T) { - tests := []struct { - name string - attrName string - input string - wantMatch bool - wantValue string - }{ - { - name: "type attribute", - attrName: "type", - input: `type="task"`, - wantMatch: true, - wantValue: "task", - }, - { - name: "context attribute", - attrName: "context", - input: `context="some context here"`, - wantMatch: true, - wantValue: "some context here", - }, - { - name: "attribute in larger string", - attrName: "id", - input: ``, - wantMatch: true, - wantValue: "123", - }, - { - name: "no match", - attrName: "missing", - input: `type="task"`, - wantMatch: false, - wantValue: "", - }, - { - name: "empty value", - attrName: "empty", - input: `empty=""`, - wantMatch: true, - wantValue: "", - }, - } - - for _, tt := range tests { - t.Run(tt.name, func(t *testing.T) { - re := RegExFromAttrName(tt.attrName) - match := re.FindStringSubmatch(tt.input) - if tt.wantMatch { - if match == nil { - t.Errorf("expected match for %q in %q", tt.attrName, tt.input) - return - } - if len(match) < 2 { - t.Errorf("match has no capture group") - return - } - if match[1] != tt.wantValue { - t.Errorf("got value %q, want %q", match[1], tt.wantValue) - } - } else if match != nil { - t.Errorf("expected no match for %q in %q, got %v", tt.attrName, tt.input, match) - } - }) - } -} + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/entry" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" +) func TestRegExEntryHeader(t *testing.T) { tests := []struct { @@ -167,7 +55,7 @@ func TestRegExEntryHeader(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := RegExEntryHeader.FindStringSubmatch(tt.input) + match := regex.EntryHeader.FindStringSubmatch(tt.input) if tt.wantMatch { if match == nil { @@ -253,7 +141,7 @@ func TestRegExTask(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := RegExTask.FindStringSubmatch(tt.input) + match := regex.Task.FindStringSubmatch(tt.input) if tt.wantMatch { if match == nil { @@ -291,7 +179,7 @@ func TestRegExTaskMultiline(t *testing.T) { - [ ] Third task ` - matches := RegExTaskMultiline.FindAllStringSubmatch(input, -1) + matches := regex.TaskMultiline.FindAllStringSubmatch(input, -1) if len(matches) != 5 { t.Errorf("expected 5 matches, got %d", len(matches)) @@ -323,9 +211,9 @@ func TestRegExPhase(t *testing.T) { for _, tt := range tests { t.Run(tt.input, func(t *testing.T) { - matched := RegExPhase.MatchString(tt.input) + matched := regex.Phase.MatchString(tt.input) if matched != tt.wantMatch { - t.Errorf("RegExPhase.MatchString(%q) = %v, want %v", tt.input, matched, tt.wantMatch) + t.Errorf("Phase.MatchString(%q) = %v, want %v", tt.input, matched, tt.wantMatch) } }) } @@ -359,7 +247,7 @@ func TestRegExTaskDoneTimestamp(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := RegExTaskDoneTimestamp.FindStringSubmatch(tt.input) + match := regex.TaskDoneTimestamp.FindStringSubmatch(tt.input) if tt.wantMatch { if match == nil { @@ -409,7 +297,7 @@ func TestRegExPath(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := RegExPath.FindStringSubmatch(tt.input) + match := regex.CodeFencePath.FindStringSubmatch(tt.input) if tt.wantMatch { if match == nil { @@ -427,17 +315,17 @@ func TestRegExPath(t *testing.T) { } func TestFileTypeMap(t *testing.T) { - // Verify FileType map contains expected mappings + // Verify ToCtxFile map contains expected mappings expected := map[string]string{ - EntryDecision: FileDecision, - EntryTask: FileTask, - EntryLearning: FileLearning, - EntryConvention: FileConvention, + entry.Decision: ctx.Decision, + entry.Task: ctx.Task, + entry.Learning: ctx.Learning, + entry.Convention: ctx.Convention, } - for entry, file := range expected { - if FileType[entry] != file { - t.Errorf("FileType[%q] = %q, want %q", entry, FileType[entry], file) + for ent, ctxFile := range expected { + if entry.ToCtxFile[ent] != ctxFile { + t.Errorf("ToCtxFile[%q] = %q, want %q", ent, entry.ToCtxFile[ent], ctxFile) } } } @@ -445,12 +333,12 @@ func TestFileTypeMap(t *testing.T) { func TestRequiredFiles(t *testing.T) { // Verify FilesRequired contains essential files required := map[string]bool{ - FileConstitution: false, - FileTask: false, - FileDecision: false, + ctx.Constitution: false, + ctx.Task: false, + ctx.Decision: false, } - for _, f := range FilesRequired { + for _, f := range ctx.FilesRequired { if _, ok := required[f]; ok { required[f] = true } @@ -464,42 +352,21 @@ func TestRequiredFiles(t *testing.T) { } func TestFileReadOrder(t *testing.T) { - // Verify FileReadOrder has expected files in order - if len(FileReadOrder) == 0 { - t.Error("FileReadOrder is empty") + // Verify ReadOrder has expected files in order + if len(ctx.ReadOrder) == 0 { + t.Error("ReadOrder is empty") } // Constitution should be first (most important) - if FileReadOrder[0] != FileConstitution { - t.Errorf("FileReadOrder[0] = %q, want %q (constitution should be first)", - FileReadOrder[0], FileConstitution) + if ctx.ReadOrder[0] != ctx.Constitution { + t.Errorf("ReadOrder[0] = %q, want %q (constitution should be first)", + ctx.ReadOrder[0], ctx.Constitution) } // Tasks should be second (what to work on) - if FileReadOrder[1] != FileTask { - t.Errorf("FileReadOrder[1] = %q, want %q (tasks should be second)", - FileReadOrder[1], FileTask) - } -} - -func TestEntryPlural(t *testing.T) { - tests := []struct { - entry string - want string - }{ - {EntryTask, "tasks"}, - {EntryDecision, "decisions"}, - {EntryLearning, "learnings"}, - {EntryConvention, "conventions"}, - } - - for _, tt := range tests { - t.Run(tt.entry, func(t *testing.T) { - got := EntryPlural[tt.entry] - if got != tt.want { - t.Errorf("EntryPlural[%q] = %q, want %q", tt.entry, got, tt.want) - } - }) + if ctx.ReadOrder[1] != ctx.Task { + t.Errorf("ReadOrder[1] = %q, want %q (tasks should be second)", + ctx.ReadOrder[1], ctx.Task) } } @@ -510,15 +377,15 @@ func TestConstants(t *testing.T) { got string want string }{ - {"DirContext", DirContext, ".context"}, - {"DirClaude", DirClaude, ".claude"}, - {"FileTask", FileTask, "TASKS.md"}, - {"FileDecision", FileDecision, "DECISIONS.md"}, - {"FileLearning", FileLearning, "LEARNINGS.md"}, - {"PrefixTaskUndone", PrefixTaskUndone, "- [ ]"}, - {"PrefixTaskDone", PrefixTaskDone, "- [x]"}, - {"IndexStart", IndexStart, ""}, - {"IndexEnd", IndexEnd, ""}, + {"Context", dir.Context, ".context"}, + {"Claude", dir.Claude, ".claude"}, + {"Task", ctx.Task, "TASKS.md"}, + {"Decision", ctx.Decision, "DECISIONS.md"}, + {"Learning", ctx.Learning, "LEARNINGS.md"}, + {"PrefixTaskUndone", marker.PrefixTaskUndone, "- [ ]"}, + {"PrefixTaskDone", marker.PrefixTaskDone, "- [x]"}, + {"IndexStart", marker.IndexStart, ""}, + {"IndexEnd", marker.IndexEnd, ""}, } for _, tt := range tests { @@ -529,37 +396,3 @@ func TestConstants(t *testing.T) { }) } } - -func TestInitialized_AllFilesPresent(t *testing.T) { - tmp := t.TempDir() - for _, f := range FilesRequired { - path := filepath.Join(tmp, f) - if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil { - t.Fatalf("setup: %v", writeErr) - } - } - if !Initialized(tmp) { - t.Error("Initialized() = false, want true when all required files present") - } -} - -func TestInitialized_MissingFile(t *testing.T) { - tmp := t.TempDir() - // Create all but the last required file. - for _, f := range FilesRequired[:len(FilesRequired)-1] { - path := filepath.Join(tmp, f) - if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil { - t.Fatalf("setup: %v", writeErr) - } - } - if Initialized(tmp) { - t.Error("Initialized() = true, want false when a required file is missing") - } -} - -func TestInitialized_EmptyDir(t *testing.T) { - tmp := t.TempDir() - if Initialized(tmp) { - t.Error("Initialized() = true, want false for empty directory") - } -} diff --git a/internal/config/content/doc.go b/internal/config/content/doc.go new file mode 100644 index 00000000..fa5ead8d --- /dev/null +++ b/internal/config/content/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package content defines constants for content detection and validation. +package content diff --git a/internal/config/content/limit.go b/internal/config/content/limit.go new file mode 100644 index 00000000..724607c5 --- /dev/null +++ b/internal/config/content/limit.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package content + +// Content detection constants. +const ( + // MinLen is the minimum byte length for a file to be considered + // non-empty by the "effectively empty" heuristic. + MinLen = 20 +) diff --git a/internal/config/crypto/crypto.go b/internal/config/crypto/crypto.go new file mode 100644 index 00000000..560bd304 --- /dev/null +++ b/internal/config/crypto/crypto.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package crypto + +// Crypto constants. +const ( + // KeySize is the required key length in bytes (256 bits). + KeySize = 32 + // NonceSize is the GCM nonce length in bytes. + NonceSize = 12 +) diff --git a/internal/config/crypto/doc.go b/internal/config/crypto/doc.go new file mode 100644 index 00000000..b8295a28 --- /dev/null +++ b/internal/config/crypto/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package crypto defines constants for encryption key sizes, nonce lengths, and key file names. +package crypto diff --git a/internal/config/crypto/enc.go b/internal/config/crypto/enc.go new file mode 100644 index 00000000..5af07da4 --- /dev/null +++ b/internal/config/crypto/enc.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package crypto + +// NotifyEnc is the encrypted webhook URL file. +const NotifyEnc = ".notify.enc" + +// ContextKey is the context encryption key file. +const ContextKey = ".ctx.key" diff --git a/internal/config/ctx/ctx.go b/internal/config/ctx/ctx.go new file mode 100644 index 00000000..85fd614b --- /dev/null +++ b/internal/config/ctx/ctx.go @@ -0,0 +1,67 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package ctx + +// `ctx` file name constants for the .context/ directory. +const ( + // Constitution contains inviolable rules for agents. + Constitution = "CONSTITUTION.md" + // Task contains current work items and their status. + Task = "TASKS.md" + // Convention contains code patterns and standards. + Convention = "CONVENTIONS.md" + // Architecture contains system structure documentation. + Architecture = "ARCHITECTURE.md" + // Decision contains architectural decisions with rationale. + Decision = "DECISIONS.md" + // Learning contains gotchas, tips, and lessons learned. + Learning = "LEARNINGS.md" + // Glossary contains domain terms and definitions. + Glossary = "GLOSSARY.md" + // AgentPlaybook contains the meta-instructions for using the + // context system. + AgentPlaybook = "AGENT_PLAYBOOK.md" + // Dependency contains project dependency documentation. + Dependency = "DEPENDENCIES.md" +) + +// ReadOrder defines the priority order for reading context files. +// +// The order follows a logical progression for AI agents: +// +// 1. CONSTITUTION — Inviolable rules. Must be loaded first so the agent +// knows what it cannot do before attempting anything. +// +// 2. TASKS — Current work items. What the agent should focus on. +// +// 3. CONVENTIONS — How to write code. Patterns and standards to follow. +// +// 4. ARCHITECTURE — System structure. Understanding of components and +// boundaries before making changes. +// +// 5. DECISIONS — Historical context. Why things are the way they are, +// to avoid re-debating settled decisions. +// +// 6. LEARNINGS — Gotchas and tips. Lessons from past work that inform +// current implementation. +// +// 7. GLOSSARY — Reference material. Domain terms and abbreviations for +// lookup as needed. +// +// 8. AGENT_PLAYBOOK — Meta instructions. How to use this context system. +// Loaded last because it's about the system itself, not the work. +// The agent should understand the content before the operating manual. +var ReadOrder = []string{ + Constitution, + Task, + Convention, + Architecture, + Decision, + Learning, + Glossary, + AgentPlaybook, +} diff --git a/internal/config/ctx/doc.go b/internal/config/ctx/doc.go new file mode 100644 index 00000000..d48cfe76 --- /dev/null +++ b/internal/config/ctx/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package ctx defines the canonical context file read order and required file lists. +package ctx diff --git a/internal/config/ctx/required.go b/internal/config/ctx/required.go new file mode 100644 index 00000000..e729e3e4 --- /dev/null +++ b/internal/config/ctx/required.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package ctx + +// FilesRequired lists the essential context files that must be present. +// +// These are the files created with `ctx init --minimal` and checked by +// drift detection for missing files. +var FilesRequired = []string{ + Constitution, + Task, + Decision, +} diff --git a/internal/config/dep/dep.go b/internal/config/dep/dep.go new file mode 100644 index 00000000..c59df382 --- /dev/null +++ b/internal/config/dep/dep.go @@ -0,0 +1,16 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package dep + +// Packages is used by sync to detect projects and suggest dependency documentation. +var Packages = map[string]string{ + "package.json": "Node.js dependencies", + "go.mod": "Go module dependencies", + "Cargo.toml": "Rust dependencies", + "requirements.txt": "Python dependencies", + "Gemfile": "Ruby dependencies", +} diff --git a/internal/config/dep/doc.go b/internal/config/dep/doc.go new file mode 100644 index 00000000..fcfc20e2 --- /dev/null +++ b/internal/config/dep/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package dep defines the dependency detection map used by ctx sync. +package dep diff --git a/internal/config/dir.go b/internal/config/dir.go deleted file mode 100644 index 2caf692d..00000000 --- a/internal/config/dir.go +++ /dev/null @@ -1,63 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -// Directory path constants used throughout the application. -const ( - // DirArchive is the subdirectory for archived tasks within .context/. - DirArchive = "archive" - // DirClaude is the Claude Code configuration directory in the project root. - DirClaude = ".claude" - // DirClaudeHooks is the hooks subdirectory within .claude/. - DirClaudeHooks = ".claude/hooks" - // DirContext is the default context directory name. - DirContext = ".context" - // DirPrompts is the subdirectory for prompt templates within .context/. - DirPrompts = "prompts" - // DirJournal is the subdirectory for journal entries within .context/. - DirJournal = "journal" - // DirJournalSite is the journal static site output directory within .context/. - DirJournalSite = "journal-site" - // DirSessions is the subdirectory for session summaries within .context/. - DirSessions = "sessions" - // DirState is the subdirectory for project-scoped runtime state within .context/. - // Gitignored — ephemeral files (flags, markers) that hooks write and consume. - DirState = "state" - // DirSpecs is the project-root directory for formalized plans and feature specs. - DirSpecs = "specs" - // DirIdeas is the project-root directory for early-stage ideas and explorations. - DirIdeas = "ideas" - // DirMemory is the subdirectory for memory bridge files within .context/. - DirMemory = "memory" - // DirMemoryArchive is the archive subdirectory within .context/memory/. - DirMemoryArchive = "memory/archive" -) - -// GitignoreEntries lists the recommended .gitignore entries added by ctx init. -var GitignoreEntries = []string{ - ".context/journal/", - ".context/journal-site/", - ".context/journal-obsidian/", - ".context/logs/", - ".context/.ctx.key", - ".context/.context.key", - ".context/.scratchpad.key", - ".context/state/", - ".claude/settings.local.json", -} - -// Journal site output directories. -const ( - // JournalDirDocs is the docs subdirectory in the generated site. - JournalDirDocs = "docs" - // JournalDirTopics is the topics subdirectory in the generated site. - JournalDirTopics = "topics" - // JournalDirFiles is the key files subdirectory in the generated site. - JournalDirFiles = "files" - // JournalDirTypes is the session types subdirectory in the generated site. - JournalDirTypes = "types" -) diff --git a/internal/config/dir/dir.go b/internal/config/dir/dir.go new file mode 100644 index 00000000..0478a52d --- /dev/null +++ b/internal/config/dir/dir.go @@ -0,0 +1,57 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package dir + +// Directory path constants used throughout the application. +const ( + // Archive is the subdirectory for archived tasks within .context/. + Archive = "archive" + // Claude is the Claude Code configuration directory in the project root. + Claude = ".claude" + // Context is the default context directory name. + Context = ".context" + // HooksMessages is the subdirectory path for hook message overrides within .context/. + HooksMessages = "hooks/messages" + // Ideas is the project-root directory for early-stage ideas and explorations. + Ideas = "ideas" + // Journal is the subdirectory for journal entries within .context/. + Journal = "journal" + // JournalObsidian is the Obsidian export of journal entries within .context/. + JournalObsidian = "journal-obsidian" + // JournalSite is the journal static site output directory within .context/. + JournalSite = "journal-site" + // Logs is the subdirectory name for log files within the context directory. + Logs = "logs" + // Memory is the subdirectory for memory bridge files within .context/. + Memory = "memory" + // MemoryArchive is the archive subdirectory within .context/memory/. + MemoryArchive = "memory/archive" + // Projects is the projects subdirectory within .claude/. + Projects = "projects" + // Prompts is the subdirectory for prompt templates within .context/. + Prompts = "prompts" + // Sessions is the subdirectory for session summaries within .context/. + Sessions = "sessions" + // Specs is the project-root directory for formalized plans and feature specs. + Specs = "specs" + // State is the subdirectory for project-scoped runtime state within .context/. + State = "state" + // CtxData is the user-level ctx data directory (~/.ctx/). + CtxData = ".ctx" +) + +// Journal site output directories. +const ( + // JournalDocs is the docs subdirectory in the generated site. + JournalDocs = "docs" + // JournTopics is the topics subdirectory in the generated site. + JournTopics = "topics" + // JournalFiles is the key files subdirectory in the generated site. + JournalFiles = "files" + // JournalTypes is the session types subdirectory in the generated site. + JournalTypes = "types" +) diff --git a/internal/config/dir/doc.go b/internal/config/dir/doc.go new file mode 100644 index 00000000..ca5ab1b4 --- /dev/null +++ b/internal/config/dir/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package dir defines directory path constants used throughout the application. +package dir diff --git a/internal/config/doctor/doc.go b/internal/config/doctor/doc.go new file mode 100644 index 00000000..0006bf65 --- /dev/null +++ b/internal/config/doctor/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package doctor defines check names and category constants for ctx doctor results. +package doctor diff --git a/internal/config/doctor/doctor.go b/internal/config/doctor/doctor.go new file mode 100644 index 00000000..a1ef08e0 --- /dev/null +++ b/internal/config/doctor/doctor.go @@ -0,0 +1,69 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package doctor + +// Doctor check name constants — used as Result.Name values. +const ( + // CheckContextInit identifies the context initialization check. + CheckContextInit = "context_initialized" + // CheckRequiredFiles identifies the required files check. + CheckRequiredFiles = "required_files" + // CheckCtxrcValidation identifies the .ctxrc validation check. + CheckCtxrcValidation = "ctxrc_validation" + // CheckDrift identifies the drift detection check. + CheckDrift = "drift" + // CheckPluginInstalled identifies the plugin installation check. + CheckPluginInstalled = "plugin_installed" + // CheckPluginEnabledGlobal identifies the global plugin enablement check. + CheckPluginEnabledGlobal = "plugin_enabled_global" + // CheckPluginEnabledLocal identifies the local plugin enablement check. + CheckPluginEnabledLocal = "plugin_enabled_local" + // CheckPluginEnabled identifies the plugin enablement check (when neither scope is active). + CheckPluginEnabled = "plugin_enabled" + // CheckEventLogging identifies the event logging check. + CheckEventLogging = "event_logging" + // CheckWebhook identifies the webhook configuration check. + CheckWebhook = "webhook" + // CheckReminders identifies the pending reminders check. + CheckReminders = "reminders" + // CheckTaskCompletion identifies the task completion check. + CheckTaskCompletion = "task_completion" + // CheckContextSize identifies the context token size check. + CheckContextSize = "context_size" + // CheckContextFilePrefix is the prefix for per-file context size results. + CheckContextFilePrefix = "context_file_" + // CheckRecentEvents identifies the recent event log check. + CheckRecentEvents = "recent_events" + // CheckResourceMemory identifies the memory resource check. + CheckResourceMemory = "resource_memory" + // CheckResourceSwap identifies the swap resource check. + CheckResourceSwap = "resource_swap" + // CheckResourceDisk identifies the disk resource check. + CheckResourceDisk = "resource_disk" + // CheckResourceLoad identifies the load resource check. + CheckResourceLoad = "resource_load" +) + +// Doctor category constants — used as Result.Category values. +const ( + // CategoryStructure groups context directory and file checks. + CategoryStructure = "Structure" + // CategoryQuality groups drift and content quality checks. + CategoryQuality = "Quality" + // CategoryPlugin groups plugin installation and enablement checks. + CategoryPlugin = "Plugin" + // CategoryHooks groups hook configuration checks. + CategoryHooks = "Hooks" + // CategoryState groups runtime state checks. + CategoryState = "State" + // CategorySize groups token size and budget checks. + CategorySize = "Size" + // CategoryResources groups system resource checks. + CategoryResources = "Resources" + // CategoryEvents groups event log checks. + CategoryEvents = "Events" +) diff --git a/internal/config/entry.go b/internal/config/entry.go deleted file mode 100644 index 1d703a54..00000000 --- a/internal/config/entry.go +++ /dev/null @@ -1,64 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -import "strings" - -// Entry type constants for context updates. -// -// These are the canonical internal representations used in switch statements -// for routing add/update commands to the appropriate handler. -const ( - // EntryTask represents a task entry in TASKS.md. - EntryTask = "task" - // EntryDecision represents an architectural decision in DECISIONS.md. - EntryDecision = "decision" - // EntryLearning represents a lesson learned in LEARNINGS.md. - EntryLearning = "learning" - // EntryConvention represents a code pattern in CONVENTIONS.md. - EntryConvention = "convention" - // EntryComplete represents a task completion action (marks the task as done). - EntryComplete = "complete" - // EntryUnknown is returned when user input doesn't match any known type. - EntryUnknown = "unknown" -) - -// EntryPlural maps entry type constants to their plural forms. -// -// Used for user-facing messages (e.g., "no decisions found"). -var EntryPlural = map[string]string{ - EntryTask: "tasks", - EntryDecision: "decisions", - EntryLearning: "learnings", - EntryConvention: "conventions", -} - -// UserInputToEntry normalizes user input to a canonical entry type. -// -// Accepts both singular and plural forms (e.g., "task" or "tasks") and -// returns the canonical singular form. Matching is case-insensitive. -// Unknown inputs return EntryUnknown. -// -// Parameters: -// - s: User-provided entry type string -// -// Returns: -// - string: Canonical entry type constant (EntryTask, EntryDecision, etc.) -func UserInputToEntry(s string) string { - switch strings.ToLower(s) { - case "task", "tasks": - return EntryTask - case "decision", "decisions": - return EntryDecision - case "learning", "learnings": - return EntryLearning - case "convention", "conventions": - return EntryConvention - default: - return EntryUnknown - } -} diff --git a/internal/config/entry/doc.go b/internal/config/entry/doc.go new file mode 100644 index 00000000..c1dd840a --- /dev/null +++ b/internal/config/entry/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package entry defines entry type identifiers, field names, and file routing maps. +package entry diff --git a/internal/config/entry/entry.go b/internal/config/entry/entry.go new file mode 100644 index 00000000..1b28d5f7 --- /dev/null +++ b/internal/config/entry/entry.go @@ -0,0 +1,52 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package entry + +import "strings" + +// Entry type constants for context updates. +// +// These are the canonical internal representations used in switch statements +// for routing add/update commands to the appropriate handler. +const ( + // Task represents a task entry in TASKS.md. + Task = "task" + // Decision represents an architectural decision in DECISIONS.md. + Decision = "decision" + // Learning represents a lesson learned in LEARNINGS.md. + Learning = "learning" + // Convention represents a code pattern in CONVENTIONS.md. + Convention = "convention" + // Complete represents a task completion action (marks the task as done). + Complete = "complete" + // Unknown is returned when user input doesn't match any known type. + Unknown = "unknown" +) + +// FromUserInput normalizes user input to a canonical entry type. +// +// Accepts singular and plural forms, case-insensitive. +// +// Parameters: +// - s: user-supplied type string (e.g. "tasks", "Decision") +// +// Returns: +// - string: canonical entry constant, or Unknown +func FromUserInput(s string) string { + switch strings.ToLower(s) { + case "task", "tasks": + return Task + case "decision", "decisions": + return Decision + case "learning", "learnings": + return Learning + case "convention", "conventions": + return Convention + default: + return Unknown + } +} diff --git a/internal/config/field.go b/internal/config/entry/field.go similarity index 98% rename from internal/config/field.go rename to internal/config/entry/field.go index aa737781..39d9c381 100644 --- a/internal/config/field.go +++ b/internal/config/entry/field.go @@ -6,7 +6,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package entry // Field name constants for structured entry attributes. // diff --git a/internal/config/entry/map.go b/internal/config/entry/map.go new file mode 100644 index 00000000..aeea088a --- /dev/null +++ b/internal/config/entry/map.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package entry + +import ( + "github.com/ActiveMemory/ctx/internal/config/ctx" +) + +// ToCtxFile maps short names to actual file names. +var ToCtxFile = map[string]string{ + Decision: ctx.Decision, + Task: ctx.Task, + Learning: ctx.Learning, + Convention: ctx.Convention, +} diff --git a/internal/config/env/doc.go b/internal/config/env/doc.go new file mode 100644 index 00000000..fe815983 --- /dev/null +++ b/internal/config/env/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package env defines environment variable names and toggle values for ctx configuration. +package env diff --git a/internal/config/env/env.go b/internal/config/env/env.go new file mode 100644 index 00000000..817865ac --- /dev/null +++ b/internal/config/env/env.go @@ -0,0 +1,32 @@ +// / ctx: https://ctx.ist +// +// ,'`./ do you remember? +// +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package env + +// Environment variable names. +const ( + // Home is the environment variable for the user's home directory. + Home = "HOME" + // CtxDir is the environment variable for overriding the context directory. + CtxDir = "CTX_DIR" + // CtxTokenBudget is the environment variable for overriding the token budget. + CtxTokenBudget = "CTX_TOKEN_BUDGET" //nolint:gosec // G101: env var name, not a credential + // BackupSMBURL is the environment variable for the SMB share URL. + BackupSMBURL = "CTX_BACKUP_SMB_URL" + // BackupSMBSubdir is the environment variable for the SMB share subdirectory. + BackupSMBSubdir = "CTX_BACKUP_SMB_SUBDIR" + // SkipPathCheck is the environment variable that skips the PATH + // validation during init. Set to True in tests. + SkipPathCheck = "CTX_SKIP_PATH_CHECK" +) + +// Environment toggle values. +const ( + // True is the canonical truthy value for environment variable toggles. + True = "1" +) diff --git a/internal/config/event/doc.go b/internal/config/event/doc.go new file mode 100644 index 00000000..64f3564c --- /dev/null +++ b/internal/config/event/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package event defines event log constants, display limits, and context-size event names. +package event diff --git a/internal/config/event/event.go b/internal/config/event/event.go new file mode 100644 index 00000000..651eeaa9 --- /dev/null +++ b/internal/config/event/event.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package event + +// Events display configuration. +const ( + // EventsMessageMaxLen is the maximum character length for event messages + // in human-readable output before truncation. + EventsMessageMaxLen = 60 + // EventsHookFallback is the placeholder displayed when no hook name + // can be determined from an event payload. + EventsHookFallback = "-" + // EventsTruncationSuffix is appended to truncated event messages. + EventsTruncationSuffix = "..." +) diff --git a/internal/config/event/log.go b/internal/config/event/log.go new file mode 100644 index 00000000..6c32544b --- /dev/null +++ b/internal/config/event/log.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package event + +// Event log constants for .context/state/ directory. +const ( + // FileEventLog is the current event log file. + FileEventLog = "events.jsonl" + // FileEventLogPrev is the rotated (previous) event log file. + FileEventLogPrev = "events.1.jsonl" + // EventLogMaxBytes is the size threshold for log rotation (1MB). + EventLogMaxBytes = 1 << 20 + // HookLogMaxBytes is the size threshold for hook log rotation (1MB). + HookLogMaxBytes = 1 << 20 +) diff --git a/internal/config/event/size.go b/internal/config/event/size.go new file mode 100644 index 00000000..3641bd8a --- /dev/null +++ b/internal/config/event/size.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package event + +// Context-size event names. +const ( + // EventSuppressed is the event name for suppressed prompts. + EventSuppressed = "suppressed" + // EventSilent is the event name for silent (no-action) prompts. + EventSilent = "silent" + // EventCheckpoint is the event name for context checkpoint emissions. + EventCheckpoint = "checkpoint" + // EventWindowWarning is the event name for context window warning emissions. + EventWindowWarning = "window-warning" +) diff --git a/internal/config/file.go b/internal/config/file.go deleted file mode 100644 index d239374d..00000000 --- a/internal/config/file.go +++ /dev/null @@ -1,408 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -import ( - "os" - "path/filepath" - "time" -) - -// AnnotationSkipInit is the cobra.Command annotation key that exempts -// a command from the PersistentPreRunE initialization guard. -const AnnotationSkipInit = "skipInitCheck" - -// CmdCompletion is the name of Cobra's built-in completion parent command. -const CmdCompletion = "completion" - -// Global CLI flag names. -const ( - FlagContextDir = "context-dir" - FlagNoColor = "no-color" // Retained for CLI compatibility - FlagAllowOutsideCwd = "allow-outside-cwd" -) - -// CLI flag prefixes for display formatting. -const ( - FlagPrefixShort = "-" - FlagPrefixLong = "--" -) - -// Add command flag names — used for both flag registration and error display. -const ( - FlagContext = "context" - FlagRationale = "rationale" - FlagConsequences = "consequences" - FlagLesson = "lesson" - FlagApplication = "application" - FlagPriority = "priority" - FlagSection = "section" - FlagFile = "file" -) - -// Initialized reports whether the context directory contains all required files. -func Initialized(contextDir string) bool { - for _, f := range FilesRequired { - if _, err := os.Stat(filepath.Join(contextDir, f)); err != nil { - return false - } - } - return true -} - -// File permission constants. -const ( - // PermFile is the standard permission for regular files (owner rw, others r). - PermFile = 0644 - // PermExec is the standard permission for directories and executable files. - PermExec = 0755 - // PermSecret is the permission for secret files (owner rw only). - PermSecret = 0600 -) - -// File extension constants. -const ( - // ExtMarkdown is the Markdown file extension. - ExtMarkdown = ".md" - // ExtJSONL is the JSON Lines file extension. - ExtJSONL = ".jsonl" -) - -// Common filenames. -const ( - // FilenameReadme is the standard README filename. - FilenameReadme = "README.md" - // FilenameIndex is the standard index filename for generated sites. - FilenameIndex = "index.md" -) - -// Journal site configuration. -const ( - // FileZensicalToml is the zensical site configuration filename. - FileZensicalToml = "zensical.toml" - // BinZensical is the zensical binary name. - BinZensical = "zensical" -) - -// Session defaults. -const ( - // DefaultSessionFilename is the fallback filename component when - // sanitization produces an empty string. - DefaultSessionFilename = "session" -) - -// Runtime configuration constants. -const ( - // FileContextRC is the optional runtime configuration file. - FileContextRC = ".ctxrc" -) - -// Environment configuration. -const ( - // EnvCtxDir is the environment variable for overriding the context directory. - EnvCtxDir = "CTX_DIR" - // EnvCtxTokenBudget is the environment variable for overriding the token budget. - EnvCtxTokenBudget = "CTX_TOKEN_BUDGET" //nolint:gosec // G101: env var name, not a credential - // EnvBackupSMBURL is the environment variable for the SMB share URL. - EnvBackupSMBURL = "CTX_BACKUP_SMB_URL" - // EnvBackupSMBSubdir is the environment variable for the SMB share subdirectory. - EnvBackupSMBSubdir = "CTX_BACKUP_SMB_SUBDIR" - // EnvSkipPathCheck is the environment variable that skips the PATH - // validation during init. Set to EnvTrue in tests. - EnvSkipPathCheck = "CTX_SKIP_PATH_CHECK" -) - -// Environment toggle values. -const ( - // EnvTrue is the canonical truthy value for environment variable toggles. - EnvTrue = "1" -) - -// User confirmation input values. -const ( - // ConfirmShort is the short affirmative response for y/N prompts. - ConfirmShort = "y" - // ConfirmLong is the long affirmative response for y/N prompts. - ConfirmLong = "yes" -) - -// Backup configuration. -const ( - // BackupDefaultSubdir is the default subdirectory on the SMB share. - BackupDefaultSubdir = "ctx-sessions" - // BackupMarkerFile is the state file touched on successful project backup. - BackupMarkerFile = "ctx-last-backup" -) - -// Date and time format constants. -const ( - // DateFormat is the canonical YYYY-MM-DD date layout for time.Parse. - DateFormat = "2006-01-02" - // DateTimeFormat is DateFormat with hours and minutes (HH:MM). - DateTimeFormat = "2006-01-02 15:04" - // DateTimePreciseFormat is DateFormat with hours, minutes, and seconds. - DateTimePreciseFormat = "2006-01-02 15:04:05" - // TimeFormat is the hours:minutes:seconds layout for timestamps. - TimeFormat = "15:04:05" - // TimestampCompact is the YYYYMMDD-HHMMSS layout used in entry headers - // and task timestamps (e.g., 2026-01-28-143022). - TimestampCompact = "2006-01-02-150405" -) - -// InclusiveUntilOffset is the duration added to an --until date to make -// it inclusive of the entire day (23:59:59). -const InclusiveUntilOffset = 24*time.Hour - time.Second - -// Parser configuration. -const ( - // ParserPeekLines is the number of lines to scan when detecting file format. - ParserPeekLines = 50 -) - -// Export configuration. -const ( - // MaxMessagesPerPart is the maximum number of messages per exported - // journal file. Sessions with more messages are split into multiple - // parts for browser performance. - MaxMessagesPerPart = 200 -) - -// Recall show/list display limits. -const ( - // PreviewMaxTurns is the maximum number of user turns shown in - // the conversation preview of recall show. - PreviewMaxTurns = 5 - // PreviewMaxTextLen is the maximum character length for a single - // turn in the conversation preview. - PreviewMaxTextLen = 100 - // SlugMaxLen is the maximum display length for session slugs in - // recall list output. - SlugMaxLen = 36 - // SessionIDShortLen is the prefix length for short session IDs - // in summary output. - SessionIDShortLen = 8 - // SessionIDHintLen is the prefix length for session IDs in - // disambiguation hints (longer than short for uniqueness). - SessionIDHintLen = 12 -) - -// Claude API content block types. -const ( - // ClaudeBlockText is a text content block. - ClaudeBlockText = "text" - // ClaudeBlockThinking is an extended thinking content block. - ClaudeBlockThinking = "thinking" - // ClaudeBlockToolUse is a tool invocation block. - ClaudeBlockToolUse = "tool_use" - // ClaudeBlockToolResult is a tool execution result block. - ClaudeBlockToolResult = "tool_result" -) - -// Claude API content block field keys. -const ( - // ClaudeFieldType is the block type discriminator key. - ClaudeFieldType = "type" - // ClaudeFieldText is the text content key. - ClaudeFieldText = "text" - // ClaudeFieldThinking is the thinking content key. - ClaudeFieldThinking = "thinking" - // ClaudeFieldName is the tool name key. - ClaudeFieldName = "name" - // ClaudeFieldInput is the tool input parameters key. - ClaudeFieldInput = "input" - // ClaudeFieldContent is the tool result content key. - ClaudeFieldContent = "content" -) - -// Claude API message roles. -const ( - // RoleUser is a user message. - RoleUser = "user" - // RoleAssistant is an assistant message. - RoleAssistant = "assistant" -) - -// Tool identifiers for session parsers. -const ( - // ToolClaudeCode is the tool identifier for Claude Code sessions. - ToolClaudeCode = "claude-code" - // ToolMarkdown is the tool identifier for Markdown session files. - ToolMarkdown = "markdown" -) - -// Claude Code integration file names. -const ( - // FileClaudeMd is the Claude Code configuration file in the project root. - FileClaudeMd = "CLAUDE.md" - // FilePromptMd is the session prompt file in the project root. - FilePromptMd = "PROMPT.md" - // FileImplementationPlan is the implementation plan file in the project root. - FileImplementationPlan = "IMPLEMENTATION_PLAN.md" - // FileSettings is the Claude Code local settings file. - FileSettings = ".claude/settings.local.json" - // FileSettingsGolden is the golden image of the Claude Code settings. - FileSettingsGolden = ".claude/settings.golden.json" - // FileMakefileCtx is the ctx-owned Makefile include for project root. - FileMakefileCtx = "Makefile.ctx" - - // FileGlobalSettings is the Claude Code global settings file. - // Located at ~/.claude/settings.json (not the project-local one). - FileGlobalSettings = "settings.json" - // FileInstalledPlugins is the Claude Code installed plugins registry. - // Located at ~/.claude/plugins/installed_plugins.json. - FileInstalledPlugins = "plugins/installed_plugins.json" - - // PluginID is the ctx plugin identifier in Claude Code. - PluginID = "ctx@activememory-ctx" -) - -// Context file name constants for .context/ directory. -const ( - // FileConstitution contains inviolable rules for agents. - FileConstitution = "CONSTITUTION.md" - // FileTask contains current work items and their status. - FileTask = "TASKS.md" - // FileConvention contains code patterns and standards. - FileConvention = "CONVENTIONS.md" - // FileArchitecture contains system structure documentation. - FileArchitecture = "ARCHITECTURE.md" - // FileDecision contains architectural decisions with rationale. - FileDecision = "DECISIONS.md" - // FileLearning contains gotchas, tips, and lessons learned. - FileLearning = "LEARNINGS.md" - // FileGlossary contains domain terms and definitions. - FileGlossary = "GLOSSARY.md" - // FileAgentPlaybook contains the meta-instructions for using the - // context system. - FileAgentPlaybook = "AGENT_PLAYBOOK.md" - // FileDependency contains project dependency documentation. - FileDependency = "DEPENDENCIES.md" -) - -// Journal state file. -const ( - // FileJournalState is the processing state file in .context/journal/. - FileJournalState = ".state.json" -) - -// Architecture mapping file constants for .context/ directory. -const ( - // FileDetailedDesign is the deep per-module architecture reference. - FileDetailedDesign = "DETAILED_DESIGN.md" - // FileMapTracking is the architecture mapping coverage state file. - FileMapTracking = "map-tracking.json" -) - -// Scratchpad file constants for .context/ directory. -const ( - // FileScratchpadEnc is the encrypted scratchpad file. - FileScratchpadEnc = "scratchpad.enc" - // FileScratchpadMd is the plaintext scratchpad file. - FileScratchpadMd = "scratchpad.md" - // FileContextKey is the context encryption key file. - FileContextKey = ".ctx.key" - // FileNotifyEnc is the encrypted webhook URL file. - FileNotifyEnc = ".notify.enc" -) - -// Reminder file constants for .context/ directory. -const ( - // FileReminders is the session-scoped reminders file. - FileReminders = "reminders.json" -) - -// Memory bridge file constants for .context/memory/ directory. -const ( - // FileMemorySource is the Claude Code auto memory filename. - FileMemorySource = "MEMORY.md" - // FileMemoryMirror is the raw copy of Claude Code's MEMORY.md. - FileMemoryMirror = "mirror.md" - // FileMemoryState is the sync/import tracking state file. - FileMemoryState = "memory-import.json" -) - -// PathMemoryMirror is the relative path from the project root to the -// memory mirror file. Constructed from directory and file constants. -var PathMemoryMirror = filepath.Join(DirContext, DirMemory, FileMemoryMirror) - -// Event log constants for .context/state/ directory. -const ( - // FileEventLog is the current event log file. - FileEventLog = "events.jsonl" - // FileEventLogPrev is the rotated (previous) event log file. - FileEventLogPrev = "events.1.jsonl" - // EventLogMaxBytes is the size threshold for log rotation (1MB). - EventLogMaxBytes = 1 << 20 - // LogMaxBytes is the size threshold for hook log rotation (1MB). - LogMaxBytes = 1 << 20 -) - -// FileType maps short names to actual file names. -var FileType = map[string]string{ - EntryDecision: FileDecision, - EntryTask: FileTask, - EntryLearning: FileLearning, - EntryConvention: FileConvention, -} - -// FilesRequired lists the essential context files that must be present. -// -// These are the files created with `ctx init --minimal` and checked by -// drift detection for missing files. -var FilesRequired = []string{ - FileConstitution, - FileTask, - FileDecision, -} - -// FileReadOrder defines the priority order for reading context files. -// -// The order follows a logical progression for AI agents: -// -// 1. CONSTITUTION — Inviolable rules. Must be loaded first so the agent -// knows what it cannot do before attempting anything. -// -// 2. TASKS — Current work items. What the agent should focus on. -// -// 3. CONVENTIONS — How to write code. Patterns and standards to follow. -// -// 4. ARCHITECTURE — System structure. Understanding of components and -// boundaries before making changes. -// -// 5. DECISIONS — Historical context. Why things are the way they are, -// to avoid re-debating settled decisions. -// -// 6. LEARNINGS — Gotchas and tips. Lessons from past work that inform -// current implementation. -// -// 7. GLOSSARY — Reference material. Domain terms and abbreviations for -// lookup as needed. -// -// 8. AGENT_PLAYBOOK — Meta instructions. How to use this context system. -// Loaded last because it's about the system itself, not the work. -// The agent should understand the content before the operating manual. -var FileReadOrder = []string{ - FileConstitution, - FileTask, - FileConvention, - FileArchitecture, - FileDecision, - FileLearning, - FileGlossary, - FileAgentPlaybook, -} - -// Packages maps dependency manifest files to their descriptions. -// -// Used by sync to detect projects and suggest dependency documentation. -var Packages = map[string]string{ - "package.json": "Node.js dependencies", - "go.mod": "Go module dependencies", - "Cargo.toml": "Rust dependencies", - "requirements.txt": "Python dependencies", - "Gemfile": "Ruby dependencies", -} diff --git a/internal/config/file/doc.go b/internal/config/file/doc.go new file mode 100644 index 00000000..802e05af --- /dev/null +++ b/internal/config/file/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package file defines file extension, filename, and profile constants used across ctx. +package file diff --git a/internal/config/file/ext.go b/internal/config/file/ext.go new file mode 100644 index 00000000..0aae1d82 --- /dev/null +++ b/internal/config/file/ext.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package file + +// File extension constants. +const ( + // ExtMarkdown is the Markdown file extension. + ExtMarkdown = ".md" + // ExtTxt is the plain text file extension. + ExtTxt = ".txt" + // ExtJSONL is the JSON Lines file extension. + ExtJSONL = ".jsonl" +) diff --git a/internal/config/file/ignore.go b/internal/config/file/ignore.go new file mode 100644 index 00000000..4b6c45f3 --- /dev/null +++ b/internal/config/file/ignore.go @@ -0,0 +1,26 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package file + +import ( + "path" + + "github.com/ActiveMemory/ctx/internal/config/dir" +) + +// Gitignore lists the recommended .gitignore entries added by ctx init. +var Gitignore = []string{ + path.Join(dir.Context, dir.Journal, "/"), + path.Join(dir.Context, dir.JournalSite, "/"), + path.Join(dir.Context, dir.JournalObsidian, "/"), + path.Join(dir.Context, dir.Logs, "/"), + ".context/.ctx.key", + ".context/.context.key", + ".context/.scratchpad.key", + ".context/state/", + ".claude/settings.local.json", +} diff --git a/internal/config/file/limit.go b/internal/config/file/limit.go new file mode 100644 index 00000000..ccad0a29 --- /dev/null +++ b/internal/config/file/limit.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package file + +// Session defaults. +const ( + // MaxNameLen is the maximum character length for sanitized filename components. + MaxNameLen = 50 +) diff --git a/internal/config/file/name.go b/internal/config/file/name.go new file mode 100644 index 00000000..9451c748 --- /dev/null +++ b/internal/config/file/name.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package file + +// Common filenames. +const ( + // Readme is the standard README filename. + Readme = "README.md" + // Index is the standard index filename for generated sites. + Index = "index.md" +) diff --git a/internal/config/file/runtime.go b/internal/config/file/runtime.go new file mode 100644 index 00000000..84dd3947 --- /dev/null +++ b/internal/config/file/runtime.go @@ -0,0 +1,18 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package file + +// Profile file names and identifiers for .ctxrc management. +const ( + // CtxRC is the optional runtime configuration file. + CtxRC = ".ctxrc" + CtxRCBase = ".ctxrc.base" + CtxRCDev = ".ctxrc.dev" + ProfileDev = "dev" + ProfileBase = "base" + ProfileProd = "prod" // Alias for ProfileBase +) diff --git a/internal/config/flag/doc.go b/internal/config/flag/doc.go new file mode 100644 index 00000000..9c8798f5 --- /dev/null +++ b/internal/config/flag/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package flag defines CLI flag name constants for registration and error display. +package flag diff --git a/internal/config/flag/flag.go b/internal/config/flag/flag.go new file mode 100644 index 00000000..d540e282 --- /dev/null +++ b/internal/config/flag/flag.go @@ -0,0 +1,31 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package flag + +// Global CLI flag names. +const ( + ContextDir = "context-dir" + AllowOutsideCwd = "allow-outside-cwd" +) + +// CLI flag prefixes for display formatting. +const ( + PrefixShort = "-" + PrefixLong = "--" +) + +// Add command flag names — used for both flag registration and error display. +const ( + Context = "context" + Rationale = "rationale" + Consequences = "consequences" + Lesson = "lesson" + Application = "application" + Priority = "priority" + Section = "section" + File = "file" +) diff --git a/internal/config/fmt/doc.go b/internal/config/fmt/doc.go new file mode 100644 index 00000000..cc8a6181 --- /dev/null +++ b/internal/config/fmt/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package fmt defines output format identifiers for CLI commands. +package fmt diff --git a/internal/config/fmt.go b/internal/config/fmt/fmt.go similarity index 70% rename from internal/config/fmt.go rename to internal/config/fmt/fmt.go index 8c31de6c..58b85ec5 100644 --- a/internal/config/fmt.go +++ b/internal/config/fmt/fmt.go @@ -4,7 +4,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package fmt // Output format constants for CLI commands. const ( @@ -12,4 +12,8 @@ const ( FormatJSON = "json" // FormatMarkdown selects Markdown output. FormatMarkdown = "md" + // FormatMermaid selects Mermaid diagram output. + FormatMermaid = "mermaid" + // FormatTable selects plain text table output. + FormatTable = "table" ) diff --git a/internal/config/fs/doc.go b/internal/config/fs/doc.go new file mode 100644 index 00000000..f8f46cb3 --- /dev/null +++ b/internal/config/fs/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package fs defines file and directory permission constants. +package fs diff --git a/internal/config/fs/perm.go b/internal/config/fs/perm.go new file mode 100644 index 00000000..32ed7ed7 --- /dev/null +++ b/internal/config/fs/perm.go @@ -0,0 +1,21 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package fs + +// File permission constants. +const ( + // PermFile is the standard permission for regular files (owner rw, others r). + PermFile = 0644 + // PermExec is the standard permission for directories and executable files. + PermExec = 0755 + // PermRestrictedDir is the permission for internal directories (owner rwx, group rx). + PermRestrictedDir = 0750 + // PermSecret is the permission for secret files (owner rw only). + PermSecret = 0600 + // PermKeyDir is the permission for the user-level key directory (owner rwx only). + PermKeyDir = 0700 +) diff --git a/internal/config/heartbeat/doc.go b/internal/config/heartbeat/doc.go new file mode 100644 index 00000000..ca79ed69 --- /dev/null +++ b/internal/config/heartbeat/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package heartbeat defines state file prefixes and filenames for the heartbeat subsystem. +package heartbeat diff --git a/internal/config/heartbeat/heartbeat.go b/internal/config/heartbeat/heartbeat.go new file mode 100644 index 00000000..93716116 --- /dev/null +++ b/internal/config/heartbeat/heartbeat.go @@ -0,0 +1,19 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package heartbeat + +// Heartbeat state file prefixes. +const ( + // HeartbeatCounterPrefix is the state file prefix for per-session + // heartbeat prompt counters. + HeartbeatCounterPrefix = "heartbeat-" + // HeartbeatMtimePrefix is the state file prefix for per-session + // heartbeat context mtime tracking. + HeartbeatMtimePrefix = "heartbeat-mtime-" + // HeartbeatLogFile is the log filename for heartbeat events. + HeartbeatLogFile = "heartbeat.log" +) diff --git a/internal/config/hook/decision.go b/internal/config/hook/decision.go new file mode 100644 index 00000000..47c30419 --- /dev/null +++ b/internal/config/hook/decision.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package hook + +// Hook decision constants — JSON values returned by PreToolUse hooks. +const ( + // HookDecisionBlock is the decision value that prevents tool execution. + HookDecisionBlock = "block" +) diff --git a/internal/config/hook/doc.go b/internal/config/hook/doc.go new file mode 100644 index 00000000..8fd2bddb --- /dev/null +++ b/internal/config/hook/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package hook defines hook names, event lifecycle stages, decision values, and prefixes. +package hook diff --git a/internal/config/hook/hook.go b/internal/config/hook/hook.go new file mode 100644 index 00000000..8d933e38 --- /dev/null +++ b/internal/config/hook/hook.go @@ -0,0 +1,65 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package hook + +// Hook name constants — used for LoadMessage, NewTemplateRef, notify.Send, +// and eventlog.Append to avoid magic strings. +const ( + // BlockDangerousCommands is the hook name for blocking dangerous commands. + BlockDangerousCommands = "block-dangerous-commands" + // BlockNonPathCtx is the hook name for blocking non-PATH ctx invocations. + BlockNonPathCtx = "block-non-path-ctx" + // CheckBackupAge is the hook name for backup staleness checks. + CheckBackupAge = "check-backup-age" + // CheckCeremonies is the hook name for ceremony usage checks. + CheckCeremonies = "check-ceremonies" + // CheckContextSize is the hook name for context window size checks. + CheckContextSize = "check-context-size" + // CheckJournal is the hook name for journal health checks. + CheckJournal = "check-journal" + // CheckKnowledge is the hook name for knowledge file health checks. + CheckKnowledge = "check-knowledge" + // CheckMapStaleness is the hook name for architecture map staleness checks. + CheckMapStaleness = "check-map-staleness" + // CheckMemoryDrift is the hook name for memory drift checks. + CheckMemoryDrift = "check-memory-drift" + // CheckPersistence is the hook name for context persistence nudges. + CheckPersistence = "check-persistence" + // CheckReminders is the hook name for session reminder checks. + CheckReminders = "check-reminders" + // CheckResources is the hook name for resource usage checks. + CheckResources = "check-resources" + // CheckTaskCompletion is the hook name for task completion nudges. + CheckTaskCompletion = "check-task-completion" + // CheckVersion is the hook name for version mismatch checks. + CheckVersion = "check-version" + // Heartbeat is the hook name for session heartbeat events. + Heartbeat = "heartbeat" + // PostCommit is the hook name for post-commit nudges. + PostCommit = "post-commit" + // QAReminder is the hook name for QA reminder gates. + QAReminder = "qa-reminder" + // SpecsNudge is the hook name for specs directory nudges. + SpecsNudge = "specs-nudge" + // VersionDrift is the hook name for version drift nudges. + VersionDrift = "version-drift" +) + +// Prefixes +const ( + // PrefixMemoryDriftThrottle is the state file prefix for per-session + // memory drift nudge tombstones. + PrefixMemoryDriftThrottle = "memory-drift-nudged-" +) + +// Hook event names (Claude Code hook lifecycle stages). +const ( + // EventPreToolUse is the hook event for pre-tool-use hooks. + EventPreToolUse = "PreToolUse" + // EventPostToolUse is the hook event for post-tool-use hooks. + EventPostToolUse = "PostToolUse" +) diff --git a/internal/config/hook/notify.go b/internal/config/hook/notify.go new file mode 100644 index 00000000..16c1d997 --- /dev/null +++ b/internal/config/hook/notify.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package hook + +// Notification channel names. +const ( + // NotifyChannelHeartbeat is the notification channel for heartbeat events. + NotifyChannelHeartbeat = "heartbeat" + // NotifyChannelNudge is the notification channel for nudge messages. + NotifyChannelNudge = "nudge" + // NotifyChannelRelay is the notification channel for relay messages. + NotifyChannelRelay = "relay" +) diff --git a/internal/config/hook/variant.go b/internal/config/hook/variant.go new file mode 100644 index 00000000..27f0a57a --- /dev/null +++ b/internal/config/hook/variant.go @@ -0,0 +1,62 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package hook + +// Hook variant constants — template selectors passed to LoadMessage and +// NewTemplateRef to choose the appropriate message for each trigger type. +const ( + // VariantMidSudo selects the mid-command sudo block message. + VariantMidSudo = "mid-sudo" + // VariantMidGitPush selects the mid-command git push block message. + VariantMidGitPush = "mid-git-push" + // VariantCpToBin selects the cp/mv to bin block message. + VariantCpToBin = "cp-to-bin" + // VariantInstallToLocalBin selects the install to ~/.local/bin block message. + VariantInstallToLocalBin = "install-to-local-bin" + // VariantDotSlash selects the relative path (./ctx) block message. + VariantDotSlash = "dot-slash" + // VariantGoRun selects the go run block message. + VariantGoRun = "go-run" + // VariantAbsolutePath selects the absolute path block message. + VariantAbsolutePath = "absolute-path" + // VariantBoth selects the template for both ceremonies missing. + VariantBoth = "both" + // VariantRemember selects the template for missing /ctx-remember. + VariantRemember = "remember" + // VariantWrapup selects the template for missing /ctx-wrap-up. + VariantWrapup = "wrapup" + // VariantUnexported selects the unexported journal entries variant. + VariantUnexported = "unexported" + // VariantUnenriched selects the unenriched journal entries variant. + VariantUnenriched = "unenriched" + // VariantWarning selects the generic warning variant. + VariantWarning = "warning" + // VariantAlert selects the alert variant. + VariantAlert = "alert" + // VariantBilling selects the billing threshold variant. + VariantBilling = "billing" + // VariantCheckpoint selects the checkpoint variant. + VariantCheckpoint = "checkpoint" + // VariantGate selects the gate variant. + VariantGate = "gate" + // VariantKeyRotation selects the key rotation variant. + VariantKeyRotation = "key-rotation" + // VariantMismatch selects the version mismatch variant. + VariantMismatch = "mismatch" + // VariantNudge selects the generic nudge variant. + VariantNudge = "nudge" + // VariantOversize selects the oversize threshold variant. + VariantOversize = "oversize" + // VariantPulse selects the heartbeat pulse variant. + VariantPulse = "pulse" + // VariantReminders selects the reminders variant. + VariantReminders = "reminders" + // VariantStale selects the staleness variant. + VariantStale = "stale" + // VariantWindow selects the context window variant. + VariantWindow = "window" +) diff --git a/internal/config/journal/check.go b/internal/config/journal/check.go new file mode 100644 index 00000000..8ecc72a3 --- /dev/null +++ b/internal/config/journal/check.go @@ -0,0 +1,16 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Check-journal configuration. +const ( + // CheckJournalThrottleID is the state file name for daily throttle of journal checks. + CheckJournalThrottleID = "journal-reminded" + // CheckJournalClaudeProjectsSubdir is the relative path under $HOME to + // the Claude Code projects directory scanned for unexported sessions. + CheckJournalClaudeProjectsSubdir = ".claude/projects" +) diff --git a/internal/config/journal/doc.go b/internal/config/journal/doc.go new file mode 100644 index 00000000..580f8327 --- /dev/null +++ b/internal/config/journal/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package journal defines constants for journal export, site generation, and display limits. +package journal diff --git a/internal/config/journal/len.go b/internal/config/journal/len.go new file mode 100644 index 00000000..3c2a1dad --- /dev/null +++ b/internal/config/journal/len.go @@ -0,0 +1,26 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Recall show/list display limits. +const ( + // PreviewMaxTurns is the maximum number of user turns shown in + // the conversation preview of recall show. + PreviewMaxTurns = 5 + // PreviewMaxTextLen is the maximum character length for a single + // turn in the conversation preview. + PreviewMaxTextLen = 100 + // SlugMaxLen is the maximum display length for session slugs in + // recall list output. + SlugMaxLen = 36 + // SessionIDShortLen is the prefix length for short session IDs + // in summary output. + SessionIDShortLen = 8 + // SessionIDHintLen is the prefix length for session IDs in + // disambiguation hints (longer than short for uniqueness). + SessionIDHintLen = 12 +) diff --git a/internal/config/journal/limit.go b/internal/config/journal/limit.go new file mode 100644 index 00000000..4c554ff5 --- /dev/null +++ b/internal/config/journal/limit.go @@ -0,0 +1,39 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Journal site generation constants. +const ( + // PopularityThreshold is the minimum number of entries to + // mark a topic or key file as "popular" (gets its own dedicated page). + PopularityThreshold = 2 + // LineWrapWidth is the soft wrap target column for journal + // content. + LineWrapWidth = 80 + // MaxRecentSessions is the maximum number of sessions shown + // in the zensical navigation sidebar. + MaxRecentSessions = 20 + // MaxNavTitleLen is the maximum title length before + // truncation in the zensical navigation sidebar. + MaxNavTitleLen = 40 + // DatePrefixLen is the length of a YYYY-MM-DD date prefix. + DatePrefixLen = 10 + // MonthPrefixLen is the length of a YYYY-MM month prefix. + MonthPrefixLen = 7 + // TimePrefixLen is the length of an HH:MM time prefix. + TimePrefixLen = 5 + // MaxTitleLen is the maximum character length for a journal title. + // Keeps H1 headings and link text on a single line (below wrap width). + MaxTitleLen = 75 + // ShortIDLen is the truncation length for session IDs in filenames. + ShortIDLen = 8 + // DetailsThreshold is the line count above which tool output is + // wrapped in a collapsible
block. + DetailsThreshold = 10 + // DefaultRecallListLimit is the default number of sessions shown by recall list. + DefaultRecallListLimit = 20 +) diff --git a/internal/config/journal/msg.go b/internal/config/journal/msg.go new file mode 100644 index 00000000..ac31320d --- /dev/null +++ b/internal/config/journal/msg.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Export configuration. +const ( + // MaxMessagesPerPart is the maximum number of messages per exported + // journal file. Sessions with more messages are split into multiple + // parts for browser performance. + MaxMessagesPerPart = 200 +) diff --git a/internal/config/journal/stage.go b/internal/config/journal/stage.go new file mode 100644 index 00000000..f8bfac77 --- /dev/null +++ b/internal/config/journal/stage.go @@ -0,0 +1,21 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Journal processing stage names. +const ( + // StageExported marks a journal entry as exported from Claude Code. + StageExported = "exported" + // StageEnriched marks a journal entry as enriched with metadata. + StageEnriched = "enriched" + // StageNormalized marks a journal entry as normalized for rendering. + StageNormalized = "normalized" + // StageFencesVerified marks a journal entry as having verified code fences. + StageFencesVerified = "fences_verified" + // StageLocked marks a journal entry as locked (read-only). + StageLocked = "locked" +) diff --git a/internal/config/journal/state.go b/internal/config/journal/state.go new file mode 100644 index 00000000..cbc81ac8 --- /dev/null +++ b/internal/config/journal/state.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package journal + +// Journal state file. +const ( + // FileState is the processing state file in .context/journal/. + FileState = ".state.json" +) diff --git a/internal/config/knowledge/doc.go b/internal/config/knowledge/doc.go new file mode 100644 index 00000000..71fe82ae --- /dev/null +++ b/internal/config/knowledge/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package knowledge defines configuration constants for the knowledge hook. +package knowledge diff --git a/internal/config/knowledge/knowledge.go b/internal/config/knowledge/knowledge.go new file mode 100644 index 00000000..706e2698 --- /dev/null +++ b/internal/config/knowledge/knowledge.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package knowledge + +// Knowledge hook configuration. +const ( + // KnowledgeThrottleID is the state file name for daily throttle of knowledge checks. + KnowledgeThrottleID = "check-knowledge" +) diff --git a/internal/config/limit.go b/internal/config/limit.go deleted file mode 100644 index 31453cbc..00000000 --- a/internal/config/limit.go +++ /dev/null @@ -1,72 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -// MaxDecisionsToSummarize is the number of recent decisions to include -// in summaries. -const MaxDecisionsToSummarize = 3 - -// MaxLearningsToSummarize is the number of recent learnings to include -// in summaries. -const MaxLearningsToSummarize = 5 - -// MaxPreviewLen is the maximum length for preview lines before truncation. -const MaxPreviewLen = 60 - -// Content detection constants. -const ( - // MinContentLen is the minimum byte length for a file to be considered - // non-empty by the effectively-empty heuristic. - MinContentLen = 20 -) - -// Insight extraction constants. -const ( - // InsightMaxLen is the maximum character length for an extracted insight. - InsightMaxLen = 150 - // InsightWordBoundaryMin is the minimum cut position when truncating - // at a word boundary. - InsightWordBoundaryMin = 100 -) - -// BinaryVersion holds the ctx binary version, set by bootstrap at startup. -// Defaults to "dev" when not set (e.g., during tests). -var BinaryVersion = "dev" - -// Recall/export constants. -const ( - // RecallMaxTitleLen is the maximum character length for a journal title. - // Keeps H1 headings and link text on a single line (below wrap width). - RecallMaxTitleLen = 75 - // RecallShortIDLen is the truncation length for session IDs in filenames. - RecallShortIDLen = 8 - // RecallDetailsThreshold is the line count above which tool output is - // wrapped in a collapsible
block. - RecallDetailsThreshold = 10 -) - -// Journal site generation constants. -const ( - // JournalPopularityThreshold is the minimum number of entries to - // mark a topic or key file as "popular" (gets its own dedicated page). - JournalPopularityThreshold = 2 - // JournalLineWrapWidth is the soft wrap target column for journal - // content. - JournalLineWrapWidth = 80 - // JournalMaxRecentSessions is the maximum number of sessions shown - // in the zensical navigation sidebar. - JournalMaxRecentSessions = 20 - // JournalMaxNavTitleLen is the maximum title length before - // truncation in the zensical navigation sidebar. - JournalMaxNavTitleLen = 40 - // JournalDatePrefixLen is the length of a YYYY-MM-DD date prefix. - JournalDatePrefixLen = 10 - // JournalMonthPrefixLen is the length of a YYYY-MM month prefix. - JournalMonthPrefixLen = 7 - // JournalTimePrefixLen is the length of an HH:MM time prefix. - JournalTimePrefixLen = 5 -) diff --git a/internal/config/load_gate/doc.go b/internal/config/load_gate/doc.go new file mode 100644 index 00000000..0cc3c885 --- /dev/null +++ b/internal/config/load_gate/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package load_gate defines constants for the context load gate and auto-prune subsystem. +package load_gate diff --git a/internal/config/load_gate/load_gate.go b/internal/config/load_gate/load_gate.go new file mode 100644 index 00000000..52775b66 --- /dev/null +++ b/internal/config/load_gate/load_gate.go @@ -0,0 +1,20 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package load_gate + +const ( + // PrefixCtxLoaded is the filename prefix for session-loaded marker files. + PrefixCtxLoaded = "ctx-loaded-" + // EventContextLoadGate is the event name for context load gate hook events. + EventContextLoadGate = "context-load-gate" + // ContextLoadSeparatorChar is the character used for header/footer separators. + ContextLoadSeparatorChar = "=" + // ContextLoadSeparatorWidth is the width of header/footer separator lines. + ContextLoadSeparatorWidth = 80 + // ContextLoadIndexSuffix is the suffix appended to filenames for index entries. + ContextLoadIndexSuffix = " (idx)" +) diff --git a/internal/config/load_gate/prune.go b/internal/config/load_gate/prune.go new file mode 100644 index 00000000..244182ce --- /dev/null +++ b/internal/config/load_gate/prune.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package load_gate + +const ( + // AutoPruneStaleDays is the number of days after which session state + // files are eligible for auto-pruning during context load. + AutoPruneStaleDays = 7 +) diff --git a/internal/config/load_gate/ts.go b/internal/config/load_gate/ts.go new file mode 100644 index 00000000..4a8efd6d --- /dev/null +++ b/internal/config/load_gate/ts.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package load_gate + +const ( + // JSONKeyTimestamp is the JSON key for timestamp extraction in event logs. + JSONKeyTimestamp = `"timestamp":"` +) diff --git a/internal/config/loop/doc.go b/internal/config/loop/doc.go new file mode 100644 index 00000000..8be706cf --- /dev/null +++ b/internal/config/loop/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package loop defines configuration constants for loop script generation. +package loop diff --git a/internal/config/loop/loop.go b/internal/config/loop/loop.go new file mode 100644 index 00000000..25098fb7 --- /dev/null +++ b/internal/config/loop/loop.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package loop + +// Loop script configuration. +const ( + // DefaultCompletionSignal is the default loop completion signal string. + DefaultCompletionSignal = "SYSTEM_CONVERGED" +) diff --git a/internal/config/loop/prompt.go b/internal/config/loop/prompt.go new file mode 100644 index 00000000..55877c1e --- /dev/null +++ b/internal/config/loop/prompt.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package loop + +// PromptMd is the session prompt file in the project root. +const PromptMd = "PROMPT.md" diff --git a/internal/config/marker/doc.go b/internal/config/marker/doc.go new file mode 100644 index 00000000..69c11c33 --- /dev/null +++ b/internal/config/marker/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package marker defines HTML comment markers for parsing and generating embedded context blocks. +package marker diff --git a/internal/config/marker/entry.go b/internal/config/marker/entry.go new file mode 100644 index 00000000..e608dfe4 --- /dev/null +++ b/internal/config/marker/entry.go @@ -0,0 +1,7 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package marker diff --git a/internal/config/marker.go b/internal/config/marker/marker.go similarity index 81% rename from internal/config/marker.go rename to internal/config/marker/marker.go index c95bdec4..7a5fcdb1 100644 --- a/internal/config/marker.go +++ b/internal/config/marker/marker.go @@ -4,7 +4,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package marker // HTML comment markers for parsing and generation. const ( @@ -30,6 +30,14 @@ const ( PromptMarkerEnd = "" ) +// Copilot block markers for .github/copilot-instructions.md. +const ( + // CopilotMarkerStart marks the beginning of ctx-managed Copilot content. + CopilotMarkerStart = "" + // CopilotMarkerEnd marks the end of ctx-managed Copilot content. + CopilotMarkerEnd = "" +) + // Plan block markers for IMPLEMENTATION_PLAN.md. const ( // PlanMarkerStart marks the beginning of the plan block. @@ -59,6 +67,21 @@ const ( MarkTaskComplete = "x" ) +// Published block markers for MEMORY.md. +const ( + // PublishMarkerStart begins the ctx-published block in MEMORY.md. + PublishMarkerStart = "" + // PublishMarkerEnd ends the ctx-published block in MEMORY.md. + PublishMarkerEnd = "" +) + +// Entry status markers for knowledge files. +const ( + // PrefixSuperseded is the strikethrough prefix that marks an entry as + // superseded by a newer entry. + PrefixSuperseded = "~~Superseded" +) + // System reminder tags injected by Claude Code into tool results. const ( // TagSystemReminderOpen is the opening tag for system reminders. diff --git a/internal/config/mcp/doc.go b/internal/config/mcp/doc.go new file mode 100644 index 00000000..20acb209 --- /dev/null +++ b/internal/config/mcp/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package mcp defines constants for the MCP server protocol integration. +package mcp diff --git a/internal/config/mcp/mcp.go b/internal/config/mcp/mcp.go new file mode 100644 index 00000000..7b6273c3 --- /dev/null +++ b/internal/config/mcp/mcp.go @@ -0,0 +1,47 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package mcp + +// MCP constants. +const ( + // MCPResourceURIPrefix is the URI scheme prefix for MCP context resources. + MCPResourceURIPrefix = "ctx://context/" + // MimeMarkdown is the MIME type for Markdown content. + MimeMarkdown = "text/markdown" + // MCPScanMaxSize is the maximum scanner buffer size for MCP messages (1 MB). + MCPScanMaxSize = 1 << 20 + // MCPMethodInitialize is the MCP initialize handshake method. + MCPMethodInitialize = "initialize" + // MCPMethodPing is the MCP ping method. + MCPMethodPing = "ping" + // MCPMethodResourcesList is the MCP method for listing resources. + MCPMethodResourcesList = "resources/list" + // MCPMethodResourcesRead is the MCP method for reading a resource. + MCPMethodResourcesRead = "resources/read" + // MCPMethodToolsList is the MCP method for listing tools. + MCPMethodToolsList = "tools/list" + // MCPMethodToolsCall is the MCP method for calling a tool. + MCPMethodToolsCall = "tools/call" + // MCPJSONRPCVersion is the JSON-RPC protocol version string. + MCPJSONRPCVersion = "2.0" + // MCPServerName is the server name reported during initialization. + MCPServerName = "ctx" + // MCPContentTypeText is the content type for text tool output. + MCPContentTypeText = "text" + // MCPSchemaObject is the JSON Schema type for objects. + MCPSchemaObject = "object" + // MCPSchemaString is the JSON Schema type for strings. + MCPSchemaString = "string" + // MCPToolStatus is the MCP tool name for context status. + MCPToolStatus = "ctx_status" + // MCPToolAdd is the MCP tool name for adding entries. + MCPToolAdd = "ctx_add" + // MCPToolComplete is the MCP tool name for completing tasks. + MCPToolComplete = "ctx_complete" + // MCPToolDrift is the MCP tool name for drift detection. + MCPToolDrift = "ctx_drift" +) diff --git a/internal/config/memory/doc.go b/internal/config/memory/doc.go new file mode 100644 index 00000000..77f924a6 --- /dev/null +++ b/internal/config/memory/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package memory defines constants for memory bridge files, publish budgets, and mirror paths. +package memory diff --git a/internal/config/memory/memory.go b/internal/config/memory/memory.go new file mode 100644 index 00000000..e5d144b2 --- /dev/null +++ b/internal/config/memory/memory.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package memory + +import ( + "path/filepath" + + "github.com/ActiveMemory/ctx/internal/config/dir" +) + +// Memory bridge file constants for .context/memory/ directory. +const ( + // MemorySource is the Claude Code auto memory filename. + MemorySource = "MEMORY.md" + // MemoryMirror is the raw copy of Claude Code's MEMORY.md. + MemoryMirror = "mirror.md" + // MemoryState is the sync/import tracking state file. + MemoryState = "memory-import.json" +) + +// PathMemoryMirror is the relative path from the project root to the +// memory mirror file. Constructed from directory and file constants. +var PathMemoryMirror = filepath.Join(dir.Context, dir.Memory, MemoryMirror) diff --git a/internal/config/memory/mirror.go b/internal/config/memory/mirror.go new file mode 100644 index 00000000..4d4bbed2 --- /dev/null +++ b/internal/config/memory/mirror.go @@ -0,0 +1,11 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package memory + +const ( + PrefixMirror = "mirror-" +) diff --git a/internal/config/memory/publish.go b/internal/config/memory/publish.go new file mode 100644 index 00000000..df710620 --- /dev/null +++ b/internal/config/memory/publish.go @@ -0,0 +1,23 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package memory + +// Publish budget and limits. +const ( + // DefaultPublishBudget is the default line budget for published content. + DefaultPublishBudget = 80 + // PublishMaxTasks is the maximum number of pending tasks to publish. + PublishMaxTasks = 10 + // PublishMaxDecisions is the maximum number of recent decisions to publish. + PublishMaxDecisions = 5 + // PublishMaxConventions is the maximum number of convention items to publish. + PublishMaxConventions = 10 + // PublishMaxLearnings is the maximum number of recent learnings to publish. + PublishMaxLearnings = 5 + // PublishRecentDays is the lookback window in days for recent entries. + PublishRecentDays = 7 +) diff --git a/internal/config/msg/doc.go b/internal/config/msg/doc.go new file mode 100644 index 00000000..81992b7f --- /dev/null +++ b/internal/config/msg/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package msg defines formatting constants for hook message table display. +package msg diff --git a/internal/config/msg/msg.go b/internal/config/msg/msg.go new file mode 100644 index 00000000..2a37ce17 --- /dev/null +++ b/internal/config/msg/msg.go @@ -0,0 +1,25 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package msg + +// Message table formatting. +const ( + // MessageColHook is the column width for the Hook field in message list output. + MessageColHook = 24 + // MessageColVariant is the column width for the Variant field in message list output. + MessageColVariant = 20 + // MessageColCategory is the column width for the Category field in message list output. + MessageColCategory = 16 + // MessageSepHook is the separator width for the Hook column underline. + MessageSepHook = 22 + // MessageSepVariant is the separator width for the Variant column underline. + MessageSepVariant = 18 + // MessageSepCategory is the separator width for the Category column underline. + MessageSepCategory = 14 + // MessageSepOverride is the separator width for the Override column underline. + MessageSepOverride = 8 +) diff --git a/internal/config/nudge/doc.go b/internal/config/nudge/doc.go new file mode 100644 index 00000000..d14ec260 --- /dev/null +++ b/internal/config/nudge/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package nudge defines configuration constants for persistence and task-completion nudge hooks. +package nudge diff --git a/internal/config/nudge/persist.go b/internal/config/nudge/persist.go new file mode 100644 index 00000000..e96267bc --- /dev/null +++ b/internal/config/nudge/persist.go @@ -0,0 +1,32 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package nudge + +// Check-persistence configuration. +const ( + // PersistenceNudgePrefix is the state file prefix for per-session + // persistence nudge counters. + PersistenceNudgePrefix = "persistence-nudge-" + // PersistenceEarlyMin is the minimum prompt count before nudging begins. + PersistenceEarlyMin = 11 + // PersistenceEarlyMax is the upper bound for the early nudge window. + PersistenceEarlyMax = 25 + // PersistenceEarlyInterval is the number of prompts between nudges + // during the early window (prompts 11-25). + PersistenceEarlyInterval = 20 + // PersistenceLateInterval is the number of prompts between nudges + // after the early window (prompts 25+). + PersistenceLateInterval = 15 + // PersistenceLogFile is the log filename for persistence check events. + PersistenceLogFile = "check-persistence.log" + // PersistenceKeyCount is the state file key for prompt count. + PersistenceKeyCount = "count" + // PersistenceKeyLastNudge is the state file key for last nudge prompt number. + PersistenceKeyLastNudge = "last_nudge" + // PersistenceKeyLastMtime is the state file key for last modification time. + PersistenceKeyLastMtime = "last_mtime" +) diff --git a/internal/config/nudge/task.go b/internal/config/nudge/task.go new file mode 100644 index 00000000..4349ad66 --- /dev/null +++ b/internal/config/nudge/task.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package nudge + +// Check-task-completion configuration. +const ( + // PrefixTask is the state file prefix for per-session + // task completion nudge counters. + PrefixTask = "task-nudge-" +) diff --git a/internal/config/obsidian/doc.go b/internal/config/obsidian/doc.go new file mode 100644 index 00000000..d3680b6f --- /dev/null +++ b/internal/config/obsidian/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package obsidian defines directory, file, and MOC page constants for Obsidian vault generation. +package obsidian diff --git a/internal/config/obsidian/obsidian.go b/internal/config/obsidian/obsidian.go new file mode 100644 index 00000000..b25843d4 --- /dev/null +++ b/internal/config/obsidian/obsidian.go @@ -0,0 +1,39 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package obsidian + +// Obsidian vault output directory constants. +const ( + // DirName is the default output directory for the Obsidian vault + // within .context/. + DirName = "journal-obsidian" + // DirEntries is the subdirectory for journal entry files. + DirEntries = "entries" + // DirConfig is the Obsidian configuration directory name. + DirConfig = ".obsidian" +) + +// Obsidian file constants. +const ( + // AppConfigFile is the Obsidian app configuration filename. + AppConfigFile = "app.json" +) + +// Obsidian MOC (Map of Content) page filenames. +const ( + // MOCPrefix is prepended to MOC filenames so they sort first + // in the Obsidian file explorer. + MOCPrefix = "_" + // MOCHome is the root navigation hub filename. + MOCHome = "Home.md" + // MOCTopics is the topics index MOC filename. + MOCTopics = "_Topics.md" + // MOCFiles is the key files index MOC filename. + MOCFiles = "_Key Files.md" + // MOCTypes is the session types index MOC filename. + MOCTypes = "_Session Types.md" +) diff --git a/internal/config/pad/blob.go b/internal/config/pad/blob.go new file mode 100644 index 00000000..7fe53573 --- /dev/null +++ b/internal/config/pad/blob.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package pad + +// Scratchpad blob constants. +const ( + // BlobSep separates the label from the base64-encoded file content. + BlobSep = ":::" + // MaxBlobSize is the maximum file size (pre-encoding) allowed for blob entries. + MaxBlobSize = 64 * 1024 + // BlobTag is the display tag appended to blob labels. + BlobTag = " [BLOB]" +) diff --git a/internal/config/pad/doc.go b/internal/config/pad/doc.go new file mode 100644 index 00000000..24e2dccf --- /dev/null +++ b/internal/config/pad/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package pad defines constants for encrypted scratchpad blob storage and file management. +package pad diff --git a/internal/config/pad/pad.go b/internal/config/pad/pad.go new file mode 100644 index 00000000..15d696ed --- /dev/null +++ b/internal/config/pad/pad.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package pad + +// Scratchpad file constants for .context/ directory. +const ( + // Enc is the encrypted scratchpad file. + Enc = "scratchpad.enc" + // Md is the plaintext scratchpad file. + Md = "scratchpad.md" +) diff --git a/internal/config/parser/buf.go b/internal/config/parser/buf.go new file mode 100644 index 00000000..cc9077ba --- /dev/null +++ b/internal/config/parser/buf.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package parser + +const ( + // BufInitSize is the initial scanner buffer size for session parsing (64 KB). + BufInitSize = 64 * 1024 + // BufMaxSize is the maximum scanner buffer size for session parsing (1 MB). + BufMaxSize = 1024 * 1024 +) diff --git a/internal/config/parser/doc.go b/internal/config/parser/doc.go new file mode 100644 index 00000000..8afed569 --- /dev/null +++ b/internal/config/parser/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package parser defines buffer sizes and directory constants for session parsing. +package parser diff --git a/internal/config/parser/parser.go b/internal/config/parser/parser.go new file mode 100644 index 00000000..0ae28179 --- /dev/null +++ b/internal/config/parser/parser.go @@ -0,0 +1,16 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package parser + +// Parser configuration. +const ( + // LinesToPeek is the number of lines to scan when detecting file format. + LinesToPeek = 50 + // DirSubagents is the directory name for sidechain sessions that share + // the parent sessionId and would cause duplicates if scanned. + DirSubagents = "subagents" +) diff --git a/internal/config/project/doc.go b/internal/config/project/doc.go new file mode 100644 index 00000000..6e997a11 --- /dev/null +++ b/internal/config/project/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package project defines project-root file constants outside the .context/ directory. +package project diff --git a/internal/config/project/project.go b/internal/config/project/project.go new file mode 100644 index 00000000..e34cf391 --- /dev/null +++ b/internal/config/project/project.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package project + +// Project-root file constants (not inside .context/). +const ( + // ImplementationPlan is the high-level project direction file. + ImplementationPlan = "IMPLEMENTATION_PLAN.md" + // MakefileCtx is the ctx-owned Makefile include for project root. + MakefileCtx = "Makefile.ctx" +) diff --git a/internal/config/regex.go b/internal/config/regex.go deleted file mode 100644 index 4b6326f5..00000000 --- a/internal/config/regex.go +++ /dev/null @@ -1,182 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -import "regexp" - -// RegExEntryHeader matches entry headers like "## [2026-01-28-051426] Title here". -// -// Groups: -// - 1: date (YYYY-MM-DD) -// - 2: time (HHMMSS) -// - 3: title -var RegExEntryHeader = regexp.MustCompile( - `## \[(\d{4}-\d{2}-\d{2})-(\d{6})] (.+)`, -) - -// RegExLineNumber matches Claude Code's line number prefixes like " 1→". -var RegExLineNumber = regexp.MustCompile(`(?m)^\s*\d+→`) - -// RegExSystemReminder matches ... blocks. -// These are injected by Claude Code into tool results. -// Groups: -// - 1: content between tags -var RegExSystemReminder = regexp.MustCompile(`(?s)\s*(.*?)\s*`) - -// RegExCodeFenceInline matches code fences that appear inline after text. -// E.g., "some text: ```code" where fence should be on its own line. -// Groups: -// - 1: preceding non-whitespace character -// - 2: the code fence (3+ backticks) -var RegExCodeFenceInline = regexp.MustCompile("(\\S) *(```+)") - -// RegExCodeFenceClose matches code fences immediately followed by text. -// E.g., "```text" where text should be on its own line after the fence. -// Groups: -// - 1: the code fence (3+ backticks) -// - 2: following non-whitespace character -var RegExCodeFenceClose = regexp.MustCompile("(```+) *(\\S)") - -// RegExPhase matches phase headers at any heading level (e.g., "## Phase 1", "### Phase"). -var RegExPhase = regexp.MustCompile(`^#{1,6}\s+Phase`) - -// RegExBulletItem matches any Markdown bullet item (not just tasks). -// -// Groups: -// - 1: item content -var RegExBulletItem = regexp.MustCompile(`(?m)^-\s*(.+)$`) - -// RegExDecision matches decision entry headers in multiline content. -// Use for finding decision positions without capturing groups. -var RegExDecision = regexp.MustCompile(`(?m)^## \[\d{4}-\d{2}-\d{2}-\d{6}].*$`) - -// RegExLearning matches learning entry headers in multiline content. -// Use for finding learning positions without capturing groups. -var RegExLearning = regexp.MustCompile(`(?m)^- \*\*\[\d{4}-\d{2}-\d{2}]\*\*.*$`) - -// RegExNonFileNameChar matches characters not allowed in file names. -var RegExNonFileNameChar = regexp.MustCompile(`[^a-zA-Z0-9-]+`) - -// RegExEntryHeading matches any entry heading (## [timestamp]). -// Use for counting entries without capturing groups. -var RegExEntryHeading = regexp.MustCompile(`(?m)^## \[`) - -// RegExPath matches file paths in Markdown backticks. -// -// Groups: -// - 1: file path -var RegExPath = regexp.MustCompile("`([^`]+\\.[a-zA-Z]{1,5})`") - -// RegExContextUpdate matches context-update XML tags. -// -// Groups: -// - 1: opening tag attributes (e.g., ` type="task" context="..."`) -// - 2: content between tags -var RegExContextUpdate = regexp.MustCompile(`]+)>([^<]+)`) - -// RegExGlossary matches glossary definition entries (lines with **term**). -var RegExGlossary = regexp.MustCompile(`(?m)(?:^|\n)\s*(?:-\s*)?\*\*[^*]+\*\*`) - -// RegExDecisionPatterns detects decision-like phrases in text. -var RegExDecisionPatterns = []*regexp.Regexp{ - regexp.MustCompile(`(?i)decided to\s+(.{20,100})`), - regexp.MustCompile(`(?i)decision:\s*(.{20,100})`), - regexp.MustCompile(`(?i)we('ll| will) use\s+(.{10,80})`), - regexp.MustCompile(`(?i)going with\s+(.{10,80})`), - regexp.MustCompile(`(?i)chose\s+(.{10,80})\s+(over|instead)`), -} - -// RegExLearningPatterns detects learning-like phrases in text. -var RegExLearningPatterns = []*regexp.Regexp{ - regexp.MustCompile(`(?i)learned that\s+(.{20,100})`), - regexp.MustCompile(`(?i)gotcha:\s*(.{20,100})`), - regexp.MustCompile(`(?i)lesson:\s*(.{20,100})`), - regexp.MustCompile(`(?i)TIL:?\s*(.{20,100})`), - regexp.MustCompile(`(?i)turns out\s+(.{20,100})`), - regexp.MustCompile(`(?i)important to (note|remember):\s*(.{20,100})`), -} - -// regExTaskPattern captures indent, checkbox state, and content. -// -// Pattern: ^(\s*)-\s*\[([x ]?)]\s*(.+)$ -// -// Groups: -// - 1: indent (leading whitespace, may be empty) -// - 2: state ("x" for completed, " " or "" for pending) -// - 3: content (task text) -const regExTaskPattern = `^(\s*)-\s*\[([x ]?)]\s*(.+)$` - -// RegExTask matches a task item on a single line. -// -// Use with MatchString or FindStringSubmatch on individual lines. -// For multiline content, use RegExTaskMultiline. -var RegExTask = regexp.MustCompile(regExTaskPattern) - -// RegExTaskMultiline matches task items across multiple lines. -// -// Use with FindAllStringSubmatch on multiline content. -var RegExTaskMultiline = regexp.MustCompile(`(?m)` + regExTaskPattern) - -// RegExTaskDoneTimestamp extracts the #done: timestamp from a task line. -// -// Groups: -// - 1: timestamp (YYYY-MM-DD-HHMMSS) -var RegExTaskDoneTimestamp = regexp.MustCompile(`#done:(\d{4}-\d{2}-\d{2}-\d{6})`) - -// RegExClaudeTag matches Claude Code internal markup tags that leak into -// session titles via the first user message. This MUST remain an allowlist -// of known Claude Code tags — do NOT replace with a blanket regex. -var RegExClaudeTag = regexp.MustCompile(``) - -// Journal site pipeline patterns. - -// RegExMultiPart matches session part files like "...-p2.md", "...-p3.md", etc. -var RegExMultiPart = regexp.MustCompile(`-p\d+\.md$`) - -// RegExGlobStar matches glob-like wildcards: *.ext, */, *) etc. -var RegExGlobStar = regexp.MustCompile(`\*(\.\w+|[/)])`) - -// RegExToolBold matches tool-use lines like "🔧 **Glob: .context/journal/*.md**". -var RegExToolBold = regexp.MustCompile(`🔧\s*\*\*(.+?)\*\*`) - -// RegExInlineCodeAngle matches single-line inline code spans containing -// angle brackets (e.g., `][^`\n]*)`") - -// RegExTurnHeader matches conversation turn headers. -// -// Groups: -// - 1: turn number -// - 2: role (e.g. "Assistant", "Tool Output") -// - 3: timestamp (HH:MM:SS) -var RegExTurnHeader = regexp.MustCompile(`^### (\d+)\. (.+?) \((\d{2}:\d{2}:\d{2})\)$`) - -// RegExFenceLine matches lines that are code fence markers (3+ backticks or -// tildes, optionally followed by a language tag). -var RegExFenceLine = regexp.MustCompile("^\\s*(`{3,}|~{3,})(.*)$") - -// RegExMarkdownHeading matches Markdown heading lines (1-6 hashes + space). -// -// Groups: -// - 1: hash prefix (e.g., "##") -// - 2: heading text -var RegExMarkdownHeading = regexp.MustCompile(`^(#{1,6}) (.+)$`) - -// RegExListStart matches lines that begin an ordered or unordered list item. -var RegExListStart = regexp.MustCompile(`^(\d+\.|[-*]) `) - -// RegExFromAttrName creates a regex to extract an XML attribute value by name. -// -// Parameters: -// - name: The attribute name to match -// -// Returns: -// - *regexp.Regexp: Pattern matching name="value" with value in group 1 -func RegExFromAttrName(name string) *regexp.Regexp { - return regexp.MustCompile(name + `="([^"]*)"`) -} diff --git a/internal/config/regex/budget.go b/internal/config/regex/budget.go new file mode 100644 index 00000000..53074f0f --- /dev/null +++ b/internal/config/regex/budget.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// OversizeTokens matches "Injected: NNNNN tokens" in the injection-oversize flag file. +// +// Groups: +// - 1: token count digits +var OversizeTokens = regexp.MustCompile(`Injected:\s+(\d+)\s+tokens`) diff --git a/internal/config/regex/claude.go b/internal/config/regex/claude.go new file mode 100644 index 00000000..bacfdc40 --- /dev/null +++ b/internal/config/regex/claude.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// LineNumber matches Claude Code's line number prefixes like " 1→". +var LineNumber = regexp.MustCompile(`(?m)^\s*\d+→`) diff --git a/internal/config/regex/cmd.go b/internal/config/regex/cmd.go new file mode 100644 index 00000000..67ae6a0e --- /dev/null +++ b/internal/config/regex/cmd.go @@ -0,0 +1,21 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// MidSudo matches mid-command sudo after && || ; +var MidSudo = regexp.MustCompile(`(;|&&|\|\|)\s*sudo\s`) + +// MidGitPush matches mid-command git push after && || ; +var MidGitPush = regexp.MustCompile(`(;|&&|\|\|)\s*git\s+push`) + +// CpMvToBin matches cp/mv to bin directories. +var CpMvToBin = regexp.MustCompile(`(cp|mv)\s+\S+\s+(/usr/local/bin|/usr/bin|~/go/bin|~/.local/bin|/home/\S+/go/bin|/home/\S+/.local/bin)`) + +// InstallToLocalBin matches cp/install to ~/.local/bin. +var InstallToLocalBin = regexp.MustCompile(`(cp|install)\s.*~/\.local/bin`) diff --git a/internal/config/regex/ctx.go b/internal/config/regex/ctx.go new file mode 100644 index 00000000..07e862f8 --- /dev/null +++ b/internal/config/regex/ctx.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// CtxGoRun matches go run ./cmd/ctx. +var CtxGoRun = regexp.MustCompile(`go run \./cmd/ctx`) + +// CtxAbsoluteStart matches absolute paths to ctx at start of command. +var CtxAbsoluteStart = regexp.MustCompile(`^\s*(/home/|/tmp/|/var/)\S*/ctx(\s|$)`) + +// AbsoluteSep matches absolute paths to ctx after command separator. +var AbsoluteSep = regexp.MustCompile(`(&&|;|\|\||\|)\s*(/home/|/tmp/|/var/)\S*/ctx(\s|$)`) + +// CtxTestException matches /tmp/ctx-test for integration test exemption. +var CtxTestException = regexp.MustCompile(`/tmp/ctx-test`) + +// CtxRelativeSep matches ./ctx or ./dist/ctx after command separator. +var CtxRelativeSep = regexp.MustCompile(`(&&|;|\|\||\|)\s*(\./ctx(\s|$)|\./dist/ctx)`) + +// CtxRelativeStart matches ./ctx or ./dist/ctx at start of command. +var CtxRelativeStart = regexp.MustCompile(`^\s*(\./ctx(\s|$)|\./dist/ctx)`) diff --git a/internal/config/regex/doc.go b/internal/config/regex/doc.go new file mode 100644 index 00000000..d15ba28f --- /dev/null +++ b/internal/config/regex/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package regex defines compiled regular expressions for parsing context files and CLI output. +package regex diff --git a/internal/config/regex/entry.go b/internal/config/regex/entry.go new file mode 100644 index 00000000..21304e29 --- /dev/null +++ b/internal/config/regex/entry.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// EntryHeader matches entry headers like "## [2026-01-28-051426] Title here". +// +// Groups: +// - 1: date (YYYY-MM-DD) +// - 2: time (HHMMSS) +// - 3: title +var EntryHeader = regexp.MustCompile( + `## \[(\d{4}-\d{2}-\d{2})-(\d{6})] (.+)`, +) + +// EntryHeaderGroups is the expected number of groups (including full +// match) returned by EntryHeader.FindStringSubmatch. +const EntryHeaderGroups = 4 + +// EntryHeading matches any entry heading (## [timestamp]). +// Use for counting entries without capturing groups. +var EntryHeading = regexp.MustCompile(`(?m)^## \[`) diff --git a/internal/config/regex/fence.go b/internal/config/regex/fence.go new file mode 100644 index 00000000..c650b93d --- /dev/null +++ b/internal/config/regex/fence.go @@ -0,0 +1,33 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// CodeFenceInline matches code fences that appear inline after text. +// E.g., "some text: ```code" where fence should be on its own line. +// Groups: +// - 1: preceding non-whitespace character +// - 2: the code fence (3+ backticks) +var CodeFenceInline = regexp.MustCompile("(\\S) *(```+)") + +// CodeFenceClose matches code fences immediately followed by text. +// E.g., "```text" where text should be on its own line after the fence. +// Groups: +// - 1: the code fence (3+ backticks) +// - 2: following non-whitespace character +var CodeFenceClose = regexp.MustCompile("(```+) *(\\S)") + +// CodeFenceLine matches lines that are code fence markers (3+ backticks or +// tildes, optionally followed by a language tag). +var CodeFenceLine = regexp.MustCompile("^\\s*(`{3,}|~{3,})(.*)$") + +// CodeFencePath matches file paths in Markdown backticks. +// +// Groups: +// - 1: file path +var CodeFencePath = regexp.MustCompile("`([^`]+\\.[a-zA-Z]{1,5})`") diff --git a/internal/config/regex/file.go b/internal/config/regex/file.go new file mode 100644 index 00000000..4a8334e0 --- /dev/null +++ b/internal/config/regex/file.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// FileNameChar matches characters not allowed in file names. +var FileNameChar = regexp.MustCompile(`[^a-zA-Z0-9-]+`) diff --git a/internal/config/regex/glossary.go b/internal/config/regex/glossary.go new file mode 100644 index 00000000..fb7cfa37 --- /dev/null +++ b/internal/config/regex/glossary.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// Glossary matches glossary definition entries (lines with **term**). +var Glossary = regexp.MustCompile(`(?m)(?:^|\n)\s*(?:-\s*)?\*\*[^*]+\*\*`) diff --git a/internal/config/regex/markdown.go b/internal/config/regex/markdown.go new file mode 100644 index 00000000..c0e262fa --- /dev/null +++ b/internal/config/regex/markdown.go @@ -0,0 +1,50 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// MarkdownHeading matches Markdown heading lines (1-6 hashes + space). +// +// Groups: +// - 1: hash prefix (e.g., "##") +// - 2: heading text +var MarkdownHeading = regexp.MustCompile(`^(#{1,6}) (.+)$`) + +// TurnHeader matches conversation turn headers. +// +// Groups: +// - 1: turn number +// - 2: role (e.g. "Assistant", "Tool Output") +// - 3: timestamp (HH:MM:SS) +var TurnHeader = regexp.MustCompile(`^### (\d+)\. (.+?) \((\d{2}:\d{2}:\d{2})\)$`) + +// ListStart matches lines that begin an ordered or unordered list item. +var ListStart = regexp.MustCompile(`^(\d+\.|[-*]) `) + +// MarkdownLink matches Markdown links with relative .md targets. +var MarkdownLink = regexp.MustCompile(`\[([^]]+)]\([^)]*\.md[^)]*\)`) + +// MarkdownImage matches Markdown image lines. +var MarkdownImage = regexp.MustCompile(`^\s*!\[.*]\(.*\)\s*$`) + +// ToolBold matches tool-use lines like "🔧 **Glob: .context/journal/*.md**". +var ToolBold = regexp.MustCompile(`🔧\s*\*\*(.+?)\*\*`) + +// InlineCodeAngle matches single-line inline code spans containing +// angle brackets (e.g., `][^`\n]*)`") + +// Phase matches phase headers at any heading level (e.g., "## Phase 1", "### Phase"). +var Phase = regexp.MustCompile(`^#{1,6}\s+Phase`) + +// BulletItem matches any Markdown bullet item (not just tasks). +// +// Groups: +// - 1: item content +var BulletItem = regexp.MustCompile(`(?m)^-\s*(.+)$`) diff --git a/internal/config/regex/page.go b/internal/config/regex/page.go new file mode 100644 index 00000000..82bc8d3e --- /dev/null +++ b/internal/config/regex/page.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// MultiPart matches session part files like "...-p2.md", "...-p3.md", etc. +var MultiPart = regexp.MustCompile(`-p\d+\.md$`) + +// GlobStar matches glob-like wildcards: *.ext, */, *) etc. +var GlobStar = regexp.MustCompile(`\*(\.\w+|[/)])`) diff --git a/internal/config/regex/system.go b/internal/config/regex/system.go new file mode 100644 index 00000000..b1d93f36 --- /dev/null +++ b/internal/config/regex/system.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// SystemContextUpdate matches context-update XML tags. +// +// Groups: +// - 1: opening tag attributes (e.g., ` type="task" context="..."`) +// - 2: content between tags +var SystemContextUpdate = regexp.MustCompile(`]+)>([^<]+)`) + +// SystemClaudeTag matches Claude Code internal markup tags that leak into +// session titles via the first user message. This MUST remain an allowlist +// of known Claude Code tags — do NOT replace with a blanket regex. +var SystemClaudeTag = regexp.MustCompile(``) + +// SystemReminder matches ... blocks. +// These are injected by Claude Code into tool results. +// Groups: +// - 1: content between tags +var SystemReminder = regexp.MustCompile(`(?s)\s*(.*?)\s*`) diff --git a/internal/config/regex/task.go b/internal/config/regex/task.go new file mode 100644 index 00000000..fd5a7cb8 --- /dev/null +++ b/internal/config/regex/task.go @@ -0,0 +1,42 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package regex + +import "regexp" + +// taskPattern captures indent, checkbox state, and content. +// +// Pattern: ^(\s*)-\s*\[([x ]?)]\s*(.+)$ +// +// Groups: +// - 1: indent (leading whitespace, may be empty) +// - 2: state ("x" for completed, " " or "" for pending) +// - 3: content (task text) +const taskPattern = `^(\s*)-\s*\[([x ]?)]\s*(.+)$` + +// Task matches a task item on a single line. +// +// Use with MatchString or FindStringSubmatch on individual lines. +// For multiline content, use TaskMultiline. +var Task = regexp.MustCompile(taskPattern) + +// TaskMultiline matches task items across multiple lines. +// +// Use with FindAllStringSubmatch on multiline content. +var TaskMultiline = regexp.MustCompile(`(?m)` + taskPattern) + +// TaskDoneTimestamp extracts the #done: timestamp from a task line. +// +// Groups: +// - 1: timestamp (YYYY-MM-DD-HHMMSS) +var TaskDoneTimestamp = regexp.MustCompile(`#done:(\d{4}-\d{2}-\d{2}-\d{6})`) + +// Runtime configuration. +const ( + // TaskCompleteReplace is the regex replacement string for marking a task done. + TaskCompleteReplace = "$1- [x] $3" +) diff --git a/internal/config/reminder/doc.go b/internal/config/reminder/doc.go new file mode 100644 index 00000000..6d1bef92 --- /dev/null +++ b/internal/config/reminder/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package reminder defines file constants for the session reminder subsystem. +package reminder diff --git a/internal/config/reminder/reminder.go b/internal/config/reminder/reminder.go new file mode 100644 index 00000000..3e17c142 --- /dev/null +++ b/internal/config/reminder/reminder.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package reminder + +// Reminder file constants for .context/ directory. +const ( + // Reminders is the session-scoped reminders file. + Reminders = "reminders.json" +) diff --git a/internal/config/rss/doc.go b/internal/config/rss/doc.go new file mode 100644 index 00000000..b10a15b9 --- /dev/null +++ b/internal/config/rss/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package rss defines defaults and formatting constants for site feed generation. +package rss diff --git a/internal/config/rss/feed.go b/internal/config/rss/feed.go new file mode 100644 index 00000000..792303fa --- /dev/null +++ b/internal/config/rss/feed.go @@ -0,0 +1,27 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package rss + +import "github.com/ActiveMemory/ctx/internal/config/token" + +// Site feed defaults. +const ( + // DefaultFeedInputDir is the default blog source directory. + DefaultFeedInputDir = "docs/blog" + // DefaultFeedOutPath is the default output path for the Atom feed. + DefaultFeedOutPath = "site/feed.xml" + // DefaultFeedBaseURL is the default base URL for feed entry links. + DefaultFeedBaseURL = "https://ctx.ist" + // FeedAtomNS is the Atom XML namespace URI. + FeedAtomNS = "http://www.w3.org/2005/Atom" + // FeedTitle is the default feed title. + FeedTitle = "ctx blog" + // FeedDefaultAuthor is the default author for feed entries. + FeedDefaultAuthor = "Jose Alekhinne" + // FeedXMLHeader is the XML declaration prepended to feed output. + FeedXMLHeader = `` + token.NewlineLF +) diff --git a/internal/config/runtime/doc.go b/internal/config/runtime/doc.go new file mode 100644 index 00000000..a5f3d3e6 --- /dev/null +++ b/internal/config/runtime/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package runtime defines default values for runtime configuration options overridable via .ctxrc. +package runtime diff --git a/internal/config/runtime/runtime.go b/internal/config/runtime/runtime.go new file mode 100644 index 00000000..606d62d3 --- /dev/null +++ b/internal/config/runtime/runtime.go @@ -0,0 +1,29 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package runtime + +// Runtime configuration defaults (overridable via .ctxrc). +const ( + // DefaultTokenBudget is the default token budget for context assembly. + DefaultTokenBudget = 8000 + // DefaultArchiveAfterDays is the default days before archiving completed tasks. + DefaultArchiveAfterDays = 7 + // DefaultEntryCountLearnings is the entry count threshold for LEARNINGS.md. + DefaultEntryCountLearnings = 30 + // DefaultEntryCountDecisions is the entry count threshold for DECISIONS.md. + DefaultEntryCountDecisions = 20 + // DefaultConventionLineCount is the line count threshold for CONVENTIONS.md. + DefaultConventionLineCount = 200 + // DefaultInjectionTokenWarn is the token threshold for oversize injection warning. + DefaultInjectionTokenWarn = 15000 + // DefaultContextWindow is the default context window size in tokens. + DefaultContextWindow = 200000 + // DefaultTaskNudgeInterval is the Edit/Write calls between task completion nudges. + DefaultTaskNudgeInterval = 5 + // DefaultKeyRotationDays is the days before encryption key rotation nudge. + DefaultKeyRotationDays = 90 +) diff --git a/internal/config/session/doc.go b/internal/config/session/doc.go new file mode 100644 index 00000000..f252e184 --- /dev/null +++ b/internal/config/session/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package session defines session metadata constants and tool identifiers for parsers. +package session diff --git a/internal/config/session/file.go b/internal/config/session/file.go new file mode 100644 index 00000000..28f7466e --- /dev/null +++ b/internal/config/session/file.go @@ -0,0 +1,13 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package session + +const ( + // DefaultSessionFilename is the fallback filename component when + // sanitization produces an empty string. + DefaultSessionFilename = "session" +) diff --git a/internal/config/session/session.go b/internal/config/session/session.go new file mode 100644 index 00000000..20e38dde --- /dev/null +++ b/internal/config/session/session.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package session + +// Session and template constants. +const ( + // IDUnknown is the fallback session ID when input lacks one. + IDUnknown = "unknown" + // TemplateName is the name used for Go text/template instances. + TemplateName = "msg" +) diff --git a/internal/config/session/tool.go b/internal/config/session/tool.go new file mode 100644 index 00000000..ff676970 --- /dev/null +++ b/internal/config/session/tool.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package session + +// Tool identifiers for session parsers. +const ( + // ToolClaudeCode is the tool identifier for Claude Code sessions. + ToolClaudeCode = "claude-code" + // ToolMarkdown is the tool identifier for Markdown session files. + ToolMarkdown = "markdown" +) diff --git a/internal/config/stats/context.go b/internal/config/stats/context.go new file mode 100644 index 00000000..57a12111 --- /dev/null +++ b/internal/config/stats/context.go @@ -0,0 +1,29 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package stats + +// PercentMultiplier is the multiplier for converting ratios to percentages. +const PercentMultiplier = 100 + +// Context size hook configuration. +const ( + // ContextSizeCounterPrefix is the state file prefix for per-session prompt counters. + ContextSizeCounterPrefix = "context-check-" + // ContextSizeLogFile is the log file name within .context/logs/. + ContextSizeLogFile = "check-context-size.log" + // ContextWindowThresholdPct is the percentage of context window usage + // that triggers an independent warning, regardless of prompt count. + ContextWindowThresholdPct = 80 + // ContextSizeBillingWarnedPrefix is the state file prefix for the one-shot billing warning guard. + ContextSizeBillingWarnedPrefix = "billing-warned-" + // ContextSizeInjectionOversizeFlag is the state file name for the injection-oversize one-shot flag. + ContextSizeInjectionOversizeFlag = "injection-oversize" + // JsonlPathCachePrefix is the state file prefix for cached JSONL file paths. + JsonlPathCachePrefix = "jsonl-path-" + // ContextSizeOversizeSepLen is the separator length for the oversize flag file header. + ContextSizeOversizeSepLen = 35 +) diff --git a/internal/config/stats/doc.go b/internal/config/stats/doc.go new file mode 100644 index 00000000..d3bafd49 --- /dev/null +++ b/internal/config/stats/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package stats defines constants for context size monitoring, health checks, and display formatting. +package stats diff --git a/internal/config/stats/preview.go b/internal/config/stats/preview.go new file mode 100644 index 00000000..3898368f --- /dev/null +++ b/internal/config/stats/preview.go @@ -0,0 +1,10 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package stats + +// MaxPreviewLen is the maximum length for preview lines before truncation. +const MaxPreviewLen = 60 diff --git a/internal/config/stats/stats.go b/internal/config/stats/stats.go new file mode 100644 index 00000000..38497c06 --- /dev/null +++ b/internal/config/stats/stats.go @@ -0,0 +1,40 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package stats + +// Stats display configuration. +const ( + // FilePrefix is the filename prefix for per-session stats JSONL files. + FilePrefix = "stats-" + // ReadBufSize is the byte buffer size for reading new lines + // from stats files during follow/stream mode. + ReadBufSize = 8192 + // HeaderTime is the column header label for timestamp. + HeaderTime = "TIME" + // HeaderSession is the column header label for session ID. + HeaderSession = "SESSION" + // HeaderPrompt is the column header label for prompt count. + HeaderPrompt = "PROMPT" + // HeaderTokens is the column header label for token count. + HeaderTokens = "TOKENS" + // HeaderPct is the column header label for percentage. + HeaderPct = "PCT" + // HeaderEvent is the column header label for the event type. + HeaderEvent = "EVENT" + // SepTime is the column separator for the time field. + SepTime = "-------------------" + // SepSession is the column separator for the session field. + SepSession = "--------" + // SepPrompt is the column separator for the prompt field. + SepPrompt = "------" + // SepTokens is the column separator for the tokens field. + SepTokens = "--------" + // SepPct is the column separator for the percentage field. + SepPct = "----" + // SepEvent is the column separator for the event field. + SepEvent = "------------" +) diff --git a/internal/config/stats/status.go b/internal/config/stats/status.go new file mode 100644 index 00000000..d33696bf --- /dev/null +++ b/internal/config/stats/status.go @@ -0,0 +1,15 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package stats + +// Check result status constants used by doctor, drift, and other health checks. +const ( + StatusOK = "ok" + StatusWarning = "warning" + StatusError = "error" + StatusInfo = "info" +) diff --git a/internal/config/stats/sysinfo.go b/internal/config/stats/sysinfo.go new file mode 100644 index 00000000..605a1df5 --- /dev/null +++ b/internal/config/stats/sysinfo.go @@ -0,0 +1,36 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package stats + +// Resources display formatting. +const ( + // ResourcesStatusCol is the column where the status indicator starts + // in the resources text output. + ResourcesStatusCol = 52 +) + +// Resource threshold constants for health evaluation. +const ( + // ThresholdMemoryWarnPct is the memory usage percentage that triggers a warning. + ThresholdMemoryWarnPct = 80 + // ThresholdMemoryDangerPct is the memory usage percentage that triggers a danger alert. + ThresholdMemoryDangerPct = 90 + // ThresholdSwapWarnPct is the swap usage percentage that triggers a warning. + ThresholdSwapWarnPct = 50 + // ThresholdSwapDangerPct is the swap usage percentage that triggers a danger alert. + ThresholdSwapDangerPct = 75 + // ThresholdDiskWarnPct is the disk usage percentage that triggers a warning. + ThresholdDiskWarnPct = 85 + // ThresholdDiskDangerPct is the disk usage percentage that triggers a danger alert. + ThresholdDiskDangerPct = 95 + // ThresholdLoadWarnRatio is the load-to-CPU ratio that triggers a warning. + ThresholdLoadWarnRatio = 0.8 + // ThresholdLoadDangerRatio is the load-to-CPU ratio that triggers a danger alert. + ThresholdLoadDangerRatio = 1.5 + // ThresholdBytesPerGiB is the number of bytes in one gibibyte. + ThresholdBytesPerGiB = 1 << 30 +) diff --git a/internal/config/time/doc.go b/internal/config/time/doc.go new file mode 100644 index 00000000..37152726 --- /dev/null +++ b/internal/config/time/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package time defines date and time format layouts and duration constants. +package time diff --git a/internal/config/time/hours.go b/internal/config/time/hours.go new file mode 100644 index 00000000..1c954494 --- /dev/null +++ b/internal/config/time/hours.go @@ -0,0 +1,12 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package time + +const ( + // HoursPerDay is the number of hours in a day for duration calculations. + HoursPerDay = 24 +) diff --git a/internal/config/time/time.go b/internal/config/time/time.go new file mode 100644 index 00000000..6490a3fb --- /dev/null +++ b/internal/config/time/time.go @@ -0,0 +1,32 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package time + +import "time" + +// Date and time format constants. +const ( + // DateFormat is the canonical YYYY-MM-DD date layout for time.Parse. + DateFormat = "2006-01-02" + // DateTimeFormat is DateFormat with hours and minutes (HH:MM). + DateTimeFormat = "2006-01-02 15:04" + // DateTimePreciseFormat is DateFormat with hours, minutes, and seconds. + DateTimePreciseFormat = "2006-01-02 15:04:05" + // Format is the hours:minutes:seconds layout for timestamps. + Format = "15:04:05" + // TimestampCompact is the YYYYMMDD-HHMMSS layout used in entry headers + // and task timestamps (e.g., 2026-01-28-143022). + TimestampCompact = "2006-01-02-150405" +) + +// InclusiveUntilOffset is the duration added to an --until date to make +// it inclusive of the entire day (23:59:59). +const InclusiveUntilOffset = 24*time.Hour - time.Second + +// OlderFormat is the Go time layout for dates older than a week. +// Exported because callers must format the fallback date before calling FormatTimeAgo. +const OlderFormat = "Jan 2, 2006" diff --git a/internal/config/token.go b/internal/config/token.go deleted file mode 100644 index e3f856e3..00000000 --- a/internal/config/token.go +++ /dev/null @@ -1,52 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package config - -const ( - // NewlineCRLF is the Windows new line. - // - // We check NewlineCRLF first, then NewlineLF to handle both formats. - NewlineCRLF = "\r\n" - // NewlineLF is Unix new line. - NewlineLF = "\n" - // Whitespace is the set of inline whitespace characters (space and tab). - Whitespace = " \t" - // Space is a single space character. - Space = " " - // Tab is a horizontal tab character. - Tab = "\t" - // Separator is a Markdown horizontal rule used between sections. - Separator = "---" - // Ellipsis is a Markdown ellipsis. - Ellipsis = "..." - // HeadingLevelOneStart is the Markdown heading for the first section. - HeadingLevelOneStart = "# " - // HeadingLevelTwoStart is the Markdown heading for subsequent sections. - HeadingLevelTwoStart = "## " - // CodeFence is the standard Markdown code fence delimiter. - CodeFence = "```" - // Backtick is a single backtick character. - Backtick = "`" - // PipeSeparator is the inline separator used between navigation links. - PipeSeparator = " | " - // LinkPrefixParent is the relative link prefix to the parent directory. - LinkPrefixParent = "../" - // PrefixHeading is the Markdown heading character used for prefix checks. - PrefixHeading = "#" - // PrefixBracket is the opening bracket used for placeholder checks. - PrefixBracket = "[" - // LoopComplete is the banner printed when the loop finishes. - LoopComplete = "=== Loop Complete ===" - // TomlNavOpen is the opening bracket for the TOML nav array. - TomlNavOpen = "nav = [" - // TomlNavSectionClose closes a nav section group. - TomlNavSectionClose = " ]}" - // TomlNavClose closes the top-level nav array. - TomlNavClose = "]" - // NudgeBoxBottom is the bottom border of a nudge/notification box. - NudgeBoxBottom = "└──────────────────────────────────────────────────" -) diff --git a/internal/config/token/delim.go b/internal/config/token/delim.go new file mode 100644 index 00000000..9f5cacab --- /dev/null +++ b/internal/config/token/delim.go @@ -0,0 +1,20 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +const ( + // Colon is the colon character used as a key-value separator. + Colon = ":" + // Dash is a hyphen used as a timestamp segment separator. + Dash = "-" + // KeyValueSep is the equals sign used as a key-value separator in state files. + KeyValueSep = "=" + // Separator is a Markdown horizontal rule used between sections. + Separator = "---" + // Ellipsis is a Markdown ellipsis. + Ellipsis = "..." +) diff --git a/internal/config/token/doc.go b/internal/config/token/doc.go new file mode 100644 index 00000000..45759cdd --- /dev/null +++ b/internal/config/token/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package token defines string tokens, delimiters, and content detection constants. +package token diff --git a/internal/config/token/fence.go b/internal/config/token/fence.go new file mode 100644 index 00000000..3728659d --- /dev/null +++ b/internal/config/token/fence.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +const ( + // CodeFence is the standard Markdown code fence delimiter. + CodeFence = "```" + // Backtick is a single backtick character. + Backtick = "`" +) diff --git a/internal/config/token/heading.go b/internal/config/token/heading.go new file mode 100644 index 00000000..e0109bee --- /dev/null +++ b/internal/config/token/heading.go @@ -0,0 +1,16 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +const ( + // HeadingLevelOneStart is the Markdown heading for the first section. + HeadingLevelOneStart = "# " + // HeadingLevelTwoStart is the Markdown heading for subsequent sections. + HeadingLevelTwoStart = "## " + // HeadingLevelThreeStart is the Markdown heading level three. + HeadingLevelThreeStart = "### " +) diff --git a/internal/config/token/prefix.go b/internal/config/token/prefix.go new file mode 100644 index 00000000..b91ff04f --- /dev/null +++ b/internal/config/token/prefix.go @@ -0,0 +1,24 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +const ( + // PrefixHeading is the Markdown heading character used for prefix checks. + PrefixHeading = "#" + // PrefixBracket is the opening bracket used for placeholder checks. + PrefixBracket = "[" + // PrefixListDash is the prefix for a dash list item. + PrefixListDash = "- " + // PrefixListStar is the prefix for a star list item. + PrefixListStar = "* " + // MemoryMirrorPrefix is the filename prefix for archived mirror files. +) + +const ( + // LinkPrefixParent is the relative link prefix to the parent directory. + LinkPrefixParent = "../" +) diff --git a/internal/config/token/secret.go b/internal/config/token/secret.go new file mode 100644 index 00000000..7a1db87c --- /dev/null +++ b/internal/config/token/secret.go @@ -0,0 +1,17 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +// SecretPatterns are filename substrings that indicate potential secret files. +var SecretPatterns = []string{ + ".env", + "credentials", + "secret", + "api_key", + "apikey", + "password", +} diff --git a/internal/config/token/sep.go b/internal/config/token/sep.go new file mode 100644 index 00000000..d13d4a01 --- /dev/null +++ b/internal/config/token/sep.go @@ -0,0 +1,14 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +// Content detection constants. +const ( + // MaxSeparatorLen is the maximum length of a line to be considered a + // Markdown separator (e.g. "---" or "----"). + MaxSeparatorLen = 5 +) diff --git a/internal/config/token/tpl.go b/internal/config/token/tpl.go new file mode 100644 index 00000000..0ce2817f --- /dev/null +++ b/internal/config/token/tpl.go @@ -0,0 +1,18 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package token + +// TemplateMarkers are content substrings that indicate a file is a template. +var TemplateMarkers = []string{ + "YOUR_", + " 0 && trimmed[0] == '-' && len(trimmed) < 5 { + if len(trimmed) > 0 && trimmed[0] == token.Dash[0] && len(trimmed) < token.MaxSeparatorLen { continue } // Check for HTML comment markers if len(trimmed) >= openLen && - string(trimmed[:openLen]) == config.CommentOpen { + string(trimmed[:openLen]) == marker.CommentOpen { continue } if len(trimmed) >= closeLen && - string(trimmed[len(trimmed)-closeLen:]) == config.CommentClose { + string(trimmed[len(trimmed)-closeLen:]) == marker.CommentClose { continue } contentLines++ diff --git a/internal/context/summary.go b/internal/context/summary.go index 35282668..4750d950 100644 --- a/internal/context/summary.go +++ b/internal/context/summary.go @@ -11,11 +11,12 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" ) -const summaryEmpty = "empty" - // summarizeConstitution counts checkbox items (invariants) in CONSTITUTION.md. // // Parameters: @@ -26,15 +27,15 @@ const summaryEmpty = "empty" func summarizeConstitution(content []byte) string { // Count checkbox items (invariants) count := bytes.Count( - content, []byte(config.PrefixTaskUndone), + content, []byte(marker.PrefixTaskUndone), ) + bytes.Count( - content, []byte(config.PrefixTaskDone), + content, []byte(marker.PrefixTaskDone), ) if count == 0 { - return "loaded" + return assets.TextDesc(assets.TextDescKeySummaryLoaded) } - return fmt.Sprintf("%d invariants", count) + return fmt.Sprintf(assets.TextDesc(assets.TextDescKeySummaryInvariants), count) } // summarizeTasks counts active and completed tasks in TASKS.md. @@ -46,19 +47,19 @@ func summarizeConstitution(content []byte) string { // - string: Summary like "3 active, 2 completed" or "empty" if none func summarizeTasks(content []byte) string { // Count active (unchecked) and completed (checked) tasks - active := bytes.Count(content, []byte(config.PrefixTaskUndone)) - completed := bytes.Count(content, []byte(config.PrefixTaskDone)) + active := bytes.Count(content, []byte(marker.PrefixTaskUndone)) + completed := bytes.Count(content, []byte(marker.PrefixTaskDone)) if active == 0 && completed == 0 { - return summaryEmpty + return assets.TextDesc(assets.TextDescKeySummaryEmpty) } var parts []string if active > 0 { - parts = append(parts, fmt.Sprintf("%d active", active)) + parts = append(parts, fmt.Sprintf(assets.TextDesc(assets.TextDescKeySummaryActive), active)) } if completed > 0 { - parts = append(parts, fmt.Sprintf("%d completed", completed)) + parts = append(parts, fmt.Sprintf(assets.TextDesc(assets.TextDescKeySummaryCompleted), completed)) } return strings.Join(parts, ", ") } @@ -72,16 +73,16 @@ func summarizeTasks(content []byte) string { // - string: Summary like "3 decisions" or "empty" if none func summarizeDecisions(content []byte) string { // Count decision headers (## [date] or ## Decision) - matches := config.RegExEntryHeading.FindAll(content, -1) + matches := regex.EntryHeading.FindAll(content, -1) count := len(matches) if count == 0 { - return summaryEmpty + return assets.TextDesc(assets.TextDescKeySummaryEmpty) } if count == 1 { - return "1 decision" + return assets.TextDesc(assets.TextDescKeySummaryDecision) } - return fmt.Sprintf("%d decisions", count) + return fmt.Sprintf(assets.TextDesc(assets.TextDescKeySummaryDecisions), count) } // summarizeGlossary counts term definitions (**term**) in GLOSSARY.md. @@ -92,16 +93,16 @@ func summarizeDecisions(content []byte) string { // Returns: // - string: Summary like "5 terms" or "empty" if none func summarizeGlossary(content []byte) string { - matches := config.RegExGlossary.FindAll(content, -1) + matches := regex.Glossary.FindAll(content, -1) count := len(matches) if count == 0 { - return summaryEmpty + return assets.TextDesc(assets.TextDescKeySummaryEmpty) } if count == 1 { - return "1 term" + return assets.TextDesc(assets.TextDescKeySummaryTerm) } - return fmt.Sprintf("%d terms", count) + return fmt.Sprintf(assets.TextDesc(assets.TextDescKeySummaryTerms), count) } // generateSummary creates a brief summary for a context file based on its @@ -115,18 +116,18 @@ func summarizeGlossary(content []byte) string { // - string: Summary string (e.g., "3 active, 2 completed" or "empty") func generateSummary(name string, content []byte) string { switch name { - case config.FileConstitution: + case ctx.Constitution: return summarizeConstitution(content) - case config.FileTask: + case ctx.Task: return summarizeTasks(content) - case config.FileDecision: + case ctx.Decision: return summarizeDecisions(content) - case config.FileGlossary: + case ctx.Glossary: return summarizeGlossary(content) default: if len(content) == 0 || effectivelyEmpty(content) { - return summaryEmpty + return assets.TextDesc(assets.TextDescKeySummaryEmpty) } - return "loaded" + return assets.TextDesc(assets.TextDescKeySummaryLoaded) } } diff --git a/internal/context/verify.go b/internal/context/verify.go index d30637dc..0acde240 100644 --- a/internal/context/verify.go +++ b/internal/context/verify.go @@ -8,7 +8,10 @@ package context import ( "os" + "path/filepath" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -21,6 +24,23 @@ import ( // // Returns: // - bool: True if the directory exists and is a directory +// +// Initialized reports whether the context directory contains all required files. +// +// Parameters: +// - contextDir: Directory path to check +// +// Returns: +// - bool: True if all required context files exist +func Initialized(contextDir string) bool { + for _, f := range ctx.FilesRequired { + if _, err := os.Stat(filepath.Join(contextDir, f)); err != nil { + return false + } + } + return true +} + func Exists(dir string) bool { if dir == "" { dir = rc.ContextDir() @@ -28,3 +48,28 @@ func Exists(dir string) bool { info, err := os.Stat(dir) return err == nil && info.IsDir() } + +// ResolvedJournalDir returns the path to the journal directory within the +// configured context directory. +func ResolvedJournalDir() string { + return filepath.Join(rc.ContextDir(), dir.Journal) +} + +// DirLine returns a one-line context directory identifier. +// Returns an empty string if the directory cannot be resolved. +func DirLine() string { + d := rc.ContextDir() + if d == "" { + return "" + } + return "Context: " + d +} + +// AppendDir appends a bracketed context directory footer to msg +// if a context directory is available. Returns msg unchanged otherwise. +func AppendDir(msg string) string { + if line := DirLine(); line != "" { + return msg + " [" + line + "]" + } + return msg +} diff --git a/internal/context/verify_test.go b/internal/context/verify_test.go new file mode 100644 index 00000000..37b04aad --- /dev/null +++ b/internal/context/verify_test.go @@ -0,0 +1,49 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package context + +import ( + "os" + "path/filepath" + "testing" + + "github.com/ActiveMemory/ctx/internal/config/ctx" +) + +func TestInitialized_AllFilesPresent(t *testing.T) { + tmp := t.TempDir() + for _, f := range ctx.FilesRequired { + path := filepath.Join(tmp, f) + if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil { + t.Fatalf("setup: %v", writeErr) + } + } + if !Initialized(tmp) { + t.Error("Initialized() = false, want true when all required files present") + } +} + +func TestInitialized_MissingFile(t *testing.T) { + tmp := t.TempDir() + // Create all but the last required file. + for _, f := range ctx.FilesRequired[:len(ctx.FilesRequired)-1] { + path := filepath.Join(tmp, f) + if writeErr := os.WriteFile(path, []byte("# "+f+"\n"), 0o600); writeErr != nil { + t.Fatalf("setup: %v", writeErr) + } + } + if Initialized(tmp) { + t.Error("Initialized() = true, want false when a required file is missing") + } +} + +func TestInitialized_EmptyDir(t *testing.T) { + tmp := t.TempDir() + if Initialized(tmp) { + t.Error("Initialized() = true, want false for empty directory") + } +} diff --git a/internal/crypto/crypto.go b/internal/crypto/crypto.go index 5b7300ec..da10d4b3 100644 --- a/internal/crypto/crypto.go +++ b/internal/crypto/crypto.go @@ -15,17 +15,13 @@ import ( "crypto/aes" "crypto/cipher" "crypto/rand" - "errors" - "fmt" "io" "os" -) - -// KeySize is the required key length in bytes (256 bits). -const KeySize = 32 -// NonceSize is the GCM nonce length in bytes. -const NonceSize = 12 + "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/fs" + ctxerr "github.com/ActiveMemory/ctx/internal/err" +) // GenerateKey returns a new 256-bit random key. // @@ -33,9 +29,9 @@ const NonceSize = 12 // - []byte: A 32-byte random key // - error: Non-nil if the system random source fails func GenerateKey() ([]byte, error) { - key := make([]byte, KeySize) + key := make([]byte, crypto.KeySize) if _, err := io.ReadFull(rand.Reader, key); err != nil { - return nil, fmt.Errorf("generate key: %w", err) + return nil, ctxerr.CryptoGenerateKey(err) } return key, nil } @@ -56,17 +52,17 @@ func GenerateKey() ([]byte, error) { func Encrypt(key, plaintext []byte) ([]byte, error) { block, err := aes.NewCipher(key) if err != nil { - return nil, fmt.Errorf("create cipher: %w", err) + return nil, ctxerr.CryptoCreateCipher(err) } gcm, err := cipher.NewGCM(block) if err != nil { - return nil, fmt.Errorf("create GCM: %w", err) + return nil, ctxerr.CryptoCreateGCM(err) } - nonce := make([]byte, NonceSize) + nonce := make([]byte, crypto.NonceSize) if _, err := io.ReadFull(rand.Reader, nonce); err != nil { - return nil, fmt.Errorf("generate nonce: %w", err) + return nil, ctxerr.CryptoGenerateNonce(err) } ciphertext := gcm.Seal(nonce, nonce, plaintext, nil) @@ -84,26 +80,26 @@ func Encrypt(key, plaintext []byte) ([]byte, error) { // - error: Non-nil if key is wrong, ciphertext is too short, or // authentication fails func Decrypt(key, ciphertext []byte) ([]byte, error) { - if len(ciphertext) < NonceSize { - return nil, errors.New("ciphertext too short") + if len(ciphertext) < crypto.NonceSize { + return nil, ctxerr.CryptoCiphertextTooShort() } block, err := aes.NewCipher(key) if err != nil { - return nil, fmt.Errorf("create cipher: %w", err) + return nil, ctxerr.CryptoCreateCipher(err) } gcm, err := cipher.NewGCM(block) if err != nil { - return nil, fmt.Errorf("create GCM: %w", err) + return nil, ctxerr.CryptoCreateGCM(err) } - nonce := ciphertext[:NonceSize] - data := ciphertext[NonceSize:] + nonce := ciphertext[:crypto.NonceSize] + data := ciphertext[crypto.NonceSize:] plaintext, err := gcm.Open(nil, nonce, data, nil) if err != nil { - return nil, fmt.Errorf("decrypt: %w", err) + return nil, ctxerr.CryptoDecrypt(err) } return plaintext, nil @@ -120,10 +116,10 @@ func Decrypt(key, ciphertext []byte) ([]byte, error) { func LoadKey(path string) ([]byte, error) { key, err := os.ReadFile(path) //nolint:gosec // path is controlled by the caller (config constants) if err != nil { - return nil, fmt.Errorf("read key: %w", err) + return nil, ctxerr.CryptoReadKey(err) } - if len(key) != KeySize { - return nil, fmt.Errorf("invalid key size: got %d bytes, want %d", len(key), KeySize) + if len(key) != crypto.KeySize { + return nil, ctxerr.CryptoInvalidKeySize(len(key), crypto.KeySize) } return key, nil } @@ -137,8 +133,8 @@ func LoadKey(path string) ([]byte, error) { // Returns: // - error: Non-nil if the file cannot be written func SaveKey(path string, key []byte) error { - if err := os.WriteFile(path, key, 0600); err != nil { - return fmt.Errorf("write key: %w", err) + if err := os.WriteFile(path, key, fs.PermSecret); err != nil { + return ctxerr.CryptoWriteKey(err) } return nil } diff --git a/internal/crypto/crypto_test.go b/internal/crypto/crypto_test.go index 89b8c617..e2f8cb76 100644 --- a/internal/crypto/crypto_test.go +++ b/internal/crypto/crypto_test.go @@ -11,6 +11,9 @@ import ( "os" "path/filepath" "testing" + + "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/fs" ) func TestGenerateKey(t *testing.T) { @@ -18,8 +21,8 @@ func TestGenerateKey(t *testing.T) { if err != nil { t.Fatalf("GenerateKey() error: %v", err) } - if len(key) != KeySize { - t.Errorf("key length = %d, want %d", len(key), KeySize) + if len(key) != crypto.KeySize { + t.Errorf("key length = %d, want %d", len(key), crypto.KeySize) } // Two keys should be different @@ -120,7 +123,7 @@ func TestDecrypt_TamperedCiphertext(t *testing.T) { } // Tamper with the ciphertext (flip a byte after the nonce) - ciphertext[NonceSize+1] ^= 0xFF + ciphertext[crypto.NonceSize+1] ^= 0xFF _, err = Decrypt(key, ciphertext) if err == nil { @@ -145,8 +148,8 @@ func TestSaveKey_LoadKey_RoundTrip(t *testing.T) { if err != nil { t.Fatalf("Stat() error: %v", err) } - if perm := info.Mode().Perm(); perm != 0600 { - t.Errorf("key file permissions = %o, want 0600", perm) + if perm := info.Mode().Perm(); perm != fs.PermSecret { + t.Errorf("key file permissions = %o, want %o", perm, fs.PermSecret) } loaded, err := LoadKey(path) @@ -161,7 +164,7 @@ func TestSaveKey_LoadKey_RoundTrip(t *testing.T) { func TestLoadKey_WrongSize(t *testing.T) { path := filepath.Join(t.TempDir(), "bad.key") - if err := os.WriteFile(path, []byte("too short"), 0600); err != nil { + if err := os.WriteFile(path, []byte("too short"), fs.PermSecret); err != nil { t.Fatalf("WriteFile() error: %v", err) } diff --git a/internal/config/keypath.go b/internal/crypto/keypath.go similarity index 89% rename from internal/config/keypath.go rename to internal/crypto/keypath.go index 522ada7f..28c70ae2 100644 --- a/internal/config/keypath.go +++ b/internal/crypto/keypath.go @@ -4,16 +4,16 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package crypto import ( "os" "path/filepath" "strings" -) -// PermKeyDir is the permission for the user-level key directory (owner rwx only). -const PermKeyDir = 0700 + cryptocfg "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/dir" +) // GlobalKeyPath returns the global encryption key path. // @@ -24,7 +24,7 @@ func GlobalKeyPath() string { if err != nil { return "" } - return filepath.Join(home, ".ctx", FileContextKey) + return filepath.Join(home, dir.CtxData, cryptocfg.ContextKey) } // ExpandHome expands a leading ~/ prefix to the user's home directory. @@ -69,7 +69,7 @@ func ResolveKeyPath(contextDir, overridePath string) string { } // Tier 2: project-local key. - local := filepath.Join(contextDir, FileContextKey) + local := filepath.Join(contextDir, cryptocfg.ContextKey) if _, err := os.Stat(local); err == nil { return local } diff --git a/internal/config/keypath_test.go b/internal/crypto/keypath_test.go similarity index 77% rename from internal/config/keypath_test.go rename to internal/crypto/keypath_test.go index a7d2d64b..cedb6de8 100644 --- a/internal/config/keypath_test.go +++ b/internal/crypto/keypath_test.go @@ -4,12 +4,15 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package crypto import ( "os" "path/filepath" "testing" + + cryptocfg "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/fs" ) func TestGlobalKeyPath(t *testing.T) { @@ -17,7 +20,7 @@ func TestGlobalKeyPath(t *testing.T) { t.Setenv("HOME", dir) got := GlobalKeyPath() - want := filepath.Join(dir, ".ctx", FileContextKey) + want := filepath.Join(dir, ".ctx", cryptocfg.ContextKey) if got != want { t.Errorf("GlobalKeyPath() = %q, want %q", got, want) } @@ -68,17 +71,17 @@ func TestResolveKeyPath_ProjectLocalBeforeGlobal(t *testing.T) { if err := os.MkdirAll(contextDir, 0750); err != nil { t.Fatal(err) } - localKey := filepath.Join(contextDir, FileContextKey) - if err := os.WriteFile(localKey, []byte("local-key"), PermSecret); err != nil { + localKey := filepath.Join(contextDir, cryptocfg.ContextKey) + if err := os.WriteFile(localKey, []byte("local-key"), fs.PermSecret); err != nil { t.Fatal(err) } globalDir := filepath.Join(dir, ".ctx") - if err := os.MkdirAll(globalDir, PermKeyDir); err != nil { + if err := os.MkdirAll(globalDir, fs.PermKeyDir); err != nil { t.Fatal(err) } - globalKey := filepath.Join(globalDir, FileContextKey) - if err := os.WriteFile(globalKey, []byte("global-key"), PermSecret); err != nil { + globalKey := filepath.Join(globalDir, cryptocfg.ContextKey) + if err := os.WriteFile(globalKey, []byte("global-key"), fs.PermSecret); err != nil { t.Fatal(err) } @@ -94,11 +97,11 @@ func TestResolveKeyPath_FallbackToGlobal(t *testing.T) { // Create global key only — no project-local. globalDir := filepath.Join(dir, ".ctx") - if err := os.MkdirAll(globalDir, PermKeyDir); err != nil { + if err := os.MkdirAll(globalDir, fs.PermKeyDir); err != nil { t.Fatal(err) } - globalKey := filepath.Join(globalDir, FileContextKey) - if err := os.WriteFile(globalKey, []byte("global-key"), PermSecret); err != nil { + globalKey := filepath.Join(globalDir, cryptocfg.ContextKey) + if err := os.WriteFile(globalKey, []byte("global-key"), fs.PermSecret); err != nil { t.Fatal(err) } diff --git a/internal/config/migrate.go b/internal/crypto/migrate.go similarity index 90% rename from internal/config/migrate.go rename to internal/crypto/migrate.go index 07e8e83a..bd2399d3 100644 --- a/internal/config/migrate.go +++ b/internal/crypto/migrate.go @@ -4,13 +4,15 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package crypto import ( "fmt" "os" "path/filepath" "strings" + + cryptocfg "github.com/ActiveMemory/ctx/internal/config/crypto" ) // MigrateKeyFile warns about legacy key files that should be moved @@ -36,7 +38,7 @@ func MigrateKeyFile(contextDir string) { var found string // Legacy project-local names. - for _, name := range []string{FileContextKey, ".context.key", ".scratchpad.key"} { + for _, name := range []string{cryptocfg.ContextKey, ".context.key", ".scratchpad.key"} { candidate := filepath.Join(contextDir, name) if _, err := os.Stat(candidate); err == nil { found = candidate diff --git a/internal/config/migrate_test.go b/internal/crypto/migrate_test.go similarity index 81% rename from internal/config/migrate_test.go rename to internal/crypto/migrate_test.go index 999b054c..f9bb2fcf 100644 --- a/internal/config/migrate_test.go +++ b/internal/crypto/migrate_test.go @@ -4,12 +4,15 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package config +package crypto import ( "os" "path/filepath" "testing" + + cryptocfg "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/fs" ) func TestMigrateKeyFile_GlobalExists_Noop(t *testing.T) { @@ -18,11 +21,11 @@ func TestMigrateKeyFile_GlobalExists_Noop(t *testing.T) { // Create global key. globalDir := filepath.Join(dir, ".ctx") - if err := os.MkdirAll(globalDir, PermKeyDir); err != nil { + if err := os.MkdirAll(globalDir, fs.PermKeyDir); err != nil { t.Fatal(err) } - globalKey := filepath.Join(globalDir, FileContextKey) - if err := os.WriteFile(globalKey, []byte("global-key"), PermSecret); err != nil { + globalKey := filepath.Join(globalDir, cryptocfg.ContextKey) + if err := os.WriteFile(globalKey, []byte("global-key"), fs.PermSecret); err != nil { t.Fatal(err) } @@ -54,8 +57,8 @@ func TestMigrateKeyFile_LegacyLocal_WarnsOnly(t *testing.T) { } // Create legacy project-local key. - localKey := filepath.Join(contextDir, FileContextKey) - if err := os.WriteFile(localKey, []byte("local-key"), PermSecret); err != nil { + localKey := filepath.Join(contextDir, cryptocfg.ContextKey) + if err := os.WriteFile(localKey, []byte("local-key"), fs.PermSecret); err != nil { t.Fatal(err) } @@ -80,11 +83,11 @@ func TestMigrateKeyFile_LegacyUserLevel_WarnsOnly(t *testing.T) { // Create a legacy user-level key at ~/.local/ctx/keys/. legacyKeyDir := filepath.Join(dir, ".local", "ctx", "keys") - if err := os.MkdirAll(legacyKeyDir, PermKeyDir); err != nil { + if err := os.MkdirAll(legacyKeyDir, fs.PermKeyDir); err != nil { t.Fatal(err) } legacyKey := filepath.Join(legacyKeyDir, "some-project--abcd1234.key") - if err := os.WriteFile(legacyKey, []byte("user-level-data"), PermSecret); err != nil { + if err := os.WriteFile(legacyKey, []byte("user-level-data"), fs.PermSecret); err != nil { t.Fatal(err) } diff --git a/internal/drift/detector.go b/internal/drift/detector.go index 82d1bc8b..31b9ebe0 100644 --- a/internal/drift/detector.go +++ b/internal/drift/detector.go @@ -14,7 +14,11 @@ import ( "strings" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/context" "github.com/ActiveMemory/ctx/internal/index" "github.com/ActiveMemory/ctx/internal/rc" @@ -22,7 +26,7 @@ import ( const staleAgeDays = 30 -var staleAgeExclude = []string{config.FileConstitution} +var staleAgeExclude = []string{ctxCfg.Constitution} // Status returns the overall status of the report. // @@ -92,13 +96,13 @@ func checkPathReferences(ctx *context.Context, report *Report) { foundDeadPaths := false for _, f := range ctx.Files { - if f.Name != config.FileArchitecture && f.Name != config.FileConvention { + if f.Name != ctxCfg.Architecture && f.Name != ctxCfg.Convention { continue } - lines := strings.Split(string(f.Content), config.NewlineLF) + lines := strings.Split(string(f.Content), token.NewlineLF) for lineNum, line := range lines { - matches := config.RegExPath.FindAllStringSubmatch(line, -1) + matches := regex.CodeFencePath.FindAllStringSubmatch(line, -1) for _, m := range matches { path := m[1] // Skip URLs and common non-file patterns @@ -123,7 +127,7 @@ func checkPathReferences(ctx *context.Context, report *Report) { File: f.Name, Line: lineNum + 1, Type: IssueDeadPath, - Message: "references path that does not exist", + Message: assets.TextDesc(assets.TextDescKeyDriftDeadPath), Path: path, }) foundDeadPaths = true @@ -148,14 +152,14 @@ func checkPathReferences(ctx *context.Context, report *Report) { func checkStaleness(ctx *context.Context, report *Report) { staleness := false - if f := ctx.File(config.FileTask); f != nil { + if f := ctx.File(ctxCfg.Task); f != nil { // Count completed tasks - completedCount := strings.Count(string(f.Content), "- [x]") + completedCount := strings.Count(string(f.Content), marker.PrefixTaskDone) if completedCount > 10 { report.Warnings = append(report.Warnings, Issue{ File: f.Name, Type: IssueStaleness, - Message: "has many completed items (consider archiving)", + Message: assets.TextDesc(assets.TextDescKeyDriftStaleness), Path: "", }) staleness = true @@ -179,14 +183,7 @@ func checkConstitution(_ *context.Context, report *Report) { // Basic heuristic checks for constitution violations // Check for potential secrets in common config files - secretPatterns := []string{ - ".env", - "credentials", - "secret", - "api_key", - "apikey", - "password", - } + secretPatterns := token.SecretPatterns // Look for common secret file patterns in the working directory entries, readErr := os.ReadDir(".") @@ -213,7 +210,7 @@ func checkConstitution(_ *context.Context, report *Report) { report.Violations = append(report.Violations, Issue{ File: entry.Name(), Type: IssueSecret, - Message: "may contain secrets (constitution violation)", + Message: assets.TextDesc(assets.TextDescKeyDriftSecret), Rule: "no_secrets", }) foundViolation = true @@ -242,12 +239,12 @@ func checkRequiredFiles(ctx *context.Context, report *Report) { existingFiles[f.Name] = true } - for _, name := range config.FilesRequired { + for _, name := range ctxCfg.FilesRequired { if !existingFiles[name] { report.Warnings = append(report.Warnings, Issue{ File: name, Type: IssueMissing, - Message: "required context file is missing", + Message: assets.TextDesc(assets.TextDescKeyDriftMissingFile), }) allPresent = false } @@ -287,7 +284,7 @@ func checkFileAge(ctx *context.Context, report *Report) { report.Warnings = append(report.Warnings, Issue{ File: f.Name, Type: IssueStaleAge, - Message: fmt.Sprintf("last modified %d days ago", days), + Message: fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDriftStaleAge), days), }) foundStale = true } @@ -311,8 +308,8 @@ func checkEntryCount(ctx *context.Context, report *Report) { file string threshold int }{ - {config.FileLearning, rc.EntryCountLearnings()}, - {config.FileDecision, rc.EntryCountDecisions()}, + {ctxCfg.Learning, rc.EntryCountLearnings()}, + {ctxCfg.Decision, rc.EntryCountDecisions()}, } found := false @@ -330,7 +327,7 @@ func checkEntryCount(ctx *context.Context, report *Report) { File: f.Name, Type: IssueEntryCount, Message: fmt.Sprintf( - "has %d entries (recommended: ≤%d)", + assets.TextDesc(assets.TextDescKeyDriftEntryCount), len(blocks), c.threshold, ), }) @@ -357,7 +354,7 @@ var reInternalPkg = regexp.MustCompile("`(internal/[^`]+)`") // - ctx: Loaded context containing files to scan // - report: Report to append warnings to (modified in place) func checkMissingPackages(ctx *context.Context, report *Report) { - f := ctx.File(config.FileArchitecture) + f := ctx.File(ctxCfg.Architecture) if f == nil { return } @@ -386,7 +383,7 @@ func checkMissingPackages(ctx *context.Context, report *Report) { report.Warnings = append(report.Warnings, Issue{ File: f.Name, Type: IssueMissingPackage, - Message: fmt.Sprintf("package %s is not documented", pkg), + Message: fmt.Sprintf(assets.TextDesc(assets.TextDescKeyDriftMissingPackage), pkg), Path: pkg, }) found = true @@ -422,16 +419,7 @@ func normalizeInternalPkg(path string) string { // - bool: True if content contains template markers func isTemplateFile(content []byte) bool { s := string(content) - // Check for common template markers - templateMarkers := []string{ - "YOUR_", - " 0 { - return write.ErrMissingFields(config.EntryDecision, m) + return add.ErrMissingFields(entry.Decision, m) } - case config.EntryLearning: + case entry.Learning: if m := checkRequired([][2]string{ - {config.FlagPrefixLong + config.FlagContext, params.Context}, - {config.FlagPrefixLong + config.FlagLesson, params.Lesson}, - {config.FlagPrefixLong + config.FlagApplication, params.Application}, + {flag.PrefixLong + flag.Context, params.Context}, + {flag.PrefixLong + flag.Lesson, params.Lesson}, + {flag.PrefixLong + flag.Application, params.Application}, }); len(m) > 0 { - return write.ErrMissingFields(config.EntryLearning, m) + return add.ErrMissingFields(entry.Learning, m) } } diff --git a/internal/entry/write.go b/internal/entry/write.go index 7e811fb5..3b905621 100644 --- a/internal/entry/write.go +++ b/internal/entry/write.go @@ -12,10 +12,11 @@ import ( "strings" "github.com/ActiveMemory/ctx/internal/cli/add/core" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/entry" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/index" "github.com/ActiveMemory/ctx/internal/rc" - "github.com/ActiveMemory/ctx/internal/write" + "github.com/ActiveMemory/ctx/internal/write/add" ) // Write formats and writes an entry to the appropriate context file. @@ -31,9 +32,9 @@ import ( func Write(params Params) error { fType := strings.ToLower(params.Type) - fileName, ok := config.FileType[fType] + fileName, ok := entry.ToCtxFile[fType] if !ok { - return write.ErrUnknownType(fType) + return add.ErrUnknownType(fType) } contextDir := params.ContextDir @@ -43,54 +44,56 @@ func Write(params Params) error { filePath := filepath.Join(contextDir, fileName) if _, statErr := os.Stat(filePath); os.IsNotExist(statErr) { - return write.ErrFileNotFound(filePath) + return add.ErrFileNotFound(filePath) } existing, readErr := os.ReadFile(filepath.Clean(filePath)) if readErr != nil { - return write.ErrFileRead(filePath, readErr) + return add.ErrFileRead(filePath, readErr) } var formatted string - switch config.UserInputToEntry(fType) { - case config.EntryDecision: + switch fType { + case entry.Decision: formatted = core.FormatDecision( params.Content, params.Context, params.Rationale, params.Consequences, ) - case config.EntryTask: + case entry.Task: formatted = core.FormatTask(params.Content, params.Priority) - case config.EntryLearning: + case entry.Learning: formatted = core.FormatLearning( params.Content, params.Context, params.Lesson, params.Application, ) - case config.EntryConvention: + case entry.Convention: formatted = core.FormatConvention(params.Content) default: - return write.ErrUnknownType(fType) + return add.ErrUnknownType(fType) } newContent := core.AppendEntry(existing, formatted, fType, params.Section) if writeErr := os.WriteFile( - filePath, newContent, config.PermFile, + filePath, newContent, fs.PermFile, ); writeErr != nil { - return write.ErrFileWriteAdd(filePath, writeErr) + return add.ErrFileWriteAdd(filePath, writeErr) } - switch config.UserInputToEntry(fType) { - case config.EntryDecision: + switch fType { + case entry.Decision: indexed := index.UpdateDecisions(string(newContent)) if indexErr := os.WriteFile( - filePath, []byte(indexed), config.PermFile, + filePath, []byte(indexed), fs.PermFile, ); indexErr != nil { - return write.ErrIndexUpdate(filePath, indexErr) + return add.ErrIndexUpdate(filePath, indexErr) } - case config.EntryLearning: + case entry.Learning: indexed := index.UpdateLearnings(string(newContent)) - if indexErr := os.WriteFile(filePath, []byte(indexed), config.PermFile); indexErr != nil { - return write.ErrIndexUpdate(filePath, indexErr) + if indexErr := os.WriteFile( + filePath, []byte(indexed), fs.PermFile, + ); indexErr != nil { + return add.ErrIndexUpdate(filePath, indexErr) } - case config.EntryTask, config.EntryConvention: + case entry.Task, entry.Convention: // No index to update for these types } diff --git a/internal/err/errors.go b/internal/err/errors.go index 14e1e150..5f88ce6f 100644 --- a/internal/err/errors.go +++ b/internal/err/errors.go @@ -10,8 +10,21 @@ import ( "errors" "fmt" "os" + + "github.com/ActiveMemory/ctx/internal/assets" ) +// ReadingStateDir wraps a state directory read failure. +// +// Parameters: +// - cause: the underlying error from reading the directory. +// +// Returns: +// - error: "reading state directory: " +func ReadingStateDir(cause error) error { + return fmt.Errorf("reading state directory: %w", cause) +} + // MemoryNotFound returns an error indicating that MEMORY.md was not // discovered. Used by all memory subcommands (sync, status, diff). // @@ -150,18 +163,20 @@ func SessionIDRequired() error { return fmt.Errorf("please provide a session ID or use --latest") } -// AllWithArgument returns a validation error when --all is used alongside -// a positional argument. +// AllWithSessionID returns a validation error when --all is used with a session ID. // -// Parameters: -// - argType: what the argument represents (e.g. "a session ID", "a pattern"). +// Returns: +// - error: "cannot use --all with a session ID; use one or the other" +func AllWithSessionID() error { + return errors.New("cannot use --all with a session ID; use one or the other") +} + +// AllWithPattern returns a validation error when --all is used with a pattern. // // Returns: -// - error: "cannot use --all with ; use one or the other" -func AllWithArgument(argType string) error { - return fmt.Errorf( - "cannot use --all with %s; use one or the other", argType, - ) +// - error: "cannot use --all with a pattern; use one or the other" +func AllWithPattern() error { + return errors.New("cannot use --all with a pattern; use one or the other") } // NoEntriesMatch returns an error when a pattern matches nothing. @@ -220,7 +235,7 @@ func RegenerateRequiresAll() error { ) } -// InvalidDate returns a validation error for a malformed date flag. +// ReadReminders returns a validation error for a malformed date flag. // // Parameters: // - flag: the flag name (e.g. "--since", "--until"). @@ -229,12 +244,120 @@ func RegenerateRequiresAll() error { // // Returns: // - error: formatted with the expected format hint +// +// ReadReminders wraps a failure to read the reminders file. +// +// Parameters: +// - cause: the underlying read error. +// +// Returns: +// - error: "read reminders: " +func ReadReminders(cause error) error { + return fmt.Errorf("read reminders: %w", cause) +} + +// ParseReminders wraps a failure to parse the reminders file. +// +// Parameters: +// - cause: the underlying parse error. +// +// Returns: +// - error: "parse reminders: " +func ParseReminders(cause error) error { + return fmt.Errorf("parse reminders: %w", cause) +} + +// InvalidID returns an error for an unparseable ID string. +// +// Parameters: +// - value: the invalid ID string. +// +// Returns: +// - error: "invalid ID " +func InvalidID(value string) error { + return fmt.Errorf("invalid ID %q", value) +} + +// ReminderNotFound returns an error when no reminder matches the given ID. +// +// Parameters: +// - id: the ID that was not found. +// +// Returns: +// - error: "no reminder with ID " +func ReminderNotFound(id int) error { + return fmt.Errorf("no reminder with ID %d", id) +} + +// ReminderIDRequired returns an error when no reminder ID is provided. +// +// Returns: +// - error: "provide a reminder ID or use --all" +func ReminderIDRequired() error { + return errors.New("provide a reminder ID or use --all") +} + +// InvalidDateValue returns an error for an invalid date string. +// +// Parameters: +// - value: the invalid date string. +// +// Returns: +// - error: "invalid date (expected YYYY-MM-DD)" +func InvalidDateValue(value string) error { + return fmt.Errorf("invalid date %q (expected YYYY-MM-DD)", value) +} + func InvalidDate(flag, value string, cause error) error { return fmt.Errorf( "invalid %s date %q (expected YYYY-MM-DD): %w", flag, value, cause, ) } +// MemoryDiscoverFailed wraps a MEMORY.md discovery failure. +// +// Parameters: +// - cause: the underlying discovery error. +// +// Returns: +// - error: "MEMORY.md not found: " +func MemoryDiscoverFailed(cause error) error { + return fmt.Errorf("MEMORY.md not found: %w", cause) +} + +// MemoryDiffFailed wraps a memory diff computation failure. +// +// Parameters: +// - cause: the underlying diff error. +// +// Returns: +// - error: "computing diff: " +func MemoryDiffFailed(cause error) error { + return fmt.Errorf("computing diff: %w", cause) +} + +// SelectContentFailed wraps a content selection failure. +// +// Parameters: +// - cause: the underlying selection error. +// +// Returns: +// - error: "selecting content: " +func SelectContentFailed(cause error) error { + return fmt.Errorf("selecting content: %w", cause) +} + +// PublishFailed wraps a publish operation failure. +// +// Parameters: +// - cause: the underlying publish error. +// +// Returns: +// - error: "publishing: " +func PublishFailed(cause error) error { + return fmt.Errorf("publishing: %w", cause) +} + // ReadMemory wraps a failure to read MEMORY.md. // // Parameters: @@ -308,6 +431,28 @@ func NoJournalEntries(path string) error { ) } +// DirNotFound returns an error when a directory does not exist. +// +// Parameters: +// - dir: the missing directory path. +// +// Returns: +// - error: "directory not found: " +func DirNotFound(dir string) error { + return fmt.Errorf("directory not found: %s", dir) +} + +// NoSiteConfig returns an error when the zensical config file is missing. +// +// Parameters: +// - dir: directory where the config was expected. +// +// Returns: +// - error: "no zensical.toml found in " +func NoSiteConfig(dir string) error { + return fmt.Errorf("no zensical.toml found in %s", dir) +} + // ZensicalNotFound returns an error when zensical is not installed. // // Returns: @@ -420,3 +565,1431 @@ func SkillList(cause error) error { func SkillRead(name string, cause error) error { return fmt.Errorf("failed to read skill %s: %w", name, cause) } + +// DetectReferenceTime wraps a failure to detect the reference time for changes. +// +// Parameters: +// - cause: the underlying detection error +// +// Returns: +// - error: "detecting reference time: " +func DetectReferenceTime(cause error) error { + return fmt.Errorf("detecting reference time: %w", cause) +} + +// CreateArchiveDir wraps a failure to create the archive directory. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to create archive directory: " +func CreateArchiveDir(cause error) error { + return fmt.Errorf("failed to create archive directory: %w", cause) +} + +// WriteArchive wraps a failure to write an archive file. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to write archive: " +func WriteArchive(cause error) error { + return fmt.Errorf("failed to write archive: %w", cause) +} + +// TaskFileNotFound returns an error when TASKS.md does not exist. +// +// Returns: +// - error: "TASKS.md not found" +func TaskFileNotFound() error { + return fmt.Errorf("TASKS.md not found") +} + +// TaskFileRead wraps a failure to read TASKS.md. +// +// Parameters: +// - cause: the underlying read error +// +// Returns: +// - error: "failed to read TASKS.md: " +func TaskFileRead(cause error) error { + return fmt.Errorf("failed to read TASKS.md: %w", cause) +} + +// TaskFileWrite wraps a failure to write TASKS.md. +// +// Parameters: +// - cause: the underlying write error +// +// Returns: +// - error: "failed to write TASKS.md: " +func TaskFileWrite(cause error) error { + return fmt.Errorf("failed to write TASKS.md: %w", cause) +} + +// TaskMultipleMatches returns an error when a query matches more than one task. +// +// Parameters: +// - query: the search string that matched multiple tasks +// +// Returns: +// - error: "multiple tasks match ; be more specific or use task number" +func TaskMultipleMatches(query string) error { + return fmt.Errorf( + "multiple tasks match %q; be more specific or use task number", + query, + ) +} + +// TaskNotFound returns an error when no task matches the query. +// +// Parameters: +// - query: the search string that matched nothing +// +// Returns: +// - error: "no task matching found" +func TaskNotFound(query string) error { + return fmt.Errorf("no task matching %q found", query) +} + +// ReadEmbeddedSchema wraps a failure to read the embedded JSON Schema. +// +// Parameters: +// - cause: the underlying read error +// +// Returns: +// - error: "read embedded schema: " +func ReadEmbeddedSchema(cause error) error { + return fmt.Errorf("read embedded schema: %w", cause) +} + +// LoadJournalStateErr wraps a failure to load journal processing state. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "load journal state: " +func LoadJournalStateErr(cause error) error { + return fmt.Errorf("load journal state: %w", cause) +} + +// UnknownProfile returns an error for an unrecognized config profile name. +// +// Parameters: +// - name: the profile name that was not recognized +// +// Returns: +// - error: "unknown profile : must be dev, base, or prod" +func UnknownProfile(name string) error { + return fmt.Errorf("unknown profile %q: must be dev, base, or prod", name) +} + +// ReadProfile wraps a failure to read a profile file. +// +// Parameters: +// - name: profile filename +// - cause: the underlying read error +// +// Returns: +// - error: "read : " +func ReadProfile(name string, cause error) error { + return fmt.Errorf("read %s: %w", name, cause) +} + +// GitNotFound returns an error when git is not installed. +// The message is loaded from assets and includes guidance for the user. +// +// Returns: +// - error: message from assets key parser.git-not-found +func GitNotFound() error { + return fmt.Errorf("%s", assets.TextDesc(assets.TextDescKeyParserGitNotFound)) +} + +// NotInGitRepo wraps a failure from git rev-parse. +// +// Parameters: +// - cause: the underlying exec error +// +// Returns: +// - error: "not in a git repository: " +func NotInGitRepo(cause error) error { + return fmt.Errorf("not in a git repository: %w", cause) +} + +// UnknownFormat returns an error for an unsupported output format. +// +// Parameters: +// - format: the format string that was not recognized +// - supported: list of valid formats +// +// Returns: +// - error: "unknown format (supported: )" +func UnknownFormat(format, supported string) error { + return fmt.Errorf("unknown format %q (supported: %s)", format, supported) +} + +// UnknownProjectType returns an error for an unsupported project type. +// +// Parameters: +// - projType: the type string that was not recognized +// - supported: list of valid types +// +// Returns: +// - error: "unknown project type (supported: )" +func UnknownProjectType(projType, supported string) error { + return fmt.Errorf("unknown project type %q (supported: %s)", projType, supported) +} + +// InvalidTool returns an error for an unsupported AI tool name. +// +// Parameters: +// - tool: the tool name that was not recognized +// +// Returns: +// - error: "invalid tool : must be claude, aider, or generic" +func InvalidTool(tool string) error { + return fmt.Errorf("invalid tool %q: must be claude, aider, or generic", tool) +} + +// NoCompletedTasks returns an error when there are no completed tasks to archive. +// +// Returns: +// - error: "no completed tasks to archive" +func NoCompletedTasks() error { + return fmt.Errorf("no completed tasks to archive") +} + +// NoTemplate wraps a failure to find an embedded template. +// +// Parameters: +// - filename: Name of the file without a template +// - cause: the underlying read error +// +// Returns: +// - error: "no template available for : " +func NoTemplate(filename string, cause error) error { + return fmt.Errorf("no template available for %s: %w", filename, cause) +} + +// UnsupportedTool returns an error for an unrecognized AI tool name. +// +// Parameters: +// - tool: the tool name that was not recognized +// +// Returns: +// - error: "unsupported tool: " +func UnsupportedTool(tool string) error { + return fmt.Errorf("unsupported tool: %s", tool) +} + +// DriftViolations returns an error when drift detection found violations. +// +// Returns: +// - error: "drift detection found violations" +func DriftViolations() error { + return fmt.Errorf("drift detection found violations") +} + +// ListTemplates wraps a failure to list embedded templates. +// +// Parameters: +// - cause: the underlying error from the list operation +// +// Returns: +// - error: "failed to list templates: " +func ListTemplates(cause error) error { + return fmt.Errorf("failed to list templates: %w", cause) +} + +// ReadTemplate wraps a failure to read an embedded template. +// +// Parameters: +// - name: template name that failed to read +// - cause: the underlying error from the read operation +// +// Returns: +// - error: "failed to read template : " +func ReadTemplate(name string, cause error) error { + return fmt.Errorf("failed to read template %s: %w", name, cause) +} + +// GenerateKey wraps a failure to generate an encryption key. +// +// Parameters: +// - cause: the underlying error from key generation +// +// Returns: +// - error: "failed to generate scratchpad key: " +func GenerateKey(cause error) error { + return fmt.Errorf("failed to generate scratchpad key: %w", cause) +} + +// SaveKey wraps a failure to save an encryption key. +// +// Parameters: +// - cause: the underlying error from key saving +// +// Returns: +// - error: "failed to save scratchpad key: " +func SaveKey(cause error) error { + return fmt.Errorf("failed to save scratchpad key: %w", cause) +} + +// MkdirKeyDir wraps a failure to create the key directory. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to create key dir: " +func MkdirKeyDir(cause error) error { + return fmt.Errorf("failed to create key dir: %w", cause) +} + +// CreateBackup wraps a failure to create a backup file. +// +// Parameters: +// - name: backup filename that could not be created +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to create backup : " +func CreateBackup(name string, cause error) error { + return fmt.Errorf("failed to create backup %s: %w", name, cause) +} + +// CreateBackupGeneric wraps a generic backup creation failure. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to create backup: " +func CreateBackupGeneric(cause error) error { + return fmt.Errorf("failed to create backup: %w", cause) +} + +// WriteMerged wraps a failure to write a merged file. +// +// Parameters: +// - path: file path that could not be written +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to write merged : " +func WriteMerged(path string, cause error) error { + return fmt.Errorf("failed to write merged %s: %w", path, cause) +} + +// MarkerNotFound returns an error when a section marker is missing. +// +// Parameters: +// - kind: marker kind (e.g. "ctx", "plan", "prompt") +// +// Returns: +// - error: " start marker not found" +func MarkerNotFound(kind string) error { + return fmt.Errorf("%s start marker not found", kind) +} + +// TemplateMissingMarkers returns an error when a template lacks markers. +// +// Parameters: +// - kind: marker kind (e.g. "ctx", "plan", "prompt") +// +// Returns: +// - error: "template missing markers" +func TemplateMissingMarkers(kind string) error { + return fmt.Errorf("template missing %s markers", kind) +} + +// FileUpdate wraps a failure to update a file. +// +// Parameters: +// - path: file path that could not be updated +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to update : " +func FileUpdate(path string, cause error) error { + return fmt.Errorf("failed to update %s: %w", path, cause) +} + +// ParseFile wraps a failure to parse a file. +// +// Parameters: +// - path: file path that could not be parsed +// - cause: the underlying parse error +// +// Returns: +// - error: "failed to parse %s: " +func ParseFile(path string, cause error) error { + return fmt.Errorf("failed to parse %s: %w", path, cause) +} + +// MarshalSettings wraps a failure to marshal settings JSON. +// +// Parameters: +// - cause: the underlying marshal error +// +// Returns: +// - error: "failed to marshal settings: " +func MarshalSettings(cause error) error { + return fmt.Errorf("failed to marshal settings: %w", cause) +} + +// ListPromptTemplates wraps a failure to list prompt templates. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "failed to list prompt templates: " +func ListPromptTemplates(cause error) error { + return fmt.Errorf("failed to list prompt templates: %w", cause) +} + +// ReadPromptTemplate wraps a failure to read a prompt template. +// +// Parameters: +// - name: template name that failed to read +// - cause: the underlying error +// +// Returns: +// - error: "failed to read prompt template : " +func ReadPromptTemplate(name string, cause error) error { + return fmt.Errorf("failed to read prompt template %s: %w", name, cause) +} + +// ListEntryTemplates wraps a failure to list entry templates. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "failed to list entry templates: " +func ListEntryTemplates(cause error) error { + return fmt.Errorf("failed to list entry templates: %w", cause) +} + +// ReadEntryTemplate wraps a failure to read an entry template. +// +// Parameters: +// - name: template name that failed to read +// - cause: the underlying error +// +// Returns: +// - error: "failed to read entry template : " +func ReadEntryTemplate(name string, cause error) error { + return fmt.Errorf("failed to read entry template %s: %w", name, cause) +} + +// HomeDir wraps a failure to determine the home directory. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "cannot determine home directory: " +func HomeDir(cause error) error { + return fmt.Errorf("cannot determine home directory: %w", cause) +} + +// MarshalPlugins wraps a failure to marshal enabledPlugins JSON. +// +// Parameters: +// - cause: the underlying marshal error +// +// Returns: +// - error: "failed to marshal enabledPlugins: " +func MarshalPlugins(cause error) error { + return fmt.Errorf("failed to marshal enabledPlugins: %w", cause) +} + +// FileAmend wraps a failure to amend an existing file. +// +// Parameters: +// - path: file path that could not be amended +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to amend : " +func FileAmend(path string, cause error) error { + return fmt.Errorf("failed to amend %s: %w", path, cause) +} + +// ReadProjectReadme wraps a failure to read a project README template. +// +// Parameters: +// - dir: directory name whose README failed to read +// - cause: the underlying error +// +// Returns: +// - error: "failed to read README template: " +func ReadProjectReadme(dir string, cause error) error { + return fmt.Errorf("failed to read %s README template: %w", dir, cause) +} + +// ReadInitTemplate wraps a failure to read an init template file. +// +// Parameters: +// - name: template filename that failed to read +// - cause: the underlying error +// +// Returns: +// - error: "failed to read template: " +func ReadInitTemplate(name string, cause error) error { + return fmt.Errorf("failed to read %s template: %w", name, cause) +} + +// CreateMakefile wraps a failure to create a new Makefile. +// +// Parameters: +// - cause: the underlying OS error +// +// Returns: +// - error: "failed to create Makefile: " +func CreateMakefile(cause error) error { + return fmt.Errorf("failed to create Makefile: %w", cause) +} + +// NoInput returns an error for missing stdin input. +// +// Returns: +// - error: "no input received" +func NoInput() error { + return errors.New("no input received") +} + +// WebhookEmpty returns an error for blank webhook URL input. +// +// Returns: +// - error: "webhook URL cannot be empty" +func WebhookEmpty() error { + return errors.New("webhook URL cannot be empty") +} + +// SaveWebhook wraps a webhook save failure. +// +// Parameters: +// - cause: the underlying error from the save operation. +// +// Returns: +// - error: "save webhook: " +func SaveWebhook(cause error) error { + return fmt.Errorf("save webhook: %w", cause) +} + +// LoadWebhook wraps a webhook load failure. +// +// Parameters: +// - cause: the underlying error from the load operation. +// +// Returns: +// - error: "load webhook: " +func LoadWebhook(cause error) error { + return fmt.Errorf("load webhook: %w", cause) +} + +// MarshalPayload wraps a JSON marshal failure. +// +// Parameters: +// - cause: the underlying marshal error. +// +// Returns: +// - error: "marshal payload: " +func MarshalPayload(cause error) error { + return fmt.Errorf("marshal payload: %w", cause) +} + +// SendNotification wraps a notification send failure. +// +// Parameters: +// - cause: the underlying HTTP error. +// +// Returns: +// - error: "send test notification: " +func SendNotification(cause error) error { + return fmt.Errorf("send test notification: %w", cause) +} + +// FlagRequired returns an error for a missing required flag. +// +// Parameters: +// - name: the flag name. +// +// Returns: +// - error: "required flag \"\" not set" +func FlagRequired(name string) error { + return fmt.Errorf("required flag %q not set", name) +} + +// ParserReadFile wraps a session file read failure. +// +// Parameters: +// - cause: the underlying error from reading the file. +// +// Returns: +// - error: "read file: " +func ParserReadFile(cause error) error { + return fmt.Errorf("read file: %w", cause) +} + +// ArgRequired returns an error for a missing required argument. +// +// Parameters: +// - name: the argument name. +// +// Returns: +// - error: " argument is required" +func ArgRequired(name string) error { + return fmt.Errorf("%s argument is required", name) +} + +// ReadFile wraps a file read failure. +// +// Parameters: +// - cause: the underlying read error. +// +// Returns: +// - error: "read file: " +func ReadFile(cause error) error { + return fmt.Errorf("read file: %w", cause) +} + +// FileTooLarge returns an error for a file exceeding the size limit. +// +// Parameters: +// - size: actual file size in bytes. +// - max: maximum allowed size in bytes. +// +// Returns: +// - error: "file too large: bytes (max )" +func FileTooLarge(size, max int) error { + return fmt.Errorf("file too large: %d bytes (max %d)", size, max) +} + +// InvalidIndex returns an error for a non-numeric entry index. +// +// Parameters: +// - value: the invalid index string. +// +// Returns: +// - error: "invalid index: " +func InvalidIndex(value string) error { + return fmt.Errorf("invalid index: %s", value) +} + +// EditBlobTextConflict returns an error when --file/--label and text +// editing flags are used together. +// +// Returns: +// - error: describing the mutual exclusivity +func EditBlobTextConflict() error { + return errors.New("--file/--label and positional text/--append/--prepend are mutually exclusive") +} + +// EditTextConflict returns an error when multiple text editing modes +// are used together. +// +// Returns: +// - error: describing the mutual exclusivity +func EditTextConflict() error { + return errors.New("--append, --prepend, and positional text are mutually exclusive") +} + +// EditNoMode returns an error when no editing mode was specified. +// +// Returns: +// - error: prompting for a mode +func EditNoMode() error { + return errors.New("provide replacement text, --append, or --prepend") +} + +// BlobAppendNotAllowed returns an error for appending to a blob entry. +// +// Returns: +// - error: "cannot append to a blob entry" +func BlobAppendNotAllowed() error { + return errors.New("cannot append to a blob entry") +} + +// BlobPrependNotAllowed returns an error for prepending to a blob entry. +// +// Returns: +// - error: "cannot prepend to a blob entry" +func BlobPrependNotAllowed() error { + return errors.New("cannot prepend to a blob entry") +} + +// NotBlobEntry returns an error when a blob operation targets a non-blob. +// +// Parameters: +// - n: the 1-based entry index. +// +// Returns: +// - error: "entry is not a blob entry" +func NotBlobEntry(n int) error { + return fmt.Errorf("entry %d is not a blob entry", n) +} + +// OpenFile wraps a file open failure. +// +// Parameters: +// - path: the file path. +// - cause: the underlying OS error. +// +// Returns: +// - error: "open : " +func OpenFile(path string, cause error) error { + return fmt.Errorf("open %s: %w", path, cause) +} + +// StatPath wraps a stat failure. +// +// Parameters: +// - path: the path that failed. +// - cause: the underlying OS error. +// +// Returns: +// - error: "stat : " +func StatPath(path string, cause error) error { + return fmt.Errorf("stat %s: %w", path, cause) +} + +// NotDirectory returns an error when a path is not a directory. +// +// Parameters: +// - path: the path. +// +// Returns: +// - error: " is not a directory" +func NotDirectory(path string) error { + return fmt.Errorf("%s is not a directory", path) +} + +// ReadDirectory wraps a directory read failure. +// +// Parameters: +// - path: the directory path. +// - cause: the underlying OS error. +// +// Returns: +// - error: "read directory : " +func ReadDirectory(path string, cause error) error { + return fmt.Errorf("read directory %s: %w", path, cause) +} + +// ResolveNotEncrypted returns an error when resolve is used on an +// unencrypted scratchpad. +// +// Returns: +// - error: "resolve is only needed for encrypted scratchpads" +func ResolveNotEncrypted() error { + return errors.New("resolve is only needed for encrypted scratchpads") +} + +// NoConflictFiles returns an error when no merge conflict files are found. +// +// Parameters: +// - filename: the base scratchpad filename. +// +// Returns: +// - error: "no conflict files found (.ours / .theirs)" +func NoConflictFiles(filename string) error { + return fmt.Errorf("no conflict files found (%s.ours / %s.theirs)", filename, filename) +} + +// WriteFileFailed wraps a file write failure. +// +// Parameters: +// - cause: the underlying write error. +// +// Returns: +// - error: "write file: " +func WriteFileFailed(cause error) error { + return fmt.Errorf("write file: %w", cause) +} + +// OutFlagRequiresBlob returns an error when --out is used on a non-blob entry. +// +// Returns: +// - error: "--out can only be used with blob entries" +func OutFlagRequiresBlob() error { + return errors.New("--out can only be used with blob entries") +} + +// ReadJournalDir wraps a failure to read the journal directory. +// +// Parameters: +// - cause: the underlying OS error. +// +// Returns: +// - error: "read journal directory: " +func ReadJournalDir(cause error) error { + return fmt.Errorf("read journal directory: %w", cause) +} + +// SettingsNotFound returns an error when settings.local.json is missing. +// +// Returns: +// - error: "no .claude/settings.local.json found" +func SettingsNotFound() error { + return errors.New("no .claude/settings.local.json found") +} + +// GoldenNotFound returns an error when settings.golden.json is missing. +// +// Returns: +// - error: advises the user to run 'ctx permissions snapshot' first +func GoldenNotFound() error { + return errors.New( + "no .claude/settings.golden.json found — run 'ctx permissions snapshot' first", + ) +} + +// FileRead wraps a file read failure with path context. +// +// Parameters: +// - path: file path that could not be read. +// - cause: the underlying OS error. +// +// Returns: +// - error: "failed to read : " +func FileRead(path string, cause error) error { + return fmt.Errorf("failed to read %s: %w", path, cause) +} + +// PromptExists returns an error when a prompt template already exists. +// +// Parameters: +// - name: the prompt name that already exists. +// +// Returns: +// - error: "prompt already exists" +func PromptExists(name string) error { + return fmt.Errorf("prompt %q already exists", name) +} + +// PromptNotFound returns an error when a prompt template does not exist. +// +// Parameters: +// - name: the prompt name that was not found. +// +// Returns: +// - error: "prompt not found" +func PromptNotFound(name string) error { + return fmt.Errorf("prompt %q not found", name) +} + +// RemovePrompt wraps a failure to remove a prompt template. +// +// Parameters: +// - cause: the underlying OS error. +// +// Returns: +// - error: "remove prompt: " +func RemovePrompt(cause error) error { + return fmt.Errorf("remove prompt: %w", cause) +} + +// NoPromptTemplate returns an error when no embedded template exists. +// +// Parameters: +// - name: the template name that was not found. +// +// Returns: +// - error: advises the user to use --stdin +func NoPromptTemplate(name string) error { + return fmt.Errorf( + "no embedded template %q — use --stdin to provide content", name, + ) +} + +// ReadScratchpad wraps a scratchpad read failure. +// +// Parameters: +// - cause: the underlying read error. +// +// Returns: +// - error: "read scratchpad: " +func ReadScratchpad(cause error) error { + return fmt.Errorf("read scratchpad: %w", cause) +} + +// ContextNotInitialized returns an error when no .context/ directory is found. +// +// Returns: +// - error: "no .context/ directory found. Run 'ctx init' first" +func ContextNotInitialized() error { + return errors.New("no .context/ directory found. Run 'ctx init' first") +} + +// InvalidBackupScope returns an error for an unrecognized backup scope value. +// +// Parameters: +// - scope: the invalid scope string +// +// Returns: +// - error: "invalid scope '': must be project, global, or all" +func InvalidBackupScope(scope string) error { + return fmt.Errorf("invalid scope %q: must be project, global, or all", scope) +} + +// BackupSMBConfig wraps an SMB configuration parse failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "parse SMB config: " +func BackupSMBConfig(cause error) error { + return fmt.Errorf("parse SMB config: %w", cause) +} + +// BackupProject wraps a project backup failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "project backup: " +func BackupProject(cause error) error { + return fmt.Errorf("project backup: %w", cause) +} + +// BackupGlobal wraps a global backup failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "global backup: " +func BackupGlobal(cause error) error { + return fmt.Errorf("global backup: %w", cause) +} + +// CreateArchive wraps an archive creation failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "create archive file: " +func CreateArchive(cause error) error { + return fmt.Errorf("create archive file: %w", cause) +} + +// ContextDirNotFound returns an error when the context directory does not exist. +// +// Parameters: +// - dir: the missing context directory path. +// +// Returns: +// - error: "context directory not found: — run 'ctx init'" +func ContextDirNotFound(dir string) error { + return fmt.Errorf("context directory not found: %s — run 'ctx init'", dir) +} + +// SourceNotFound returns an error when a backup source path is missing. +// +// Parameters: +// - path: the missing source path +// +// Returns: +// - error: "source not found: " +func SourceNotFound(path string) error { + return fmt.Errorf("source not found: %s", path) +} + +// EmbeddedTemplateNotFound returns an error when an embedded hook +// message template cannot be located. +// +// Parameters: +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - error: "embedded template not found for /" +func EmbeddedTemplateNotFound(hook, variant string) error { + return fmt.Errorf("embedded template not found for %s/%s", hook, variant) +} + +// OverrideExists returns an error when a message override already +// exists and must be reset before editing. +// +// Parameters: +// - path: existing override file path +// - hook: hook name +// - variant: template variant name +// +// Returns: +// - error: "override already exists at ..." +func OverrideExists(path, hook, variant string) error { + return fmt.Errorf("override already exists at %s\nEdit it directly or use `ctx system message reset %s %s` first", + path, hook, variant) +} + +// CreateDir wraps a directory creation failure. +// +// Parameters: +// - dir: the directory path that could not be created +// - cause: the underlying error +// +// Returns: +// - error: "failed to create directory : " +func CreateDir(dir string, cause error) error { + return fmt.Errorf("failed to create directory %s: %w", dir, cause) +} + +// WriteOverride wraps a message override write failure. +// +// Parameters: +// - path: the override file path +// - cause: the underlying error +// +// Returns: +// - error: "failed to write override : " +func WriteOverride(path string, cause error) error { + return fmt.Errorf("failed to write override %s: %w", path, cause) +} + +// RemoveOverride wraps a message override removal failure. +// +// Parameters: +// - path: the override file path +// - cause: the underlying error +// +// Returns: +// - error: "failed to remove override : " +func RemoveOverride(path string, cause error) error { + return fmt.Errorf("failed to remove override %s: %w", path, cause) +} + +// UnknownHook returns an error for an unrecognized hook name. +// +// Parameters: +// - hook: the unknown hook name +// +// Returns: +// - error: "unknown hook: ..." +func UnknownHook(hook string) error { + return fmt.Errorf("unknown hook: %s\nRun `ctx system message list` to see available hooks", hook) +} + +// UnknownVariant returns an error for an unrecognized variant within +// a known hook. +// +// Parameters: +// - variant: the unknown variant name +// - hook: the parent hook name +// +// Returns: +// - error: "unknown variant for hook ..." +func UnknownVariant(variant, hook string) error { + return fmt.Errorf("unknown variant %q for hook %q\nRun `ctx system message list` to see available variants", variant, hook) +} + +// LoadJournalStateFailed wraps a journal state loading failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "load journal state: " +func LoadJournalStateFailed(cause error) error { + return fmt.Errorf("load journal state: %w", cause) +} + +// SaveJournalStateFailed wraps a journal state save failure. +// +// Parameters: +// - cause: the underlying error +// +// Returns: +// - error: "save journal state: " +func SaveJournalStateFailed(cause error) error { + return fmt.Errorf("save journal state: %w", cause) +} + +// UnknownStage returns an error for an unrecognized journal stage. +// +// Parameters: +// - stage: the unknown stage name +// - valid: comma-separated list of valid stage names +// +// Returns: +// - error: "unknown stage ; valid: " +func UnknownStage(stage, valid string) error { + return fmt.Errorf("unknown stage %q; valid: %s", stage, valid) +} + +// StageNotSet returns an error when a journal stage has not been set. +// +// Parameters: +// - filename: the journal filename +// - stage: the stage name +// +// Returns: +// - error: ": not set" +func StageNotSet(filename, stage string) error { + return fmt.Errorf("%s: %s not set", filename, stage) +} + +// EventLogRead wraps a failure to read the event log. +// +// Parameters: +// - cause: the underlying error from the query operation. +// +// Returns: +// - error: "reading event log: " +func EventLogRead(cause error) error { + return fmt.Errorf("reading event log: %w", cause) +} + +// StatsGlob wraps a failure to glob stats files. +// +// Parameters: +// - cause: the underlying error from the glob operation. +// +// Returns: +// - error: "globbing stats files: " +func StatsGlob(cause error) error { + return fmt.Errorf("globbing stats files: %w", cause) +} + +// CryptoCreateCipher wraps a failure to create an AES cipher. +// +// Parameters: +// - cause: the underlying crypto error. +// +// Returns: +// - error: "create cipher: " +func CryptoCreateCipher(cause error) error { + return fmt.Errorf("create cipher: %w", cause) +} + +// CryptoCreateGCM wraps a failure to create a GCM instance. +// +// Parameters: +// - cause: the underlying crypto error. +// +// Returns: +// - error: "create GCM: " +func CryptoCreateGCM(cause error) error { + return fmt.Errorf("create GCM: %w", cause) +} + +// CryptoGenerateNonce wraps a failure to generate a random nonce. +// +// Parameters: +// - cause: the underlying IO error. +// +// Returns: +// - error: "generate nonce: " +func CryptoGenerateNonce(cause error) error { + return fmt.Errorf("generate nonce: %w", cause) +} + +// CryptoGenerateKey wraps a failure to generate a random key. +// +// Parameters: +// - cause: the underlying IO error. +// +// Returns: +// - error: "generate key: " +func CryptoGenerateKey(cause error) error { + return fmt.Errorf("generate key: %w", cause) +} + +// CryptoCiphertextTooShort returns an error when ciphertext is shorter +// than the nonce size. +// +// Returns: +// - error: "ciphertext too short" +func CryptoCiphertextTooShort() error { + return errors.New("ciphertext too short") +} + +// CryptoDecrypt wraps a decryption failure with cause. +// +// Parameters: +// - cause: the underlying decryption error. +// +// Returns: +// - error: "decrypt: " +func CryptoDecrypt(cause error) error { + return fmt.Errorf("decrypt: %w", cause) +} + +// CryptoReadKey wraps a failure to read a key file. +// +// Parameters: +// - cause: the underlying read error. +// +// Returns: +// - error: "read key: " +func CryptoReadKey(cause error) error { + return fmt.Errorf("read key: %w", cause) +} + +// CryptoInvalidKeySize returns an error when a key file has the wrong size. +// +// Parameters: +// - got: actual key size in bytes. +// - want: expected key size in bytes. +// +// Returns: +// - error: "invalid key size: got N bytes, want M" +func CryptoInvalidKeySize(got, want int) error { + return fmt.Errorf("invalid key size: got %d bytes, want %d", got, want) +} + +// CryptoWriteKey wraps a failure to write a key file. +// +// Parameters: +// - cause: the underlying write error. +// +// Returns: +// - error: "write key: " +func CryptoWriteKey(cause error) error { + return fmt.Errorf("write key: %w", cause) +} + +// SnapshotWrite wraps a failure to write a task snapshot file. +// +// Parameters: +// - cause: the underlying OS error. +// +// Returns: +// - error: "failed to write snapshot: " +func SnapshotWrite(cause error) error { + return fmt.Errorf("failed to write snapshot: %w", cause) +} + +// OpenLogFile wraps a failure to open a log file. +// +// Parameters: +// - cause: the underlying OS error. +// +// Returns: +// - error: "failed to open log file: " +func OpenLogFile(cause error) error { + return fmt.Errorf("failed to open log file: %w", cause) +} + +// UnknownUpdateType returns an error for an unrecognized context update type. +// +// Parameters: +// - typeName: the update type that was not recognized. +// +// Returns: +// - error: "unknown update type: " +func UnknownUpdateType(typeName string) error { + return fmt.Errorf("unknown update type: %s", typeName) +} + +// NoTaskSpecified returns an error when no task query was provided. +// +// Returns: +// - error: "no task specified" +func NoTaskSpecified() error { + return errors.New("no task specified") +} + +// NoTaskMatch returns an error when no task matches the search query. +// +// Parameters: +// - query: the search string that matched nothing. +// +// Returns: +// - error: "no task matching \"\" found" +func NoTaskMatch(query string) error { + return fmt.Errorf("no task matching %q found", query) +} + +// ReadInputStream wraps a failure to read from the input stream. +// +// Parameters: +// - cause: the underlying read error. +// +// Returns: +// - error: "error reading input: " +func ReadInputStream(cause error) error { + return fmt.Errorf("error reading input: %w", cause) +} + +// ReindexFileNotFound returns an error when the file to reindex does not exist. +// +// Parameters: +// - fileName: Display name (e.g., "DECISIONS.md") +// +// Returns: +// - error: " not found. Run 'ctx init' first" +func ReindexFileNotFound(fileName string) error { + return fmt.Errorf("%s not found. Run 'ctx init' first", fileName) +} + +// ReindexFileRead wraps a read failure during reindexing. +// +// Parameters: +// - filePath: Path that could not be read +// - cause: The underlying read error +// +// Returns: +// - error: "failed to read : " +func ReindexFileRead(filePath string, cause error) error { + return fmt.Errorf("failed to read %s: %w", filePath, cause) +} + +// ReindexFileWrite wraps a write failure during reindexing. +// +// Parameters: +// - filePath: Path that could not be written +// - cause: The underlying write error +// +// Returns: +// - error: "failed to write : " +func ReindexFileWrite(filePath string, cause error) error { + return fmt.Errorf("failed to write %s: %w", filePath, cause) +} + +// DiscoverResolveRoot wraps a project root resolution failure. +func DiscoverResolveRoot(cause error) error { + return fmt.Errorf("resolving project root: %w", cause) +} + +// DiscoverResolveHome wraps a home directory resolution failure. +func DiscoverResolveHome(cause error) error { + return fmt.Errorf("resolving home directory: %w", cause) +} + +// DiscoverNoMemory returns an error when no auto memory file exists. +func DiscoverNoMemory(path string) error { + return fmt.Errorf("no auto memory found at %s", path) +} + +// MemoryReadSource wraps a source file read failure during sync. +func MemoryReadSource(cause error) error { + return fmt.Errorf("reading source: %w", cause) +} + +// MemoryArchivePrevious wraps a failure to archive the previous mirror. +func MemoryArchivePrevious(cause error) error { + return fmt.Errorf("archiving previous mirror: %w", cause) +} + +// MemoryCreateDir wraps a failure to create the memory directory. +func MemoryCreateDir(cause error) error { + return fmt.Errorf("creating memory directory: %w", cause) +} + +// MemoryWriteMirror wraps a failure to write the mirror file. +func MemoryWriteMirror(cause error) error { + return fmt.Errorf("writing mirror: %w", cause) +} + +// MemoryReadMirrorArchive wraps a failure to read the mirror for archiving. +func MemoryReadMirrorArchive(cause error) error { + return fmt.Errorf("reading mirror for archive: %w", cause) +} + +// MemoryCreateArchiveDir wraps a failure to create the archive directory. +func MemoryCreateArchiveDir(cause error) error { + return fmt.Errorf("creating archive directory: %w", cause) +} + +// MemoryWriteArchive wraps a failure to write an archive file. +func MemoryWriteArchive(cause error) error { + return fmt.Errorf("writing archive: %w", cause) +} + +// MemoryReadMirror wraps a failure to read the mirror file. +func MemoryReadMirror(cause error) error { + return fmt.Errorf("reading mirror: %w", cause) +} + +// MemoryReadDiffSource wraps a failure to read the source for diff. +func MemoryReadDiffSource(cause error) error { + return fmt.Errorf("reading source: %w", cause) +} + +// MemorySelectContent wraps a failure to select publish content. +func MemorySelectContent(cause error) error { + return fmt.Errorf("selecting content: %w", cause) +} + +// MemoryWriteMemory wraps a failure to write MEMORY.md. +func MemoryWriteMemory(cause error) error { + return fmt.Errorf("writing MEMORY.md: %w", cause) +} + +// ParserOpenFile wraps a session file open failure. +// +// Parameters: +// - cause: the underlying error from opening the file. +// +// Returns: +// - error: "open file: " +func ParserOpenFile(cause error) error { + return fmt.Errorf("open file: %w", cause) +} + +// ParserNoMatch returns an error when no parser can handle a file. +// +// Parameters: +// - path: the file path that no parser matched. +// +// Returns: +// - error: "no parser found for file: " +func ParserNoMatch(path string) error { + return fmt.Errorf("no parser found for file: %s", path) +} + +// ParserWalkDir wraps a directory walk failure during session scanning. +// +// Parameters: +// - cause: the underlying error from filepath.Walk. +// +// Returns: +// - error: "walk directory: " +func ParserWalkDir(cause error) error { + return fmt.Errorf("walk directory: %w", cause) +} + +// ParserFileError wraps a per-file parse failure with the file path. +// +// Parameters: +// - path: the file path that failed to parse. +// - cause: the underlying parse error. +// +// Returns: +// - error: ": " +func ParserFileError(path string, cause error) error { + return fmt.Errorf("%s: %w", path, cause) +} + +// ParserScanFile wraps a session file scan failure. +// +// Parameters: +// - cause: the underlying error from scanning the file. +// +// Returns: +// - error: "scan file: " +func ParserScanFile(cause error) error { + return fmt.Errorf("scan file: %w", cause) +} + +// ParserUnmarshal wraps a JSON unmarshal failure during session parsing. +// +// Parameters: +// - cause: the underlying error from JSON unmarshaling. +// +// Returns: +// - error: "unmarshal: " +func ParserUnmarshal(cause error) error { + return fmt.Errorf("unmarshal: %w", cause) +} diff --git a/internal/eventlog/eventlog.go b/internal/eventlog/event_log.go similarity index 92% rename from internal/eventlog/eventlog.go rename to internal/eventlog/event_log.go index 8e8fa138..27edc927 100644 --- a/internal/eventlog/eventlog.go +++ b/internal/eventlog/event_log.go @@ -17,7 +17,9 @@ import ( "path/filepath" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/event" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/notify" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -43,7 +45,7 @@ func Append(event, message, sessionID string, detail *notify.TemplateRef) { // Ensure state directory exists. stateDir := filepath.Dir(logPath) - if mkErr := os.MkdirAll(stateDir, config.PermExec); mkErr != nil { + if mkErr := os.MkdirAll(stateDir, fs.PermExec); mkErr != nil { return } @@ -71,7 +73,7 @@ func Append(event, message, sessionID string, detail *notify.TemplateRef) { line = append(line, '\n') //nolint:gosec // project-local state path - f, openErr := os.OpenFile(logPath, os.O_APPEND|os.O_CREATE|os.O_WRONLY, config.PermFile) + f, openErr := os.OpenFile(logPath, os.O_APPEND|os.O_CREATE|os.O_WRONLY, fs.PermFile) if openErr != nil { return } @@ -190,7 +192,7 @@ func rotate(logPath string) { if statErr != nil { return // file doesn't exist yet, nothing to rotate } - if info.Size() < int64(config.EventLogMaxBytes) { + if info.Size() < int64(event.EventLogMaxBytes) { return } @@ -201,10 +203,10 @@ func rotate(logPath string) { // logFilePath returns the path to the current event log. func logFilePath() string { - return filepath.Join(rc.ContextDir(), config.DirState, config.FileEventLog) + return filepath.Join(rc.ContextDir(), dir.State, event.FileEventLog) } // prevLogFilePath returns the path to the rotated event log. func prevLogFilePath() string { - return filepath.Join(rc.ContextDir(), config.DirState, config.FileEventLogPrev) + return filepath.Join(rc.ContextDir(), dir.State, event.FileEventLogPrev) } diff --git a/internal/eventlog/eventlog_test.go b/internal/eventlog/event_log_test.go similarity index 77% rename from internal/eventlog/eventlog_test.go rename to internal/eventlog/event_log_test.go index 851b4214..f20a01bf 100644 --- a/internal/eventlog/eventlog_test.go +++ b/internal/eventlog/event_log_test.go @@ -13,7 +13,10 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/event" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/notify" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -22,10 +25,10 @@ import ( // and returns a cleanup function. func setupTestDir(t *testing.T, enableLog bool) string { t.Helper() - dir := t.TempDir() + tmpDir := t.TempDir() rc.Reset() - rc.OverrideContextDir(filepath.Join(dir, config.DirContext)) + rc.OverrideContextDir(filepath.Join(tmpDir, dir.Context)) // Write .ctxrc to control event_log. rcContent := "event_log: false\n" @@ -33,14 +36,14 @@ func setupTestDir(t *testing.T, enableLog bool) string { rcContent = "event_log: true\n" } if writeErr := os.WriteFile( - filepath.Join(dir, config.FileContextRC), []byte(rcContent), config.PermFile, + filepath.Join(tmpDir, file.CtxRC), []byte(rcContent), fs.PermFile, ); writeErr != nil { t.Fatalf("failed to write .ctxrc: %v", writeErr) } // Change to temp dir so rc loads the .ctxrc. origDir, _ := os.Getwd() - if chErr := os.Chdir(dir); chErr != nil { + if chErr := os.Chdir(tmpDir); chErr != nil { t.Fatalf("failed to chdir: %v", chErr) } rc.Reset() // force reload with new cwd @@ -50,12 +53,12 @@ func setupTestDir(t *testing.T, enableLog bool) string { rc.Reset() }) - return dir + return tmpDir } func TestAppend_Disabled(t *testing.T) { - dir := setupTestDir(t, false) - logPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLog) + tmpDir := setupTestDir(t, false) + logPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLog) Append("relay", "test message", "session-1", nil) @@ -65,8 +68,8 @@ func TestAppend_Disabled(t *testing.T) { } func TestAppend_Basic(t *testing.T) { - dir := setupTestDir(t, true) - logPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLog) + tmpDir := setupTestDir(t, true) + logPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLog) detail := notify.NewTemplateRef("qa-reminder", "gate", nil) Append("relay", "QA gate reminder", "session-1", detail) @@ -99,8 +102,8 @@ func TestAppend_Basic(t *testing.T) { } func TestAppend_CreatesStateDir(t *testing.T) { - dir := setupTestDir(t, true) - stateDir := filepath.Join(dir, config.DirContext, config.DirState) + tmpDir := setupTestDir(t, true) + stateDir := filepath.Join(tmpDir, dir.Context, dir.State) // Verify state dir doesn't exist yet. if _, statErr := os.Stat(stateDir); !os.IsNotExist(statErr) { @@ -115,18 +118,18 @@ func TestAppend_CreatesStateDir(t *testing.T) { } func TestAppend_Rotation(t *testing.T) { - dir := setupTestDir(t, true) - logPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLog) - prevPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLogPrev) + tmpDir := setupTestDir(t, true) + logPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLog) + prevPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLogPrev) // Create state dir and write a file that exceeds the max size. - stateDir := filepath.Join(dir, config.DirContext, config.DirState) - if mkErr := os.MkdirAll(stateDir, config.PermExec); mkErr != nil { + stateDir := filepath.Join(tmpDir, dir.Context, dir.State) + if mkErr := os.MkdirAll(stateDir, fs.PermExec); mkErr != nil { t.Fatalf("failed to create state dir: %v", mkErr) } bigContent := strings.Repeat(`{"event":"relay","message":"filler"}`+"\n", 40000) - if writeErr := os.WriteFile(logPath, []byte(bigContent), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(logPath, []byte(bigContent), fs.PermFile); writeErr != nil { t.Fatalf("failed to write big log: %v", writeErr) } @@ -149,23 +152,23 @@ func TestAppend_Rotation(t *testing.T) { } func TestAppend_RotationOverwrite(t *testing.T) { - dir := setupTestDir(t, true) - logPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLog) - prevPath := filepath.Join(dir, config.DirContext, config.DirState, config.FileEventLogPrev) + tmpDir := setupTestDir(t, true) + logPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLog) + prevPath := filepath.Join(tmpDir, dir.Context, dir.State, event.FileEventLogPrev) - stateDir := filepath.Join(dir, config.DirContext, config.DirState) - if mkErr := os.MkdirAll(stateDir, config.PermExec); mkErr != nil { + stateDir := filepath.Join(tmpDir, dir.Context, dir.State) + if mkErr := os.MkdirAll(stateDir, fs.PermExec); mkErr != nil { t.Fatalf("failed to create state dir: %v", mkErr) } // Create an existing .1 file. - if writeErr := os.WriteFile(prevPath, []byte("old rotated content\n"), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(prevPath, []byte("old rotated content\n"), fs.PermFile); writeErr != nil { t.Fatalf("failed to write old .1 file: %v", writeErr) } // Write oversized current log. bigContent := strings.Repeat(`{"event":"relay","message":"filler"}`+"\n", 40000) - if writeErr := os.WriteFile(logPath, []byte(bigContent), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(logPath, []byte(bigContent), fs.PermFile); writeErr != nil { t.Fatalf("failed to write big log: %v", writeErr) } @@ -245,16 +248,16 @@ func TestQuery_Last(t *testing.T) { } func TestQuery_IncludeRotated(t *testing.T) { - dir := setupTestDir(t, true) - stateDir := filepath.Join(dir, config.DirContext, config.DirState) - if mkErr := os.MkdirAll(stateDir, config.PermExec); mkErr != nil { + tmpDir := setupTestDir(t, true) + stateDir := filepath.Join(tmpDir, dir.Context, dir.State) + if mkErr := os.MkdirAll(stateDir, fs.PermExec); mkErr != nil { t.Fatalf("failed to create state dir: %v", mkErr) } // Write events to rotated file. - prevPath := filepath.Join(stateDir, config.FileEventLogPrev) + prevPath := filepath.Join(stateDir, event.FileEventLogPrev) prevLine := `{"event":"relay","message":"old event","timestamp":"2026-01-01T00:00:00Z","project":"test"}` + "\n" - if writeErr := os.WriteFile(prevPath, []byte(prevLine), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(prevPath, []byte(prevLine), fs.PermFile); writeErr != nil { t.Fatalf("failed to write .1 file: %v", writeErr) } @@ -280,18 +283,18 @@ func TestQuery_IncludeRotated(t *testing.T) { } func TestQuery_CorruptLine(t *testing.T) { - dir := setupTestDir(t, true) - stateDir := filepath.Join(dir, config.DirContext, config.DirState) - if mkErr := os.MkdirAll(stateDir, config.PermExec); mkErr != nil { + tmpDir := setupTestDir(t, true) + stateDir := filepath.Join(tmpDir, dir.Context, dir.State) + if mkErr := os.MkdirAll(stateDir, fs.PermExec); mkErr != nil { t.Fatalf("failed to create state dir: %v", mkErr) } - logPath := filepath.Join(stateDir, config.FileEventLog) + logPath := filepath.Join(stateDir, event.FileEventLog) content := `{"event":"relay","message":"good","timestamp":"2026-01-01T00:00:00Z","project":"test"} not valid json {"event":"nudge","message":"also good","timestamp":"2026-01-02T00:00:00Z","project":"test"} ` - if writeErr := os.WriteFile(logPath, []byte(content), config.PermFile); writeErr != nil { + if writeErr := os.WriteFile(logPath, []byte(content), fs.PermFile); writeErr != nil { t.Fatalf("failed to write log: %v", writeErr) } diff --git a/internal/index/entry.go b/internal/index/entry.go index 122e3af7..91ca3c3f 100644 --- a/internal/index/entry.go +++ b/internal/index/entry.go @@ -9,24 +9,11 @@ package index import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" ) -// EntryBlock represents a parsed entry block from a knowledge file -// (DECISIONS.md or LEARNINGS.md). -// -// Fields: -// - Entry: The parsed header metadata (timestamp, date, title) -// - Lines: All lines belonging to this entry (header + body) -// - StartIndex: Zero-based line index where this entry starts -// - EndIndex: Zero-based line index where this entry ends (exclusive) -type EntryBlock struct { - Entry Entry - Lines []string - StartIndex int - EndIndex int -} - // ParseEntryBlocks splits file content into discrete entry blocks. // // Each block starts at a "## [YYYY-MM-DD-HHMMSS] Title" header and extends @@ -42,7 +29,7 @@ func ParseEntryBlocks(content string) []EntryBlock { return nil } - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var blocks []EntryBlock // Find all entry header positions @@ -53,12 +40,12 @@ func ParseEntryBlocks(content string) []EntryBlock { var headers []headerPos for i, line := range lines { - matches := config.RegExEntryHeader.FindStringSubmatch(line) - if len(matches) == 4 { + matches := regex.EntryHeader.FindStringSubmatch(line) + if len(matches) == regex.EntryHeaderGroups { headers = append(headers, headerPos{ lineIdx: i, entry: Entry{ - Timestamp: matches[1] + "-" + matches[2], + Timestamp: matches[1] + token.Dash + matches[2], Date: matches[1], Title: matches[3], }, @@ -104,7 +91,7 @@ func ParseEntryBlocks(content string) []EntryBlock { func (eb *EntryBlock) IsSuperseded() bool { for _, line := range eb.Lines { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, "~~Superseded") { + if strings.HasPrefix(trimmed, marker.PrefixSuperseded) { return true } } @@ -116,5 +103,5 @@ func (eb *EntryBlock) IsSuperseded() bool { // Returns: // - string: The full entry content with lines joined by newlines func (eb *EntryBlock) BlockContent() string { - return strings.Join(eb.Lines, config.NewlineLF) + return strings.Join(eb.Lines, token.NewlineLF) } diff --git a/internal/index/index.go b/internal/index/index.go index bca03434..aa00b8c5 100644 --- a/internal/index/index.go +++ b/internal/index/index.go @@ -13,21 +13,14 @@ import ( "os" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/token" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) -// Entry represents a parsed entry header from a context file. -// -// Fields: -// - Timestamp: Full timestamp (YYYY-MM-DD-HHMMSS) -// - Date: Date only (YYYY-MM-DD) -// - Title: Entry title -type Entry struct { - Timestamp string - Date string - Title string -} - // ParseHeaders extracts all entries from file content. // // It scans for headers matching the pattern "## [YYYY-MM-DD-HHMMSS] Title" @@ -41,14 +34,14 @@ type Entry struct { func ParseHeaders(content string) []Entry { var entries []Entry - matches := config.RegExEntryHeader.FindAllStringSubmatch(content, -1) + matches := regex.EntryHeader.FindAllStringSubmatch(content, -1) for _, match := range matches { - if len(match) == 4 { + if len(match) == regex.EntryHeaderGroups { date := match[1] time := match[2] title := match[3] entries = append(entries, Entry{ - Timestamp: date + "-" + time, + Timestamp: date + token.Dash + time, Date: date, Title: title, }) @@ -74,7 +67,7 @@ func GenerateTable(entries []Entry, columnHeader string) string { return "" } - nl := config.NewlineLF + nl := token.NewlineLF var sb strings.Builder sb.WriteString("| Date | ") sb.WriteString(columnHeader) @@ -112,11 +105,11 @@ func GenerateTable(entries []Entry, columnHeader string) string { func Update(content, fileHeader, columnHeader string) string { entries := ParseHeaders(content) indexContent := GenerateTable(entries, columnHeader) - nl := config.NewlineLF + nl := token.NewlineLF // Check if markers already exist - startIdx := strings.Index(content, config.IndexStart) - endIdx := strings.Index(content, config.IndexEnd) + startIdx := strings.Index(content, marker.IndexStart) + endIdx := strings.Index(content, marker.IndexEnd) if startIdx != -1 && endIdx != -1 && endIdx > startIdx { // Replace the existing index @@ -124,7 +117,7 @@ func Update(content, fileHeader, columnHeader string) string { // No entries - remove index entirely (including markers // and surrounding whitespace) before := strings.TrimRight(content[:startIdx], nl) - after := content[endIdx+len(config.IndexEnd):] + after := content[endIdx+len(marker.IndexEnd):] after = strings.TrimLeft(after, nl) if after != "" { return before + nl + nl + after @@ -132,7 +125,7 @@ func Update(content, fileHeader, columnHeader string) string { return before + nl } // Replace content between markers - before := content[:startIdx+len(config.IndexStart)] + before := content[:startIdx+len(marker.IndexStart)] after := content[endIdx:] return before + nl + indexContent + after } @@ -154,8 +147,8 @@ func Update(content, fileHeader, columnHeader string) string { if lineEnd == -1 { // Header is at the end of the file return content + nl + nl + - config.IndexStart + nl + indexContent + - config.IndexEnd + nl + marker.IndexStart + nl + indexContent + + marker.IndexEnd + nl } insertPoint := headerIdx + lineEnd + 1 @@ -164,10 +157,10 @@ func Update(content, fileHeader, columnHeader string) string { var sb strings.Builder sb.WriteString(content[:insertPoint]) sb.WriteString(nl) - sb.WriteString(config.IndexStart) + sb.WriteString(marker.IndexStart) sb.WriteString(nl) sb.WriteString(indexContent) - sb.WriteString(config.IndexEnd) + sb.WriteString(marker.IndexEnd) sb.WriteString(nl) sb.WriteString(content[insertPoint:]) @@ -182,7 +175,7 @@ func Update(content, fileHeader, columnHeader string) string { // Returns: // - string: Updated content with regenerated index func UpdateDecisions(content string) string { - return Update(content, config.HeadingDecisions, config.ColumnDecision) + return Update(content, assets.HeadingDecisions, assets.ColumnDecision) } // UpdateLearnings regenerates the learning index in LEARNINGS.md content. @@ -193,7 +186,7 @@ func UpdateDecisions(content string) string { // Returns: // - string: Updated content with regenerated index func UpdateLearnings(content string) string { - return Update(content, config.HeadingLearnings, config.ColumnLearning) + return Update(content, assets.HeadingLearnings, assets.ColumnLearning) } // ReindexFile reads a context file, regenerates its index, and writes it back. @@ -221,31 +214,31 @@ func ReindexFile( entryType string, ) error { if _, err := os.Stat(filePath); os.IsNotExist(err) { - return fmt.Errorf("%s not found. Run 'ctx init' first", fileName) + return ctxerr.ReindexFileNotFound(fileName) } content, err := os.ReadFile(filePath) //nolint:gosec // G304: filePath is constructed from known config paths if err != nil { - return fmt.Errorf("failed to read %s: %w", filePath, err) + return ctxerr.ReindexFileRead(filePath, err) } updated := updateFunc(string(content)) - if err := os.WriteFile(filePath, []byte(updated), config.PermFile); err != nil { - return fmt.Errorf("failed to write %s: %w", filePath, err) + if err := os.WriteFile(filePath, []byte(updated), fs.PermFile); err != nil { + return ctxerr.ReindexFileWrite(filePath, err) } entries := ParseHeaders(string(content)) if len(entries) == 0 { _, err := fmt.Fprintf( - w, "✓ Index cleared (no %s found)\n", entryType) + w, assets.TextDesc(assets.TextDescKeyDriftCleared)+token.NewlineLF, entryType) if err != nil { return err } } else { _, err := fmt.Fprintf( w, - "✓ Index regenerated with %d entries\n", len(entries), + assets.TextDesc(assets.TextDescKeyDriftRegenerated)+token.NewlineLF, len(entries), ) if err != nil { return err diff --git a/internal/index/index_test.go b/internal/index/index_test.go index fcc4b6b5..b3210016 100644 --- a/internal/index/index_test.go +++ b/internal/index/index_test.go @@ -10,7 +10,7 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/marker" ) func TestParseHeaders(t *testing.T) { @@ -183,7 +183,7 @@ func TestUpdateDecisions(t *testing.T) { { name: "empty file with header", content: "# Decisions\n", - wantNot: []string{config.IndexStart, config.IndexEnd}, + wantNot: []string{marker.IndexStart, marker.IndexEnd}, }, { name: "file with one decision", @@ -194,8 +194,8 @@ func TestUpdateDecisions(t *testing.T) { **Status**: Accepted `, wantHas: []string{ - config.IndexStart, - config.IndexEnd, + marker.IndexStart, + marker.IndexEnd, "| Date | Decision |", "| 2026-01-28 | Test decision |", "## [2026-01-28-051426] Test decision", @@ -216,8 +216,8 @@ func TestUpdateDecisions(t *testing.T) { **Status**: Accepted `, wantHas: []string{ - config.IndexStart, - config.IndexEnd, + marker.IndexStart, + marker.IndexEnd, "| 2026-01-28 | New decision |", }, wantNot: []string{ @@ -237,8 +237,8 @@ func TestUpdateDecisions(t *testing.T) { Some other content. `, wantNot: []string{ - config.IndexStart, - config.IndexEnd, + marker.IndexStart, + marker.IndexEnd, "| Date | Decision |", }, wantHas: []string{ @@ -297,10 +297,10 @@ func TestUpdateDecisions_PreservesContent(t *testing.T) { got := UpdateDecisions(content) - if !strings.Contains(got, config.IndexStart) { + if !strings.Contains(got, marker.IndexStart) { t.Error("Missing INDEX:START marker") } - if !strings.Contains(got, config.IndexEnd) { + if !strings.Contains(got, marker.IndexEnd) { t.Error("Missing INDEX:END marker") } @@ -345,7 +345,7 @@ func TestUpdateLearnings(t *testing.T) { { name: "empty file with header", content: "# Learnings\n", - wantNot: []string{config.IndexStart, config.IndexEnd}, + wantNot: []string{marker.IndexStart, marker.IndexEnd}, }, { name: "file with one learning", @@ -360,8 +360,8 @@ func TestUpdateLearnings(t *testing.T) { **Application**: Always use all three flags `, wantHas: []string{ - config.IndexStart, - config.IndexEnd, + marker.IndexStart, + marker.IndexEnd, "| Date | Learning |", "| 2026-01-28 | Required flags now enforced |", }, diff --git a/internal/index/types.go b/internal/index/types.go new file mode 100644 index 00000000..a2637ca2 --- /dev/null +++ b/internal/index/types.go @@ -0,0 +1,34 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package index + +// Entry represents a parsed entry header from a context file. +// +// Fields: +// - Timestamp: Full timestamp (YYYY-MM-DD-HHMMSS) +// - Date: Date only (YYYY-MM-DD) +// - Title: Entry title +type Entry struct { + Timestamp string + Date string + Title string +} + +// EntryBlock represents a parsed entry block from a knowledge file +// (DECISIONS.md or LEARNINGS.md). +// +// Fields: +// - Entry: The parsed header metadata (timestamp, date, title) +// - Lines: All lines belonging to this entry (header + body) +// - StartIndex: Zero-based line index where this entry starts +// - EndIndex: Zero-based line index where this entry ends (exclusive) +type EntryBlock struct { + Entry Entry + Lines []string + StartIndex int + EndIndex int +} diff --git a/internal/journal/state/state.go b/internal/journal/state/state.go index 6171fe02..80337e6a 100644 --- a/internal/journal/state/state.go +++ b/internal/journal/state/state.go @@ -17,32 +17,18 @@ import ( "path/filepath" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/journal" ) // CurrentVersion is the schema version for the state file. const CurrentVersion = 1 -// JournalState is the top-level state file structure. -type JournalState struct { - Version int `json:"version"` - Entries map[string]FileState `json:"entries"` -} - -// FileState tracks processing stages for a single journal entry. -// Values are date strings (YYYY-MM-DD) indicating when the stage completed. -type FileState struct { - Exported string `json:"exported,omitempty"` - Enriched string `json:"enriched,omitempty"` - Normalized string `json:"normalized,omitempty"` - FencesVerified string `json:"fences_verified,omitempty"` - Locked string `json:"locked,omitempty"` -} - // Load reads the state file from the journal directory. If the file does // not exist, an empty state is returned (not an error). func Load(journalDir string) (*JournalState, error) { - path := filepath.Join(journalDir, config.FileJournalState) + path := filepath.Join(journalDir, journal.FileState) data, err := os.ReadFile(filepath.Clean(path)) if os.IsNotExist(err) { @@ -74,10 +60,10 @@ func (s *JournalState) Save(journalDir string) error { } data = append(data, '\n') - path := filepath.Join(journalDir, config.FileJournalState) + path := filepath.Join(journalDir, journal.FileState) tmp := path + ".tmp" - if err := os.WriteFile(tmp, data, config.PermFile); err != nil { + if err := os.WriteFile(tmp, data, fs.PermFile); err != nil { return err } return os.Rename(tmp, path) @@ -128,15 +114,15 @@ func (s *JournalState) MarkFencesVerified(filename string) { func (s *JournalState) Mark(filename, stage string) bool { fs := s.Entries[filename] switch stage { - case "exported": + case journal.StageExported: fs.Exported = today() - case "enriched": + case journal.StageEnriched: fs.Enriched = today() - case "normalized": + case journal.StageNormalized: fs.Normalized = today() - case "fences_verified": + case journal.StageFencesVerified: fs.FencesVerified = today() - case "locked": + case journal.StageLocked: fs.Locked = today() default: return false @@ -156,15 +142,15 @@ func (s *JournalState) Mark(filename, stage string) bool { func (s *JournalState) Clear(filename, stage string) bool { fs := s.Entries[filename] switch stage { - case "exported": + case journal.StageExported: fs.Exported = "" - case "enriched": + case journal.StageEnriched: fs.Enriched = "" - case "normalized": + case journal.StageNormalized: fs.Normalized = "" - case "fences_verified": + case journal.StageFencesVerified: fs.FencesVerified = "" - case "locked": + case journal.StageLocked: fs.Locked = "" default: return false @@ -203,23 +189,23 @@ func (s *JournalState) ClearEnriched(filename string) { s.Entries[filename] = fs } -// IsEnriched reports whether the file has been enriched. -func (s *JournalState) IsEnriched(filename string) bool { +// Enriched reports whether the file has been enriched. +func (s *JournalState) Enriched(filename string) bool { return s.Entries[filename].Enriched != "" } -// IsNormalized reports whether the file has been normalized. -func (s *JournalState) IsNormalized(filename string) bool { +// Normalized reports whether the file has been normalized. +func (s *JournalState) Normalized(filename string) bool { return s.Entries[filename].Normalized != "" } -// IsFencesVerified reports whether the file's fences have been verified. -func (s *JournalState) IsFencesVerified(filename string) bool { +// FencesVerified reports whether the file's fences have been verified. +func (s *JournalState) FencesVerified(filename string) bool { return s.Entries[filename].FencesVerified != "" } -// IsExported reports whether the file has been exported. -func (s *JournalState) IsExported(filename string) bool { +// Exported reports whether the file has been exported. +func (s *JournalState) Exported(filename string) bool { return s.Entries[filename].Exported != "" } @@ -233,10 +219,10 @@ func (s *JournalState) CountUnenriched(journalDir string) int { count := 0 for _, entry := range entries { - if entry.IsDir() || filepath.Ext(entry.Name()) != config.ExtMarkdown { + if entry.IsDir() || filepath.Ext(entry.Name()) != file.ExtMarkdown { continue } - if !s.IsEnriched(entry.Name()) { + if !s.Enriched(entry.Name()) { count++ } } @@ -245,5 +231,5 @@ func (s *JournalState) CountUnenriched(journalDir string) int { // ValidStages lists the recognized stage names for Mark() and Clear(). var ValidStages = []string{ - "exported", "enriched", "normalized", "fences_verified", "locked", + journal.StageExported, journal.StageEnriched, journal.StageNormalized, journal.StageFencesVerified, journal.StageLocked, } diff --git a/internal/journal/state/state_test.go b/internal/journal/state/state_test.go index b82eb283..d05c0828 100644 --- a/internal/journal/state/state_test.go +++ b/internal/journal/state/state_test.go @@ -11,7 +11,8 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/journal" ) func TestLoad_MissingFile(t *testing.T) { @@ -70,12 +71,12 @@ func TestCountUnenriched(t *testing.T) { // Create some .md files for _, name := range []string{"a.md", "b.md", "c.md"} { - if err := os.WriteFile(filepath.Join(dir, name), []byte("content"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(dir, name), []byte("content"), fs.PermFile); err != nil { t.Fatal(err) } } // Create a non-md file that should be ignored - if err := os.WriteFile(filepath.Join(dir, "state.json"), []byte("{}"), config.PermFile); err != nil { + if err := os.WriteFile(filepath.Join(dir, "state.json"), []byte("{}"), fs.PermFile); err != nil { t.Fatal(err) } @@ -152,27 +153,27 @@ func TestQueryHelpers(t *testing.T) { }, } - if !s.IsExported("full.md") { + if !s.Exported("full.md") { t.Error("full.md should be exported") } - if !s.IsEnriched("full.md") { + if !s.Enriched("full.md") { t.Error("full.md should be enriched") } - if !s.IsNormalized("full.md") { + if !s.Normalized("full.md") { t.Error("full.md should be normalized") } - if !s.IsFencesVerified("full.md") { + if !s.FencesVerified("full.md") { t.Error("full.md should have fences verified") } - if !s.IsExported("partial.md") { + if !s.Exported("partial.md") { t.Error("partial.md should be exported") } - if s.IsEnriched("partial.md") { + if s.Enriched("partial.md") { t.Error("partial.md should not be enriched") } - if s.IsExported("missing.md") { + if s.Exported("missing.md") { t.Error("missing.md should not be exported") } } @@ -188,17 +189,17 @@ func TestClearEnriched(t *testing.T) { }, } - if !s.IsEnriched("test.md") { + if !s.Enriched("test.md") { t.Fatal("should be enriched before clear") } s.ClearEnriched("test.md") - if s.IsEnriched("test.md") { + if s.Enriched("test.md") { t.Error("should not be enriched after ClearEnriched") } // Other fields should be untouched - if !s.IsExported("test.md") { + if !s.Exported("test.md") { t.Error("exported should be preserved after ClearEnriched") } } @@ -213,7 +214,7 @@ func TestClearEnriched_NoOp(t *testing.T) { // Should not panic on file that isn't enriched s.ClearEnriched("test.md") - if s.IsEnriched("test.md") { + if s.Enriched("test.md") { t.Error("should remain unenriched") } @@ -227,10 +228,10 @@ func TestMark(t *testing.T) { Entries: make(map[string]FileState), } - if ok := s.Mark("test.md", "exported"); !ok { + if ok := s.Mark("test.md", journal.StageExported); !ok { t.Error("Mark exported should succeed") } - if !s.IsExported("test.md") { + if !s.Exported("test.md") { t.Error("test.md should be exported after Mark") } @@ -270,17 +271,17 @@ func TestClear(t *testing.T) { }, } - if ok := s.Clear("test.md", "locked"); !ok { + if ok := s.Clear("test.md", journal.StageLocked); !ok { t.Error("Clear locked should succeed") } if s.Locked("test.md") { t.Error("should not be locked after Clear") } // Other fields preserved. - if !s.IsExported("test.md") { + if !s.Exported("test.md") { t.Error("exported should be preserved after Clear locked") } - if !s.IsEnriched("test.md") { + if !s.Enriched("test.md") { t.Error("enriched should be preserved after Clear locked") } @@ -322,12 +323,12 @@ func TestLocked(t *testing.T) { t.Error("should not be locked initially") } - s.Mark("test.md", "locked") + s.Mark("test.md", journal.StageLocked) if !s.Locked("test.md") { t.Error("should be locked after Mark") } - s.Clear("test.md", "locked") + s.Clear("test.md", journal.StageLocked) if s.Locked("test.md") { t.Error("should not be locked after Clear") } diff --git a/internal/journal/state/types.go b/internal/journal/state/types.go new file mode 100644 index 00000000..f6e4ba81 --- /dev/null +++ b/internal/journal/state/types.go @@ -0,0 +1,23 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package state + +// JournalState is the top-level state file structure. +type JournalState struct { + Version int `json:"version"` + Entries map[string]FileState `json:"entries"` +} + +// FileState tracks processing stages for a single journal entry. +// Values are date strings (YYYY-MM-DD) indicating when the stage completed. +type FileState struct { + Exported string `json:"exported,omitempty"` + Enriched string `json:"enriched,omitempty"` + Normalized string `json:"normalized,omitempty"` + FencesVerified string `json:"fences_verified,omitempty"` + Locked string `json:"locked,omitempty"` +} diff --git a/internal/mcp/doc.go b/internal/mcp/doc.go index a9f535bd..bd2e4258 100644 --- a/internal/mcp/doc.go +++ b/internal/mcp/doc.go @@ -44,7 +44,7 @@ // // # Usage // -// server := mcp.NewServer(contextDir) +// server := mcp.NewServer(contextDir, version) // server.Serve() // blocks, reads stdin, writes stdout // // # Design Invariants diff --git a/internal/mcp/resources.go b/internal/mcp/resources.go index 59c4a391..19ef2d27 100644 --- a/internal/mcp/resources.go +++ b/internal/mcp/resources.go @@ -11,7 +11,10 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + ctxCfg "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/mcp" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/context" ) @@ -25,19 +28,19 @@ type resourceMapping struct { // resourceTable defines all individual context file resources. var resourceTable = []resourceMapping{ - {config.FileConstitution, "constitution", "Hard rules that must never be violated"}, - {config.FileTask, "tasks", "Current work items and their status"}, - {config.FileConvention, "conventions", "Code patterns and standards"}, - {config.FileArchitecture, "architecture", "System architecture documentation"}, - {config.FileDecision, "decisions", "Architectural decisions with rationale"}, - {config.FileLearning, "learnings", "Gotchas, tips, and lessons learned"}, - {config.FileGlossary, "glossary", "Project-specific terminology"}, - {config.FileAgentPlaybook, "playbook", "How agents should use this system"}, + {ctxCfg.Constitution, "constitution", assets.TextDesc(assets.TextDescKeyMCPResConstitution)}, + {ctxCfg.Task, "tasks", assets.TextDesc(assets.TextDescKeyMCPResTasks)}, + {ctxCfg.Convention, "conventions", assets.TextDesc(assets.TextDescKeyMCPResConventions)}, + {ctxCfg.Architecture, "architecture", assets.TextDesc(assets.TextDescKeyMCPResArchitecture)}, + {ctxCfg.Decision, "decisions", assets.TextDesc(assets.TextDescKeyMCPResDecisions)}, + {ctxCfg.Learning, "learnings", assets.TextDesc(assets.TextDescKeyMCPResLearnings)}, + {ctxCfg.Glossary, "glossary", assets.TextDesc(assets.TextDescKeyMCPResGlossary)}, + {ctxCfg.AgentPlaybook, "playbook", assets.TextDesc(assets.TextDescKeyMCPResPlaybook)}, } // resourceURI builds a resource URI from a suffix. func resourceURI(name string) string { - return "ctx://context/" + name + return mcp.MCPResourceURIPrefix + name } // handleResourcesList returns all available MCP resources. @@ -49,7 +52,7 @@ func (s *Server) handleResourcesList(req Request) *Response { resources = append(resources, Resource{ URI: resourceURI(rm.name), Name: rm.name, - MimeType: "text/markdown", + MimeType: mcp.MimeMarkdown, Description: rm.desc, }) } @@ -58,8 +61,8 @@ func (s *Server) handleResourcesList(req Request) *Response { resources = append(resources, Resource{ URI: resourceURI("agent"), Name: "agent", - MimeType: "text/markdown", - Description: "All context files assembled in priority read order", + MimeType: mcp.MimeMarkdown, + Description: assets.TextDesc(assets.TextDescKeyMCPResAgent), }) return s.ok(req.ID, ResourceListResult{Resources: resources}) @@ -69,13 +72,13 @@ func (s *Server) handleResourcesList(req Request) *Response { func (s *Server) handleResourcesRead(req Request) *Response { var params ReadResourceParams if err := json.Unmarshal(req.Params, ¶ms); err != nil { - return s.error(req.ID, errCodeInvalidArg, "invalid params") + return s.error(req.ID, errCodeInvalidArg, assets.TextDesc(assets.TextDescKeyMCPInvalidParams)) } ctx, err := context.Load(s.contextDir) if err != nil { return s.error(req.ID, errCodeInternal, - fmt.Sprintf("failed to load context: %v", err)) + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPLoadContext), err)) } // Check for individual file resources. @@ -91,7 +94,7 @@ func (s *Server) handleResourcesRead(req Request) *Response { } return s.error(req.ID, errCodeInvalidArg, - fmt.Sprintf("unknown resource: %s", params.URI)) + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPUnknownResource), params.URI)) } // readContextFile returns the content of a single context file. @@ -101,13 +104,13 @@ func (s *Server) readContextFile( f := ctx.File(fileName) if f == nil { return s.error(id, errCodeInvalidArg, - fmt.Sprintf("file not found: %s", fileName)) + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPFileNotFound), fileName)) } return s.ok(id, ReadResourceResult{ Contents: []ResourceContent{{ URI: uri, - MimeType: "text/markdown", + MimeType: mcp.MimeMarkdown, Text: string(f.Content), }}, }) @@ -116,26 +119,27 @@ func (s *Server) readContextFile( // readAgentPacket assembles all context files in read order into a // single response, respecting the configured token budget. // -// Files are added in priority order (FileReadOrder). When the token +// Files are added in priority order (ReadOrder). When the token // budget would be exceeded, remaining files are listed as "Also noted" // summaries instead of included in full. func (s *Server) readAgentPacket( id json.RawMessage, ctx *context.Context, ) *Response { var sb strings.Builder - sb.WriteString("# Context Packet\n\n") + header := assets.TextDesc(assets.TextDescKeyMCPPacketHeader) + sb.WriteString(header) - tokensUsed := context.EstimateTokensString("# Context Packet\n\n") + tokensUsed := context.EstimateTokensString(header) budget := s.tokenBudget var skipped []string - for _, fileName := range config.FileReadOrder { + for _, fileName := range ctxCfg.ReadOrder { f := ctx.File(fileName) if f == nil || f.IsEmpty { continue } - section := fmt.Sprintf("---\n## %s\n\n%s\n\n", fileName, string(f.Content)) + section := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPSectionFormat), fileName, string(f.Content)) sectionTokens := context.EstimateTokensString(section) if budget > 0 && tokensUsed+sectionTokens > budget { @@ -148,17 +152,17 @@ func (s *Server) readAgentPacket( } if len(skipped) > 0 { - sb.WriteString("---\n## Also Noted\n\n") + sb.WriteString(assets.TextDesc(assets.TextDescKeyMCPAlsoNoted)) for _, name := range skipped { - fmt.Fprintf(&sb, "- %s (omitted for budget)\n", name) + fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPOmittedFormat), name) } - sb.WriteString(config.NewlineLF) + sb.WriteString(token.NewlineLF) } return s.ok(id, ReadResourceResult{ Contents: []ResourceContent{{ URI: resourceURI("agent"), - MimeType: "text/markdown", + MimeType: mcp.MimeMarkdown, Text: sb.String(), }}, }) diff --git a/internal/mcp/server.go b/internal/mcp/server.go index ae8feb24..46423193 100644 --- a/internal/mcp/server.go +++ b/internal/mcp/server.go @@ -10,36 +10,26 @@ import ( "bufio" "encoding/json" "fmt" - "io" "os" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/mcp" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/rc" ) -// Server is an MCP server that exposes ctx context over JSON-RPC 2.0. -// -// It reads JSON-RPC requests from stdin and writes responses to stdout, -// following the Model Context Protocol specification. -type Server struct { - contextDir string - version string - tokenBudget int - out io.Writer - in io.Reader -} - // NewServer creates a new MCP server for the given context directory. // // Parameters: // - contextDir: Path to the .context/ directory +// - version: Binary version string for the server info response // // Returns: // - *Server: A configured MCP server ready to serve -func NewServer(contextDir string) *Server { +func NewServer(contextDir, version string) *Server { return &Server{ contextDir: contextDir, - version: config.BinaryVersion, + version: version, tokenBudget: rc.TokenBudget(), out: os.Stdout, in: os.Stdin, @@ -56,9 +46,7 @@ func NewServer(contextDir string) *Server { func (s *Server) Serve() error { scanner := bufio.NewScanner(s.in) - // Increase scanner buffer for large messages (1MB). - const maxScanSize = 1 << 20 - scanner.Buffer(make([]byte, 0, maxScanSize), maxScanSize) + scanner.Buffer(make([]byte, 0, mcp.MCPScanMaxSize), mcp.MCPScanMaxSize) for scanner.Scan() { line := scanner.Bytes() @@ -75,10 +63,10 @@ func (s *Server) Serve() error { out, err := json.Marshal(resp) if err != nil { // Marshal failure is an internal error; try to report it. - s.writeError(nil, errCodeInternal, "failed to marshal response") + s.writeError(nil, errCodeInternal, assets.TextDesc(assets.TextDescKeyMCPFailedMarshal)) continue } - if _, writeErr := s.out.Write(append(out, '\n')); writeErr != nil { + if _, writeErr := s.out.Write(append(out, token.NewlineLF[0])); writeErr != nil { return writeErr } } @@ -91,8 +79,8 @@ func (s *Server) handleMessage(data []byte) *Response { var req Request if err := json.Unmarshal(data, &req); err != nil { return &Response{ - JSONRPC: "2.0", - Error: &RPCError{Code: errCodeParse, Message: "parse error"}, + JSONRPC: mcp.MCPJSONRPCVersion, + Error: &RPCError{Code: errCodeParse, Message: assets.TextDesc(assets.TextDescKeyMCPParseError)}, } } @@ -108,21 +96,21 @@ func (s *Server) handleMessage(data []byte) *Response { // dispatch routes a request to the correct handler based on method name. func (s *Server) dispatch(req Request) *Response { switch req.Method { - case "initialize": + case mcp.MCPMethodInitialize: return s.handleInitialize(req) - case "ping": + case mcp.MCPMethodPing: return s.ok(req.ID, struct{}{}) - case "resources/list": + case mcp.MCPMethodResourcesList: return s.handleResourcesList(req) - case "resources/read": + case mcp.MCPMethodResourcesRead: return s.handleResourcesRead(req) - case "tools/list": + case mcp.MCPMethodToolsList: return s.handleToolsList(req) - case "tools/call": + case mcp.MCPMethodToolsCall: return s.handleToolsCall(req) default: return s.error(req.ID, errCodeNotFound, - fmt.Sprintf("method not found: %s", req.Method)) + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPMethodNotFound), req.Method)) } } @@ -143,7 +131,7 @@ func (s *Server) handleInitialize(req Request) *Response { Tools: &ToolsCap{}, }, ServerInfo: AppInfo{ - Name: "ctx", + Name: mcp.MCPServerName, Version: s.version, }, } @@ -153,7 +141,7 @@ func (s *Server) handleInitialize(req Request) *Response { // ok builds a successful JSON-RPC response. func (s *Server) ok(id json.RawMessage, result interface{}) *Response { return &Response{ - JSONRPC: "2.0", + JSONRPC: mcp.MCPJSONRPCVersion, ID: id, Result: result, } @@ -162,7 +150,7 @@ func (s *Server) ok(id json.RawMessage, result interface{}) *Response { // error builds a JSON-RPC error response. func (s *Server) error(id json.RawMessage, code int, msg string) *Response { return &Response{ - JSONRPC: "2.0", + JSONRPC: mcp.MCPJSONRPCVersion, ID: id, Error: &RPCError{Code: code, Message: msg}, } @@ -175,6 +163,6 @@ func (s *Server) writeError(id json.RawMessage, code int, msg string) { if out, err := json.Marshal(resp); err == nil { // Best-effort: writeError is a last-resort fallback; nowhere // to report a write failure from here. - _, _ = s.out.Write(append(out, '\n')) + _, _ = s.out.Write(append(out, token.NewlineLF[0])) } } diff --git a/internal/mcp/server_test.go b/internal/mcp/server_test.go index 713b3ec5..e707512e 100644 --- a/internal/mcp/server_test.go +++ b/internal/mcp/server_test.go @@ -14,7 +14,7 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" ) func newTestServer(t *testing.T) (*Server, string) { @@ -25,14 +25,14 @@ func newTestServer(t *testing.T) (*Server, string) { t.Fatalf("mkdir: %v", err) } files := map[string]string{ - config.FileConstitution: "# Constitution\n\n- Rule 1: Never break things\n", - config.FileTask: "# Tasks\n\n- [ ] Build MCP server\n- [ ] Write tests\n", - config.FileDecision: "# Decisions\n", - config.FileConvention: "# Conventions\n\n- Use Go idioms\n", - config.FileLearning: "# Learnings\n", - config.FileArchitecture: "# Architecture\n", - config.FileGlossary: "# Glossary\n", - config.FileAgentPlaybook: "# Agent Playbook\n\nRead context files first.\n", + ctx.Constitution: "# Constitution\n\n- Rule 1: Never break things\n", + ctx.Task: "# Tasks\n\n- [ ] Build MCP server\n- [ ] Write tests\n", + ctx.Decision: "# Decisions\n", + ctx.Convention: "# Conventions\n\n- Use Go idioms\n", + ctx.Learning: "# Learnings\n", + ctx.Architecture: "# Architecture\n", + ctx.Glossary: "# Glossary\n", + ctx.AgentPlaybook: "# Agent Playbook\n\nRead context files first.\n", } for name, content := range files { p := filepath.Join(contextDir, name) @@ -40,7 +40,7 @@ func newTestServer(t *testing.T) (*Server, string) { t.Fatalf("write %s: %v", name, err) } } - srv := NewServer(contextDir) + srv := NewServer(contextDir, "test") return srv, contextDir } @@ -268,7 +268,7 @@ func TestToolComplete(t *testing.T) { if !strings.Contains(result.Content[0].Text, "Build MCP server") { t.Errorf("expected completed task name, got: %s", result.Content[0].Text) } - content, err := os.ReadFile(filepath.Join(contextDir, config.FileTask)) + content, err := os.ReadFile(filepath.Join(contextDir, ctx.Task)) if err != nil { t.Fatalf("read tasks: %v", err) } @@ -309,13 +309,13 @@ func TestToolAdd(t *testing.T) { { name: "add task", args: map[string]interface{}{"type": "task", "content": "Test task"}, - wantFile: config.FileTask, + wantFile: ctx.Task, wantContains: "Test task", }, { name: "add convention", args: map[string]interface{}{"type": "convention", "content": "Use tabs"}, - wantFile: config.FileConvention, + wantFile: ctx.Convention, wantContains: "Use tabs", }, { @@ -327,7 +327,7 @@ func TestToolAdd(t *testing.T) { "rationale": "Fast and simple", "consequences": "Ops must manage Redis", }, - wantFile: config.FileDecision, + wantFile: ctx.Decision, wantContains: "Use Redis", }, { @@ -339,7 +339,7 @@ func TestToolAdd(t *testing.T) { "lesson": "Only same or child dirs", "application": "Keep files in internal", }, - wantFile: config.FileLearning, + wantFile: ctx.Learning, wantContains: "Go embed", }, { diff --git a/internal/mcp/tools.go b/internal/mcp/tools.go index 8c777507..cd385cd4 100644 --- a/internal/mcp/tools.go +++ b/internal/mcp/tools.go @@ -11,8 +11,12 @@ import ( "fmt" "strings" - "github.com/ActiveMemory/ctx/internal/cli/complete" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + taskcomplete "github.com/ActiveMemory/ctx/internal/cli/task/cmd/complete" + "github.com/ActiveMemory/ctx/internal/config/cli" + entry2 "github.com/ActiveMemory/ctx/internal/config/entry" + "github.com/ActiveMemory/ctx/internal/config/mcp" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/ActiveMemory/ctx/internal/context" "github.com/ActiveMemory/ctx/internal/drift" "github.com/ActiveMemory/ctx/internal/entry" @@ -21,65 +25,65 @@ import ( // toolDefs defines all available MCP tools. var toolDefs = []Tool{ { - Name: "ctx_status", - Description: "Show context health: file count, token estimate, and file summaries", - InputSchema: InputSchema{Type: "object"}, + Name: mcp.MCPToolStatus, + Description: assets.TextDesc(assets.TextDescKeyMCPToolStatusDesc), + InputSchema: InputSchema{Type: mcp.MCPSchemaObject}, Annotations: &ToolAnnotations{ReadOnlyHint: true}, }, { - Name: "ctx_add", - Description: "Add a task, decision, learning, or convention to the context", + Name: mcp.MCPToolAdd, + Description: assets.TextDesc(assets.TextDescKeyMCPToolAddDesc), InputSchema: InputSchema{ - Type: "object", + Type: mcp.MCPSchemaObject, Properties: map[string]Property{ - "type": { - Type: "string", - Description: "Entry type to add", + cli.AttrType: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropType), Enum: []string{"task", "decision", "learning", "convention"}, }, "content": { - Type: "string", - Description: "Title or main content of the entry", + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropContent), }, "priority": { - Type: "string", - Description: "Priority level (for tasks only)", + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropPriority), Enum: []string{"high", "medium", "low"}, }, - "context": { - Type: "string", - Description: "Context field (required for decisions and learnings)", + cli.AttrContext: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropContext), }, - "rationale": { - Type: "string", - Description: "Rationale (required for decisions)", + cli.AttrRationale: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropRationale), }, - "consequences": { - Type: "string", - Description: "Consequences (required for decisions)", + cli.AttrConsequences: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropConseq), }, - "lesson": { - Type: "string", - Description: "Lesson learned (required for learnings)", + cli.AttrLesson: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropLesson), }, - "application": { - Type: "string", - Description: "How to apply this lesson (required for learnings)", + cli.AttrApplication: { + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropApplication), }, }, - Required: []string{"type", "content"}, + Required: []string{cli.AttrType, "content"}, }, Annotations: &ToolAnnotations{}, }, { - Name: "ctx_complete", - Description: "Mark a task as done by number or text match", + Name: mcp.MCPToolComplete, + Description: assets.TextDesc(assets.TextDescKeyMCPToolCompleteDesc), InputSchema: InputSchema{ - Type: "object", + Type: mcp.MCPSchemaObject, Properties: map[string]Property{ "query": { - Type: "string", - Description: "Task number (e.g. '1') or search text to match", + Type: mcp.MCPSchemaString, + Description: assets.TextDesc(assets.TextDescKeyMCPToolPropQuery), }, }, Required: []string{"query"}, @@ -87,9 +91,9 @@ var toolDefs = []Tool{ Annotations: &ToolAnnotations{IdempotentHint: true}, }, { - Name: "ctx_drift", - Description: "Detect stale or invalid context: dead paths, missing files, staleness", - InputSchema: InputSchema{Type: "object"}, + Name: mcp.MCPToolDrift, + Description: assets.TextDesc(assets.TextDescKeyMCPToolDriftDesc), + InputSchema: InputSchema{Type: mcp.MCPSchemaObject}, Annotations: &ToolAnnotations{ReadOnlyHint: true}, }, } @@ -103,21 +107,21 @@ func (s *Server) handleToolsList(req Request) *Response { func (s *Server) handleToolsCall(req Request) *Response { var params CallToolParams if err := json.Unmarshal(req.Params, ¶ms); err != nil { - return s.error(req.ID, errCodeInvalidArg, "invalid params") + return s.error(req.ID, errCodeInvalidArg, assets.TextDesc(assets.TextDescKeyMCPInvalidParams)) } switch params.Name { - case "ctx_status": + case mcp.MCPToolStatus: return s.toolStatus(req.ID) - case "ctx_add": + case mcp.MCPToolAdd: return s.toolAdd(req.ID, params.Arguments) - case "ctx_complete": + case mcp.MCPToolComplete: return s.toolComplete(req.ID, params.Arguments) - case "ctx_drift": + case mcp.MCPToolDrift: return s.toolDrift(req.ID) default: return s.error(req.ID, errCodeNotFound, - fmt.Sprintf("unknown tool: %s", params.Name)) + fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPUnknownTool), params.Name)) } } @@ -125,20 +129,20 @@ func (s *Server) handleToolsCall(req Request) *Response { func (s *Server) toolStatus(id json.RawMessage) *Response { ctx, err := context.Load(s.contextDir) if err != nil { - return s.toolError(id, fmt.Sprintf("failed to load context: %v", err)) + return s.toolError(id, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPLoadContext), err)) } var sb strings.Builder - fmt.Fprintf(&sb, "Context: %s\n", ctx.Dir) - fmt.Fprintf(&sb, "Files: %d\n", len(ctx.Files)) - fmt.Fprintf(&sb, "Tokens: ~%d\n\n", ctx.TotalTokens) + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPStatusContextFormat), ctx.Dir) + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPStatusFilesFormat), len(ctx.Files)) + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPStatusTokensFormat), ctx.TotalTokens) for _, f := range ctx.Files { - status := "OK" + status := assets.TextDesc(assets.TextDescKeyMCPStatusOK) if f.IsEmpty { - status = "EMPTY" + status = assets.TextDesc(assets.TextDescKeyMCPStatusEmpty) } - fmt.Fprintf(&sb, " %-22s %6d tokens [%s]\n", + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPStatusFileFormat), f.Name, f.Tokens, status) } @@ -149,11 +153,11 @@ func (s *Server) toolStatus(id json.RawMessage) *Response { func (s *Server) toolAdd( id json.RawMessage, args map[string]interface{}, ) *Response { - entryType, _ := args["type"].(string) + entryType, _ := args[cli.AttrType].(string) content, _ := args["content"].(string) if entryType == "" || content == "" { - return s.toolError(id, "type and content are required") + return s.toolError(id, assets.TextDesc(assets.TextDescKeyMCPTypeContentRequired)) } params := entry.Params{ @@ -188,11 +192,11 @@ func (s *Server) toolAdd( } if wErr := entry.Write(params); wErr != nil { - return s.toolError(id, fmt.Sprintf("write failed: %v", wErr)) + return s.toolError(id, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPWriteFailed), wErr)) } - fileName := config.FileType[strings.ToLower(entryType)] - return s.toolOK(id, fmt.Sprintf("Added %s to %s", entryType, fileName)) + fileName := entry2.ToCtxFile[strings.ToLower(entryType)] + return s.toolOK(id, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPAddedFormat), entryType, fileName)) } // toolComplete marks a task as done by number or text match. @@ -201,51 +205,51 @@ func (s *Server) toolComplete( ) *Response { query, _ := args["query"].(string) if query == "" { - return s.toolError(id, "query is required") + return s.toolError(id, assets.TextDesc(assets.TextDescKeyMCPQueryRequired)) } - completedTask, err := complete.CompleteTask(query, s.contextDir) + completedTask, err := taskcomplete.CompleteTask(query, s.contextDir) if err != nil { return s.toolError(id, err.Error()) } - return s.toolOK(id, fmt.Sprintf("Completed: %s", completedTask)) + return s.toolOK(id, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPCompletedFormat), completedTask)) } // toolDrift runs drift detection and returns the report. func (s *Server) toolDrift(id json.RawMessage) *Response { ctx, err := context.Load(s.contextDir) if err != nil { - return s.toolError(id, fmt.Sprintf("failed to load context: %v", err)) + return s.toolError(id, fmt.Sprintf(assets.TextDesc(assets.TextDescKeyMCPLoadContext), err)) } report := drift.Detect(ctx) var sb strings.Builder - fmt.Fprintf(&sb, "Status: %s\n\n", report.Status()) + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPDriftStatusFormat), report.Status()) if len(report.Violations) > 0 { - sb.WriteString("Violations:\n") + sb.WriteString(assets.TextDesc(assets.TextDescKeyMCPDriftViolations)) for _, v := range report.Violations { - fmt.Fprintf(&sb, " - [%s] %s: %s\n", + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPDriftIssueFormat), v.Type, v.File, v.Message) } - sb.WriteString(config.NewlineLF) + sb.WriteString(token.NewlineLF) } if len(report.Warnings) > 0 { - sb.WriteString("Warnings:\n") + sb.WriteString(assets.TextDesc(assets.TextDescKeyMCPDriftWarnings)) for _, w := range report.Warnings { - fmt.Fprintf(&sb, " - [%s] %s: %s\n", + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPDriftIssueFormat), w.Type, w.File, w.Message) } - sb.WriteString(config.NewlineLF) + sb.WriteString(token.NewlineLF) } if len(report.Passed) > 0 { - sb.WriteString("Passed:\n") + sb.WriteString(assets.TextDesc(assets.TextDescKeyMCPDriftPassed)) for _, p := range report.Passed { - fmt.Fprintf(&sb, " - %s\n", p) + _, _ = fmt.Fprintf(&sb, assets.TextDesc(assets.TextDescKeyMCPDriftPassedFormat), p) } } @@ -255,14 +259,14 @@ func (s *Server) toolDrift(id json.RawMessage) *Response { // toolOK builds a successful tool result. func (s *Server) toolOK(id json.RawMessage, text string) *Response { return s.ok(id, CallToolResult{ - Content: []ToolContent{{Type: "text", Text: text}}, + Content: []ToolContent{{Type: mcp.MCPContentTypeText, Text: text}}, }) } // toolError builds a tool error result. func (s *Server) toolError(id json.RawMessage, msg string) *Response { return s.ok(id, CallToolResult{ - Content: []ToolContent{{Type: "text", Text: msg}}, + Content: []ToolContent{{Type: mcp.MCPContentTypeText, Text: msg}}, IsError: true, }) } diff --git a/internal/mcp/types.go b/internal/mcp/types.go new file mode 100644 index 00000000..247934a1 --- /dev/null +++ b/internal/mcp/types.go @@ -0,0 +1,21 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package mcp + +import "io" + +// Server is an MCP server that exposes ctx context over JSON-RPC 2.0. +// +// It reads JSON-RPC requests from stdin and writes responses to stdout, +// following the Model Context Protocol specification. +type Server struct { + contextDir string + version string + tokenBudget int + out io.Writer + in io.Reader +} diff --git a/internal/memory/classify.go b/internal/memory/classify.go index 81e1bfdf..1d54cbaf 100644 --- a/internal/memory/classify.go +++ b/internal/memory/classify.go @@ -9,15 +9,9 @@ package memory import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/entry" ) -// Classification is the result of heuristic entry classification. -type Classification struct { - Target string // config.Entry* constant or "skip" - Keywords []string // Keywords that triggered the match -} - // TargetSkip indicates an entry that doesn't match any classification rule. const TargetSkip = "skip" @@ -30,19 +24,19 @@ type classRule struct { // rules are evaluated in priority order: conventions > decisions > learnings > tasks. var rules = []classRule{ { - target: config.EntryConvention, + target: entry.Convention, keywords: []string{"always use", "prefer", "convention", "never use", "standard", "always "}, }, { - target: config.EntryDecision, + target: entry.Decision, keywords: []string{"decided", "chose", "trade-off", "approach", "over", "instead of"}, }, { - target: config.EntryLearning, + target: entry.Learning, keywords: []string{"gotcha", "learned", "watch out", "bug", "caveat", "careful", "turns out"}, }, { - target: config.EntryTask, + target: entry.Task, keywords: []string{"todo", "need to", "follow up", "should", "task"}, }, } diff --git a/internal/memory/classify_test.go b/internal/memory/classify_test.go index 2880bbcd..4b22825f 100644 --- a/internal/memory/classify_test.go +++ b/internal/memory/classify_test.go @@ -9,7 +9,7 @@ package memory import ( "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/entry" ) func TestClassify(t *testing.T) { @@ -21,52 +21,52 @@ func TestClassify(t *testing.T) { { name: "convention: always use", text: "always use bun for this project", - target: config.EntryConvention, + target: entry.Convention, }, { name: "convention: prefer", text: "prefer filepath.Join over string concatenation", - target: config.EntryConvention, + target: entry.Convention, }, { name: "convention: never use", text: "never use global state in handlers", - target: config.EntryConvention, + target: entry.Convention, }, { name: "decision: decided", text: "decided to use SQLite over Postgres for local storage", - target: config.EntryDecision, + target: entry.Decision, }, { name: "decision: chose", text: "chose marker-based merge for bidirectional sync", - target: config.EntryDecision, + target: entry.Decision, }, { name: "learning: learned", text: "learned that golangci-lint v2 ignores inline nolint", - target: config.EntryLearning, + target: entry.Learning, }, { name: "learning: gotcha", text: "gotcha: symlinks in project path produce different slugs", - target: config.EntryLearning, + target: entry.Learning, }, { name: "learning: watch out", text: "watch out for race conditions in concurrent map access", - target: config.EntryLearning, + target: entry.Learning, }, { name: "task: need to", text: "need to add tests for the import command", - target: config.EntryTask, + target: entry.Task, }, { name: "task: todo", text: "todo: wire up the publish command", - target: config.EntryTask, + target: entry.Task, }, { name: "skip: session notes", @@ -81,12 +81,12 @@ func TestClassify(t *testing.T) { { name: "case insensitive", text: "ALWAYS USE ctx from PATH", - target: config.EntryConvention, + target: entry.Convention, }, { name: "convention wins over decision (priority order)", text: "always use the approach we decided on", - target: config.EntryConvention, + target: entry.Convention, }, } diff --git a/internal/memory/discover.go b/internal/memory/discover.go index 83134c7c..e97fb83a 100644 --- a/internal/memory/discover.go +++ b/internal/memory/discover.go @@ -7,10 +7,13 @@ package memory import ( - "fmt" "os" "path/filepath" "strings" + + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/memory" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // DiscoverMemoryPath locates Claude Code's auto memory file for the @@ -23,19 +26,19 @@ import ( func DiscoverMemoryPath(projectRoot string) (string, error) { abs, absErr := filepath.Abs(projectRoot) if absErr != nil { - return "", fmt.Errorf("resolving project root: %w", absErr) + return "", ctxerr.DiscoverResolveRoot(absErr) } home, homeErr := os.UserHomeDir() if homeErr != nil { - return "", fmt.Errorf("resolving home directory: %w", homeErr) + return "", ctxerr.DiscoverResolveHome(homeErr) } slug := ProjectSlug(abs) - memPath := filepath.Join(home, ".claude", "projects", slug, "memory", "MEMORY.md") + memPath := filepath.Join(home, dir.Claude, dir.Projects, slug, dir.Memory, memory.MemorySource) if _, statErr := os.Stat(memPath); statErr != nil { - return "", fmt.Errorf("no auto memory found at %s", memPath) + return "", ctxerr.DiscoverNoMemory(memPath) } return memPath, nil } diff --git a/internal/memory/integration_test.go b/internal/memory/integration_test.go index 9708740a..b16b75ce 100644 --- a/internal/memory/integration_test.go +++ b/internal/memory/integration_test.go @@ -12,7 +12,7 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" ) const fixtureMemory = `# Auto Memory @@ -70,22 +70,22 @@ func TestIntegration_ParseClassifyPromote(t *testing.T) { } // Verify entries landed in correct files - convData, _ := os.ReadFile(filepath.Join(contextDir, config.FileConvention)) + convData, _ := os.ReadFile(filepath.Join(contextDir, ctx.Convention)) if !strings.Contains(string(convData), "ctx from PATH") { t.Error("expected convention 'always use ctx from PATH' in CONVENTIONS.md") } - decData, _ := os.ReadFile(filepath.Join(contextDir, config.FileDecision)) + decData, _ := os.ReadFile(filepath.Join(contextDir, ctx.Decision)) if !strings.Contains(string(decData), "heuristic classification") { t.Error("expected decision about classification in DECISIONS.md") } - lrnData, _ := os.ReadFile(filepath.Join(contextDir, config.FileLearning)) + lrnData, _ := os.ReadFile(filepath.Join(contextDir, ctx.Learning)) if !strings.Contains(string(lrnData), "symlinks") { t.Error("expected learning about symlinks in LEARNINGS.md") } - taskData, _ := os.ReadFile(filepath.Join(contextDir, config.FileTask)) + taskData, _ := os.ReadFile(filepath.Join(contextDir, ctx.Task)) if !strings.Contains(string(taskData), "integration tests") { t.Error("expected task about integration tests in TASKS.md") } diff --git a/internal/memory/mirror.go b/internal/memory/mirror.go index a4a315a7..7a758db0 100644 --- a/internal/memory/mirror.go +++ b/internal/memory/mirror.go @@ -14,27 +14,25 @@ import ( "strings" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/memory" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) -// SyncResult holds the outcome of a Sync operation. -type SyncResult struct { - SourcePath string - MirrorPath string - ArchivedTo string // empty if no prior mirror existed - SourceLines int - MirrorLines int // lines in the previous mirror (0 if first sync) -} - // Sync copies sourcePath to .context/memory/mirror.md, archiving the // previous mirror if one exists. Creates directories as needed. func Sync(contextDir, sourcePath string) (SyncResult, error) { - mirrorDir := filepath.Join(contextDir, config.DirMemory) - mirrorPath := filepath.Join(mirrorDir, config.FileMemoryMirror) + mirrorDir := filepath.Join(contextDir, dir.Memory) + mirrorPath := filepath.Join(mirrorDir, memory.MemoryMirror) sourceData, readErr := os.ReadFile(sourcePath) //nolint:gosec // caller-provided path if readErr != nil { - return SyncResult{}, fmt.Errorf("reading source: %w", readErr) + return SyncResult{}, ctxerr.MemoryReadSource(readErr) } result := SyncResult{ @@ -48,17 +46,17 @@ func Sync(contextDir, sourcePath string) (SyncResult, error) { result.MirrorLines = countLines(existingData) archivePath, archiveErr := Archive(contextDir) if archiveErr != nil { - return SyncResult{}, fmt.Errorf("archiving previous mirror: %w", archiveErr) + return SyncResult{}, ctxerr.MemoryArchivePrevious(archiveErr) } result.ArchivedTo = archivePath } - if mkErr := os.MkdirAll(mirrorDir, config.PermExec); mkErr != nil { - return SyncResult{}, fmt.Errorf("creating memory directory: %w", mkErr) + if mkErr := os.MkdirAll(mirrorDir, fs.PermExec); mkErr != nil { + return SyncResult{}, ctxerr.MemoryCreateDir(mkErr) } - if writeErr := os.WriteFile(mirrorPath, sourceData, config.PermFile); writeErr != nil { - return SyncResult{}, fmt.Errorf("writing mirror: %w", writeErr) + if writeErr := os.WriteFile(mirrorPath, sourceData, fs.PermFile); writeErr != nil { + return SyncResult{}, ctxerr.MemoryWriteMirror(writeErr) } return result, nil @@ -67,23 +65,23 @@ func Sync(contextDir, sourcePath string) (SyncResult, error) { // Archive copies the current mirror.md to archive/mirror-.md. // Returns the archive path. Returns an error if no mirror exists. func Archive(contextDir string) (string, error) { - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) - archiveDir := filepath.Join(contextDir, config.DirMemoryArchive) + mirrorPath := filepath.Join(contextDir, dir.Memory, memory.MemoryMirror) + archiveDir := filepath.Join(contextDir, dir.MemoryArchive) data, readErr := os.ReadFile(mirrorPath) //nolint:gosec // project-local path if readErr != nil { - return "", fmt.Errorf("reading mirror for archive: %w", readErr) + return "", ctxerr.MemoryReadMirrorArchive(readErr) } - if mkErr := os.MkdirAll(archiveDir, config.PermExec); mkErr != nil { - return "", fmt.Errorf("creating archive directory: %w", mkErr) + if mkErr := os.MkdirAll(archiveDir, fs.PermExec); mkErr != nil { + return "", ctxerr.MemoryCreateArchiveDir(mkErr) } - ts := time.Now().Format(config.TimestampCompact) - archivePath := filepath.Join(archiveDir, "mirror-"+ts+config.ExtMarkdown) + ts := time.Now().Format(time2.TimestampCompact) + archivePath := filepath.Join(archiveDir, memory.PrefixMirror+ts+file.ExtMarkdown) - if writeErr := os.WriteFile(archivePath, data, config.PermFile); writeErr != nil { - return "", fmt.Errorf("writing archive: %w", writeErr) + if writeErr := os.WriteFile(archivePath, data, fs.PermFile); writeErr != nil { + return "", ctxerr.MemoryWriteArchive(writeErr) } return archivePath, nil @@ -92,24 +90,24 @@ func Archive(contextDir string) (string, error) { // Diff returns a simple line-based diff between the mirror and the source. // Returns empty string when files are identical. func Diff(contextDir, sourcePath string) (string, error) { - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) + mirrorPath := filepath.Join(contextDir, dir.Memory, memory.MemoryMirror) mirrorData, mirrorErr := os.ReadFile(mirrorPath) //nolint:gosec // project-local path if mirrorErr != nil { - return "", fmt.Errorf("reading mirror: %w", mirrorErr) + return "", ctxerr.MemoryReadMirror(mirrorErr) } sourceData, sourceErr := os.ReadFile(sourcePath) //nolint:gosec // caller-provided path if sourceErr != nil { - return "", fmt.Errorf("reading source: %w", sourceErr) + return "", ctxerr.MemoryReadDiffSource(sourceErr) } if bytes.Equal(mirrorData, sourceData) { return "", nil } - mirrorLines := strings.Split(string(mirrorData), config.NewlineLF) - sourceLines := strings.Split(string(sourceData), config.NewlineLF) + mirrorLines := strings.Split(string(mirrorData), token.NewlineLF) + sourceLines := strings.Split(string(sourceData), token.NewlineLF) return simpleDiff(mirrorPath, sourcePath, mirrorLines, sourceLines), nil } @@ -117,7 +115,7 @@ func Diff(contextDir, sourcePath string) (string, error) { // HasDrift checks whether MEMORY.md has been modified since the last sync. // Returns false if either file is missing (no drift to report). func HasDrift(contextDir, sourcePath string) bool { - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) + mirrorPath := filepath.Join(contextDir, dir.Memory, memory.MemoryMirror) sourceInfo, sourceErr := os.Stat(sourcePath) if sourceErr != nil { @@ -134,14 +132,14 @@ func HasDrift(contextDir, sourcePath string) bool { // ArchiveCount returns the number of archived mirror snapshots. func ArchiveCount(contextDir string) int { - archiveDir := filepath.Join(contextDir, config.DirMemoryArchive) + archiveDir := filepath.Join(contextDir, dir.MemoryArchive) entries, readErr := os.ReadDir(archiveDir) if readErr != nil { return 0 } count := 0 for _, e := range entries { - if !e.IsDir() && strings.HasPrefix(e.Name(), "mirror-") { + if !e.IsDir() && strings.HasPrefix(e.Name(), memory.PrefixMirror) { count++ } } @@ -152,14 +150,14 @@ func countLines(data []byte) int { if len(data) == 0 { return 0 } - return bytes.Count(data, []byte(config.NewlineLF)) + return bytes.Count(data, []byte(token.NewlineLF)) } // simpleDiff produces a minimal unified-style diff header with added/removed lines. func simpleDiff(oldPath, newPath string, oldLines, newLines []string) string { var buf strings.Builder - buf.WriteString(fmt.Sprintf("--- %s (mirror)\n", oldPath)) - buf.WriteString(fmt.Sprintf("+++ %s (source)\n", newPath)) + _, _ = fmt.Fprintf(&buf, assets.TextDesc(assets.TextDescKeyMemoryDiffOldFormat), oldPath) + _, _ = fmt.Fprintf(&buf, assets.TextDesc(assets.TextDescKeyMemoryDiffNewFormat), newPath) oldSet := make(map[string]bool, len(oldLines)) for _, l := range oldLines { @@ -172,12 +170,12 @@ func simpleDiff(oldPath, newPath string, oldLines, newLines []string) string { for _, l := range oldLines { if !newSet[l] { - buf.WriteString("-" + l + config.NewlineLF) + buf.WriteString("-" + l + token.NewlineLF) } } for _, l := range newLines { if !oldSet[l] { - buf.WriteString("+" + l + config.NewlineLF) + buf.WriteString("+" + l + token.NewlineLF) } } diff --git a/internal/memory/mirror_test.go b/internal/memory/mirror_test.go index c20545ab..b3a1c2a9 100644 --- a/internal/memory/mirror_test.go +++ b/internal/memory/mirror_test.go @@ -12,7 +12,8 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/memory" ) func TestSync_FirstRun(t *testing.T) { @@ -37,7 +38,7 @@ func TestSync_FirstRun(t *testing.T) { t.Errorf("SourceLines = %d, want 4", result.SourceLines) } - mirrorPath := filepath.Join(contextDir, config.DirMemory, config.FileMemoryMirror) + mirrorPath := filepath.Join(contextDir, dir.Memory, memory.MemoryMirror) mirrorData, readErr := os.ReadFile(mirrorPath) if readErr != nil { t.Fatalf("reading mirror: %v", readErr) @@ -53,11 +54,11 @@ func TestSync_WithArchive(t *testing.T) { sourcePath := filepath.Join(sourceDir, "MEMORY.md") // Create initial mirror - mirrorDir := filepath.Join(contextDir, config.DirMemory) + mirrorDir := filepath.Join(contextDir, dir.Memory) if mkErr := os.MkdirAll(mirrorDir, 0o755); mkErr != nil { t.Fatal(mkErr) } - mirrorPath := filepath.Join(mirrorDir, config.FileMemoryMirror) + mirrorPath := filepath.Join(mirrorDir, memory.MemoryMirror) oldContent := "# Memory v1\n" if writeErr := os.WriteFile(mirrorPath, []byte(oldContent), 0o644); writeErr != nil { t.Fatal(writeErr) @@ -109,11 +110,11 @@ func TestDiff_Identical(t *testing.T) { content := "# Memory\nsame content\n" - mirrorDir := filepath.Join(contextDir, config.DirMemory) + mirrorDir := filepath.Join(contextDir, dir.Memory) if mkErr := os.MkdirAll(mirrorDir, 0o755); mkErr != nil { t.Fatal(mkErr) } - mirrorPath := filepath.Join(mirrorDir, config.FileMemoryMirror) + mirrorPath := filepath.Join(mirrorDir, memory.MemoryMirror) if writeErr := os.WriteFile(mirrorPath, []byte(content), 0o644); writeErr != nil { t.Fatal(writeErr) } @@ -136,11 +137,11 @@ func TestDiff_WithChanges(t *testing.T) { contextDir := t.TempDir() sourceDir := t.TempDir() - mirrorDir := filepath.Join(contextDir, config.DirMemory) + mirrorDir := filepath.Join(contextDir, dir.Memory) if mkErr := os.MkdirAll(mirrorDir, 0o755); mkErr != nil { t.Fatal(mkErr) } - mirrorPath := filepath.Join(mirrorDir, config.FileMemoryMirror) + mirrorPath := filepath.Join(mirrorDir, memory.MemoryMirror) if writeErr := os.WriteFile(mirrorPath, []byte("# Memory\nold line\n"), 0o644); writeErr != nil { t.Fatal(writeErr) } @@ -182,7 +183,7 @@ func TestSync_EmptySource(t *testing.T) { func TestArchiveCount(t *testing.T) { contextDir := t.TempDir() - archiveDir := filepath.Join(contextDir, config.DirMemoryArchive) + archiveDir := filepath.Join(contextDir, dir.MemoryArchive) if mkErr := os.MkdirAll(archiveDir, 0o755); mkErr != nil { t.Fatal(mkErr) } diff --git a/internal/memory/parse.go b/internal/memory/parse.go index cd17f840..d5a24580 100644 --- a/internal/memory/parse.go +++ b/internal/memory/parse.go @@ -9,28 +9,9 @@ package memory import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/token" ) -// EntryKind identifies how an entry was delimited in MEMORY.md. -type EntryKind int - -const ( - // EntryHeader is a Markdown heading (## or ###). - EntryHeader EntryKind = iota - // EntryParagraph is a blank-line-separated paragraph. - EntryParagraph - // EntryList is one or more consecutive list items. - EntryList -) - -// Entry is a discrete block parsed from MEMORY.md. -type Entry struct { - Text string // Raw text of the entry (trimmed) - StartLine int // 1-based line number where the entry begins - Kind EntryKind // How the entry was delimited -} - // ParseEntries splits MEMORY.md content into discrete entries. // // Entry boundaries: @@ -44,7 +25,7 @@ func ParseEntries(content string) []Entry { return nil } - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var entries []Entry var current []string var currentKind EntryKind @@ -52,7 +33,7 @@ func ParseEntries(content string) []Entry { inEntry := false flush := func() { - text := strings.TrimSpace(strings.Join(current, config.NewlineLF)) + text := strings.TrimSpace(strings.Join(current, token.NewlineLF)) if text != "" { entries = append(entries, Entry{ Text: text, @@ -69,7 +50,7 @@ func ParseEntries(content string) []Entry { trimmed := strings.TrimSpace(line) // Skip top-level heading - if strings.HasPrefix(trimmed, "# ") && !strings.HasPrefix(trimmed, "## ") { + if strings.HasPrefix(trimmed, token.HeadingLevelOneStart) && !strings.HasPrefix(trimmed, token.HeadingLevelTwoStart) { if inEntry { flush() } @@ -77,7 +58,7 @@ func ParseEntries(content string) []Entry { } // Section header (## or ###) starts a new entry - if strings.HasPrefix(trimmed, "## ") || strings.HasPrefix(trimmed, "### ") { + if strings.HasPrefix(trimmed, token.HeadingLevelTwoStart) || strings.HasPrefix(trimmed, token.HeadingLevelThreeStart) { if inEntry { flush() } @@ -97,7 +78,7 @@ func ParseEntries(content string) []Entry { } // List item — each top-level item is a separate entry for classification - if strings.HasPrefix(trimmed, "- ") || strings.HasPrefix(trimmed, "* ") { + if strings.HasPrefix(trimmed, token.PrefixListDash) || strings.HasPrefix(trimmed, token.PrefixListStar) { if inEntry { flush() } diff --git a/internal/memory/promote.go b/internal/memory/promote.go index 29178adb..644ccb2d 100644 --- a/internal/memory/promote.go +++ b/internal/memory/promote.go @@ -9,17 +9,17 @@ package memory import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/entry" + "github.com/ActiveMemory/ctx/internal/config/token" ctxentry "github.com/ActiveMemory/ctx/internal/entry" ) -const importSource = "auto-memory import" - // Promote writes a classified entry to the appropriate .context/ file. // Uses the add package's WriteEntry for consistent formatting and indexing. -func Promote(entry Entry, classification Classification) error { +func Promote(e Entry, classification Classification) error { // Extract a title from the entry text (first line, trimmed of Markdown markers) - title := extractTitle(entry.Text) + title := extractTitle(e.Text) params := ctxentry.Params{ Type: classification.Target, @@ -27,20 +27,20 @@ func Promote(entry Entry, classification Classification) error { } switch classification.Target { - case config.EntryDecision: - params.Context = importSource - params.Rationale = extractBody(entry.Text) - params.Consequences = "Imported from MEMORY.md — review and update as needed" + case entry.Decision: + params.Context = assets.TextDesc(assets.TextDescKeyMemoryImportSource) + params.Rationale = extractBody(e.Text) + params.Consequences = assets.TextDesc(assets.TextDescKeyMemoryImportReview) - case config.EntryLearning: - params.Context = importSource - params.Lesson = extractBody(entry.Text) - params.Application = "Imported from MEMORY.md — review and update as needed" + case entry.Learning: + params.Context = assets.TextDesc(assets.TextDescKeyMemoryImportSource) + params.Lesson = extractBody(e.Text) + params.Application = assets.TextDesc(assets.TextDescKeyMemoryImportReview) - case config.EntryTask: + case entry.Task: // Tasks just need content — FormatTask handles the rest - case config.EntryConvention: + case entry.Convention: // Conventions just need content — FormatConvention handles the rest } @@ -50,16 +50,16 @@ func Promote(entry Entry, classification Classification) error { // extractTitle returns the first meaningful line of an entry, cleaned of // Markdown heading markers and list item prefixes. func extractTitle(text string) string { - line := strings.SplitN(text, config.NewlineLF, 2)[0] + line := strings.SplitN(text, token.NewlineLF, 2)[0] line = strings.TrimSpace(line) // Strip heading markers - line = strings.TrimLeft(line, "#") + line = strings.TrimLeft(line, token.PrefixHeading) line = strings.TrimSpace(line) // Strip list item markers - if strings.HasPrefix(line, "- ") { - line = line[2:] - } else if strings.HasPrefix(line, "* ") { - line = line[2:] + if strings.HasPrefix(line, token.PrefixListDash) { + line = line[len(token.PrefixListDash):] + } else if strings.HasPrefix(line, token.PrefixListStar) { + line = line[len(token.PrefixListStar):] } return strings.TrimSpace(line) } @@ -67,7 +67,7 @@ func extractTitle(text string) string { // extractBody returns everything after the first line, or the first line // itself if there's only one line. func extractBody(text string) string { - parts := strings.SplitN(text, config.NewlineLF, 2) + parts := strings.SplitN(text, token.NewlineLF, 2) if len(parts) < 2 || strings.TrimSpace(parts[1]) == "" { return extractTitle(text) } diff --git a/internal/memory/promote_test.go b/internal/memory/promote_test.go index b765a2b6..14bbc275 100644 --- a/internal/memory/promote_test.go +++ b/internal/memory/promote_test.go @@ -12,7 +12,9 @@ import ( "strings" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/entry" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -24,15 +26,15 @@ func setupContextDir(t *testing.T) (string, func()) { _ = os.Chdir(workDir) rc.Reset() - contextDir := filepath.Join(workDir, config.DirContext) + contextDir := filepath.Join(workDir, dir.Context) if mkErr := os.MkdirAll(contextDir, 0o755); mkErr != nil { t.Fatal(mkErr) } // Create required context files for _, f := range []string{ - config.FileConstitution, config.FileTask, config.FileDecision, - config.FileLearning, config.FileConvention, + ctx.Constitution, ctx.Task, ctx.Decision, + ctx.Learning, ctx.Convention, } { content := "# " + strings.TrimSuffix(f, ".md") + "\n\n" if writeErr := os.WriteFile(filepath.Join(contextDir, f), []byte(content), 0o644); writeErr != nil { @@ -47,14 +49,14 @@ func TestPromote_Convention(t *testing.T) { contextDir, cleanup := setupContextDir(t) defer cleanup() - entry := Entry{Text: "always use bun for this project", Kind: EntryList} - classification := Classification{Target: config.EntryConvention, Keywords: []string{"always use"}} + e := Entry{Text: "always use bun for this project", Kind: EntryList} + classification := Classification{Target: entry.Convention, Keywords: []string{"always use"}} - if promoteErr := Promote(entry, classification); promoteErr != nil { + if promoteErr := Promote(e, classification); promoteErr != nil { t.Fatalf("Promote: %v", promoteErr) } - data, readErr := os.ReadFile(filepath.Join(contextDir, config.FileConvention)) + data, readErr := os.ReadFile(filepath.Join(contextDir, ctx.Convention)) if readErr != nil { t.Fatal(readErr) } @@ -67,14 +69,14 @@ func TestPromote_Learning(t *testing.T) { contextDir, cleanup := setupContextDir(t) defer cleanup() - entry := Entry{Text: "learned that nolint is ignored in v2", Kind: EntryParagraph} - classification := Classification{Target: config.EntryLearning, Keywords: []string{"learned"}} + e := Entry{Text: "learned that nolint is ignored in v2", Kind: EntryParagraph} + classification := Classification{Target: entry.Learning, Keywords: []string{"learned"}} - if promoteErr := Promote(entry, classification); promoteErr != nil { + if promoteErr := Promote(e, classification); promoteErr != nil { t.Fatalf("Promote: %v", promoteErr) } - data, readErr := os.ReadFile(filepath.Join(contextDir, config.FileLearning)) + data, readErr := os.ReadFile(filepath.Join(contextDir, ctx.Learning)) if readErr != nil { t.Fatal(readErr) } @@ -87,14 +89,14 @@ func TestPromote_Decision(t *testing.T) { contextDir, cleanup := setupContextDir(t) defer cleanup() - entry := Entry{Text: "decided to use SQLite over Postgres", Kind: EntryParagraph} - classification := Classification{Target: config.EntryDecision, Keywords: []string{"decided"}} + e := Entry{Text: "decided to use SQLite over Postgres", Kind: EntryParagraph} + classification := Classification{Target: entry.Decision, Keywords: []string{"decided"}} - if promoteErr := Promote(entry, classification); promoteErr != nil { + if promoteErr := Promote(e, classification); promoteErr != nil { t.Fatalf("Promote: %v", promoteErr) } - data, readErr := os.ReadFile(filepath.Join(contextDir, config.FileDecision)) + data, readErr := os.ReadFile(filepath.Join(contextDir, ctx.Decision)) if readErr != nil { t.Fatal(readErr) } @@ -107,14 +109,14 @@ func TestPromote_Task(t *testing.T) { contextDir, cleanup := setupContextDir(t) defer cleanup() - entry := Entry{Text: "need to add tests for import", Kind: EntryList} - classification := Classification{Target: config.EntryTask, Keywords: []string{"need to"}} + e := Entry{Text: "need to add tests for import", Kind: EntryList} + classification := Classification{Target: entry.Task, Keywords: []string{"need to"}} - if promoteErr := Promote(entry, classification); promoteErr != nil { + if promoteErr := Promote(e, classification); promoteErr != nil { t.Fatalf("Promote: %v", promoteErr) } - data, readErr := os.ReadFile(filepath.Join(contextDir, config.FileTask)) + data, readErr := os.ReadFile(filepath.Join(contextDir, ctx.Task)) if readErr != nil { t.Fatal(readErr) } diff --git a/internal/memory/publish.go b/internal/memory/publish.go index 96c1d55e..c565ad23 100644 --- a/internal/memory/publish.go +++ b/internal/memory/publish.go @@ -7,41 +7,22 @@ package memory import ( - "fmt" "os" "path/filepath" "strings" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/marker" + "github.com/ActiveMemory/ctx/internal/config/memory" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" + ctxerr "github.com/ActiveMemory/ctx/internal/err" "github.com/ActiveMemory/ctx/internal/index" ) -const ( - // MarkerStart is the HTML comment that begins the ctx-published block. - MarkerStart = "" - // MarkerEnd is the HTML comment that ends the ctx-published block. - MarkerEnd = "" - - // DefaultPublishBudget is the default line budget for published content. - DefaultPublishBudget = 80 - - maxTasks = 10 - maxDecisions = 5 - maxConventions = 10 - maxLearnings = 5 - recentDays = 7 -) - -// PublishResult holds what was selected for publishing. -type PublishResult struct { - Tasks []string - Decisions []string - Conventions []string - Learnings []string - TotalLines int -} - // SelectContent reads .context/ files and selects content within the line budget. // // Priority order: tasks > decisions > conventions > learnings. @@ -50,27 +31,27 @@ func SelectContent(contextDir string, budget int) (PublishResult, error) { var result PublishResult // Pending tasks - taskPath := filepath.Join(contextDir, config.FileTask) + taskPath := filepath.Join(contextDir, ctx.Task) if data, readErr := os.ReadFile(taskPath); readErr == nil { //nolint:gosec // project-local path - result.Tasks = extractPendingTasks(string(data), maxTasks) + result.Tasks = extractPendingTasks(string(data), memory.PublishMaxTasks) } // Recent decisions - decPath := filepath.Join(contextDir, config.FileDecision) + decPath := filepath.Join(contextDir, ctx.Decision) if data, readErr := os.ReadFile(decPath); readErr == nil { //nolint:gosec // project-local path - result.Decisions = extractRecentEntries(string(data), maxDecisions) + result.Decisions = extractRecentEntries(string(data), memory.PublishMaxDecisions) } // Key conventions (first N lines that are list items) - convPath := filepath.Join(contextDir, config.FileConvention) + convPath := filepath.Join(contextDir, ctx.Convention) if data, readErr := os.ReadFile(convPath); readErr == nil { //nolint:gosec // project-local path - result.Conventions = extractConventionItems(string(data), maxConventions) + result.Conventions = extractConventionItems(string(data), memory.PublishMaxConventions) } // Recent learnings - lrnPath := filepath.Join(contextDir, config.FileLearning) + lrnPath := filepath.Join(contextDir, ctx.Learning) if data, readErr := os.ReadFile(lrnPath); readErr == nil { //nolint:gosec // project-local path - result.Learnings = extractRecentEntries(string(data), maxLearnings) + result.Learnings = extractRecentEntries(string(data), memory.PublishMaxLearnings) } // Trim to budget (tasks always fit, trim from bottom) @@ -83,41 +64,41 @@ func SelectContent(contextDir string, budget int) (PublishResult, error) { // Format renders the publish result as a Markdown block (without markers). func (r PublishResult) Format() string { var buf strings.Builder - buf.WriteString("# Project Context (managed by ctx)\n\n") + buf.WriteString(assets.TextDesc(assets.TextDescKeyMemoryPublishTitle)) if len(r.Tasks) > 0 { - buf.WriteString("## Pending Tasks" + config.NewlineLF) + buf.WriteString(assets.TextDesc(assets.TextDescKeyMemoryPublishTasks) + token.NewlineLF) for _, t := range r.Tasks { - buf.WriteString(t + config.NewlineLF) + buf.WriteString(t + token.NewlineLF) } - buf.WriteString(config.NewlineLF) + buf.WriteString(token.NewlineLF) } if len(r.Decisions) > 0 { - buf.WriteString("## Recent Decisions" + config.NewlineLF) + buf.WriteString(assets.TextDesc(assets.TextDescKeyMemoryPublishDec) + token.NewlineLF) for _, d := range r.Decisions { - buf.WriteString("- " + d + config.NewlineLF) + buf.WriteString(token.PrefixListDash + d + token.NewlineLF) } - buf.WriteString(config.NewlineLF) + buf.WriteString(token.NewlineLF) } if len(r.Conventions) > 0 { - buf.WriteString("## Key Conventions" + config.NewlineLF) + buf.WriteString(assets.TextDesc(assets.TextDescKeyMemoryPublishConv) + token.NewlineLF) for _, c := range r.Conventions { - buf.WriteString(c + config.NewlineLF) + buf.WriteString(c + token.NewlineLF) } - buf.WriteString(config.NewlineLF) + buf.WriteString(token.NewlineLF) } if len(r.Learnings) > 0 { - buf.WriteString("## Recent Learnings" + config.NewlineLF) + buf.WriteString(assets.TextDesc(assets.TextDescKeyMemoryPublishLrn) + token.NewlineLF) for _, l := range r.Learnings { - buf.WriteString("- " + l + config.NewlineLF) + buf.WriteString(token.PrefixListDash + l + token.NewlineLF) } - buf.WriteString(config.NewlineLF) + buf.WriteString(token.NewlineLF) } - return strings.TrimRight(buf.String(), config.NewlineLF) + config.NewlineLF + return strings.TrimRight(buf.String(), token.NewlineLF) + token.NewlineLF } // MergePublished inserts or replaces the marker block in existing MEMORY.md content. @@ -125,24 +106,24 @@ func (r PublishResult) Format() string { // If markers exist, replaces everything between them. If markers are missing, // appends the block at the end (recovery). Returns (merged content, markers were missing). func MergePublished(existing, published string) (string, bool) { - block := MarkerStart + config.NewlineLF + published + MarkerEnd + config.NewlineLF + block := marker.PublishMarkerStart + token.NewlineLF + published + marker.PublishMarkerEnd + token.NewlineLF - startIdx := strings.Index(existing, MarkerStart) - endIdx := strings.Index(existing, MarkerEnd) + startIdx := strings.Index(existing, marker.PublishMarkerStart) + endIdx := strings.Index(existing, marker.PublishMarkerEnd) if startIdx >= 0 && endIdx > startIdx { // Replace existing block before := existing[:startIdx] - after := existing[endIdx+len(MarkerEnd):] + after := existing[endIdx+len(marker.PublishMarkerEnd):] // Trim trailing newline from after to avoid double blank lines - after = strings.TrimPrefix(after, config.NewlineLF) + after = strings.TrimPrefix(after, token.NewlineLF) return before + block + after, false } // Markers missing — append - sep := config.NewlineLF - if !strings.HasSuffix(existing, config.NewlineLF) { - sep = config.NewlineLF + config.NewlineLF + sep := token.NewlineLF + if !strings.HasSuffix(existing, token.NewlineLF) { + sep = token.NewlineLF + token.NewlineLF } return existing + sep + block, startIdx < 0 } @@ -150,22 +131,22 @@ func MergePublished(existing, published string) (string, bool) { // RemovePublished strips the marker block from MEMORY.md content. // Returns (cleaned content, true if markers were found and removed). func RemovePublished(content string) (string, bool) { - startIdx := strings.Index(content, MarkerStart) - endIdx := strings.Index(content, MarkerEnd) + startIdx := strings.Index(content, marker.PublishMarkerStart) + endIdx := strings.Index(content, marker.PublishMarkerEnd) if startIdx < 0 || endIdx <= startIdx { return content, false } before := content[:startIdx] - after := content[endIdx+len(MarkerEnd):] - after = strings.TrimPrefix(after, config.NewlineLF) + after := content[endIdx+len(marker.PublishMarkerEnd):] + after = strings.TrimPrefix(after, token.NewlineLF) - result := strings.TrimRight(before, config.NewlineLF) + result := strings.TrimRight(before, token.NewlineLF) if after != "" { - result += config.NewlineLF + after + result += token.NewlineLF + after } else { - result += config.NewlineLF + result += token.NewlineLF } return result, true @@ -203,9 +184,9 @@ func (r *PublishResult) trimToBudget(budget int) { // extractPendingTasks finds unchecked task items from TASKS.md. func extractPendingTasks(content string, max int) []string { var tasks []string - for _, line := range strings.Split(content, config.NewlineLF) { + for _, line := range strings.Split(content, token.NewlineLF) { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, "- [ ] ") { + if strings.HasPrefix(trimmed, marker.PrefixTaskUndone+token.Space) { tasks = append(tasks, trimmed) if len(tasks) >= max { break @@ -218,7 +199,7 @@ func extractPendingTasks(content string, max int) []string { // extractRecentEntries returns titles of entries from the last N days. func extractRecentEntries(content string, max int) []string { blocks := index.ParseEntryBlocks(content) - cutoff := time.Now().AddDate(0, 0, -recentDays).Format("2006-01-02") + cutoff := time.Now().AddDate(0, 0, -memory.PublishRecentDays).Format(time2.DateFormat) var titles []string for _, b := range blocks { @@ -235,9 +216,9 @@ func extractRecentEntries(content string, max int) []string { // extractConventionItems returns the first N list items from CONVENTIONS.md. func extractConventionItems(content string, max int) []string { var items []string - for _, line := range strings.Split(content, config.NewlineLF) { + for _, line := range strings.Split(content, token.NewlineLF) { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, "- ") || strings.HasPrefix(trimmed, "* ") { + if strings.HasPrefix(trimmed, token.PrefixListDash) || strings.HasPrefix(trimmed, token.PrefixListStar) { items = append(items, trimmed) if len(items) >= max { break @@ -251,7 +232,7 @@ func extractConventionItems(content string, max int) []string { func Publish(contextDir, memoryPath string, budget int) (PublishResult, error) { result, selectErr := SelectContent(contextDir, budget) if selectErr != nil { - return PublishResult{}, fmt.Errorf("selecting content: %w", selectErr) + return PublishResult{}, ctxerr.MemorySelectContent(selectErr) } formatted := result.Format() @@ -264,8 +245,8 @@ func Publish(contextDir, memoryPath string, budget int) (PublishResult, error) { merged, _ := MergePublished(string(existing), formatted) - if writeErr := os.WriteFile(memoryPath, []byte(merged), config.PermFile); writeErr != nil { - return PublishResult{}, fmt.Errorf("writing MEMORY.md: %w", writeErr) + if writeErr := os.WriteFile(memoryPath, []byte(merged), fs.PermFile); writeErr != nil { + return PublishResult{}, ctxerr.MemoryWriteMemory(writeErr) } return result, nil diff --git a/internal/memory/publish_test.go b/internal/memory/publish_test.go index 84646c45..3f5e4e65 100644 --- a/internal/memory/publish_test.go +++ b/internal/memory/publish_test.go @@ -14,7 +14,11 @@ import ( "testing" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/marker" + cfgmem "github.com/ActiveMemory/ctx/internal/config/memory" + time2 "github.com/ActiveMemory/ctx/internal/config/time" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -22,10 +26,10 @@ func TestMergePublished_EmptyFile(t *testing.T) { published := "# Project Context (managed by ctx)\n\n## Pending Tasks\n- [ ] task one\n" merged, missing := MergePublished("", published) - if !strings.Contains(merged, MarkerStart) { + if !strings.Contains(merged, marker.PublishMarkerStart) { t.Error("expected marker start in output") } - if !strings.Contains(merged, MarkerEnd) { + if !strings.Contains(merged, marker.PublishMarkerEnd) { t.Error("expected marker end in output") } if !strings.Contains(merged, "task one") { @@ -38,7 +42,7 @@ func TestMergePublished_EmptyFile(t *testing.T) { func TestMergePublished_ReplaceExisting(t *testing.T) { existing := "# Auto Memory\n\nClaude notes here.\n\n" + - MarkerStart + "\nold content\n" + MarkerEnd + "\n\nMore Claude notes.\n" + marker.PublishMarkerStart + "\nold content\n" + marker.PublishMarkerEnd + "\n\nMore Claude notes.\n" published := "# Project Context (managed by ctx)\n\nnew content\n" merged, missing := MergePublished(existing, published) @@ -79,7 +83,7 @@ func TestMergePublished_MarkersStripped(t *testing.T) { func TestRemovePublished(t *testing.T) { content := "# Auto Memory\n\nNotes.\n\n" + - MarkerStart + "\npublished stuff\n" + MarkerEnd + "\n\nMore notes.\n" + marker.PublishMarkerStart + "\npublished stuff\n" + marker.PublishMarkerEnd + "\n\nMore notes.\n" cleaned, found := RemovePublished(content) @@ -135,36 +139,36 @@ func TestSelectContent(t *testing.T) { rc.Reset() defer func() { _ = os.Chdir(origDir) }() - contextDir := filepath.Join(workDir, config.DirContext) + contextDir := filepath.Join(workDir, dir.Context) if mkErr := os.MkdirAll(contextDir, 0o755); mkErr != nil { t.Fatal(mkErr) } // Create TASKS.md with pending items tasks := "# Tasks\n\n- [x] done task\n- [ ] pending task one\n- [ ] pending task two\n" - if writeErr := os.WriteFile(filepath.Join(contextDir, config.FileTask), []byte(tasks), 0o644); writeErr != nil { + if writeErr := os.WriteFile(filepath.Join(contextDir, ctx.Task), []byte(tasks), 0o644); writeErr != nil { t.Fatal(writeErr) } // Create DECISIONS.md with a recent entry - ts := time.Now().Format(config.TimestampCompact) + ts := time.Now().Format(time2.TimestampCompact) decisions := fmt.Sprintf("# Decisions\n\n## [%s] Use SQLite\n\nContext: testing\n", ts) - if writeErr := os.WriteFile(filepath.Join(contextDir, config.FileDecision), []byte(decisions), 0o644); writeErr != nil { + if writeErr := os.WriteFile(filepath.Join(contextDir, ctx.Decision), []byte(decisions), 0o644); writeErr != nil { t.Fatal(writeErr) } // Create CONVENTIONS.md conventions := "# Conventions\n\n- Always use ctx from PATH\n- Prefer filepath.Join\n" - if writeErr := os.WriteFile(filepath.Join(contextDir, config.FileConvention), []byte(conventions), 0o644); writeErr != nil { + if writeErr := os.WriteFile(filepath.Join(contextDir, ctx.Convention), []byte(conventions), 0o644); writeErr != nil { t.Fatal(writeErr) } // Create empty LEARNINGS.md - if writeErr := os.WriteFile(filepath.Join(contextDir, config.FileLearning), []byte("# Learnings\n"), 0o644); writeErr != nil { + if writeErr := os.WriteFile(filepath.Join(contextDir, ctx.Learning), []byte("# Learnings\n"), 0o644); writeErr != nil { t.Fatal(writeErr) } - result, selectErr := SelectContent(contextDir, DefaultPublishBudget) + result, selectErr := SelectContent(contextDir, cfgmem.DefaultPublishBudget) if selectErr != nil { t.Fatalf("SelectContent: %v", selectErr) } diff --git a/internal/memory/state.go b/internal/memory/state.go index ead3f2ee..b6259e2b 100644 --- a/internal/memory/state.go +++ b/internal/memory/state.go @@ -15,18 +15,13 @@ import ( "path/filepath" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/fs" + "github.com/ActiveMemory/ctx/internal/config/memory" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" ) -// State tracks memory bridge sync timestamps and (in future phases) -// import/publish progress. -type State struct { - LastSync *time.Time `json:"last_sync"` - LastImport *time.Time `json:"last_import"` - LastPublish *time.Time `json:"last_publish"` - ImportedHashes []string `json:"imported_hashes"` -} - // LoadState reads the sync state from .context/state/memory-import.json. // Returns a zero-value State if the file does not exist. func LoadState(contextDir string) (State, error) { @@ -53,7 +48,7 @@ func LoadState(contextDir string) (State, error) { func SaveState(contextDir string, s State) error { path := statePath(contextDir) dir := filepath.Dir(path) - if mkErr := os.MkdirAll(dir, config.PermExec); mkErr != nil { + if mkErr := os.MkdirAll(dir, fs.PermExec); mkErr != nil { return mkErr } @@ -61,8 +56,8 @@ func SaveState(contextDir string, s State) error { if marshalErr != nil { return marshalErr } - data = append(data, '\n') - return os.WriteFile(path, data, config.PermFile) + data = append(data, token.NewlineLF[0]) + return os.WriteFile(path, data, fs.PermFile) } // MarkSynced updates the state with the current timestamp. @@ -81,7 +76,7 @@ func EntryHash(text string) string { // Imported reports whether an entry hash has already been imported. // Stored entries use format "hash:target:date"; matches on hash prefix. func (s *State) Imported(hash string) bool { - prefix := hash + ":" + prefix := hash + token.Colon for _, h := range s.ImportedHashes { if h == hash || len(h) > len(hash) && h[:len(prefix)] == prefix { return true @@ -92,7 +87,7 @@ func (s *State) Imported(hash string) bool { // MarkImported records an entry hash with its target and date. func (s *State) MarkImported(hash, target string) { - date := time.Now().Format("2006-01-02") + date := time.Now().Format(time2.DateFormat) entry := fmt.Sprintf("%s:%s:%s", hash, target, date) s.ImportedHashes = append(s.ImportedHashes, entry) } @@ -104,5 +99,5 @@ func (s *State) MarkImportedDone() { } func statePath(contextDir string) string { - return filepath.Join(contextDir, config.DirState, config.FileMemoryState) + return filepath.Join(contextDir, dir.State, memory.MemoryState) } diff --git a/internal/memory/state_test.go b/internal/memory/state_test.go index ae5431e3..46b160d5 100644 --- a/internal/memory/state_test.go +++ b/internal/memory/state_test.go @@ -11,12 +11,13 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/memory" ) func TestStateRoundtrip(t *testing.T) { contextDir := t.TempDir() - stateDir := filepath.Join(contextDir, config.DirState) + stateDir := filepath.Join(contextDir, dir.State) if mkErr := os.MkdirAll(stateDir, 0o755); mkErr != nil { t.Fatal(mkErr) } @@ -106,12 +107,12 @@ func TestDedup_MarkImportedFormat(t *testing.T) { func TestLoadState_CorruptJSON(t *testing.T) { contextDir := t.TempDir() - stateDir := filepath.Join(contextDir, config.DirState) + stateDir := filepath.Join(contextDir, dir.State) if mkErr := os.MkdirAll(stateDir, 0o755); mkErr != nil { t.Fatal(mkErr) } - path := filepath.Join(stateDir, config.FileMemoryState) + path := filepath.Join(stateDir, memory.MemoryState) if writeErr := os.WriteFile(path, []byte("{corrupt"), 0o644); writeErr != nil { t.Fatal(writeErr) } diff --git a/internal/memory/types.go b/internal/memory/types.go new file mode 100644 index 00000000..03a8417f --- /dev/null +++ b/internal/memory/types.go @@ -0,0 +1,60 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package memory + +import "time" + +// EntryKind identifies how an entry was delimited in MEMORY.md. +type EntryKind int + +const ( + // EntryHeader is a Markdown heading (## or ###). + EntryHeader EntryKind = iota + // EntryParagraph is a blank-line-separated paragraph. + EntryParagraph + // EntryList is one or more consecutive list items. + EntryList +) + +// Entry is a discrete block parsed from MEMORY.md. +type Entry struct { + Text string // Raw text of the entry (trimmed) + StartLine int // 1-based line number where the entry begins + Kind EntryKind // How the entry was delimited +} + +// Classification is the result of heuristic entry classification. +type Classification struct { + Target string // config.Entry* constant or "skip" + Keywords []string // Keywords that triggered the match +} + +// PublishResult holds what was selected for publishing. +type PublishResult struct { + Tasks []string + Decisions []string + Conventions []string + Learnings []string + TotalLines int +} + +// State tracks memory bridge sync timestamps and import/publish progress. +type State struct { + LastSync *time.Time `json:"last_sync"` + LastImport *time.Time `json:"last_import"` + LastPublish *time.Time `json:"last_publish"` + ImportedHashes []string `json:"imported_hashes"` +} + +// SyncResult holds the outcome of a Sync operation. +type SyncResult struct { + SourcePath string + MirrorPath string + ArchivedTo string // empty if no prior mirror existed + SourceLines int + MirrorLines int // lines in the previous mirror (0 if first sync) +} diff --git a/internal/notify/notify.go b/internal/notify/notify.go index fcd70333..283378e2 100644 --- a/internal/notify/notify.go +++ b/internal/notify/notify.go @@ -19,7 +19,8 @@ import ( "path/filepath" "time" - "github.com/ActiveMemory/ctx/internal/config" + crypto2 "github.com/ActiveMemory/ctx/internal/config/crypto" + "github.com/ActiveMemory/ctx/internal/config/fs" "github.com/ActiveMemory/ctx/internal/crypto" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -54,9 +55,9 @@ type Payload struct { // (silent noop — webhook not configured). func LoadWebhook() (string, error) { contextDir := rc.ContextDir() - config.MigrateKeyFile(contextDir) + crypto.MigrateKeyFile(contextDir) kp := rc.KeyPath() - encPath := filepath.Join(contextDir, config.FileNotifyEnc) + encPath := filepath.Join(contextDir, crypto2.NotifyEnc) key, err := crypto.LoadKey(kp) if err != nil { @@ -87,9 +88,9 @@ func LoadWebhook() (string, error) { // If the scratchpad key does not exist, it is generated and saved first. func SaveWebhook(url string) error { contextDir := rc.ContextDir() - config.MigrateKeyFile(contextDir) + crypto.MigrateKeyFile(contextDir) kp := rc.KeyPath() - encPath := filepath.Join(contextDir, config.FileNotifyEnc) + encPath := filepath.Join(contextDir, crypto2.NotifyEnc) key, err := crypto.LoadKey(kp) if err != nil { @@ -98,7 +99,7 @@ func SaveWebhook(url string) error { if err != nil { return err } - if mkdirErr := os.MkdirAll(filepath.Dir(kp), config.PermKeyDir); mkdirErr != nil { + if mkdirErr := os.MkdirAll(filepath.Dir(kp), fs.PermKeyDir); mkdirErr != nil { return mkdirErr } if saveErr := crypto.SaveKey(kp, key); saveErr != nil { @@ -111,7 +112,7 @@ func SaveWebhook(url string) error { return err } - return os.WriteFile(encPath, ciphertext, config.PermSecret) + return os.WriteFile(encPath, ciphertext, fs.PermSecret) } // EventAllowed reports whether the given event passes the filter. @@ -168,8 +169,7 @@ func Send(event, message, sessionID string, detail *TemplateRef) error { return nil } - client := &http.Client{Timeout: 5 * time.Second} - resp, err := client.Post(url, "application/json", bytes.NewReader(body)) //nolint:gosec // URL is user-configured via encrypted storage + resp, err := PostJSON(url, body) if err != nil { return nil // fire-and-forget } @@ -177,3 +177,41 @@ func Send(event, message, sessionID string, detail *TemplateRef) error { return nil } + +// PostJSON sends a JSON payload to a webhook URL and returns the response. +// The URL is always user-configured via encrypted storage. +// +// Parameters: +// - url: webhook endpoint. +// - body: JSON-encoded payload bytes. +// +// Returns: +// - *http.Response: the HTTP response (caller must close Body). +// - error: on HTTP failure. +func PostJSON(url string, body []byte) (*http.Response, error) { + client := &http.Client{Timeout: 5 * time.Second} + return client.Post(url, "application/json", bytes.NewReader(body)) //nolint:gosec // URL is user-configured via encrypted storage +} + +// MaskURL shows the scheme + host and masks everything after the path start. +// +// Parameters: +// - url: full webhook URL. +// +// Returns: +// - string: masked URL safe for display. +func MaskURL(url string) string { + count := 0 + for i, c := range url { + if c == '/' { + count++ + if count == 3 { + return url[:i] + "/***" + } + } + } + if len(url) > 20 { + return url[:20] + "***" + } + return url +} diff --git a/internal/notify/notify_test.go b/internal/notify/notify_test.go index 89f64787..b712d415 100644 --- a/internal/notify/notify_test.go +++ b/internal/notify/notify_test.go @@ -14,7 +14,7 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/crypto" "github.com/ActiveMemory/ctx/internal/rc" ) @@ -52,7 +52,7 @@ func TestLoadWebhook_NoFile(t *testing.T) { defer cleanup() // Create key but no encrypted file - keyPath := filepath.Join(tempDir, ".context", config.FileContextKey) + keyPath := filepath.Join(tempDir, ".context", crypto.ContextKey) _ = os.WriteFile(keyPath, make([]byte, 32), 0o600) url, err := LoadWebhook() @@ -297,7 +297,7 @@ func TestLoadWebhook_CorruptedFile(t *testing.T) { } // Corrupt the encrypted file with garbage bytes. - encPath := filepath.Join(tempDir, ".context", config.FileNotifyEnc) + encPath := filepath.Join(tempDir, ".context", crypto.NotifyEnc) if writeErr := os.WriteFile(encPath, []byte("corrupted-garbage-data"), 0o600); writeErr != nil { t.Fatalf("WriteFile() error = %v", writeErr) } diff --git a/internal/parse/date.go b/internal/parse/date.go index f1288730..b12ff67e 100644 --- a/internal/parse/date.go +++ b/internal/parse/date.go @@ -9,7 +9,7 @@ package parse import ( "time" - "github.com/ActiveMemory/ctx/internal/config" + time2 "github.com/ActiveMemory/ctx/internal/config/time" ) // Date parses a YYYY-MM-DD string into a time.Time at midnight UTC. @@ -25,5 +25,5 @@ func Date(s string) (time.Time, error) { if s == "" { return time.Time{}, nil } - return time.Parse(config.DateFormat, s) + return time.Parse(time2.DateFormat, s) } diff --git a/internal/rc/default.go b/internal/rc/default.go index b3c423b3..420bb972 100644 --- a/internal/rc/default.go +++ b/internal/rc/default.go @@ -6,36 +6,17 @@ package rc -// DefaultTokenBudget is the default token budget when not configured. -const DefaultTokenBudget = 8000 - -// DefaultArchiveAfterDays is the default days before archiving. -const DefaultArchiveAfterDays = 7 - -// DefaultEntryCountLearnings is the entry count threshold for LEARNINGS.md. -// Learnings are situational; many become stale. Warn above this count. -const DefaultEntryCountLearnings = 30 - -// DefaultEntryCountDecisions is the entry count threshold for DECISIONS.md. -// Decisions are more durable but still compound. Warn above this count. -const DefaultEntryCountDecisions = 20 - -// DefaultConventionLineCount is the line count threshold for CONVENTIONS.md. -// Conventions lack dated entry headers, so line count is used instead. -const DefaultConventionLineCount = 200 - -// DefaultInjectionTokenWarn is the token threshold for oversize injection warning. -// When auto-injected context exceeds this count, a flag file is written for -// check-context-size to pick up. 0 disables the check. -const DefaultInjectionTokenWarn = 15000 - -// DefaultContextWindow is the default context window size in tokens. -// Matches Claude Opus/Sonnet (200k). Override via `context_window` in .ctxrc. -const DefaultContextWindow = 200000 - -// DefaultTaskNudgeInterval is the number of Edit/Write calls between task -// completion nudges. Set to 0 in .ctxrc to disable. -const DefaultTaskNudgeInterval = 5 - -// DefaultKeyRotationDays is the number of days before a key rotation nudge. -const DefaultKeyRotationDays = 90 +import "github.com/ActiveMemory/ctx/internal/config/runtime" + +// Aliases re-exported from config/runtime for use within rc. +const ( + DefaultTokenBudget = runtime.DefaultTokenBudget + DefaultArchiveAfterDays = runtime.DefaultArchiveAfterDays + DefaultEntryCountLearnings = runtime.DefaultEntryCountLearnings + DefaultEntryCountDecisions = runtime.DefaultEntryCountDecisions + DefaultConventionLineCount = runtime.DefaultConventionLineCount + DefaultInjectionTokenWarn = runtime.DefaultInjectionTokenWarn + DefaultContextWindow = runtime.DefaultContextWindow + DefaultTaskNudgeInterval = runtime.DefaultTaskNudgeInterval + DefaultKeyRotationDays = runtime.DefaultKeyRotationDays +) diff --git a/internal/rc/load.go b/internal/rc/load.go index 6a374b54..0325505f 100644 --- a/internal/rc/load.go +++ b/internal/rc/load.go @@ -11,9 +11,12 @@ import ( "os" "strconv" + "github.com/ActiveMemory/ctx/internal/config/env" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/token" "gopkg.in/yaml.v3" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" ) // loadRC loads configuration from the .ctxrc file and applies env @@ -25,19 +28,19 @@ func loadRC() *CtxRC { cfg := Default() // Try to load .ctxrc from the current directory - data, err := os.ReadFile(config.FileContextRC) + data, err := os.ReadFile(file.CtxRC) if err == nil { if yamlErr := yaml.Unmarshal(data, cfg); yamlErr != nil { - fmt.Fprintf(os.Stderr, "ctx: warning: failed to parse %s: %v (using defaults)\n", - config.FileContextRC, yamlErr) + _, _ = fmt.Fprintf(os.Stderr, assets.TextDesc(assets.TextDescKeyRcParseWarning)+token.NewlineLF, + file.CtxRC, yamlErr) } } // Apply environment variable overrides - if envDir := os.Getenv(config.EnvCtxDir); envDir != "" { + if envDir := os.Getenv(env.CtxDir); envDir != "" { cfg.ContextDir = envDir } - if envBudget := os.Getenv(config.EnvCtxTokenBudget); envBudget != "" { + if envBudget := os.Getenv(env.CtxTokenBudget); envBudget != "" { if budget, err := strconv.Atoi(envBudget); err == nil && budget > 0 { cfg.TokenBudget = budget } diff --git a/internal/rc/rc.go b/internal/rc/rc.go index c360280a..ed117663 100644 --- a/internal/rc/rc.go +++ b/internal/rc/rc.go @@ -10,7 +10,9 @@ package rc import ( "sync" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/crypto" ) // Default returns a new CtxRC with hardcoded default values. @@ -20,9 +22,9 @@ import ( // (8000 token budget, 7-day archive, etc.) func Default() *CtxRC { return &CtxRC{ - ContextDir: config.DirContext, + ContextDir: dir.Context, TokenBudget: DefaultTokenBudget, - PriorityOrder: nil, // nil means use config.FileReadOrder + PriorityOrder: nil, // nil means use config.ReadOrder AutoArchive: true, ArchiveAfterDays: DefaultArchiveAfterDays, EntryCountLearnings: DefaultEntryCountLearnings, @@ -77,7 +79,7 @@ func TokenBudget() int { // // Returns: // - []string: File names in priority order, or nil if not configured -// (callers should fall back to config.FileReadOrder) +// (callers should fall back to config.ReadOrder) func PriorityOrder() []string { return RC().PriorityOrder } @@ -201,7 +203,7 @@ func NotifyEvents() []string { // Returns: // - string: Resolved path to the encryption key file func KeyPath() string { - return config.ResolveKeyPath(ContextDir(), RC().KeyPathOverride) + return crypto.ResolveKeyPath(ContextDir(), RC().KeyPathOverride) } // KeyRotationDays returns the configured key rotation threshold in days. @@ -278,7 +280,7 @@ func Reset() { // FilePriority returns the priority of a context file. // // If a priority_order is configured in .ctxrc, that order is used. -// Otherwise, the default config.FileReadOrder is used. +// Otherwise, the default config.ReadOrder is used. // // Lower numbers indicate higher priority (1 = highest). // Unknown files return 100. @@ -300,8 +302,8 @@ func FilePriority(name string) int { return 100 } - // Use the default priority from config.FileReadOrder - for i, fName := range config.FileReadOrder { + // Use the default priority from config.ReadOrder + for i, fName := range ctx.ReadOrder { if fName == name { return i + 1 } diff --git a/internal/rc/rc_test.go b/internal/rc/rc_test.go index bd43b5fa..f48f059c 100644 --- a/internal/rc/rc_test.go +++ b/internal/rc/rc_test.go @@ -11,14 +11,16 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/ctx" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/env" ) func TestDefaultRC(t *testing.T) { rc := Default() - if rc.ContextDir != config.DirContext { - t.Errorf("ContextDir = %q, want %q", rc.ContextDir, config.DirContext) + if rc.ContextDir != dir.Context { + t.Errorf("ContextDir = %q, want %q", rc.ContextDir, dir.Context) } if rc.TokenBudget != DefaultTokenBudget { t.Errorf("TokenBudget = %d, want %d", rc.TokenBudget, DefaultTokenBudget) @@ -45,8 +47,8 @@ func TestGetRC_NoFile(t *testing.T) { rc := RC() - if rc.ContextDir != config.DirContext { - t.Errorf("ContextDir = %q, want %q", rc.ContextDir, config.DirContext) + if rc.ContextDir != dir.Context { + t.Errorf("ContextDir = %q, want %q", rc.ContextDir, dir.Context) } if rc.TokenBudget != DefaultTokenBudget { t.Errorf("TokenBudget = %d, want %d", rc.TokenBudget, DefaultTokenBudget) @@ -104,8 +106,8 @@ token_budget: 4000 _ = os.WriteFile(filepath.Join(tempDir, ".ctxrc"), []byte(rcContent), 0600) // Set environment variables (t.Setenv auto-restores after test) - t.Setenv(config.EnvCtxDir, "env-context") - t.Setenv(config.EnvCtxTokenBudget, "2000") + t.Setenv(env.CtxDir, "env-context") + t.Setenv(env.CtxTokenBudget, "2000") Reset() @@ -131,7 +133,7 @@ func TestGetContextDir_CLIOverride(t *testing.T) { _ = os.WriteFile(filepath.Join(tempDir, ".ctxrc"), []byte(rcContent), 0600) // Set env override (t.Setenv auto-restores after test) - t.Setenv(config.EnvCtxDir, "env-context") + t.Setenv(env.CtxDir, "env-context") Reset() @@ -197,8 +199,8 @@ func TestGetRC_PartialConfig(t *testing.T) { t.Errorf("TokenBudget = %d, want %d", rc.TokenBudget, 5000) } // Unspecified values should use defaults - if rc.ContextDir != config.DirContext { - t.Errorf("ContextDir = %q, want %q (default)", rc.ContextDir, config.DirContext) + if rc.ContextDir != dir.Context { + t.Errorf("ContextDir = %q, want %q (default)", rc.ContextDir, dir.Context) } } @@ -208,7 +210,7 @@ func TestGetRC_InvalidEnvBudget(t *testing.T) { _ = os.Chdir(tempDir) defer func() { _ = os.Chdir(origDir) }() - t.Setenv(config.EnvCtxTokenBudget, "not-a-number") + t.Setenv(env.CtxTokenBudget, "not-a-number") Reset() @@ -389,16 +391,16 @@ func TestFilePriority_DefaultOrder(t *testing.T) { Reset() - // CONSTITUTION.md should be first in default FileReadOrder - p := FilePriority(config.FileConstitution) + // CONSTITUTION.md should be first in default ReadOrder + p := FilePriority(ctx.Constitution) if p != 1 { - t.Errorf("FilePriority(%q) = %d, want 1", config.FileConstitution, p) + t.Errorf("FilePriority(%q) = %d, want 1", ctx.Constitution, p) } // TASKS.md should be second - p = FilePriority(config.FileTask) + p = FilePriority(ctx.Task) if p != 2 { - t.Errorf("FilePriority(%q) = %d, want 2", config.FileTask, p) + t.Errorf("FilePriority(%q) = %d, want 2", ctx.Task, p) } // Unknown file gets 100 @@ -423,15 +425,15 @@ func TestFilePriority_CustomOrder(t *testing.T) { Reset() // DECISIONS.md should be first in custom order - p := FilePriority(config.FileDecision) + p := FilePriority(ctx.Decision) if p != 1 { - t.Errorf("FilePriority(%q) = %d, want 1", config.FileDecision, p) + t.Errorf("FilePriority(%q) = %d, want 1", ctx.Decision, p) } // TASKS.md should be second - p = FilePriority(config.FileTask) + p = FilePriority(ctx.Task) if p != 2 { - t.Errorf("FilePriority(%q) = %d, want 2", config.FileTask, p) + t.Errorf("FilePriority(%q) = %d, want 2", ctx.Task, p) } // File not in custom order gets 100 @@ -449,9 +451,9 @@ func TestContextDir_NoOverride(t *testing.T) { Reset() - dir := ContextDir() - if dir != config.DirContext { - t.Errorf("ContextDir() = %q, want %q", dir, config.DirContext) + got := ContextDir() + if got != dir.Context { + t.Errorf("ContextDir() = %q, want %q", got, dir.Context) } } @@ -598,7 +600,7 @@ func TestGetRC_NegativeEnvBudget(t *testing.T) { _ = os.Chdir(tempDir) defer func() { _ = os.Chdir(origDir) }() - t.Setenv(config.EnvCtxTokenBudget, "-100") + t.Setenv(env.CtxTokenBudget, "-100") Reset() diff --git a/internal/rc/types.go b/internal/rc/types.go index f42dafa1..bef79f3c 100644 --- a/internal/rc/types.go +++ b/internal/rc/types.go @@ -29,6 +29,7 @@ package rc // - TaskNudgeInterval: Edit/Write calls between task completion nudges (default 5, 0 = disabled) // - KeyPathOverride: Explicit encryption key file path (default: auto-resolved) type CtxRC struct { + Profile string `yaml:"profile"` ContextDir string `yaml:"context_dir"` TokenBudget int `yaml:"token_budget"` PriorityOrder []string `yaml:"priority_order"` diff --git a/internal/rc/validate.go b/internal/rc/validate.go index 26368635..40046142 100644 --- a/internal/rc/validate.go +++ b/internal/rc/validate.go @@ -8,6 +8,7 @@ package rc import ( "bytes" + "errors" "io" "gopkg.in/yaml.v3" @@ -36,7 +37,8 @@ func Validate(data []byte) (warnings []string, err error) { } // yaml.v3 returns *yaml.TypeError for unknown fields. - if te, ok := decErr.(*yaml.TypeError); ok { + var te *yaml.TypeError + if errors.As(decErr, &te) { return te.Errors, nil } diff --git a/internal/recall/parser/claude.go b/internal/recall/parser/claude.go index 487f47a0..f123869e 100644 --- a/internal/recall/parser/claude.go +++ b/internal/recall/parser/claude.go @@ -9,13 +9,16 @@ package parser import ( "bufio" "encoding/json" - "fmt" "os" "path/filepath" "sort" "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/parser" + "github.com/ActiveMemory/ctx/internal/config/session" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // ClaudeCodeParser parses Claude Code JSONL session files. @@ -38,7 +41,7 @@ func NewClaudeCodeParser() *ClaudeCodeParser { // Returns: // - string: The identifier "claude-code" func (p *ClaudeCodeParser) Tool() string { - return config.ToolClaudeCode + return session.ToolClaudeCode } // Matches returns true if the file appears to be a Claude Code session file. @@ -53,21 +56,21 @@ func (p *ClaudeCodeParser) Tool() string { // - bool: True if this parser can handle the file func (p *ClaudeCodeParser) Matches(path string) bool { // Check extension - if !strings.HasSuffix(path, config.ExtJSONL) { + if !strings.HasSuffix(path, file.ExtJSONL) { return false } // Peek at the first few lines to detect the Claude Code format - file, openErr := os.Open(filepath.Clean(path)) + f, openErr := os.Open(filepath.Clean(path)) if openErr != nil { return false } - defer func() { _ = file.Close() }() + defer func() { _ = f.Close() }() - scanner := bufio.NewScanner(file) + scanner := bufio.NewScanner(f) // Check the first N lines for Claude Code message structure // (early lines can be file-history-snapshot which should be skipped) - for i := 0; i < config.ParserPeekLines && scanner.Scan(); i++ { + for i := 0; i < parser.LinesToPeek && scanner.Scan(); i++ { line := scanner.Bytes() if len(line) == 0 { continue @@ -80,7 +83,8 @@ func (p *ClaudeCodeParser) Matches(path string) bool { // Claude Code messages have sessionId and type (user/assistant) // Note: slug field was removed in newer Claude Code versions - if raw.SessionID != "" && (raw.Type == config.RoleUser || raw.Type == config.RoleAssistant) { + if raw.SessionID != "" && (raw.Type == claude.RoleUser || + raw.Type == claude.RoleAssistant) { return true } } @@ -101,19 +105,19 @@ func (p *ClaudeCodeParser) Matches(path string) bool { // - []*Session: All sessions found in the file, sorted by start time // - error: Non-nil if the file cannot be opened or read func (p *ClaudeCodeParser) ParseFile(path string) ([]*Session, error) { - file, openErr := os.Open(filepath.Clean(path)) + f, openErr := os.Open(filepath.Clean(path)) if openErr != nil { - return nil, fmt.Errorf("open file: %w", openErr) + return nil, ctxerr.ParserOpenFile(openErr) } - defer func() { _ = file.Close() }() + defer func() { _ = f.Close() }() // Group messages by session ID sessionMsgs := make(map[string][]claudeRawMessage) - scanner := bufio.NewScanner(file) + scanner := bufio.NewScanner(f) // Increase buffer size for large lines - buf := make([]byte, 0, 64*1024) - scanner.Buffer(buf, 1024*1024) // 1MB max line size + buf := make([]byte, 0, parser.BufInitSize) + scanner.Buffer(buf, parser.BufMaxSize) lineNum := 0 for scanner.Scan() { @@ -130,7 +134,7 @@ func (p *ClaudeCodeParser) ParseFile(path string) ([]*Session, error) { } // Skip non-message lines (e.g., file-history-snapshot) - if raw.Type != config.RoleUser && raw.Type != config.RoleAssistant { + if raw.Type != claude.RoleUser && raw.Type != claude.RoleAssistant { continue } @@ -142,7 +146,7 @@ func (p *ClaudeCodeParser) ParseFile(path string) ([]*Session, error) { } if scanErr := scanner.Err(); scanErr != nil { - return nil, fmt.Errorf("scan file: %w", scanErr) + return nil, ctxerr.ParserScanFile(scanErr) } // Convert to sessions @@ -182,11 +186,11 @@ func (p *ClaudeCodeParser) ParseLine(line []byte) (*Message, string, error) { var raw claudeRawMessage if unmarshalErr := json.Unmarshal(line, &raw); unmarshalErr != nil { - return nil, "", fmt.Errorf("unmarshal: %w", unmarshalErr) + return nil, "", ctxerr.ParserUnmarshal(unmarshalErr) } // Skip non-message lines - if raw.Type != config.RoleUser && raw.Type != config.RoleAssistant { + if raw.Type != claude.RoleUser && raw.Type != claude.RoleAssistant { return nil, "", nil } diff --git a/internal/recall/parser/git.go b/internal/recall/parser/git.go index 21bd11b0..c39a0d7d 100644 --- a/internal/recall/parser/git.go +++ b/internal/recall/parser/git.go @@ -35,6 +35,10 @@ func gitRemote(dir string) string { return "" } + if _, lookErr := exec.LookPath("git"); lookErr != nil { + return "" + } + cmd := exec.Command("git", "-C", dir, "remote", "get-url", "origin") output, cmdErr := cmd.Output() if cmdErr != nil { diff --git a/internal/recall/parser/markdown.go b/internal/recall/parser/markdown.go index 54ef9d31..51790f2d 100644 --- a/internal/recall/parser/markdown.go +++ b/internal/recall/parser/markdown.go @@ -8,13 +8,20 @@ package parser import ( "bufio" - "fmt" "os" "path/filepath" "strings" "time" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/dir" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/parser" + "github.com/ActiveMemory/ctx/internal/config/session" + time2 "github.com/ActiveMemory/ctx/internal/config/time" + "github.com/ActiveMemory/ctx/internal/config/token" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // MarkdownSessionParser parses Markdown session files written by AI agents. @@ -53,7 +60,7 @@ func NewMarkdownSessionParser() *MarkdownSessionParser { // Returns: // - string: The identifier "markdown" func (p *MarkdownSessionParser) Tool() string { - return config.ToolMarkdown + return session.ToolMarkdown } // Matches returns true if the file appears to be a Markdown session file. @@ -67,24 +74,24 @@ func (p *MarkdownSessionParser) Tool() string { // Returns: // - bool: True if this parser can handle the file func (p *MarkdownSessionParser) Matches(path string) bool { - if !strings.HasSuffix(path, config.ExtMarkdown) { + if !strings.HasSuffix(path, file.ExtMarkdown) { return false } // Skip README.md files base := filepath.Base(path) - if strings.EqualFold(base, config.FilenameReadme) { + if strings.EqualFold(base, file.Readme) { return false } - file, err := os.Open(filepath.Clean(path)) + f, err := os.Open(filepath.Clean(path)) if err != nil { return false } - defer func() { _ = file.Close() }() + defer func() { _ = f.Close() }() - scanner := bufio.NewScanner(file) - for i := 0; i < config.ParserPeekLines && scanner.Scan(); i++ { + scanner := bufio.NewScanner(f) + for i := 0; i < parser.LinesToPeek && scanner.Scan(); i++ { line := strings.TrimSpace(scanner.Text()) if isSessionHeader(line) { return true @@ -108,7 +115,7 @@ func (p *MarkdownSessionParser) Matches(path string) bool { func (p *MarkdownSessionParser) ParseFile(path string) ([]*Session, error) { content, err := os.ReadFile(filepath.Clean(path)) if err != nil { - return nil, fmt.Errorf("read file: %w", err) + return nil, ctxerr.ParserReadFile(err) } session := p.parseMarkdownSession(string(content), path) @@ -141,7 +148,7 @@ func (p *MarkdownSessionParser) ParseLine(_ []byte) (*Message, string, error) { func (p *MarkdownSessionParser) parseMarkdownSession( content string, sourcePath string, ) *Session { - lines := strings.Split(content, config.NewlineLF) + lines := strings.Split(content, token.NewlineLF) var headerLine string for _, line := range lines { @@ -160,7 +167,7 @@ func (p *MarkdownSessionParser) parseMarkdownSession( // Derive a session ID from the filename (stable, OS-agnostic) base := filepath.Base(sourcePath) - sessionID := strings.TrimSuffix(base, config.ExtMarkdown) + sessionID := strings.TrimSuffix(base, file.ExtMarkdown) // Parse date from header or fall back to file modification time startTime := parseSessionDate(date) @@ -184,7 +191,7 @@ func (p *MarkdownSessionParser) parseMarkdownSession( var bodyParts []string for _, sec := range sections { if sec.body != "" { - bodyParts = append(bodyParts, "## "+sec.heading+config.NewlineLF+sec.body) + bodyParts = append(bodyParts, token.HeadingLevelTwoStart+sec.heading+token.NewlineLF+sec.body) } } @@ -192,8 +199,8 @@ func (p *MarkdownSessionParser) parseMarkdownSession( messages = append(messages, Message{ ID: sessionID + "-summary", Timestamp: startTime, - Role: config.RoleAssistant, - Text: strings.Join(bodyParts, config.NewlineLF+config.NewlineLF), + Role: claude.RoleAssistant, + Text: strings.Join(bodyParts, token.NewlineLF+token.NewlineLF), }) } @@ -203,7 +210,7 @@ func (p *MarkdownSessionParser) parseMarkdownSession( messages = append([]Message{{ ID: sessionID + "-topic", Timestamp: startTime, - Role: config.RoleUser, + Role: claude.RoleUser, Text: topic, }}, messages...) } @@ -211,10 +218,10 @@ func (p *MarkdownSessionParser) parseMarkdownSession( cwd := "" project := "" // Try to infer project from the path (look for .context/sessions/ pattern) - dir := filepath.Dir(sourcePath) - if filepath.Base(dir) == "sessions" { - contextDir := filepath.Dir(dir) - if filepath.Base(contextDir) == config.DirContext { + d := filepath.Dir(sourcePath) + if filepath.Base(d) == "sessions" { + contextDir := filepath.Dir(d) + if filepath.Base(contextDir) == dir.Context { projectDir := filepath.Dir(contextDir) project = filepath.Base(projectDir) cwd = projectDir @@ -224,7 +231,7 @@ func (p *MarkdownSessionParser) parseMarkdownSession( return &Session{ ID: sessionID, Slug: sessionID, - Tool: config.ToolMarkdown, + Tool: session.ToolMarkdown, SourceFile: sourcePath, CWD: cwd, Project: project, @@ -242,7 +249,6 @@ func (p *MarkdownSessionParser) parseMarkdownSession( // Recognized formats: // - "# Session: YYYY-MM-DD — Topic" // - "# Session: YYYY-MM-DD - Topic" -// - "# Oturum: YYYY-MM-DD — Topic" (Turkish) // - "# YYYY-MM-DD — Topic" // - "# YYYY-MM-DD - Topic" // @@ -252,14 +258,17 @@ func (p *MarkdownSessionParser) parseMarkdownSession( // Returns: // - bool: True if the line matches a session header pattern func isSessionHeader(line string) bool { - if !strings.HasPrefix(line, "# ") { + if !strings.HasPrefix(line, token.HeadingLevelOneStart) { return false } - rest := line[2:] + rest := line[len(token.HeadingLevelOneStart):] // Check for "Session:" or "Oturum:" prefix - for _, prefix := range []string{"Session:", "Oturum:"} { + for _, prefix := range []string{ + assets.TextDesc(assets.TextDescKeyParserSessionPrefix), + assets.TextDesc(assets.TextDescKeyParserSessionPrefixAlt), + } { if strings.HasPrefix(rest, prefix) { return true } @@ -283,17 +292,21 @@ func isSessionHeader(line string) bool { // - string: The topic portion (e.g., "Fix API") func parseSessionHeader(line string) (string, string) { // Remove "# " prefix - rest := strings.TrimPrefix(line, "# ") - - // Remove "Session: " or "Oturum: " prefix if present - for _, prefix := range []string{"Session: ", "Oturum: ", "Session:", "Oturum:"} { + rest := strings.TrimPrefix(line, token.HeadingLevelOneStart) + + // Remove "Session: " / "Oturum: " prefix if present + for _, prefix := range []string{ + assets.TextDesc(assets.TextDescKeyParserSessionPrefix), + assets.TextDesc(assets.TextDescKeyParserSessionPrefixAlt), + } { + rest = strings.TrimPrefix(rest, prefix+token.Space) rest = strings.TrimPrefix(rest, prefix) } rest = strings.TrimSpace(rest) // Split on " — " (em dash) or " - " (hyphen) - for _, sep := range []string{" \u2014 ", " - "} { + for _, sep := range []string{" — ", " - "} { if idx := strings.Index(rest, sep); idx >= 0 { return strings.TrimSpace(rest[:idx]), strings.TrimSpace(rest[idx+len(sep):]) } @@ -313,7 +326,7 @@ func parseSessionHeader(line string) (string, string) { // Returns: // - time.Time: Parsed time, or zero value on failure func parseSessionDate(dateStr string) time.Time { - t, err := time.Parse("2006-01-02", dateStr) + t, err := time.Parse(time2.DateFormat, dateStr) if err != nil { return time.Time{} } @@ -342,17 +355,17 @@ func extractSections(lines []string) []section { for _, line := range lines { trimmed := strings.TrimSpace(line) - if strings.HasPrefix(trimmed, "## ") { + if strings.HasPrefix(trimmed, token.HeadingLevelTwoStart) { // Save previous section if currentHeading != "" { sections = append(sections, section{ heading: currentHeading, body: strings.TrimSpace( - strings.Join(currentBody, config.NewlineLF), + strings.Join(currentBody, token.NewlineLF), ), }) } - currentHeading = strings.TrimPrefix(trimmed, "## ") + currentHeading = strings.TrimPrefix(trimmed, token.HeadingLevelTwoStart) currentBody = nil continue } @@ -367,7 +380,7 @@ func extractSections(lines []string) []section { sections = append(sections, section{ heading: currentHeading, body: strings.TrimSpace( - strings.Join(currentBody, config.NewlineLF), + strings.Join(currentBody, token.NewlineLF), ), }) } diff --git a/internal/recall/parser/markdown_test.go b/internal/recall/parser/markdown_test.go index 2256f552..9dd339b4 100644 --- a/internal/recall/parser/markdown_test.go +++ b/internal/recall/parser/markdown_test.go @@ -11,13 +11,13 @@ import ( "path/filepath" "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/session" ) func TestMarkdownSessionParser_Tool(t *testing.T) { p := NewMarkdownSessionParser() - if got := p.Tool(); got != config.ToolMarkdown { - t.Errorf("Tool() = %q, want %q", got, config.ToolMarkdown) + if got := p.Tool(); got != session.ToolMarkdown { + t.Errorf("Tool() = %q, want %q", got, session.ToolMarkdown) } } @@ -138,8 +138,8 @@ func TestMarkdownSessionParser_ParseFile(t *testing.T) { if s.ID != "2026-01-15-fix-api" { t.Errorf("ID = %q, want %q", s.ID, "2026-01-15-fix-api") } - if s.Tool != config.ToolMarkdown { - t.Errorf("Tool = %q, want %q", s.Tool, config.ToolMarkdown) + if s.Tool != session.ToolMarkdown { + t.Errorf("Tool = %q, want %q", s.Tool, session.ToolMarkdown) } if s.FirstUserMsg != "Fix API Rate Limiting" { t.Errorf("FirstUserMsg = %q, want %q", s.FirstUserMsg, "Fix API Rate Limiting") @@ -355,10 +355,10 @@ func TestScanDirectory_WithMarkdown(t *testing.T) { for _, s := range sessions { tools[s.Tool] = true } - if !tools[config.ToolMarkdown] { + if !tools[session.ToolMarkdown] { t.Error("expected markdown session in results") } - if !tools[config.ToolClaudeCode] { + if !tools[session.ToolClaudeCode] { t.Error("expected claude-code session in results") } } @@ -367,22 +367,22 @@ func TestRegisteredTools_IncludesMarkdown(t *testing.T) { tools := RegisteredTools() found := false for _, tool := range tools { - if tool == config.ToolMarkdown { + if tool == session.ToolMarkdown { found = true break } } if !found { - t.Errorf("expected %q in registered tools", config.ToolMarkdown) + t.Errorf("expected %q in registered tools", session.ToolMarkdown) } } func TestGetParser_Markdown(t *testing.T) { - p := Parser(config.ToolMarkdown) + p := Parser(session.ToolMarkdown) if p == nil { - t.Fatalf("expected parser for %q", config.ToolMarkdown) + t.Fatalf("expected parser for %q", session.ToolMarkdown) } - if p.Tool() != config.ToolMarkdown { - t.Errorf("Tool() = %q, want %q", p.Tool(), config.ToolMarkdown) + if p.Tool() != session.ToolMarkdown { + t.Errorf("Tool() = %q, want %q", p.Tool(), session.ToolMarkdown) } } diff --git a/internal/recall/parser/message.go b/internal/recall/parser/message.go index 42bb7992..c8efebb4 100644 --- a/internal/recall/parser/message.go +++ b/internal/recall/parser/message.go @@ -6,48 +6,17 @@ package parser -import "time" - -// Message represents a single message in a session. -// -// This is tool-agnostic - all parsers normalize to this format. -// -// Fields: -// -// Identity: -// - ID: Unique message identifier -// - Timestamp: When the message was created -// - Role: Message role ("user" or "assistant") -// -// Content: -// - Text: Main text content -// - Thinking: Reasoning content (if available) -// - ToolUses: Tool invocations in this message -// - ToolResults: Results from tool invocations -// -// Token Usage: -// - TokensIn: Input tokens for this message (if available) -// - TokensOut: Output tokens for this message (if available) -type Message struct { - ID string `json:"id"` - Timestamp time.Time `json:"timestamp"` - Role string `json:"role"` - - Text string `json:"text,omitempty"` - Thinking string `json:"thinking,omitempty"` - ToolUses []ToolUse `json:"tool_uses,omitempty"` - ToolResults []ToolResult `json:"tool_results,omitempty"` - - TokensIn int `json:"tokens_in,omitempty"` - TokensOut int `json:"tokens_out,omitempty"` -} +import ( + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/token" +) // BelongsToUser returns true if this is a user message. // // Returns: // - bool: True if Role is "user" func (m *Message) BelongsToUser() bool { - return m.Role == "user" + return m.Role == claude.RoleUser } // BelongsToAssistant returns true if this is an assistant message. @@ -55,7 +24,7 @@ func (m *Message) BelongsToUser() bool { // Returns: // - bool: True if Role is "assistant" func (m *Message) BelongsToAssistant() bool { - return m.Role == "assistant" + return m.Role == claude.RoleAssistant } // UsesTools returns true if this message contains tool invocations. @@ -77,5 +46,5 @@ func (m *Message) Preview(maxLen int) string { if len(m.Text) <= maxLen { return m.Text } - return m.Text[:maxLen] + "..." + return m.Text[:maxLen] + token.Ellipsis } diff --git a/internal/recall/parser/parse.go b/internal/recall/parser/parse.go index ea596b54..3ca28b23 100644 --- a/internal/recall/parser/parse.go +++ b/internal/recall/parser/parse.go @@ -11,7 +11,9 @@ import ( "path/filepath" "sort" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/claude" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/config/token" ) // buildSession constructs a Session from raw Claude Code messages. @@ -41,7 +43,7 @@ func (p *ClaudeCodeParser) buildSession( session := &Session{ ID: id, Slug: first.Slug, - Tool: config.ToolClaudeCode, + Tool: session.ToolClaudeCode, SourceFile: sourcePath, CWD: first.CWD, Project: filepath.Base(first.CWD), @@ -62,7 +64,7 @@ func (p *ClaudeCodeParser) buildSession( // Truncate preview preview := msg.Text if len(preview) > 100 { - preview = preview[:100] + "..." + preview = preview[:100] + token.Ellipsis } session.FirstUserMsg = preview } @@ -115,19 +117,19 @@ func (p *ClaudeCodeParser) convertMessage(raw claudeRawMessage) Message { // Extract content from blocks for _, block := range blocks { switch block.Type { - case config.ClaudeBlockText: + case claude.BlockText: if msg.Text != "" { - msg.Text += config.NewlineLF + msg.Text += token.NewlineLF } msg.Text += block.Text - case config.ClaudeBlockThinking: + case claude.BlockThinking: if msg.Thinking != "" { - msg.Thinking += config.NewlineLF + msg.Thinking += token.NewlineLF } msg.Thinking += block.Thinking - case config.ClaudeBlockToolUse: + case claude.BlockToolUse: inputStr := "" if block.Input != nil { inputStr = string(block.Input) @@ -138,7 +140,7 @@ func (p *ClaudeCodeParser) convertMessage(raw claudeRawMessage) Message { Input: inputStr, }) - case config.ClaudeBlockToolResult: + case claude.BlockToolResult: contentStr := "" if block.Content != nil { // Try to unmarshal as JSON string first (handles escaping) @@ -185,7 +187,7 @@ func (p *ClaudeCodeParser) parseContentBlocks( // Try parsing as a simple string var text string if err := json.Unmarshal(content, &text); err == nil && text != "" { - return []claudeRawBlock{{Type: config.ClaudeBlockText, Text: text}} + return []claudeRawBlock{{Type: claude.BlockText, Text: text}} } return nil diff --git a/internal/recall/parser/parser.go b/internal/recall/parser/parser.go index a0fd449d..f8ccb7d3 100644 --- a/internal/recall/parser/parser.go +++ b/internal/recall/parser/parser.go @@ -7,11 +7,13 @@ package parser import ( - "fmt" "os" "path/filepath" "sort" "strings" + + "github.com/ActiveMemory/ctx/internal/config/parser" + ctxerr "github.com/ActiveMemory/ctx/internal/err" ) // registeredParsers holds all available session parsers. @@ -37,7 +39,7 @@ func ParseFile(path string) ([]*Session, error) { return parser.ParseFile(path) } } - return nil, fmt.Errorf("no parser found for file: %s", path) + return nil, ctxerr.ParserNoMatch(path) } // ScanDirectory recursively scans a directory for session files. @@ -84,14 +86,14 @@ func ScanDirectoryWithErrors(dir string) ([]*Session, []error, error) { if info.IsDir() { // Skip subagents directories - they contain sidechain sessions // that share the parent sessionId and would cause duplicates - if info.Name() == "subagents" { + if info.Name() == parser.DirSubagents { return filepath.SkipDir } return nil } // Skip files in paths containing /subagents/ (defensive check) - if strings.Contains(path, string(filepath.Separator)+"subagents"+string(filepath.Separator)) { + if strings.Contains(path, string(filepath.Separator)+parser.DirSubagents+string(filepath.Separator)) { return nil } @@ -100,7 +102,7 @@ func ScanDirectoryWithErrors(dir string) ([]*Session, []error, error) { if parser.Matches(path) { sessions, err := parser.ParseFile(path) if err != nil { - parseErrors = append(parseErrors, fmt.Errorf("%s: %w", path, err)) + parseErrors = append(parseErrors, ctxerr.ParserFileError(path, err)) break } allSessions = append(allSessions, sessions...) @@ -112,7 +114,7 @@ func ScanDirectoryWithErrors(dir string) ([]*Session, []error, error) { }) if err != nil { - return nil, nil, fmt.Errorf("walk directory: %w", err) + return nil, nil, ctxerr.ParserWalkDir(err) } // Sort by start time (newest first) diff --git a/internal/recall/parser/path.go b/internal/recall/parser/path.go index 52b6e031..ff01a61d 100644 --- a/internal/recall/parser/path.go +++ b/internal/recall/parser/path.go @@ -12,7 +12,16 @@ import ( ) // getPathRelativeToHome returns the path relative to the user's home directory. -// Returns an empty string if the path is not under a home directory. +// +// Handles both Linux (/home/username/...) and macOS (/Users/username/...) +// home directory patterns. Returns an empty string if the path is empty +// or not under a recognized home directory root. +// +// Parameters: +// - path: Absolute file path to make relative +// +// Returns: +// - string: Path relative to the home directory, or empty string func getPathRelativeToHome(path string) string { if path == "" { return "" diff --git a/internal/recall/parser/query.go b/internal/recall/parser/query.go index 8467e48b..14b31785 100644 --- a/internal/recall/parser/query.go +++ b/internal/recall/parser/query.go @@ -11,7 +11,7 @@ import ( "path/filepath" "sort" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/dir" ) // findSessionsWithFilter scans common locations and additional directories @@ -53,12 +53,12 @@ func findSessionsWithFilter( // Check Claude Code default location home, err := os.UserHomeDir() if err == nil { - scanOnce(filepath.Join(home, ".claude", "projects")) + scanOnce(filepath.Join(home, dir.Claude, dir.Projects)) } // Check .context/sessions/ in the current working directory if cwd, cwdErr := os.Getwd(); cwdErr == nil { - scanOnce(filepath.Join(cwd, config.DirContext, config.DirSessions)) + scanOnce(filepath.Join(cwd, dir.Context, dir.Sessions)) } // Check additional directories diff --git a/internal/recall/parser/raw.go b/internal/recall/parser/raw.go deleted file mode 100644 index 14aec48e..00000000 --- a/internal/recall/parser/raw.go +++ /dev/null @@ -1,115 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package parser - -import ( - "encoding/json" - "time" -) - -// Claude Code JSONL raw types. -// -// These types mirror the on-disk JSONL format produced by Claude Code. -// Each line in a Claude Code session file is a self-contained JSON object -// that deserializes into claudeRawMessage. - -// claudeRawMessage represents a single JSONL line from a Claude Code session. -// -// Fields: -// - UUID: Unique message identifier -// - ParentUUID: Parent message for threading (nil for root messages) -// - SessionID: Groups messages into a single session -// - RequestID: API request correlation identifier -// - Timestamp: When the message was created -// - Type: Message role ("user", "assistant", or system types) -// - UserType: Sub-type for user messages -// - IsSidechain: True if the message is on a sidechain branch -// - CWD: Working directory at message time -// - GitBranch: Active git branch at message time -// - Version: Claude Code version that created the message -// - Slug: URL-friendly session identifier (removed in newer versions) -// - Message: Nested content payload -type claudeRawMessage struct { - UUID string `json:"uuid"` - ParentUUID *string `json:"parentUuid"` - SessionID string `json:"sessionId"` - RequestID string `json:"requestId,omitempty"` - Timestamp time.Time `json:"timestamp"` - Type string `json:"type"` - UserType string `json:"userType,omitempty"` - IsSidechain bool `json:"isSidechain,omitempty"` - CWD string `json:"cwd"` - GitBranch string `json:"gitBranch,omitempty"` - Version string `json:"version"` - Slug string `json:"slug"` - Message claudeRawContent `json:"message"` -} - -// claudeRawContent is the nested content envelope inside a claudeRawMessage. -// -// Fields: -// - ID: Content block identifier -// - Type: Content type discriminator -// - Model: AI model used for this response -// - Role: Message role ("user" or "assistant") -// - Content: Raw JSON that may be a string or []claudeRawBlock -// - StopReason: Why the model stopped generating -// - StopSequence: Stop sequence that was hit, if any -// - Usage: Token usage statistics for this message -type claudeRawContent struct { - ID string `json:"id"` - Type string `json:"type"` - Model string `json:"model,omitempty"` - Role string `json:"role"` - Content json.RawMessage `json:"content"` - StopReason *string `json:"stop_reason,omitempty"` - StopSequence *string `json:"stop_sequence,omitempty"` - Usage *claudeRawUsage `json:"usage,omitempty"` -} - -// claudeRawBlock represents a single content block in a Claude response. -// -// The Type field discriminates between text, thinking, tool_use, and -// tool_result blocks. Only fields relevant to the block type are populated. -// -// Fields: -// - Type: Block type ("text", "thinking", "tool_use", "tool_result") -// - Text: Text content (for text blocks) -// - Thinking: Reasoning content (for thinking blocks) -// - Signature: Cryptographic signature (for thinking blocks) -// - ID: Block identifier (for tool_use blocks) -// - Name: Tool name (for tool_use blocks) -// - Input: Raw JSON tool parameters (for tool_use blocks) -// - ToolUseID: References the tool_use block (for tool_result blocks) -// - Content: Raw JSON tool output (for tool_result blocks) -// - IsError: True if tool execution failed (for tool_result blocks) -type claudeRawBlock struct { - Type string `json:"type"` - Text string `json:"text,omitempty"` - Thinking string `json:"thinking,omitempty"` - Signature string `json:"signature,omitempty"` - ID string `json:"id,omitempty"` - Name string `json:"name,omitempty"` - Input json.RawMessage `json:"input,omitempty"` - ToolUseID string `json:"tool_use_id,omitempty"` - Content json.RawMessage `json:"content,omitempty"` - IsError bool `json:"is_error,omitempty"` -} - -// claudeRawUsage contains token usage statistics from the Claude API. -// -// Fields: -// - InputTokens: Number of input tokens consumed -// - OutputTokens: Number of output tokens generated -// - CacheCreationInputTokens: Tokens used to create prompt cache -// - CacheReadInputTokens: Tokens read from prompt cache -type claudeRawUsage struct { - InputTokens int `json:"input_tokens"` - OutputTokens int `json:"output_tokens"` - CacheCreationInputTokens int `json:"cache_creation_input_tokens,omitempty"` - CacheReadInputTokens int `json:"cache_read_input_tokens,omitempty"` -} diff --git a/internal/recall/parser/session.go b/internal/recall/parser/session.go index 43e4f47b..445e9117 100644 --- a/internal/recall/parser/session.go +++ b/internal/recall/parser/session.go @@ -8,6 +8,27 @@ package parser import "time" +// SessionParser defines the interface for tool-specific session parsers. +// +// Each AI tool (Claude Code, Aider, Cursor) implements this interface +// to parse its specific format into the common Session type. +type SessionParser interface { + // ParseFile reads a session file and returns all sessions found. + // A single file may contain multiple sessions (grouped by session ID). + ParseFile(path string) ([]*Session, error) + + // ParseLine parses a single line from a session file. + // Returns nil if the line should be skipped (e.g., non-message lines). + ParseLine(line []byte) (*Message, string, error) // message, sessionID, error + + // Matches returns true if this parser can handle the given file. + // Implementations may check file extension, peek at content, etc. + Matches(path string) bool + + // Tool returns the tool identifier (e.g., "claude-code", "aider"). + Tool() string +} + // Session represents a reconstructed conversation session. // // This is the tool-agnostic output type that all parsers produce. diff --git a/internal/recall/parser/types.go b/internal/recall/parser/types.go index 7eb7fda8..94c89555 100644 --- a/internal/recall/parser/types.go +++ b/internal/recall/parser/types.go @@ -11,6 +11,148 @@ // Session output type with tool-specific parsers (e.g., ClaudeCodeParser). package parser +import ( + "encoding/json" + "time" +) + +// Claude Code JSONL raw types. +// +// These types mirror the on-disk JSONL format produced by Claude Code. +// Each line in a Claude Code session file is a self-contained JSON object +// that deserializes into claudeRawMessage. + +// claudeRawMessage represents a single JSONL line from a Claude Code session. +// +// Fields: +// - UUID: Unique message identifier +// - ParentUUID: Parent message for threading (nil for root messages) +// - SessionID: Groups messages into a single session +// - RequestID: API request correlation identifier +// - Timestamp: When the message was created +// - Type: Message role ("user", "assistant", or system types) +// - UserType: Sub-type for user messages +// - IsSidechain: True if the message is on a sidechain branch +// - CWD: Working directory at message time +// - GitBranch: Active git branch at message time +// - Version: Claude Code version that created the message +// - Slug: URL-friendly session identifier (removed in newer versions) +// - Message: Nested content payload +type claudeRawMessage struct { + UUID string `json:"uuid"` + ParentUUID *string `json:"parentUuid"` + SessionID string `json:"sessionId"` + RequestID string `json:"requestId,omitempty"` + Timestamp time.Time `json:"timestamp"` + Type string `json:"type"` + UserType string `json:"userType,omitempty"` + IsSidechain bool `json:"isSidechain,omitempty"` + CWD string `json:"cwd"` + GitBranch string `json:"gitBranch,omitempty"` + Version string `json:"version"` + Slug string `json:"slug"` + Message claudeRawContent `json:"message"` +} + +// claudeRawContent is the nested content envelope inside a claudeRawMessage. +// +// Fields: +// - ID: Content block identifier +// - Type: Content type discriminator +// - Model: AI model used for this response +// - Role: Message role ("user" or "assistant") +// - Content: Raw JSON that may be a string or []claudeRawBlock +// - StopReason: Why the model stopped generating +// - StopSequence: Stop sequence that was hit, if any +// - Usage: Token usage statistics for this message +type claudeRawContent struct { + ID string `json:"id"` + Type string `json:"type"` + Model string `json:"model,omitempty"` + Role string `json:"role"` + Content json.RawMessage `json:"content"` + StopReason *string `json:"stop_reason,omitempty"` + StopSequence *string `json:"stop_sequence,omitempty"` + Usage *claudeRawUsage `json:"usage,omitempty"` +} + +// claudeRawBlock represents a single content block in a Claude response. +// +// The Type field discriminates between text, thinking, tool_use, and +// tool_result blocks. Only fields relevant to the block type are populated. +// +// Fields: +// - Type: Block type ("text", "thinking", "tool_use", "tool_result") +// - Text: Text content (for text blocks) +// - Thinking: Reasoning content (for thinking blocks) +// - Signature: Cryptographic signature (for thinking blocks) +// - ID: Block identifier (for tool_use blocks) +// - Name: Tool name (for tool_use blocks) +// - Input: Raw JSON tool parameters (for tool_use blocks) +// - ToolUseID: References the tool_use block (for tool_result blocks) +// - Content: Raw JSON tool output (for tool_result blocks) +// - IsError: True if tool execution failed (for tool_result blocks) +type claudeRawBlock struct { + Type string `json:"type"` + Text string `json:"text,omitempty"` + Thinking string `json:"thinking,omitempty"` + Signature string `json:"signature,omitempty"` + ID string `json:"id,omitempty"` + Name string `json:"name,omitempty"` + Input json.RawMessage `json:"input,omitempty"` + ToolUseID string `json:"tool_use_id,omitempty"` + Content json.RawMessage `json:"content,omitempty"` + IsError bool `json:"is_error,omitempty"` +} + +// claudeRawUsage contains token usage statistics from the Claude API. +// +// Fields: +// - InputTokens: Number of input tokens consumed +// - OutputTokens: Number of output tokens generated +// - CacheCreationInputTokens: Tokens used to create prompt cache +// - CacheReadInputTokens: Tokens read from prompt cache +type claudeRawUsage struct { + InputTokens int `json:"input_tokens"` + OutputTokens int `json:"output_tokens"` + CacheCreationInputTokens int `json:"cache_creation_input_tokens,omitempty"` + CacheReadInputTokens int `json:"cache_read_input_tokens,omitempty"` +} + +// Message represents a single message in a session. +// +// This is tool-agnostic - all parsers normalize to this format. +// +// Fields: +// +// Identity: +// - ID: Unique message identifier +// - Timestamp: When the message was created +// - Role: Message role ("user" or "assistant") +// +// Content: +// - Text: Main text content +// - Thinking: Reasoning content (if available) +// - ToolUses: Tool invocations in this message +// - ToolResults: Results from tool invocations +// +// Token Usage: +// - TokensIn: Input tokens for this message (if available) +// - TokensOut: Output tokens for this message (if available) +type Message struct { + ID string `json:"id"` + Timestamp time.Time `json:"timestamp"` + Role string `json:"role"` + + Text string `json:"text,omitempty"` + Thinking string `json:"thinking,omitempty"` + ToolUses []ToolUse `json:"tool_uses,omitempty"` + ToolResults []ToolResult `json:"tool_results,omitempty"` + + TokensIn int `json:"tokens_in,omitempty"` + TokensOut int `json:"tokens_out,omitempty"` +} + // ToolUse represents a tool invocation by the assistant. // // Fields: @@ -34,24 +176,3 @@ type ToolResult struct { Content string `json:"content"` IsError bool `json:"is_error,omitempty"` } - -// SessionParser defines the interface for tool-specific session parsers. -// -// Each AI tool (Claude Code, Aider, Cursor) implements this interface -// to parse its specific format into the common Session type. -type SessionParser interface { - // ParseFile reads a session file and returns all sessions found. - // A single file may contain multiple sessions (grouped by session ID). - ParseFile(path string) ([]*Session, error) - - // ParseLine parses a single line from a session file. - // Returns nil if the line should be skipped (e.g., non-message lines). - ParseLine(line []byte) (*Message, string, error) // message, sessionID, error - - // Matches returns true if this parser can handle the given file. - // Implementations may check file extension, peek at content, etc. - Matches(path string) bool - - // Tool returns the tool identifier (e.g., "claude-code", "aider"). - Tool() string -} diff --git a/internal/sysinfo/disk.go b/internal/sysinfo/disk.go index bc04eb34..a7aa745c 100644 --- a/internal/sysinfo/disk.go +++ b/internal/sysinfo/disk.go @@ -10,6 +10,17 @@ package sysinfo import "syscall" +// collectDisk queries filesystem statistics for the given mount path. +// +// Uses syscall.Statfs to obtain total and available block counts, +// then converts to byte values. Returns a DiskInfo with Supported=false +// if the statfs call fails (e.g. path does not exist). +// +// Parameters: +// - path: Filesystem path to query (typically "/" or a mount point) +// +// Returns: +// - DiskInfo: Disk usage statistics for the filesystem containing path func collectDisk(path string) DiskInfo { var stat syscall.Statfs_t if err := syscall.Statfs(path, &stat); err != nil { diff --git a/internal/sysinfo/load_darwin.go b/internal/sysinfo/load_darwin.go index ce1529c9..f34987de 100644 --- a/internal/sysinfo/load_darwin.go +++ b/internal/sysinfo/load_darwin.go @@ -15,6 +15,14 @@ import ( "strings" ) +// collectLoad queries system load averages on macOS via sysctl. +// +// Parses the output of `sysctl -n vm.loadavg` (format: "{ 0.52 0.41 0.38 }") +// into 1-, 5-, and 15-minute load averages. Returns a LoadInfo with +// Supported=false if the command fails or output cannot be parsed. +// +// Returns: +// - LoadInfo: System load averages and CPU count func collectLoad() LoadInfo { out, err := exec.Command("sysctl", "-n", "vm.loadavg").Output() if err != nil { diff --git a/internal/sysinfo/load_linux.go b/internal/sysinfo/load_linux.go index f2f7083f..1beee441 100644 --- a/internal/sysinfo/load_linux.go +++ b/internal/sysinfo/load_linux.go @@ -15,6 +15,13 @@ import ( "runtime" ) +// collectLoad reads system load averages from /proc/loadavg on Linux. +// +// Returns a LoadInfo with Supported=false if /proc/loadavg cannot be +// opened or its content cannot be parsed. +// +// Returns: +// - LoadInfo: System load averages and CPU count func collectLoad() LoadInfo { f, err := os.Open("/proc/loadavg") if err != nil { @@ -25,6 +32,15 @@ func collectLoad() LoadInfo { } // parseLoadavg parses /proc/loadavg content into a LoadInfo struct. +// +// Expects space-separated 1-, 5-, and 15-minute load averages as the +// first three fields. Returns Supported=false if parsing fails. +// +// Parameters: +// - r: Reader providing /proc/loadavg content +// +// Returns: +// - LoadInfo: Parsed load averages and CPU count func parseLoadavg(r io.Reader) LoadInfo { var load1, load5, load15 float64 _, err := fmt.Fscanf(r, "%f %f %f", &load1, &load5, &load15) diff --git a/internal/sysinfo/load_other.go b/internal/sysinfo/load_other.go index 47e3176e..56994538 100644 --- a/internal/sysinfo/load_other.go +++ b/internal/sysinfo/load_other.go @@ -8,6 +8,10 @@ package sysinfo +// collectLoad is a no-op stub for unsupported platforms. +// +// Returns: +// - LoadInfo: Always returns Supported=false func collectLoad() LoadInfo { return LoadInfo{Supported: false} } diff --git a/internal/sysinfo/memory_darwin.go b/internal/sysinfo/memory_darwin.go index 3aedb399..05a1dca2 100644 --- a/internal/sysinfo/memory_darwin.go +++ b/internal/sysinfo/memory_darwin.go @@ -9,12 +9,21 @@ package sysinfo import ( - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/token" + "os/exec" "strconv" "strings" ) +// collectMemory queries physical and swap memory usage on macOS. +// +// Uses `sysctl -n hw.memsize` for total RAM, `vm_stat` for page-level +// usage, and `sysctl -n vm.swapusage` for swap statistics. Returns a +// MemInfo with Supported=false if the total memory cannot be determined. +// +// Returns: +// - MemInfo: Physical and swap memory statistics func collectMemory() MemInfo { // Total physical memory out, err := exec.Command("sysctl", "-n", "hw.memsize").Output() @@ -50,12 +59,22 @@ func collectMemory() MemInfo { } // parseVMStat extracts used memory from vm_stat output. -// Used = Total - (free + inactive) * pageSize. +// +// Computes used bytes as Total - (free + inactive) * pageSize. +// Defaults to 16384-byte pages (Apple Silicon) if page size is not +// found in the output. +// +// Parameters: +// - output: Raw output from the vm_stat command +// - totalBytes: Total physical memory in bytes +// +// Returns: +// - uint64: Estimated used memory in bytes func parseVMStat(output string, totalBytes uint64) uint64 { var pageSize uint64 = 16384 // default on Apple Silicon pages := make(map[string]uint64) - for _, line := range strings.Split(output, config.NewlineLF) { + for _, line := range strings.Split(output, token.NewlineLF) { if strings.Contains(line, "page size of") { for _, word := range strings.Fields(line) { if n, err := strconv.ParseUint(word, 10, 64); err == nil && n > 0 { @@ -84,7 +103,16 @@ func parseVMStat(output string, totalBytes uint64) uint64 { } // parseSwapUsage parses sysctl vm.swapusage output. -// Format: "total = 2048.00M used = 123.45M free = 1924.55M (encrypted)" +// +// Expected format: "total = 2048.00M used = 123.45M free = 1924.55M (encrypted)" +// Values are parsed as megabytes and converted to bytes. +// +// Parameters: +// - output: Raw output from `sysctl -n vm.swapusage` +// +// Returns: +// - total: Total swap space in bytes +// - used: Used swap space in bytes func parseSwapUsage(output string) (total, used uint64) { parseMB := func(s string) uint64 { s = strings.TrimSuffix(strings.TrimSpace(s), "M") diff --git a/internal/sysinfo/memory_linux.go b/internal/sysinfo/memory_linux.go index 03045461..07c75e00 100644 --- a/internal/sysinfo/memory_linux.go +++ b/internal/sysinfo/memory_linux.go @@ -16,6 +16,12 @@ import ( "strings" ) +// collectMemory reads physical and swap memory usage from /proc/meminfo on Linux. +// +// Returns a MemInfo with Supported=false if /proc/meminfo cannot be opened. +// +// Returns: +// - MemInfo: Physical and swap memory statistics func collectMemory() MemInfo { f, err := os.Open("/proc/meminfo") if err != nil { @@ -26,7 +32,16 @@ func collectMemory() MemInfo { } // parseMeminfo parses /proc/meminfo content into a MemInfo struct. -// Exported-in-tests via parse_linux_test.go. +// +// Reads key-value pairs in "Key: value kB" format. Used memory is +// computed as Total - Available (with a fallback to Free + Buffers + +// Cached for kernels before 3.14 that lack MemAvailable). +// +// Parameters: +// - r: Reader providing /proc/meminfo content +// +// Returns: +// - MemInfo: Parsed memory and swap statistics func parseMeminfo(r io.Reader) MemInfo { vals := make(map[string]uint64) scanner := bufio.NewScanner(r) diff --git a/internal/sysinfo/memory_other.go b/internal/sysinfo/memory_other.go index 87456754..e2de39b7 100644 --- a/internal/sysinfo/memory_other.go +++ b/internal/sysinfo/memory_other.go @@ -8,6 +8,10 @@ package sysinfo +// collectMemory is a no-op stub for unsupported platforms. +// +// Returns: +// - MemInfo: Always returns Supported=false func collectMemory() MemInfo { return MemInfo{Supported: false} } diff --git a/internal/sysinfo/resources.go b/internal/sysinfo/resources.go index ae9c1554..a80c401e 100644 --- a/internal/sysinfo/resources.go +++ b/internal/sysinfo/resources.go @@ -80,11 +80,11 @@ func Collect(path string) Snapshot { // MaxSeverity returns the highest severity among the given alerts. // Returns SeverityOK when the slice is empty. func MaxSeverity(alerts []ResourceAlert) Severity { - max := SeverityOK + highest := SeverityOK for _, a := range alerts { - if a.Severity > max { - max = a.Severity + if a.Severity > highest { + highest = a.Severity } } - return max + return highest } diff --git a/internal/sysinfo/threshold.go b/internal/sysinfo/threshold.go index 441b8d95..2562d8c6 100644 --- a/internal/sysinfo/threshold.go +++ b/internal/sysinfo/threshold.go @@ -6,81 +6,89 @@ package sysinfo -import "fmt" +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/stats" +) // Evaluate checks a snapshot against resource thresholds and returns any // alerts. Unsupported or zero-total resources are silently skipped. +// +// Thresholds: +// - Memory: WARNING >= 80%, DANGER >= 90% +// - Swap: WARNING >= 50%, DANGER >= 75% +// - Disk: WARNING >= 85%, DANGER >= 95% +// - Load: WARNING >= 0.8x CPUs, DANGER >= 1.5x CPUs +// +// Parameters: +// - snap: System resource snapshot to evaluate +// +// Returns: +// - []ResourceAlert: Alerts for any resources exceeding thresholds func Evaluate(snap Snapshot) []ResourceAlert { var alerts []ResourceAlert - // Memory: WARNING >= 80%, DANGER >= 90% + // Memory if snap.Memory.Supported && snap.Memory.TotalBytes > 0 { pct := percent(snap.Memory.UsedBytes, snap.Memory.TotalBytes) - if pct >= 90 { + msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertMemory), + pct, FormatGiB(snap.Memory.UsedBytes), FormatGiB(snap.Memory.TotalBytes)) + if pct >= stats.ThresholdMemoryDangerPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityDanger, - Resource: "memory", - Message: fmt.Sprintf("Memory %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Memory.UsedBytes), FormatGiB(snap.Memory.TotalBytes)), + Severity: SeverityDanger, Resource: "memory", Message: msg, }) - } else if pct >= 80 { + } else if pct >= stats.ThresholdMemoryWarnPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityWarning, - Resource: "memory", - Message: fmt.Sprintf("Memory %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Memory.UsedBytes), FormatGiB(snap.Memory.TotalBytes)), + Severity: SeverityWarning, Resource: "memory", Message: msg, }) } } - // Swap: WARNING >= 50%, DANGER >= 75% + // Swap if snap.Memory.Supported && snap.Memory.SwapTotalBytes > 0 { pct := percent(snap.Memory.SwapUsedBytes, snap.Memory.SwapTotalBytes) - if pct >= 75 { + msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertSwap), + pct, FormatGiB(snap.Memory.SwapUsedBytes), FormatGiB(snap.Memory.SwapTotalBytes)) + if pct >= stats.ThresholdSwapDangerPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityDanger, - Resource: "swap", - Message: fmt.Sprintf("Swap %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Memory.SwapUsedBytes), FormatGiB(snap.Memory.SwapTotalBytes)), + Severity: SeverityDanger, Resource: "swap", Message: msg, }) - } else if pct >= 50 { + } else if pct >= stats.ThresholdSwapWarnPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityWarning, - Resource: "swap", - Message: fmt.Sprintf("Swap %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Memory.SwapUsedBytes), FormatGiB(snap.Memory.SwapTotalBytes)), + Severity: SeverityWarning, Resource: "swap", Message: msg, }) } } - // Disk: WARNING >= 85%, DANGER >= 95% + // Disk if snap.Disk.Supported && snap.Disk.TotalBytes > 0 { pct := percent(snap.Disk.UsedBytes, snap.Disk.TotalBytes) - if pct >= 95 { + msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertDisk), + pct, FormatGiB(snap.Disk.UsedBytes), FormatGiB(snap.Disk.TotalBytes)) + if pct >= stats.ThresholdDiskDangerPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityDanger, - Resource: "disk", - Message: fmt.Sprintf("Disk %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Disk.UsedBytes), FormatGiB(snap.Disk.TotalBytes)), + Severity: SeverityDanger, Resource: "disk", Message: msg, }) - } else if pct >= 85 { + } else if pct >= stats.ThresholdDiskWarnPct { alerts = append(alerts, ResourceAlert{ - Severity: SeverityWarning, - Resource: "disk", - Message: fmt.Sprintf("Disk %.0f%% used (%s / %s GB)", pct, FormatGiB(snap.Disk.UsedBytes), FormatGiB(snap.Disk.TotalBytes)), + Severity: SeverityWarning, Resource: "disk", Message: msg, }) } } - // Load (1m): WARNING >= 0.8x CPUs, DANGER >= 1.5x CPUs + // Load (1m) if snap.Load.Supported && snap.Load.NumCPU > 0 { ratio := snap.Load.Load1 / float64(snap.Load.NumCPU) - if ratio >= 1.5 { + msg := fmt.Sprintf(assets.TextDesc(assets.TextDescKeyResourcesAlertLoad), ratio) + if ratio >= stats.ThresholdLoadDangerRatio { alerts = append(alerts, ResourceAlert{ - Severity: SeverityDanger, - Resource: "load", - Message: fmt.Sprintf("Load %.2fx CPU count", ratio), + Severity: SeverityDanger, Resource: "load", Message: msg, }) - } else if ratio >= 0.8 { + } else if ratio >= stats.ThresholdLoadWarnRatio { alerts = append(alerts, ResourceAlert{ - Severity: SeverityWarning, - Resource: "load", - Message: fmt.Sprintf("Load %.2fx CPU count", ratio), + Severity: SeverityWarning, Resource: "load", Message: msg, }) } } @@ -89,14 +97,30 @@ func Evaluate(snap Snapshot) []ResourceAlert { } // FormatGiB formats bytes as a GiB value with one decimal place (e.g. "14.7"). +// +// Parameters: +// - bytes: Value in bytes to format +// +// Returns: +// - string: Formatted GiB string (e.g. "14.7") func FormatGiB(bytes uint64) string { - gib := float64(bytes) / (1 << 30) + gib := float64(bytes) / stats.ThresholdBytesPerGiB return fmt.Sprintf("%.1f", gib) } +// percent computes the percentage of used relative to total. +// +// Returns 0 when total is zero to avoid division by zero. +// +// Parameters: +// - used: Numerator value +// - total: Denominator value +// +// Returns: +// - float64: Percentage (0-100) func percent(used, total uint64) float64 { if total == 0 { return 0 } - return float64(used) / float64(total) * 100 + return float64(used) / float64(total) * stats.PercentMultiplier } diff --git a/internal/task/task.go b/internal/task/task.go index 3e9c3117..eab260d5 100644 --- a/internal/task/task.go +++ b/internal/task/task.go @@ -10,7 +10,10 @@ // their Markdown representation. package task -import "github.com/ActiveMemory/ctx/internal/config" +import ( + "github.com/ActiveMemory/ctx/internal/config/archive" + "github.com/ActiveMemory/ctx/internal/config/marker" +) // Match indices for accessing capture groups. // @@ -40,7 +43,7 @@ func Completed(match []string) bool { if len(match) <= MatchState { return false } - return match[MatchState] == config.MarkTaskComplete + return match[MatchState] == marker.MarkTaskComplete } // Pending reports whether a match represents a pending task. @@ -54,7 +57,7 @@ func Pending(match []string) bool { if len(match) <= MatchState { return false } - return match[MatchState] != config.MarkTaskComplete + return match[MatchState] != marker.MarkTaskComplete } // Indent returns the leading whitespace from a match. @@ -93,5 +96,5 @@ func Content(match []string) string { // Returns: // - bool: True if indent is 2+ spaces func SubTask(match []string) bool { - return len(Indent(match)) >= 2 + return len(Indent(match)) >= archive.SubTaskMinIndent } diff --git a/internal/task/task_test.go b/internal/task/task_test.go index a1024fa5..25d7c2e1 100644 --- a/internal/task/task_test.go +++ b/internal/task/task_test.go @@ -9,7 +9,7 @@ package task import ( "testing" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/regex" ) func TestCompleted(t *testing.T) { @@ -47,7 +47,7 @@ func TestCompleted(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := config.RegExTask.FindStringSubmatch(tt.line) + match := regex.Task.FindStringSubmatch(tt.line) if match == nil { t.Fatalf("line did not match task pattern: %q", tt.line) } @@ -102,7 +102,7 @@ func TestIsPending(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := config.RegExTask.FindStringSubmatch(tt.line) + match := regex.Task.FindStringSubmatch(tt.line) if match == nil { t.Fatalf("line did not match task pattern: %q", tt.line) } @@ -153,7 +153,7 @@ func TestIndent(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := config.RegExTask.FindStringSubmatch(tt.line) + match := regex.Task.FindStringSubmatch(tt.line) if match == nil { t.Fatalf("line did not match task pattern: %q", tt.line) } @@ -204,7 +204,7 @@ func TestContent(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := config.RegExTask.FindStringSubmatch(tt.line) + match := regex.Task.FindStringSubmatch(tt.line) if match == nil { t.Fatalf("line did not match task pattern: %q", tt.line) } @@ -260,7 +260,7 @@ func TestIsSubTask(t *testing.T) { for _, tt := range tests { t.Run(tt.name, func(t *testing.T) { - match := config.RegExTask.FindStringSubmatch(tt.line) + match := regex.Task.FindStringSubmatch(tt.line) if match == nil { t.Fatalf("line did not match task pattern: %q", tt.line) } @@ -275,7 +275,7 @@ func TestIsSubTask(t *testing.T) { func TestMatchConstants(t *testing.T) { // Verify match indices work correctly line := " - [x] Task content here" - match := config.RegExTask.FindStringSubmatch(line) + match := regex.Task.FindStringSubmatch(line) if match == nil { t.Fatal("line did not match task pattern") } diff --git a/internal/validation/path.go b/internal/validation/path.go index e54ebc2c..95970002 100644 --- a/internal/validation/path.go +++ b/internal/validation/path.go @@ -85,6 +85,55 @@ func SafeReadFile(baseDir, filename string) ([]byte, error) { return data, nil } +// OpenUserFile opens a file at a user-provided path for reading. +// +// Use this instead of raw os.Open when the path comes directly from +// user input. Centralises the gosec suppression. +// +// Parameters: +// - path: user-provided file path. +// +// Returns: +// - *os.File: open file handle (caller must close). +// - error: non-nil on open failure. +func OpenUserFile(path string) (*os.File, error) { + clean := filepath.Clean(path) + return os.Open(clean) //nolint:gosec // user-provided path is intentional +} + +// ReadUserFile reads a file at a user-provided path. +// +// Use this instead of raw os.ReadFile when the path comes directly from +// user input (CLI argument, flag, or interactive prompt). Centralises +// the gosec suppression so call sites stay clean. +// +// Parameters: +// - path: user-provided file path. +// +// Returns: +// - []byte: file content. +// - error: non-nil on read failure. +func ReadUserFile(path string) ([]byte, error) { + clean := filepath.Clean(path) + return os.ReadFile(clean) //nolint:gosec // user-provided path is intentional +} + +// WriteFile writes data to a cleaned file path. +// +// This centralises the gosec suppression for WriteFile calls where the +// path is constructed internally but flagged by the linter. +// +// Parameters: +// - path: file path (will be cleaned). +// - data: content to write. +// - perm: file permission bits. +// +// Returns: +// - error: non-nil on write failure. +func WriteFile(path string, data []byte, perm os.FileMode) error { + return os.WriteFile(filepath.Clean(path), data, perm) //nolint:gosec // path is internally constructed +} + // CheckSymlinks checks whether dir itself or any of its immediate children // are symlinks. Returns an error describing the first symlink found. func CheckSymlinks(dir string) error { diff --git a/internal/validation/validate.go b/internal/validation/validate.go index bbe33828..16c94c4e 100644 --- a/internal/validation/validate.go +++ b/internal/validation/validate.go @@ -9,7 +9,10 @@ package validation import ( "strings" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/config/file" + "github.com/ActiveMemory/ctx/internal/config/regex" + "github.com/ActiveMemory/ctx/internal/config/session" + "github.com/ActiveMemory/ctx/internal/config/token" ) // SanitizeFilename converts a topic string to a safe filename component. @@ -24,17 +27,17 @@ import ( // - string: Safe filename component (lowercase, hyphenated, max 50 chars) func SanitizeFilename(s string) string { // Replace spaces and special chars with hyphens - s = config.RegExNonFileNameChar.ReplaceAllString(s, "-") + s = regex.FileNameChar.ReplaceAllString(s, "-") // Remove leading/trailing hyphens - s = strings.Trim(s, "-") + s = strings.Trim(s, token.Dash) // Convert to lowercase s = strings.ToLower(s) // Limit length - if len(s) > 50 { - s = s[:50] + if len(s) > file.MaxNameLen { + s = s[:file.MaxNameLen] } if s == "" { - s = config.DefaultSessionFilename + s = session.DefaultSessionFilename } return s } diff --git a/internal/write/add.go b/internal/write/add/add.go similarity index 99% rename from internal/write/add.go rename to internal/write/add/add.go index d08c0fcb..0a707ae6 100644 --- a/internal/write/add.go +++ b/internal/write/add/add.go @@ -4,7 +4,7 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package write +package add import ( "fmt" diff --git a/internal/write/add/doc.go b/internal/write/add/doc.go new file mode 100644 index 00000000..ad4d1497 --- /dev/null +++ b/internal/write/add/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package add provides formatted output helpers for the add command. +package add diff --git a/internal/write/backup/backup.go b/internal/write/backup/backup.go new file mode 100644 index 00000000..42af40d9 --- /dev/null +++ b/internal/write/backup/backup.go @@ -0,0 +1,34 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package backup + +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/write" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// BackupResultLine prints a single backup result with optional SMB destination. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - scope: backup scope label (e.g. "project", "global"). +// - archive: archive file path. +// - size: archive size in bytes. +// - smbDest: optional SMB destination (empty string skips). +func BackupResultLine(cmd *cobra.Command, scope, archive string, size int64, smbDest string) { + if cmd == nil { + return + } + line := fmt.Sprintf(config.TplBackupResult, scope, archive, write.FormatBytes(size)) + if smbDest != "" { + line += fmt.Sprintf(config.TplBackupSMBDest, smbDest) + } + cmd.Println(line) +} diff --git a/internal/write/backup/doc.go b/internal/write/backup/doc.go new file mode 100644 index 00000000..3f2b6ea6 --- /dev/null +++ b/internal/write/backup/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package backup provides formatted output helpers for the backup command. +package backup diff --git a/internal/write/bootstrap/bootstrap.go b/internal/write/bootstrap/bootstrap.go new file mode 100644 index 00000000..4ecc99b1 --- /dev/null +++ b/internal/write/bootstrap/bootstrap.go @@ -0,0 +1,87 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package bootstrap + +import ( + "encoding/json" + "fmt" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// BootstrapJSONOutput is the JSON output structure for the bootstrap command. +type BootstrapJSONOutput struct { + ContextDir string `json:"context_dir"` + Files []string `json:"files"` + Rules []string `json:"rules"` + NextSteps []string `json:"next_steps"` + Warnings []string `json:"warnings,omitempty"` +} + +// BootstrapText prints the human-readable bootstrap output to stdout. +// +// Parameters: +// - cmd: Cobra command whose stdout stream receives the output. +// - dir: absolute path to the context directory. +// - fileList: pre-formatted, wrapped file list string. +// - rules: ordered rule strings (numbered automatically). +// - nextSteps: ordered next-step strings (numbered automatically). +// - warning: optional warning string (empty string skips). +func BootstrapText(cmd *cobra.Command, dir string, fileList string, rules []string, nextSteps []string, warning string) { + cmd.Println(config.TplBootstrapTitle) + cmd.Println(config.TplBootstrapSep) + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplBootstrapDir, dir)) + cmd.Println() + cmd.Println(config.TplBootstrapFiles) + cmd.Println(fileList) + cmd.Println() + cmd.Println(config.TplBootstrapRules) + for i, r := range rules { + cmd.Println(fmt.Sprintf(config.TplBootstrapNumbered, i+1, r)) + } + cmd.Println() + cmd.Println(config.TplBootstrapNextSteps) + for i, s := range nextSteps { + cmd.Println(fmt.Sprintf(config.TplBootstrapNumbered, i+1, s)) + } + + if warning != "" { + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplBootstrapWarning, warning)) + } +} + +// BootstrapJSON prints the JSON bootstrap output to stdout. +// +// Parameters: +// - cmd: Cobra command whose stdout stream receives the output. +// - dir: absolute path to the context directory. +// - files: list of context file names. +// - rules: list of rule strings. +// - nextSteps: list of next-step strings. +// - warning: optional warning string (empty string omits warnings). +// +// Returns: +// - error: non-nil if JSON encoding fails. +func BootstrapJSON(cmd *cobra.Command, dir string, files []string, rules []string, nextSteps []string, warning string) error { + out := BootstrapJSONOutput{ + ContextDir: dir, + Files: files, + Rules: rules, + NextSteps: nextSteps, + } + + if warning != "" { + out.Warnings = []string{warning} + } + + enc := json.NewEncoder(cmd.OutOrStdout()) + enc.SetIndent("", " ") + return enc.Encode(out) +} diff --git a/internal/write/bootstrap/doc.go b/internal/write/bootstrap/doc.go new file mode 100644 index 00000000..9830fd71 --- /dev/null +++ b/internal/write/bootstrap/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package bootstrap provides formatted output helpers for the bootstrap command. +package bootstrap diff --git a/internal/write/config.go b/internal/write/config.go deleted file mode 100644 index 40804fd2..00000000 --- a/internal/write/config.go +++ /dev/null @@ -1,60 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package write - -// prefixError is prepended to all error messages written to stderr. -const prefixError = "Error: " - -// tplPathExists is a format template for reporting that a destination path -// already exists. Arguments: original path, resolved destination path. -const tplPathExists = " %s -> %s (exists)" - -// tplExistsWritingAsAlternative is a format template for reporting that a -// file exists and content was written to an alternative filename instead. -// Arguments: original path, alternative path. -const tplExistsWritingAsAlternative = " ! %s exists, writing as %s" - -// tplDryRun is printed when a command runs in dry-run mode. -const tplDryRun = "Dry run — no files will be written." - -// tplSource is a format template for reporting a source path. -// Arguments: path. -const tplSource = " Source: %s" - -// tplMirror is a format template for reporting a mirror path. -// Arguments: relative mirror path. -const tplMirror = " Mirror: %s" - -// tplStatusDrift is printed when drift is detected. -const tplStatusDrift = " Status: drift detected (source is newer)" - -// tplStatusNoDrift is printed when no drift is detected. -const tplStatusNoDrift = " Status: no drift" - -// tplArchived is a format template for reporting an archived file. -// Arguments: archive filename. -const tplArchived = "Archived previous mirror to %s" - -// tplSynced is a format template for reporting a successful sync. -// Arguments: source label, destination relative path. -const tplSynced = "Synced %s -> %s" - -// tplLines is a format template for reporting line counts. -// Arguments: line count. -const tplLines = " Lines: %d" - -// tplLinesPrevious is a format template appended to line counts when a -// previous count is available. Arguments: previous line count. -const tplLinesPrevious = " (was %d)" - -// tplNewContent is a format template for reporting new content since last sync. -// Arguments: line count. -const tplNewContent = " New content: %d lines since last sync" - -// tplAddedTo is a format template for confirming an entry was added. -// Arguments: filename. -const tplAddedTo = "✓ Added to %s" diff --git a/internal/write/config/bootstrap.go b/internal/write/config/bootstrap.go new file mode 100644 index 00000000..9351a4ea --- /dev/null +++ b/internal/write/config/bootstrap.go @@ -0,0 +1,7 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package config diff --git a/internal/write/config/config.go b/internal/write/config/config.go new file mode 100644 index 00000000..6c9ee23c --- /dev/null +++ b/internal/write/config/config.go @@ -0,0 +1,1016 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package config + +import ( + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/time" +) + +// TplBootstrapTitle is the heading for bootstrap output. +var TplBootstrapTitle = assets.TextDesc(assets.TextDescKeyWriteBootstrapTitle) + +// TplBootstrapSep is the visual separator under the bootstrap heading. +var TplBootstrapSep = assets.TextDesc(assets.TextDescKeyWriteBootstrapSep) + +// TplBootstrapDir is a format template for the context directory. +// Arguments: context directory path. +var TplBootstrapDir = assets.TextDesc(assets.TextDescKeyWriteBootstrapDir) + +// TplBootstrapFiles is the heading for the file list section. +var TplBootstrapFiles = assets.TextDesc(assets.TextDescKeyWriteBootstrapFiles) + +// TplBootstrapRules is the heading for the rules section. +var TplBootstrapRules = assets.TextDesc(assets.TextDescKeyWriteBootstrapRules) + +// TplBootstrapNextSteps is the heading for the next steps section. +var TplBootstrapNextSteps = assets.TextDesc(assets.TextDescKeyWriteBootstrapNextSteps) + +// TplBootstrapNumbered is a format template for a numbered list item. +// Arguments: index, text. +var TplBootstrapNumbered = assets.TextDesc(assets.TextDescKeyWriteBootstrapNumbered) + +// TplBootstrapWarning is a format template for a warning line. +// Arguments: warning text. +var TplBootstrapWarning = assets.TextDesc(assets.TextDescKeyWriteBootstrapWarning) + +// PrefixError is prepended to all error messages written to stderr. +var PrefixError = assets.TextDesc(assets.TextDescKeyWritePrefixError) + +// TplPathExists is a format template for reporting that a destination path +// already exists. Arguments: original path, resolved destination path. +var TplPathExists = assets.TextDesc(assets.TextDescKeyWritePathExists) + +// TplExistsWritingAsAlternative is a format template for reporting that a +// file exists and content was written to an alternative filename instead. +// Arguments: original path, alternative path. +var TplExistsWritingAsAlternative = assets.TextDesc(assets.TextDescKeyWriteExistsWritingAsAlternative) + +// TplDryRun is printed when a command runs in dry-run mode. +var TplDryRun = assets.TextDesc(assets.TextDescKeyWriteDryRun) + +// TplSource is a format template for reporting a source path. +// Arguments: path. +var TplSource = assets.TextDesc(assets.TextDescKeyWriteSource) + +// TplMirror is a format template for reporting a mirror path. +// Arguments: relative mirror path. +var TplMirror = assets.TextDesc(assets.TextDescKeyWriteMirror) + +// TplStatusDrift is printed when drift is detected. +var TplStatusDrift = assets.TextDesc(assets.TextDescKeyWriteStatusDrift) + +// TplStatusNoDrift is printed when no drift is detected. +var TplStatusNoDrift = assets.TextDesc(assets.TextDescKeyWriteStatusNoDrift) + +// TplArchived is a format template for reporting an archived file. +// Arguments: archive filename. +var TplArchived = assets.TextDesc(assets.TextDescKeyWriteArchived) + +// TplSynced is a format template for reporting a successful sync. +// Arguments: source label, destination relative path. +var TplSynced = assets.TextDesc(assets.TextDescKeyWriteSynced) + +// TplLines is a format template for reporting line counts. +// Arguments: line count. +var TplLines = assets.TextDesc(assets.TextDescKeyWriteLines) + +// TplLinesPrevious is a format template appended to line counts when a +// previous count is available. Arguments: previous line count. +var TplLinesPrevious = assets.TextDesc(assets.TextDescKeyWriteLinesPrevious) + +// TplNewContent is a format template for reporting new content since last sync. +// Arguments: line count. +var TplNewContent = assets.TextDesc(assets.TextDescKeyWriteNewContent) + +// TplAddedTo is a format template for confirming an entry was added. +// Arguments: filename. +var TplAddedTo = assets.TextDesc(assets.TextDescKeyWriteAddedTo) + +// TplMovingTask is a format template for a task being moved to completed. +// Arguments: truncated task text. +var TplMovingTask = assets.TextDesc(assets.TextDescKeyWriteMovingTask) + +// TplCompletedTask is a format template for a task marked complete. +// Arguments: task text. +var TplCompletedTask = assets.TextDesc(assets.TextDescKeyWriteCompletedTask) + +// TplConfigProfileDev is the status output for dev profile. +var TplConfigProfileDev = assets.TextDesc(assets.TextDescKeyWriteConfigProfileDev) + +// TplConfigProfileBase is the status output for base profile. +var TplConfigProfileBase = assets.TextDesc(assets.TextDescKeyWriteConfigProfileBase) + +// TplConfigProfileNone is the status output when no profile exists. +// Arguments: ctxrc filename. +var TplConfigProfileNone = assets.TextDesc(assets.TextDescKeyWriteConfigProfileNone) + +// TplDepsNoProject is printed when no supported project is detected. +var TplDepsNoProject = assets.TextDesc(assets.TextDescKeyWriteDepsNoProject) + +// TplDepsLookingFor is printed with the list of files checked. +var TplDepsLookingFor = assets.TextDesc(assets.TextDescKeyWriteDepsLookingFor) + +// TplDepsUseType hints at the --type flag. +// Arguments: comma-separated list of builder names. +var TplDepsUseType = assets.TextDesc(assets.TextDescKeyWriteDepsUseType) + +// TplDepsNoDeps is printed when no dependencies are found. +var TplDepsNoDeps = assets.TextDesc(assets.TextDescKeyWriteDepsNoDeps) + +// TplSkillsHeader is the heading for the skills list. +var TplSkillsHeader = assets.TextDesc(assets.TextDescKeyWriteSkillsHeader) + +// TplSkillLine formats a single skill entry. +// Arguments: name, description. +var TplSkillLine = assets.TextDesc(assets.TextDescKeyWriteSkillLine) + +// TplHookCopilotSkipped reports that copilot instructions were skipped. +// Arguments: target file path. +var TplHookCopilotSkipped = assets.TextDesc(assets.TextDescKeyWriteHookCopilotSkipped) + +// TplHookCopilotForceHint tells the user about the --force flag. +var TplHookCopilotForceHint = assets.TextDesc(assets.TextDescKeyWriteHookCopilotForceHint) + +// TplHookCopilotMerged reports that copilot instructions were merged. +// Arguments: target file path. +var TplHookCopilotMerged = assets.TextDesc(assets.TextDescKeyWriteHookCopilotMerged) + +// TplHookCopilotCreated reports that copilot instructions were created. +// Arguments: target file path. +var TplHookCopilotCreated = assets.TextDesc(assets.TextDescKeyWriteHookCopilotCreated) + +// TplHookCopilotSessionsDir reports that the sessions directory was created. +// Arguments: sessions directory path. +var TplHookCopilotSessionsDir = assets.TextDesc(assets.TextDescKeyWriteHookCopilotSessionsDir) + +// TplHookCopilotSummary is the post-write summary for copilot. +var TplHookCopilotSummary = assets.TextDesc(assets.TextDescKeyWriteHookCopilotSummary) + +// TplHookUnknownTool reports an unrecognized tool name. +// Arguments: tool name. +var TplHookUnknownTool = assets.TextDesc(assets.TextDescKeyWriteHookUnknownTool) + +// TplInitOverwritePrompt prompts the user before overwriting .context/. +// Arguments: context directory path. +var TplInitOverwritePrompt = assets.TextDesc(assets.TextDescKeyWriteInitOverwritePrompt) + +// TplInitAborted is printed when the user declines overwriting. +var TplInitAborted = assets.TextDesc(assets.TextDescKeyWriteInitAborted) + +// TplInitExistsSkipped reports a file that was skipped because it exists. +// Arguments: filename. +var TplInitExistsSkipped = assets.TextDesc(assets.TextDescKeyWriteInitExistsSkipped) + +// TplInitFileCreated reports a file that was successfully created. +// Arguments: filename. +var TplInitFileCreated = assets.TextDesc(assets.TextDescKeyWriteInitFileCreated) + +// TplInitialized reports successful context initialization. +// Arguments: context directory path. +var TplInitialized = assets.TextDesc(assets.TextDescKeyWriteInitialized) + +// TplInitWarnNonFatal reports a non-fatal warning during init. +// Arguments: label, error. +var TplInitWarnNonFatal = assets.TextDesc(assets.TextDescKeyWriteInitWarnNonFatal) + +// TplInitScratchpadPlaintext reports a plaintext scratchpad was created. +// Arguments: path. +var TplInitScratchpadPlaintext = assets.TextDesc(assets.TextDescKeyWriteInitScratchpadPlaintext) + +// TplInitScratchpadNoKey warns about a missing key for an encrypted scratchpad. +// Arguments: key path. +var TplInitScratchpadNoKey = assets.TextDesc(assets.TextDescKeyWriteInitScratchpadNoKey) + +// TplInitScratchpadKeyCreated reports a scratchpad key was generated. +// Arguments: key path. +var TplInitScratchpadKeyCreated = assets.TextDesc(assets.TextDescKeyWriteInitScratchpadKeyCreated) + +// TplInitCreatingRootFiles is the heading before project root file creation. +var TplInitCreatingRootFiles = assets.TextDesc(assets.TextDescKeyWriteInitCreatingRootFiles) + +// TplInitSettingUpPermissions is the heading before permissions setup. +var TplInitSettingUpPermissions = assets.TextDesc(assets.TextDescKeyWriteInitSettingUpPermissions) + +// TplInitGitignoreUpdated reports .gitignore entries were added. +// Arguments: count of entries added. +var TplInitGitignoreUpdated = assets.TextDesc(assets.TextDescKeyWriteInitGitignoreUpdated) + +// TplInitGitignoreReview hints how to review the .gitignore changes. +var TplInitGitignoreReview = assets.TextDesc(assets.TextDescKeyWriteInitGitignoreReview) + +// TplInitNextSteps is the next-steps guidance block after init completes. +var TplInitNextSteps = assets.TextDesc(assets.TextDescKeyWriteInitNextSteps) + +// TplInitPluginInfo is the plugin installation guidance block. +var TplInitPluginInfo = assets.TextDesc(assets.TextDescKeyWriteInitPluginInfo) + +// TplInitPluginNote is the note about local plugin enabling. +var TplInitPluginNote = assets.TextDesc(assets.TextDescKeyWriteInitPluginNote) + +// TplInitCtxContentExists reports a file skipped because ctx content exists. +// Arguments: path. +var TplInitCtxContentExists = assets.TextDesc( + assets.TextDescKeyWriteInitCtxContentExists, +) + +// TplInitMerged reports a file merged during init. +// Arguments: path. +var TplInitMerged = assets.TextDesc(assets.TextDescKeyWriteInitMerged) + +// TplInitBackup reports a backup file created. +// Arguments: backup path. +var TplInitBackup = assets.TextDesc(assets.TextDescKeyWriteInitBackup) + +// TplInitUpdatedCtxSection reports a file whose ctx section was updated. +// Arguments: path. +var TplInitUpdatedCtxSection = assets.TextDesc( + assets.TextDescKeyWriteInitUpdatedCtxSection, +) + +// TplInitUpdatedPlanSection reports a file whose plan section was updated. +// Arguments: path. +var TplInitUpdatedPlanSection = assets.TextDesc( + assets.TextDescKeyWriteInitUpdatedPlanSection, +) + +// TplInitUpdatedPromptSection reports a file whose prompt section was updated. +// Arguments: path. +var TplInitUpdatedPromptSection = assets.TextDesc( + assets.TextDescKeyWriteInitUpdatedPromptSection, +) + +// TplInitFileExistsNoCtx reports a file exists without ctx content. +// Arguments: path. +var TplInitFileExistsNoCtx = assets.TextDesc( + assets.TextDescKeyWriteInitFileExistsNoCtx, +) + +// TplInitNoChanges reports a settings file with no changes needed. +// Arguments: path. +var TplInitNoChanges = assets.TextDesc(assets.TextDescKeyWriteInitNoChanges) + +// TplInitPermsMergedDeduped reports permissions merged and deduped. +// Arguments: path. +var TplInitPermsMergedDeduped = assets.TextDesc( + assets.TextDescKeyWriteInitPermsMergedDeduped, +) + +// TplInitPermsDeduped reports duplicate permissions removed. +// Arguments: path. +var TplInitPermsDeduped = assets.TextDesc( + assets.TextDescKeyWriteInitPermsDeduped, +) + +// TplInitPermsAllowDeny reports allow+deny permissions added. +// Arguments: path. +var TplInitPermsAllowDeny = assets.TextDesc( + assets.TextDescKeyWriteInitPermsAllowDeny, +) + +// TplInitPermsDeny reports deny permissions added. +// Arguments: path. +var TplInitPermsDeny = assets.TextDesc(assets.TextDescKeyWriteInitPermsDeny) + +// TplInitPermsAllow reports ctx permissions added. +// Arguments: path. +var TplInitPermsAllow = assets.TextDesc(assets.TextDescKeyWriteInitPermsAllow) + +// TplInitMakefileCreated is printed when a new Makefile is created. +var TplInitMakefileCreated = assets.TextDesc( + assets.TextDescKeyWriteInitMakefileCreated, +) + +// TplInitMakefileIncludes reports Makefile already includes the directive. +// Arguments: filename. +var TplInitMakefileIncludes = assets.TextDesc( + assets.TextDescKeyWriteInitMakefileIncludes, +) + +// TplInitMakefileAppended reports an include appended to Makefile. +// Arguments: filename. +var TplInitMakefileAppended = assets.TextDesc( + assets.TextDescKeyWriteInitMakefileAppended, +) + +// TplInitPluginSkipped is printed when plugin enablement is skipped. +var TplInitPluginSkipped = assets.TextDesc( + assets.TextDescKeyWriteInitPluginSkipped, +) + +// TplInitPluginAlreadyEnabled is printed when plugin is already enabled. +var TplInitPluginAlreadyEnabled = assets.TextDesc( + assets.TextDescKeyWriteInitPluginAlreadyEnabled, +) + +// TplInitPluginEnabled reports plugin enabled globally. +// Arguments: settings path. +var TplInitPluginEnabled = assets.TextDesc( + assets.TextDescKeyWriteInitPluginEnabled, +) + +// TplInitSkippedDir reports a directory skipped because it exists. +// Arguments: dir. +var TplInitSkippedDir = assets.TextDesc( + assets.TextDescKeyWriteInitSkippedDir, +) + +// TplInitCreatedDir reports a directory created during init. +// Arguments: dir. +var TplInitCreatedDir = assets.TextDesc( + assets.TextDescKeyWriteInitCreatedDir, +) + +// TplInitCreatedWith reports a file created with a qualifier. +// Arguments: path, qualifier. +var TplInitCreatedWith = assets.TextDesc( + assets.TextDescKeyWriteInitCreatedWith, +) + +// TplInitSkippedPlain reports a file skipped without detail. +// Arguments: path. +var TplInitSkippedPlain = assets.TextDesc( + assets.TextDescKeyWriteInitSkippedPlain, +) + +// TplObsidianGenerated reports successful Obsidian vault generation. +// Arguments: entry count, output directory. +var TplObsidianGenerated = assets.TextDesc( + assets.TextDescKeyWriteObsidianGenerated, +) + +// TplObsidianNextSteps is the post-generation guidance. +// Arguments: output directory. +var TplObsidianNextSteps = assets.TextDesc( + assets.TextDescKeyWriteObsidianNextSteps, +) + +// TplJournalOrphanRemoved reports a removed orphan file. +// Arguments: filename. +var TplJournalOrphanRemoved = assets.TextDesc( + assets.TextDescKeyWriteJournalOrphanRemoved, +) + +// TplJournalSiteGenerated reports successful site generation. +// Arguments: entry count, output directory. +var TplJournalSiteGenerated = assets.TextDesc( + assets.TextDescKeyWriteJournalSiteGenerated, +) + +// TplJournalSiteStarting reports the server is starting. +var TplJournalSiteStarting = assets.TextDesc( + assets.TextDescKeyWriteJournalSiteStarting, +) + +// TplJournalSiteBuilding reports a build is in progress. +var TplJournalSiteBuilding = assets.TextDesc( + assets.TextDescKeyWriteJournalSiteBuilding, +) + +// TplJournalSiteNextSteps shows post-generation guidance. +// Arguments: output directory, zensical binary name. +var TplJournalSiteNextSteps = assets.TextDesc( + assets.TextDescKeyWriteJournalSiteNextSteps, +) + +// TplJournalSiteAlt is the alternative command hint. +var TplJournalSiteAlt = assets.TextDesc( + assets.TextDescKeyWriteJournalSiteAlt, +) + +// TplLoopGenerated reports successful loop script generation. +// Arguments: output file path. +var TplLoopGenerated = assets.TextDesc( + assets.TextDescKeyWriteLoopGenerated, +) + +// TplLoopRunCmd shows how to run the generated script. +// Arguments: output file path. +var TplLoopRunCmd = assets.TextDesc( + assets.TextDescKeyWriteLoopRunCmd, +) + +// TplLoopTool shows the selected tool. +// Arguments: tool name. +var TplLoopTool = assets.TextDesc(assets.TextDescKeyWriteLoopTool) + +// TplLoopPrompt shows the prompt file. +// Arguments: prompt file path. +var TplLoopPrompt = assets.TextDesc(assets.TextDescKeyWriteLoopPrompt) + +// TplLoopMaxIterations shows the max iterations setting. +// Arguments: count. +var TplLoopMaxIterations = assets.TextDesc( + assets.TextDescKeyWriteLoopMaxIterations, +) + +// TplLoopUnlimited shows unlimited iterations. +var TplLoopUnlimited = assets.TextDesc(assets.TextDescKeyWriteLoopUnlimited) + +// TplLoopCompletion shows the completion signal. +// Arguments: signal string. +var TplLoopCompletion = assets.TextDesc(assets.TextDescKeyWriteLoopCompletion) + +// TplUnpublishNotFound reports no published block was found. +// Arguments: source filename. +var TplUnpublishNotFound = assets.TextDesc( + assets.TextDescKeyWriteUnpublishNotFound, +) + +// TplUnpublishDone reports the published block was removed. +// Arguments: source filename. +var TplUnpublishDone = assets.TextDesc(assets.TextDescKeyWriteUnpublishDone) + +// TplPublishHeader reports publishing has started. +var TplPublishHeader = assets.TextDesc(assets.TextDescKeyWritePublishHeader) + +// TplPublishSourceFiles lists the source files used for publishing. +var TplPublishSourceFiles = assets.TextDesc( + assets.TextDescKeyWritePublishSourceFiles, +) + +// TplPublishBudget reports the line budget. +// Arguments: budget. +var TplPublishBudget = assets.TextDesc(assets.TextDescKeyWritePublishBudget) + +// TplPublishBlock is the heading for the published block detail. +var TplPublishBlock = assets.TextDesc(assets.TextDescKeyWritePublishBlock) + +// TplPublishTasks reports pending tasks count. +// Arguments: count. +var TplPublishTasks = assets.TextDesc(assets.TextDescKeyWritePublishTasks) + +// TplPublishDecisions reports recent decisions count. +// Arguments: count. +var TplPublishDecisions = assets.TextDesc( + assets.TextDescKeyWritePublishDecisions, +) + +// TplPublishConventions reports key conventions count. +// Arguments: count. +var TplPublishConventions = assets.TextDesc( + assets.TextDescKeyWritePublishConventions, +) + +// TplPublishLearnings reports recent learnings count. +// Arguments: count. +var TplPublishLearnings = assets.TextDesc( + assets.TextDescKeyWritePublishLearnings, +) + +// TplPublishTotal reports the total line count within budget. +// Arguments: total lines, budget. +var TplPublishTotal = assets.TextDesc(assets.TextDescKeyWritePublishTotal) + +// TplPublishDryRun reports a publish dry run. +var TplPublishDryRun = assets.TextDesc(assets.TextDescKeyWritePublishDryRun) + +// TplPublishDone reports successful publishing with marker info. +var TplPublishDone = assets.TextDesc(assets.TextDescKeyWritePublishDone) + +// TplImportNoEntries reports no entries found in MEMORY.md. +var TplImportNoEntries = assets.TextDesc(assets.TextDescKeyWriteImportNoEntries) + +// TplImportScanning reports scanning has started. +// Arguments: source filename. +var TplImportScanning = assets.TextDesc(assets.TextDescKeyWriteImportScanning) + +// TplImportFound reports the number of entries found. +// Arguments: count. +var TplImportFound = assets.TextDesc(assets.TextDescKeyWriteImportFound) + +// TplImportEntry reports an entry being processed. +// Arguments: truncated title (already quoted). +var TplImportEntry = assets.TextDesc(assets.TextDescKeyWriteImportEntry) + +// TplImportClassifiedSkip reports an entry classified as skip. +var TplImportClassifiedSkip = assets.TextDesc( + assets.TextDescKeyWriteImportClassifiedSkip, +) + +// TplImportClassified reports an entry classification. +// Arguments: target file, comma-joined keywords. +var TplImportClassified = assets.TextDesc(assets.TextDescKeyWriteImportClassified) + +// TplImportAdded reports an entry added to a target file. +// Arguments: target filename. +var TplImportAdded = assets.TextDesc(assets.TextDescKeyWriteImportAdded) + +// TplImportSummaryDryRun is the dry-run summary prefix. +// Arguments: count. +var TplImportSummaryDryRun = assets.TextDesc( + assets.TextDescKeyWriteImportSummaryDryRun, +) + +// TplImportSummary is the import summary prefix. +// Arguments: count. +var TplImportSummary = assets.TextDesc(assets.TextDescKeyWriteImportSummary) + +// TplImportSkipped reports skipped entries. +// Arguments: count. +var TplImportSkipped = assets.TextDesc(assets.TextDescKeyWriteImportSkipped) + +// TplImportDuplicates reports duplicate entries. +// Arguments: count. +var TplImportDuplicates = assets.TextDesc(assets.TextDescKeyWriteImportDuplicates) + +// TplMemoryNoChanges reports no changes since last sync. +var TplMemoryNoChanges = assets.TextDesc(assets.TextDescKeyWriteMemoryNoChanges) + +// TplMemoryBridgeHeader is the heading for memory status output. +var TplMemoryBridgeHeader = assets.TextDesc( + assets.TextDescKeyWriteMemoryBridgeHeader, +) + +// TplMemorySourceNotActive reports that auto memory is not active. +var TplMemorySourceNotActive = assets.TextDesc( + assets.TextDescKeyWriteMemorySourceNotActive, +) + +// TplMemorySource is a format template for the source path. +// Arguments: path. +var TplMemorySource = assets.TextDesc(assets.TextDescKeyWriteMemorySource) + +// TplMemoryMirror is a format template for the mirror relative path. +// Arguments: relative path. +var TplMemoryMirror = assets.TextDesc(assets.TextDescKeyWriteMemoryMirror) + +// TplMemoryLastSync is a format template for the last sync time. +// Arguments: formatted time, human-readable duration. +var TplMemoryLastSync = assets.TextDesc(assets.TextDescKeyWriteMemoryLastSync) + +// TplMemoryLastSyncNever reports no sync has occurred. +var TplMemoryLastSyncNever = assets.TextDesc( + assets.TextDescKeyWriteMemoryLastSyncNever, +) + +// TplMemorySourceLines is a format template for MEMORY.md line count. +// Arguments: line count. +var TplMemorySourceLines = assets.TextDesc(assets.TextDescKeyWriteMemorySourceLines) + +// TplMemorySourceLinesDrift is a format template for MEMORY.md line count +// when drift is detected. Arguments: line count. +var TplMemorySourceLinesDrift = assets.TextDesc( + assets.TextDescKeyWriteMemorySourceLinesDrift, +) + +// TplMemoryMirrorLines is a format template for mirror line count. +// Arguments: line count. +var TplMemoryMirrorLines = assets.TextDesc( + assets.TextDescKeyWriteMemoryMirrorLines, +) + +// TplMemoryMirrorNotSynced reports the mirror has not been synced. +var TplMemoryMirrorNotSynced = assets.TextDesc( + assets.TextDescKeyWriteMemoryMirrorNotSynced, +) + +// TplMemoryDriftDetected reports drift was detected. +var TplMemoryDriftDetected = assets.TextDesc( + assets.TextDescKeyWriteMemoryDriftDetected, +) + +// TplMemoryDriftNone reports no drift. +var TplMemoryDriftNone = assets.TextDesc(assets.TextDescKeyWriteMemoryDriftNone) + +// TplMemoryArchives is a format template for archive snapshot count. +// Arguments: count, archive directory name. +var TplMemoryArchives = assets.TextDesc(assets.TextDescKeyWriteMemoryArchives) + +// TplPadEntryAdded is a format template for pad entry confirmation. +// Arguments: entry number. +var TplPadEntryAdded = assets.TextDesc(assets.TextDescKeyWritePadEntryAdded) + +// TplPadEntryUpdated is a format template for pad entry update confirmation. +// Arguments: entry number. +var TplPadEntryUpdated = assets.TextDesc(assets.TextDescKeyWritePadEntryUpdated) + +// TplPadExportPlan is a format template for a dry-run export line. +// Arguments: label, output path. +var TplPadExportPlan = assets.TextDesc(assets.TextDescKeyWritePadExportPlan) + +// TplPadExportDone is a format template for a successfully exported blob. +// Arguments: label. +var TplPadExportDone = assets.TextDesc(assets.TextDescKeyWritePadExportDone) + +// TplPadExportWriteFailed is a format template for a failed blob write (stderr). +// Arguments: label, error. +var TplPadExportWriteFailed = assets.TextDesc( + assets.TextDescKeyWritePadExportWriteFailed, +) + +// TplPadExportNone is the message when no blob entries exist to export. +var TplPadExportNone = assets.TextDesc(assets.TextDescKeyWritePadExportNone) + +// TplPadExportSummary is a format template for the export summary. +// Arguments: verb ("Exported"/"Would export"), count. +var TplPadExportSummary = assets.TextDesc(assets.TextDescKeyWritePadExportSummary) + +// TplPadExportVerbDone is the past-tense verb for export summary. +var TplPadExportVerbDone = assets.TextDesc(assets.TextDescKeyWritePadExportVerbDone) + +// TplPadExportVerbDryRun is the dry-run verb for export summary. +var TplPadExportVerbDryRun = assets.TextDesc( + assets.TextDescKeyWritePadExportVerbDryRun, +) + +// TplPadImportNone is the message when no entries were found to import. +var TplPadImportNone = assets.TextDesc(assets.TextDescKeyWritePadImportNone) + +// TplPadImportDone is a format template for successful line import. +// Arguments: count. +var TplPadImportDone = assets.TextDesc(assets.TextDescKeyWritePadImportDone) + +// TplPadImportBlobAdded is a format template for a successfully imported blob. +// Arguments: filename. +var TplPadImportBlobAdded = assets.TextDesc( + assets.TextDescKeyWritePadImportBlobAdded, +) + +// TplPadImportBlobSkipped is a format template for a skipped blob (stderr). +// Arguments: filename, reason. +var TplPadImportBlobSkipped = assets.TextDesc( + assets.TextDescKeyWritePadImportBlobSkipped, +) + +// TplPadImportBlobTooLarge is a format template for a blob exceeding the size limit (stderr). +// Arguments: filename, max bytes. +var TplPadImportBlobTooLarge = assets.TextDesc( + assets.TextDescKeyWritePadImportBlobTooLarge, +) + +// TplPadImportBlobNone is the message when no files were found to import. +var TplPadImportBlobNone = assets.TextDesc( + assets.TextDescKeyWritePadImportBlobNone, +) + +// TplPadImportBlobSummary is a format template for blob import summary. +// Arguments: added count, skipped count. +var TplPadImportBlobSummary = assets.TextDesc( + assets.TextDescKeyWritePadImportBlobSummary, +) + +// TplPadImportCloseWarning is a format template for file close warning (stderr). +// Arguments: filename, error. +var TplPadImportCloseWarning = assets.TextDesc( + assets.TextDescKeyWritePadImportCloseWarning, +) + +// TplPaused is a format template for the pause confirmation. +// Arguments: session ID. +var TplPaused = assets.TextDesc(assets.TextDescKeyWritePaused) + +// TplRestoreNoLocal is the message when golden is restored with no local file. +var TplRestoreNoLocal = assets.TextDesc(assets.TextDescKeyWriteRestoreNoLocal) + +// TplRestoreMatch is the message when settings already match golden. +var TplRestoreMatch = assets.TextDesc(assets.TextDescKeyWriteRestoreMatch) + +// TplRestoreDroppedHeader is a format template for dropped permissions header. +// Arguments: count. +var TplRestoreDroppedHeader = assets.TextDesc( + assets.TextDescKeyWriteRestoreDroppedHeader, +) + +// TplRestoreRestoredHeader is a format template for restored permissions header. +// Arguments: count. +var TplRestoreRestoredHeader = assets.TextDesc( + assets.TextDescKeyWriteRestoreRestoredHeader, +) + +// TplRestoreDenyDroppedHeader is a format template for dropped deny rules header. +// Arguments: count. +var TplRestoreDenyDroppedHeader = assets.TextDesc( + assets.TextDescKeyWriteRestoreDenyDroppedHeader, +) + +// TplRestoreDenyRestoredHeader is a format template for restored deny rules header. +// Arguments: count. +var TplRestoreDenyRestoredHeader = assets.TextDesc( + assets.TextDescKeyWriteRestoreDenyRestoredHeader, +) + +// TplRestoreRemoved is a format template for a removed permission line. +// Arguments: permission string. +var TplRestoreRemoved = assets.TextDesc(assets.TextDescKeyWriteRestoreRemoved) + +// TplRestoreAdded is a format template for an added permission line. +// Arguments: permission string. +var TplRestoreAdded = assets.TextDesc(assets.TextDescKeyWriteRestoreAdded) + +// TplRestorePermMatch is the message when only non-permission settings differ. +var TplRestorePermMatch = assets.TextDesc(assets.TextDescKeyWriteRestorePermMatch) + +// TplRestoreDone is the message after successful restore. +var TplRestoreDone = assets.TextDesc(assets.TextDescKeyWriteRestoreDone) + +// TplSnapshotSaved is a format template for golden image save. +// Arguments: golden file path. +var TplSnapshotSaved = assets.TextDesc(assets.TextDescKeyWriteSnapshotSaved) + +// TplSnapshotUpdated is a format template for golden image update. +// Arguments: golden file path. +var TplSnapshotUpdated = assets.TextDesc(assets.TextDescKeyWriteSnapshotUpdated) + +// TplResumed is a format template for the resume confirmation. +// Arguments: session ID. +var TplResumed = assets.TextDesc(assets.TextDescKeyWriteResumed) + +// TplPadEmpty is the message when the scratchpad has no entries. +var TplPadEmpty = assets.TextDesc(assets.TextDescKeyWritePadEmpty) + +// TplPadKeyCreated is a format template for key creation notice (stderr). +// Arguments: key file path. +var TplPadKeyCreated = assets.TextDesc(assets.TextDescKeyWritePadKeyCreated) + +// TplPadBlobWritten is a format template for blob file write confirmation. +// Arguments: byte count, output path. +var TplPadBlobWritten = assets.TextDesc(assets.TextDescKeyWritePadBlobWritten) + +// TplPadEntryRemoved is a format template for pad entry removal confirmation. +// Arguments: entry number. +var TplPadEntryRemoved = assets.TextDesc(assets.TextDescKeyWritePadEntryRemoved) + +// TplPadResolveHeader is a format template for a conflict side header. +// Arguments: side label ("OURS"/"THEIRS"). +var TplPadResolveHeader = assets.TextDesc(assets.TextDescKeyWritePadResolveHeader) + +// TplPadResolveEntry is a format template for a numbered conflict entry. +// Arguments: 1-based index, display string. +var TplPadResolveEntry = assets.TextDesc(assets.TextDescKeyWritePadResolveEntry) + +// TplPadEntryMoved is a format template for pad entry move confirmation. +// Arguments: source position, destination position. +var TplPadEntryMoved = assets.TextDesc(assets.TextDescKeyWritePadEntryMoved) + +// TplPadMergeDupe is a format template for a duplicate entry during merge. +// Arguments: display string. +var TplPadMergeDupe = assets.TextDesc(assets.TextDescKeyWritePadMergeDupe) + +// TplPadMergeAdded is a format template for a newly added entry during merge. +// Arguments: display string, source file. +var TplPadMergeAdded = assets.TextDesc(assets.TextDescKeyWritePadMergeAdded) + +// TplPadMergeBlobConflict is a format template for a blob label conflict warning. +// Arguments: label. +var TplPadMergeBlobConflict = assets.TextDesc( + assets.TextDescKeyWritePadMergeBlobConflict, +) + +// TplPadMergeBinaryWarning is a format template for a binary data warning. +// Arguments: filename. +var TplPadMergeBinaryWarning = assets.TextDesc( + assets.TextDescKeyWritePadMergeBinaryWarning, +) + +// TplPadMergeNone is the message when no entries were found to merge. +var TplPadMergeNone = assets.TextDesc(assets.TextDescKeyWritePadMergeNone) + +// TplPadMergeNoneNew is a format template when all entries are duplicates. +// Arguments: dupe count, pluralized "duplicate". +var TplPadMergeNoneNew = assets.TextDesc(assets.TextDescKeyWritePadMergeNoneNew) + +// TplPadMergeDryRun is a format template for dry-run merge summary. +// Arguments: added count, pluralized "entry", dupe count, pluralized "duplicate". +var TplPadMergeDryRun = assets.TextDesc(assets.TextDescKeyWritePadMergeDryRun) + +// TplPadMergeDone is a format template for successful merge summary. +// Arguments: added count, pluralized "entry", dupe count, pluralized "duplicate". +var TplPadMergeDone = assets.TextDesc(assets.TextDescKeyWritePadMergeDone) + +// TplSetupPrompt is the interactive prompt for webhook URL entry. +var TplSetupPrompt = assets.TextDesc(assets.TextDescKeyWriteSetupPrompt) + +// TplSetupDone is a format template for successful webhook configuration. +// Arguments: masked URL, encrypted file path. +var TplSetupDone = assets.TextDesc(assets.TextDescKeyWriteSetupDone) + +// TplTestNoWebhook is the message when no webhook is configured. +var TplTestNoWebhook = assets.TextDesc(assets.TextDescKeyWriteTestNoWebhook) + +// TplTestFiltered is the notice when the test event is filtered. +var TplTestFiltered = assets.TextDesc(assets.TextDescKeyWriteTestFiltered) + +// TplTestResult is a format template for webhook test response. +// Arguments: HTTP status code, status text. +var TplTestResult = assets.TextDesc(assets.TextDescKeyWriteTestResult) + +// TplTestWorking is the success message after a webhook test. +// Arguments: encrypted file path. +var TplTestWorking = assets.TextDesc(assets.TextDescKeyWriteTestWorking) + +// TplPromptCreated is the confirmation after creating a prompt template. +// Arguments: prompt name. +var TplPromptCreated = assets.TextDesc(assets.TextDescKeyWritePromptCreated) + +// TplPromptNone is printed when no prompts are found. +var TplPromptNone = assets.TextDesc(assets.TextDescKeyWritePromptNone) + +// TplPromptItem is a format template for listing a prompt name. +// Arguments: prompt name. +var TplPromptItem = assets.TextDesc(assets.TextDescKeyWritePromptItem) + +// TplPromptRemoved is the confirmation after removing a prompt template. +// Arguments: prompt name. +var TplPromptRemoved = assets.TextDesc(assets.TextDescKeyWritePromptRemoved) + +// TplReminderAdded is the confirmation for a newly added reminder. +// Arguments: id, message, suffix (e.g. " (after 2026-03-10)" or ""). +var TplReminderAdded = assets.TextDesc(assets.TextDescKeyWriteReminderAdded) + +// TplReminderDismissed is the confirmation for a dismissed reminder. +// Arguments: id, message. +var TplReminderDismissed = assets.TextDesc(assets.TextDescKeyWriteReminderDismissed) + +// TplReminderNone is printed when there are no reminders. +var TplReminderNone = assets.TextDesc(assets.TextDescKeyWriteReminderNone) + +// TplReminderDismissedAll is the summary after dismissing all reminders. +// Arguments: count. +var TplReminderDismissedAll = assets.TextDesc( + assets.TextDescKeyWriteReminderDismissedAll, +) + +// TplReminderItem is a format template for listing a reminder. +// Arguments: id, message, annotation. +var TplReminderItem = assets.TextDesc(assets.TextDescKeyWriteReminderItem) + +// TplReminderNotDue is the annotation for reminders not yet due. +// Arguments: date string. +var TplReminderNotDue = assets.TextDesc(assets.TextDescKeyWriteReminderNotDue) + +// TplReminderAfterSuffix formats the date-gate suffix for a reminder. +// Arguments: date string. +var TplReminderAfterSuffix = assets.TextDesc( + assets.TextDescKeyWriteReminderAfterSuffix, +) + +// TplLockUnlockEntry is the confirmation for a single locked/unlocked entry. +// Arguments: filename, verb ("locked" or "unlocked"). +var TplLockUnlockEntry = assets.TextDesc(assets.TextDescKeyWriteLockUnlockEntry) + +// TplLockUnlockNoChanges is printed when all entries already have the target state. +// Arguments: verb. +var TplLockUnlockNoChanges = assets.TextDesc( + assets.TextDescKeyWriteLockUnlockNoChanges, +) + +// TplLockUnlockSummary is the summary after locking/unlocking entries. +// Arguments: capitalized verb, count. +var TplLockUnlockSummary = assets.TextDesc(assets.TextDescKeyWriteLockUnlockSummary) + +// TplBackupResult is a format template for a backup result line. +// Arguments: scope, archive path, formatted size. +var TplBackupResult = assets.TextDesc(assets.TextDescKeyWriteBackupResult) + +// TplBackupSMBDest is a format template for the SMB destination suffix. +// Arguments: SMB destination path. +var TplBackupSMBDest = assets.TextDesc(assets.TextDescKeyWriteBackupSMBDest) + +// TplStatusTitle is the heading for the status output. +var TplStatusTitle = assets.TextDesc(assets.TextDescKeyWriteStatusTitle) + +// TplStatusSeparator is the visual separator under the heading. +var TplStatusSeparator = assets.TextDesc(assets.TextDescKeyWriteStatusSeparator) + +// TplStatusDir is a format template for the context directory. +// Arguments: context directory path. +var TplStatusDir = assets.TextDesc(assets.TextDescKeyWriteStatusDir) + +// TplStatusFiles is a format template for the total file count. +// Arguments: count. +var TplStatusFiles = assets.TextDesc(assets.TextDescKeyWriteStatusFiles) + +// TplStatusTokens is a format template for the token estimate. +// Arguments: formatted token count. +var TplStatusTokens = assets.TextDesc(assets.TextDescKeyWriteStatusTokens) + +// TplStatusFilesHeader is the heading for the file list section. +var TplStatusFilesHeader = assets.TextDesc( + assets.TextDescKeyWriteStatusFilesHeader, +) + +// TplStatusFileVerbose is a format template for a verbose file entry. +// Arguments: indicator, name, status, formatted tokens, formatted size. +var TplStatusFileVerbose = assets.TextDesc( + assets.TextDescKeyWriteStatusFileVerbose, +) + +// TplStatusFileCompact is a format template for a compact file entry. +// Arguments: indicator, name, status. +var TplStatusFileCompact = assets.TextDesc( + assets.TextDescKeyWriteStatusFileCompact, +) + +// TplStatusPreviewLine is a format template for a content preview line. +// Arguments: line text. +var TplStatusPreviewLine = assets.TextDesc( + assets.TextDescKeyWriteStatusPreviewLine, +) + +// TplStatusActivityHeader is the heading for the recent activity section. +var TplStatusActivityHeader = assets.TextDesc( + assets.TextDescKeyWriteStatusActivityHeader, +) + +// TplStatusActivityItem is a format template for a recent activity entry. +// Arguments: filename, relative time string. +var TplStatusActivityItem = assets.TextDesc( + assets.TextDescKeyWriteStatusActivityItem, +) + +// TplTimeJustNow is the display string for "just now" relative time. +var TplTimeJustNow = assets.TextDesc(assets.TextDescKeyWriteTimeJustNow) + +// TplTimeMinuteAgo is the display string for "1 minute ago". +var TplTimeMinuteAgo = assets.TextDesc(assets.TextDescKeyWriteTimeMinuteAgo) + +// TplTimeMinutesAgo is a format template for minutes ago. +// Arguments: count. +var TplTimeMinutesAgo = assets.TextDesc(assets.TextDescKeyWriteTimeMinutesAgo) + +// TplTimeHourAgo is the display string for "1 hour ago". +var TplTimeHourAgo = assets.TextDesc(assets.TextDescKeyWriteTimeHourAgo) + +// TplTimeHoursAgo is a format template for hours ago. +// Arguments: count. +var TplTimeHoursAgo = assets.TextDesc(assets.TextDescKeyWriteTimeHoursAgo) + +// TplTimeDayAgo is the display string for "1 day ago". +var TplTimeDayAgo = assets.TextDesc(assets.TextDescKeyWriteTimeDayAgo) + +// TplTimeDaysAgo is a format template for days ago. +// Arguments: count. +var TplTimeDaysAgo = assets.TextDesc(assets.TextDescKeyWriteTimeDaysAgo) + +// TplTimeOlderFormat is the Go time layout for dates older than a week. +// Exported because callers must format the fallback date before calling FormatTimeAgo. +// +// Deprecated: Use config.OlderFormat instead. +const TplTimeOlderFormat = time.OlderFormat + +// TplSyncInSync is printed when context is fully in sync. +var TplSyncInSync = assets.TextDesc(assets.TextDescKeyWriteSyncInSync) + +// TplSyncHeader is the heading for the sync analysis output. +var TplSyncHeader = assets.TextDesc(assets.TextDescKeyWriteSyncHeader) + +// TplSyncSeparator is the visual separator under the heading. +var TplSyncSeparator = assets.TextDesc(assets.TextDescKeyWriteSyncSeparator) + +// TplSyncDryRun is printed when running in dry-run mode. +var TplSyncDryRun = assets.TextDesc(assets.TextDescKeyWriteSyncDryRun) + +// TplSyncAction is a format template for a sync action item. +// Arguments: index, type, description. +var TplSyncAction = assets.TextDesc(assets.TextDescKeyWriteSyncAction) + +// TplSyncSuggestion is a format template for a suggestion under an action. +// Arguments: suggestion text. +var TplSyncSuggestion = assets.TextDesc(assets.TextDescKeyWriteSyncSuggestion) + +// TplSyncDryRunSummary is a format template for dry-run summary. +// Arguments: count. +var TplSyncDryRunSummary = assets.TextDesc( + assets.TextDescKeyWriteSyncDryRunSummary, +) + +// TplSyncSummary is a format template for the sync summary. +// Arguments: count. +var TplSyncSummary = assets.TextDesc(assets.TextDescKeyWriteSyncSummary) + +// TplJournalSyncNone is printed when no journal entries are found. +var TplJournalSyncNone = assets.TextDesc(assets.TextDescKeyWriteJournalSyncNone) + +// TplJournalSyncLocked is a format template for a newly locked entry. +// Arguments: filename. +var TplJournalSyncLocked = assets.TextDesc( + assets.TextDescKeyWriteJournalSyncLocked, +) + +// TplJournalSyncUnlocked is a format template for a newly unlocked entry. +// Arguments: filename. +var TplJournalSyncUnlocked = assets.TextDesc( + assets.TextDescKeyWriteJournalSyncUnlocked, +) + +// TplJournalSyncMatch is printed when state already matches frontmatter. +var TplJournalSyncMatch = assets.TextDesc( + assets.TextDescKeyWriteJournalSyncMatch, +) + +// TplJournalSyncLockedCount is a format template for locked entry count. +// Arguments: count. +var TplJournalSyncLockedCount = assets.TextDesc( + assets.TextDescKeyWriteJournalSyncLockedCount, +) + +// TplJournalSyncUnlockedCount is a format template for unlocked entry count. +// Arguments: count. +var TplJournalSyncUnlockedCount = assets.TextDesc( + assets.TextDescKeyWriteJournalSyncUnlockedCount, +) diff --git a/internal/write/config/doc.go b/internal/write/config/doc.go new file mode 100644 index 00000000..20181fd0 --- /dev/null +++ b/internal/write/config/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package config provides formatted output helpers for the config command. +package config diff --git a/internal/write/errors.go b/internal/write/errors.go index 71b46374..b464f25a 100644 --- a/internal/write/errors.go +++ b/internal/write/errors.go @@ -7,6 +7,8 @@ package write import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" "github.com/spf13/cobra" ) @@ -19,7 +21,7 @@ func ErrWithError(cmd *cobra.Command, err error) { if cmd == nil { return } - cmd.PrintErrln(prefixError, err) + cmd.PrintErrln(config.PrefixError, err) } // WarnFileErr prints a non-fatal file operation warning to stderr. @@ -32,5 +34,5 @@ func WarnFileErr(cmd *cobra.Command, path string, err error) { if cmd == nil { return } - sprintf(cmd, " ! %s: %v", path, err) + cmd.Println(fmt.Sprintf(" ! %s: %v", path, err)) } diff --git a/internal/write/export.go b/internal/write/export.go index 674846b5..02379d3d 100644 --- a/internal/write/export.go +++ b/internal/write/export.go @@ -13,22 +13,21 @@ import ( "github.com/spf13/cobra" ) -// ExportCounts holds aggregate counters for export summary output. -type ExportCounts struct { - New int - Regen int - Skip int - Locked int -} - // ExportSummary prints what an export will (or would) do based on // aggregate counters. // // Parameters: // - cmd: Cobra command for output. Nil is a no-op. -// - counts: aggregate export counters. +// - newCount: number of new files to export. +// - regenCount: number of existing files to regenerate. +// - skipCount: number of existing files to skip. +// - lockedCount: number of locked files to skip. // - dryRun: when true, uses "Would" instead of "Will". -func ExportSummary(cmd *cobra.Command, counts ExportCounts, dryRun bool) { +func ExportSummary( + cmd *cobra.Command, + newCount, regenCount, skipCount, lockedCount int, + dryRun bool, +) { if cmd == nil { return } @@ -38,21 +37,21 @@ func ExportSummary(cmd *cobra.Command, counts ExportCounts, dryRun bool) { verb = "Would" } var parts []string - if counts.New > 0 { - parts = append(parts, fmt.Sprintf("export %d new", counts.New)) + if newCount > 0 { + parts = append(parts, fmt.Sprintf("export %d new", newCount)) } - if counts.Regen > 0 { - parts = append(parts, fmt.Sprintf("regenerate %d existing", counts.Regen)) + if regenCount > 0 { + parts = append(parts, fmt.Sprintf("regenerate %d existing", regenCount)) } - if counts.Skip > 0 { - parts = append(parts, fmt.Sprintf("skip %d existing", counts.Skip)) + if skipCount > 0 { + parts = append(parts, fmt.Sprintf("skip %d existing", skipCount)) } - if counts.Locked > 0 { - parts = append(parts, fmt.Sprintf("skip %d locked", counts.Locked)) + if lockedCount > 0 { + parts = append(parts, fmt.Sprintf("skip %d locked", lockedCount)) } if len(parts) == 0 { cmd.Println("Nothing to export.") return } - sprintf(cmd, "%s %s.", verb, strings.Join(parts, ", ")) + cmd.Println(fmt.Sprintf("%s %s.", verb, strings.Join(parts, ", "))) } diff --git a/internal/write/fmt.go b/internal/write/fmt.go new file mode 100644 index 00000000..d1aecd79 --- /dev/null +++ b/internal/write/fmt.go @@ -0,0 +1,88 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/write/config" +) + +// FormatTimeAgo returns a human-readable relative time duration. +// +// Examples: "just now", "5 minutes ago", "2 hours ago", "3 days ago", +// or a formatted date for times older than a week. +// +// Parameters: +// - d: Duration since the event +// - fallbackDate: Formatted date string for durations older than a week +// +// Returns: +// - string: Human-readable relative time +func FormatTimeAgo(hours float64, mins int, fallbackDate string) string { + switch { + case hours < 1.0/60: // less than a minute + return config.TplTimeJustNow + case hours < 1: + if mins == 1 { + return config.TplTimeMinuteAgo + } + return fmt.Sprintf(config.TplTimeMinutesAgo, mins) + case hours < 24: + h := int(hours) + if h == 1 { + return config.TplTimeHourAgo + } + return fmt.Sprintf(config.TplTimeHoursAgo, h) + case hours < 7*24: + days := int(hours / 24) + if days == 1 { + return config.TplTimeDayAgo + } + return fmt.Sprintf(config.TplTimeDaysAgo, days) + default: + return fallbackDate + } +} + +// FormatNumber returns a number with thousand separators. +// +// Examples: 500 -> "500", 1500 -> "1,500", 12345 -> "12,345" +// +// Parameters: +// - n: The number to format +// +// Returns: +// - string: Formatted number with commas +func FormatNumber(n int) string { + if n < 1000 { + return fmt.Sprintf("%d", n) + } + return fmt.Sprintf("%d,%03d", n/1000, n%1000) +} + +// FormatBytes returns a human-readable byte-size string. +// +// Uses binary units (1024-based): B, KB, MB, GB, etc. +// +// Parameters: +// - b: The byte count to format +// +// Returns: +// - string: Human-readable size with unit +func FormatBytes(b int64) string { + const unit = 1024 + if b < unit { + return fmt.Sprintf("%d B", b) + } + div, exp := int64(unit), 0 + for n := b / unit; n >= unit; n /= unit { + div *= unit + exp++ + } + return fmt.Sprintf("%.1f %cB", float64(b)/float64(div), "KMGTPE"[exp]) +} diff --git a/internal/write/hook.go b/internal/write/hook.go new file mode 100644 index 00000000..eac089c1 --- /dev/null +++ b/internal/write/hook.go @@ -0,0 +1,98 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// HookNudge prints a pre-built nudge box to stdout. +// +// Used by system hooks to emit nudge messages through the write layer +// rather than calling cmd.Println directly. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - nudgeBox: fully formatted nudge box string. +func HookNudge(cmd *cobra.Command, nudgeBox string) { + if cmd == nil { + return + } + cmd.Println(nudgeBox) +} + +// InfoHookTool prints a tool integration section to stdout. +// +// The content is a pre-formatted multi-line text block loaded from +// commands.yaml. A trailing newline is not added — the content is +// expected to include its own formatting. +// +// Parameters: +// - cmd: Cobra command for output +// - content: Pre-formatted text block +func InfoHookTool(cmd *cobra.Command, content string) { + cmd.Print(content) +} + +// InfoHookCopilotSkipped reports that copilot instructions were skipped +// because the ctx marker already exists in the target file. +// +// Parameters: +// - cmd: Cobra command for output +// - targetFile: Path to the existing file +func InfoHookCopilotSkipped(cmd *cobra.Command, targetFile string) { + cmd.Println(fmt.Sprintf(config.TplHookCopilotSkipped, targetFile)) + cmd.Println(config.TplHookCopilotForceHint) +} + +// InfoHookCopilotMerged reports that copilot instructions were merged +// into an existing file. +// +// Parameters: +// - cmd: Cobra command for output +// - targetFile: Path to the merged file +func InfoHookCopilotMerged(cmd *cobra.Command, targetFile string) { + cmd.Println(fmt.Sprintf(config.TplHookCopilotMerged, targetFile)) +} + +// InfoHookCopilotCreated reports that copilot instructions were created. +// +// Parameters: +// - cmd: Cobra command for output +// - targetFile: Path to the created file +func InfoHookCopilotCreated(cmd *cobra.Command, targetFile string) { + cmd.Println(fmt.Sprintf(config.TplHookCopilotCreated, targetFile)) +} + +// InfoHookCopilotSessionsDir reports that the sessions directory was created. +// +// Parameters: +// - cmd: Cobra command for output +// - sessionsDir: Path to the sessions directory +func InfoHookCopilotSessionsDir(cmd *cobra.Command, sessionsDir string) { + cmd.Println(fmt.Sprintf(config.TplHookCopilotSessionsDir, sessionsDir)) +} + +// InfoHookCopilotSummary prints the post-write summary for copilot. +// +// Parameters: +// - cmd: Cobra command for output +func InfoHookCopilotSummary(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplHookCopilotSummary) +} + +// InfoHookUnknownTool prints the unknown tool message. +// +// Parameters: +// - cmd: Cobra command for output +// - tool: The unrecognized tool name +func InfoHookUnknownTool(cmd *cobra.Command, tool string) { + cmd.Println(fmt.Sprintf(config.TplHookUnknownTool, tool)) +} diff --git a/internal/write/import.go b/internal/write/import.go new file mode 100644 index 00000000..b444f506 --- /dev/null +++ b/internal/write/import.go @@ -0,0 +1,167 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "strings" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// ImportNoEntries prints that no entries were found in the source file. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: source file name (e.g. "MEMORY.md"). +func ImportNoEntries(cmd *cobra.Command, filename string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplImportNoEntries, filename)) +} + +// ImportScanHeader prints the scanning header: source name, entry count, +// and a trailing blank line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: source file name being scanned. +// - count: number of entries discovered. +func ImportScanHeader(cmd *cobra.Command, filename string, count int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplImportScanning, filename)) + cmd.Println(fmt.Sprintf(config.TplImportFound, count)) + cmd.Println() +} + +// ImportEntrySkipped prints a skipped entry block: title, "skip" +// classification, and a trailing blank line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - title: truncated entry title. +func ImportEntrySkipped(cmd *cobra.Command, title string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplImportEntry, title)) + cmd.Println(config.TplImportClassifiedSkip) + cmd.Println() +} + +// ImportEntryClassified prints a classified entry block (dry run): +// title, target file with keywords, and a trailing blank line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - title: truncated entry title. +// - targetFile: destination filename. +// - keywords: matched classification keywords. +func ImportEntryClassified(cmd *cobra.Command, title, targetFile string, keywords []string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplImportEntry, title)) + cmd.Println(fmt.Sprintf(config.TplImportClassified, targetFile, strings.Join(keywords, ", "))) + cmd.Println() +} + +// ImportEntryAdded prints a promoted entry block: title, target file, +// and a trailing blank line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - title: truncated entry title. +// - targetFile: destination filename. +func ImportEntryAdded(cmd *cobra.Command, title, targetFile string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplImportEntry, title)) + cmd.Println(fmt.Sprintf(config.TplImportAdded, targetFile)) + cmd.Println() +} + +// ErrImportPromote prints a promotion error to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - targetFile: destination filename. +// - cause: the promotion error. +func ErrImportPromote(cmd *cobra.Command, targetFile string, cause error) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(" Error promoting to %s: %v", targetFile, cause)) +} + +// ImportCounts holds the per-type tallies for import summary output. +type ImportCounts struct { + Conventions int + Decisions int + Learnings int + Tasks int + Skipped int + Dupes int +} + +// ImportSummary prints the full import summary block: total with +// per-type breakdown, skipped count, and duplicate count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - counts: aggregate import counters. +// - dryRun: whether this was a dry run. +func ImportSummary(cmd *cobra.Command, counts ImportCounts, dryRun bool) { + if cmd == nil { + return + } + + total := counts.Conventions + counts.Decisions + counts.Learnings + counts.Tasks + + var summary string + if dryRun { + summary = fmt.Sprintf(config.TplImportSummaryDryRun, total) + } else { + summary = fmt.Sprintf(config.TplImportSummary, total) + } + + var parts []string + if counts.Conventions > 0 { + parts = append(parts, fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyImportCountConvention), counts.Conventions)) + } + if counts.Decisions > 0 { + parts = append(parts, fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyImportCountDecision), counts.Decisions)) + } + if counts.Learnings > 0 { + parts = append(parts, fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyImportCountLearning), counts.Learnings)) + } + if counts.Tasks > 0 { + parts = append(parts, fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyImportCountTask), counts.Tasks)) + } + if len(parts) > 0 { + summary += fmt.Sprintf(" (%s)", strings.Join(parts, ", ")) + } + cmd.Println(summary) + + if counts.Skipped > 0 { + cmd.Println(fmt.Sprintf(config.TplImportSkipped, counts.Skipped)) + } + if counts.Dupes > 0 { + cmd.Println(fmt.Sprintf(config.TplImportDuplicates, counts.Dupes)) + } +} diff --git a/internal/write/info.go b/internal/write/info.go index 7bf63268..202686ee 100644 --- a/internal/write/info.go +++ b/internal/write/info.go @@ -7,9 +7,13 @@ package write import ( + "fmt" "path/filepath" + "github.com/ActiveMemory/ctx/internal/write/config" "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" ) // InfoPathConversionExists reports that a path conversion target already @@ -27,7 +31,7 @@ func InfoPathConversionExists( if cmd == nil { return } - sprintf(cmd, tplPathExists, oldPath, filepath.Join(rootDir, newPath)) + cmd.Println(fmt.Sprintf(config.TplPathExists, oldPath, filepath.Join(rootDir, newPath))) } // InfoAddedTo confirms an entry was added to a context file. @@ -36,7 +40,108 @@ func InfoPathConversionExists( // - cmd: Cobra command for output // - filename: Name of the file the entry was added to func InfoAddedTo(cmd *cobra.Command, filename string) { - sprintf(cmd, tplAddedTo, filename) + cmd.Println(fmt.Sprintf(config.TplAddedTo, filename)) +} + +// InfoMovingTask reports a completed task being moved. +// +// Parameters: +// - cmd: Cobra command for output +// - taskText: Truncated task description +func InfoMovingTask(cmd *cobra.Command, taskText string) { + cmd.Println(fmt.Sprintf(config.TplMovingTask, taskText)) +} + +// InfoSkippingTask reports a task skipped due to incomplete children. +// +// Parameters: +// - cmd: Cobra command for output +// - taskText: Truncated task description +func InfoSkippingTask(cmd *cobra.Command, taskText string) { + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveSkipping), taskText)) +} + +// InfoArchivedTasks reports the number of tasks archived. +// +// Parameters: +// - cmd: Cobra command for output +// - count: Number of tasks archived +// - archiveFile: Path to the archive file +// - days: Age threshold in days +func InfoArchivedTasks(cmd *cobra.Command, count int, archiveFile string, days int) { + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveSuccessWithAge), count, archiveFile, days)) +} + +// InfoCompletedTask reports a task marked complete. +// +// Parameters: +// - cmd: Cobra command for output +// - taskText: The completed task description +func InfoCompletedTask(cmd *cobra.Command, taskText string) { + cmd.Println(fmt.Sprintf(config.TplCompletedTask, taskText)) +} + +// InfoConfigProfileDev reports that the dev profile is active. +// +// Parameters: +// - cmd: Cobra command for output +func InfoConfigProfileDev(cmd *cobra.Command) { + cmd.Println(config.TplConfigProfileDev) +} + +// InfoConfigProfileBase reports that the base profile is active. +// +// Parameters: +// - cmd: Cobra command for output +func InfoConfigProfileBase(cmd *cobra.Command) { + cmd.Println(config.TplConfigProfileBase) +} + +// InfoConfigProfileNone reports that no profile exists. +// +// Parameters: +// - cmd: Cobra command for output +// - filename: The .ctxrc filename +func InfoConfigProfileNone(cmd *cobra.Command, filename string) { + cmd.Println(fmt.Sprintf(config.TplConfigProfileNone, filename)) +} + +// InfoDepsNoProject reports that no supported project was detected. +// +// Parameters: +// - cmd: Cobra command for output +// - builderNames: Comma-separated list of supported project types +func InfoDepsNoProject(cmd *cobra.Command, builderNames string) { + cmd.Println(config.TplDepsNoProject) + cmd.Println(config.TplDepsLookingFor) + cmd.Println(fmt.Sprintf(config.TplDepsUseType, builderNames)) +} + +// InfoDepsNoDeps reports that no dependencies were found. +// +// Parameters: +// - cmd: Cobra command for output +func InfoDepsNoDeps(cmd *cobra.Command) { + cmd.Println(config.TplDepsNoDeps) +} + +// InfoSkillsHeader prints the skills list heading. +// +// Parameters: +// - cmd: Cobra command for output +func InfoSkillsHeader(cmd *cobra.Command) { + cmd.Println(config.TplSkillsHeader) + cmd.Println() +} + +// InfoSkillLine prints a single skill entry. +// +// Parameters: +// - cmd: Cobra command for output +// - name: Skill name +// - description: Truncated skill description +func InfoSkillLine(cmd *cobra.Command, name, description string) { + cmd.Println(fmt.Sprintf(config.TplSkillLine, name, description)) } // InfoExistsWritingAsAlternative reports that a file already exists and the @@ -52,5 +157,222 @@ func InfoExistsWritingAsAlternative( if cmd == nil { return } - sprintf(cmd, tplExistsWritingAsAlternative, path, alternative) + cmd.Println(fmt.Sprintf(config.TplExistsWritingAsAlternative, path, alternative)) +} + +// InfoInitOverwritePrompt prints the overwrite confirmation prompt. +// +// Parameters: +// - cmd: Cobra command for output +// - contextDir: path to the existing .context/ directory +func InfoInitOverwritePrompt(cmd *cobra.Command, contextDir string) { + cmd.Print(fmt.Sprintf(config.TplInitOverwritePrompt, contextDir)) +} + +// InfoInitAborted reports that the user cancelled the init operation. +// +// Parameters: +// - cmd: Cobra command for output +func InfoInitAborted(cmd *cobra.Command) { + cmd.Println(config.TplInitAborted) +} + +// InfoInitExistsSkipped reports a template file skipped because it exists. +// +// Parameters: +// - cmd: Cobra command for output +// - name: the template filename that was skipped +func InfoInitExistsSkipped(cmd *cobra.Command, name string) { + cmd.Println(fmt.Sprintf(config.TplInitExistsSkipped, name)) +} + +// InfoInitFileCreated reports a template file that was created. +// +// Parameters: +// - cmd: Cobra command for output +// - name: the template filename that was created +func InfoInitFileCreated(cmd *cobra.Command, name string) { + cmd.Println(fmt.Sprintf(config.TplInitFileCreated, name)) +} + +// InfoInitialized reports successful context directory initialization. +// +// Parameters: +// - cmd: Cobra command for output +// - contextDir: the path to the initialized .context/ directory +func InfoInitialized(cmd *cobra.Command, contextDir string) { + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplInitialized, contextDir)) +} + +// InfoInitWarnNonFatal reports a non-fatal warning during init. +// +// Parameters: +// - cmd: Cobra command for output +// - label: short description of what failed (e.g. "CLAUDE.md") +// - err: the non-fatal error +func InfoInitWarnNonFatal(cmd *cobra.Command, label string, err error) { + cmd.Println(fmt.Sprintf(config.TplInitWarnNonFatal, label, err)) +} + +// InfoInitScratchpadPlaintext reports a plaintext scratchpad was created. +// +// Parameters: +// - cmd: Cobra command for output +// - path: the scratchpad file path +func InfoInitScratchpadPlaintext(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitScratchpadPlaintext, path)) +} + +// InfoInitScratchpadNoKey warns about a missing key for an encrypted scratchpad. +// +// Parameters: +// - cmd: Cobra command for output +// - keyPath: the expected key path +func InfoInitScratchpadNoKey(cmd *cobra.Command, keyPath string) { + cmd.Println(fmt.Sprintf(config.TplInitScratchpadNoKey, keyPath)) +} + +// InfoInitScratchpadKeyCreated reports a scratchpad key was generated. +// +// Parameters: +// - cmd: Cobra command for output +// - keyPath: the path where the key was saved +func InfoInitScratchpadKeyCreated(cmd *cobra.Command, keyPath string) { + cmd.Println(fmt.Sprintf(config.TplInitScratchpadKeyCreated, keyPath)) +} + +// InfoInitCreatingRootFiles prints the heading before root file creation. +// +// Parameters: +// - cmd: Cobra command for output +func InfoInitCreatingRootFiles(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplInitCreatingRootFiles) +} + +// InfoInitSettingUpPermissions prints the heading before permissions setup. +// +// Parameters: +// - cmd: Cobra command for output +func InfoInitSettingUpPermissions(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplInitSettingUpPermissions) +} + +// InfoInitGitignoreUpdated reports .gitignore entries were added. +// +// Parameters: +// - cmd: Cobra command for output +// - count: number of entries added +func InfoInitGitignoreUpdated(cmd *cobra.Command, count int) { + cmd.Println(fmt.Sprintf(config.TplInitGitignoreUpdated, count)) +} + +// InfoInitGitignoreReview hints how to review changes. +// +// Parameters: +// - cmd: Cobra command for output +func InfoInitGitignoreReview(cmd *cobra.Command) { + cmd.Println(config.TplInitGitignoreReview) +} + +// InfoInitNextSteps prints the post-init guidance block. +// +// Parameters: +// - cmd: Cobra command for output +func InfoInitNextSteps(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplInitNextSteps) + cmd.Println() + cmd.Println(config.TplInitPluginInfo) + cmd.Println() + cmd.Println(config.TplInitPluginNote) +} + +// InfoObsidianGenerated reports successful Obsidian vault generation. +// +// Parameters: +// - cmd: Cobra command for output +// - count: Number of entries generated +// - output: Output directory path +func InfoObsidianGenerated(cmd *cobra.Command, count int, output string) { + cmd.Println(fmt.Sprintf(config.TplObsidianGenerated, count, output)) + cmd.Println() + cmd.Println("Next steps:") + cmd.Println(fmt.Sprintf(config.TplObsidianNextSteps, output)) +} + +// InfoJournalOrphanRemoved reports a removed orphan file. +// +// Parameters: +// - cmd: Cobra command for output +// - name: Filename that was removed +func InfoJournalOrphanRemoved(cmd *cobra.Command, name string) { + cmd.Println(fmt.Sprintf(config.TplJournalOrphanRemoved, name)) +} + +// InfoJournalSiteGenerated reports successful site generation with next steps. +// +// Parameters: +// - cmd: Cobra command for output +// - count: Number of entries generated +// - output: Output directory path +// - zensicalBin: Zensical binary name +func InfoJournalSiteGenerated(cmd *cobra.Command, count int, output, zensicalBin string) { + cmd.Println(fmt.Sprintf(config.TplJournalSiteGenerated, count, output)) + cmd.Println() + cmd.Println("Next steps:") + cmd.Println(fmt.Sprintf(config.TplJournalSiteNextSteps, output, zensicalBin)) + cmd.Println(" or") + cmd.Println(config.TplJournalSiteAlt) +} + +// InfoJournalSiteStarting reports the server is starting. +// +// Parameters: +// - cmd: Cobra command for output +func InfoJournalSiteStarting(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplJournalSiteStarting) +} + +// InfoJournalSiteBuilding reports a build is in progress. +// +// Parameters: +// - cmd: Cobra command for output +func InfoJournalSiteBuilding(cmd *cobra.Command) { + cmd.Println() + cmd.Println(config.TplJournalSiteBuilding) +} + +// InfoLoopGenerated reports successful loop script generation with details. +// +// Parameters: +// - cmd: Cobra command for output +// - outputFile: Generated script path +// - heading: Start heading text +// - tool: Selected AI tool +// - promptFile: Prompt file path +// - maxIterations: Max iterations (0 = unlimited) +// - completionMsg: Completion signal string +func InfoLoopGenerated( + cmd *cobra.Command, + outputFile, heading, tool, promptFile string, + maxIterations int, + completionMsg string, +) { + cmd.Println(fmt.Sprintf(config.TplLoopGenerated, outputFile)) + cmd.Println() + cmd.Println(heading) + cmd.Println(fmt.Sprintf(config.TplLoopRunCmd, outputFile)) + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplLoopTool, tool)) + cmd.Println(fmt.Sprintf(config.TplLoopPrompt, promptFile)) + if maxIterations > 0 { + cmd.Println(fmt.Sprintf(config.TplLoopMaxIterations, maxIterations)) + } else { + cmd.Println(config.TplLoopUnlimited) + } + cmd.Println(fmt.Sprintf(config.TplLoopCompletion, completionMsg)) } diff --git a/internal/write/init.go b/internal/write/init.go new file mode 100644 index 00000000..f468d48b --- /dev/null +++ b/internal/write/init.go @@ -0,0 +1,236 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// InitCreated reports a file created during init. +// +// Parameters: +// - cmd: Cobra command for output +// - path: created file path +func InitCreated(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitFileCreated, path)) +} + +// InitCreatedWith reports a file created with a qualifier (e.g. " (ralph mode)"). +// +// Parameters: +// - cmd: Cobra command for output +// - path: created file path +// - qualifier: additional info appended after the path +func InitCreatedWith(cmd *cobra.Command, path, qualifier string) { + cmd.Println(fmt.Sprintf(config.TplInitCreatedWith, path, qualifier)) +} + +// InitSkipped reports a file skipped because it already exists. +// +// Parameters: +// - cmd: Cobra command for output +// - path: skipped file path +func InitSkipped(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitExistsSkipped, path)) +} + +// InitSkippedPlain reports a file skipped without detail. +// +// Parameters: +// - cmd: Cobra command for output +// - path: skipped file path +func InitSkippedPlain(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitSkippedPlain, path)) +} + +// InitCtxContentExists reports a file skipped because ctx content exists. +// +// Parameters: +// - cmd: Cobra command for output +// - path: skipped file path +func InitCtxContentExists(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitCtxContentExists, path)) +} + +// InitMerged reports a file merged during init. +// +// Parameters: +// - cmd: Cobra command for output +// - path: merged file path +func InitMerged(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitMerged, path)) +} + +// InitBackup reports a backup file created. +// +// Parameters: +// - cmd: Cobra command for output +// - path: backup file path +func InitBackup(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitBackup, path)) +} + +// InitUpdatedCtxSection reports a file whose ctx section was updated. +// +// Parameters: +// - cmd: Cobra command for output +// - path: updated file path +func InitUpdatedCtxSection(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitUpdatedCtxSection, path)) +} + +// InitUpdatedPlanSection reports a file whose plan section was updated. +// +// Parameters: +// - cmd: Cobra command for output +// - path: updated file path +func InitUpdatedPlanSection(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitUpdatedPlanSection, path)) +} + +// InitUpdatedPromptSection reports a file whose prompt section was updated. +// +// Parameters: +// - cmd: Cobra command for output +// - path: updated file path +func InitUpdatedPromptSection(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitUpdatedPromptSection, path)) +} + +// InitFileExistsNoCtx reports a file exists without ctx content. +// +// Parameters: +// - cmd: Cobra command for output +// - path: file path +func InitFileExistsNoCtx(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitFileExistsNoCtx, path)) +} + +// InitNoChanges reports a settings file with no changes needed. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitNoChanges(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitNoChanges, path)) +} + +// InitPermsMergedDeduped reports permissions merged and deduped. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitPermsMergedDeduped(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitPermsMergedDeduped, path)) +} + +// InitPermsDeduped reports duplicate permissions removed. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitPermsDeduped(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitPermsDeduped, path)) +} + +// InitPermsAllowDeny reports allow+deny permissions added. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitPermsAllowDeny(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitPermsAllowDeny, path)) +} + +// InitPermsDeny reports deny permissions added. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitPermsDeny(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitPermsDeny, path)) +} + +// InitPermsAllow reports ctx permissions added. +// +// Parameters: +// - cmd: Cobra command for output +// - path: settings file path +func InitPermsAllow(cmd *cobra.Command, path string) { + cmd.Println(fmt.Sprintf(config.TplInitPermsAllow, path)) +} + +// InitMakefileCreated reports a new Makefile created with ctx include. +// +// Parameters: +// - cmd: Cobra command for output +func InitMakefileCreated(cmd *cobra.Command) { + cmd.Println(config.TplInitMakefileCreated) +} + +// InitMakefileIncludes reports Makefile already includes the directive. +// +// Parameters: +// - cmd: Cobra command for output +// - filename: included filename +func InitMakefileIncludes(cmd *cobra.Command, filename string) { + cmd.Println(fmt.Sprintf(config.TplInitMakefileIncludes, filename)) +} + +// InitMakefileAppended reports an include appended to Makefile. +// +// Parameters: +// - cmd: Cobra command for output +// - filename: included filename +func InitMakefileAppended(cmd *cobra.Command, filename string) { + cmd.Println(fmt.Sprintf(config.TplInitMakefileAppended, filename)) +} + +// InitPluginSkipped reports plugin enablement was skipped. +// +// Parameters: +// - cmd: Cobra command for output +func InitPluginSkipped(cmd *cobra.Command) { + cmd.Println(config.TplInitPluginSkipped) +} + +// InitPluginAlreadyEnabled reports plugin is already enabled globally. +// +// Parameters: +// - cmd: Cobra command for output +func InitPluginAlreadyEnabled(cmd *cobra.Command) { + cmd.Println(config.TplInitPluginAlreadyEnabled) +} + +// InitPluginEnabled reports plugin enabled globally. +// +// Parameters: +// - cmd: Cobra command for output +// - settingsPath: path to the settings file +func InitPluginEnabled(cmd *cobra.Command, settingsPath string) { + cmd.Println(fmt.Sprintf(config.TplInitPluginEnabled, settingsPath)) +} + +// InitSkippedDir reports a directory skipped because it exists. +// +// Parameters: +// - cmd: Cobra command for output +// - dir: directory name +func InitSkippedDir(cmd *cobra.Command, dir string) { + cmd.Println(fmt.Sprintf(config.TplInitSkippedDir, dir)) +} + +// InitCreatedDir reports a directory created during init. +// +// Parameters: +// - cmd: Cobra command for output +// - dir: directory name +func InitCreatedDir(cmd *cobra.Command, dir string) { + cmd.Println(fmt.Sprintf(config.TplInitCreatedDir, dir)) +} diff --git a/internal/write/io/doc.go b/internal/write/io/doc.go new file mode 100644 index 00000000..29615513 --- /dev/null +++ b/internal/write/io/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package io provides low-level print helpers shared across write subpackages. +package io diff --git a/internal/cli/complete/cmd/root/cmd.go b/internal/write/io/print.go similarity index 94% rename from internal/cli/complete/cmd/root/cmd.go rename to internal/write/io/print.go index d37451a9..c27d944c 100644 --- a/internal/cli/complete/cmd/root/cmd.go +++ b/internal/write/io/print.go @@ -4,4 +4,4 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package root +package io diff --git a/internal/cli/load/core/out.go b/internal/write/load.go similarity index 53% rename from internal/cli/load/core/out.go rename to internal/write/load.go index 7eb35a17..2bdd7512 100644 --- a/internal/cli/load/core/out.go +++ b/internal/write/load.go @@ -4,33 +4,31 @@ // \ Copyright 2026-present Context contributors. // SPDX-License-Identifier: Apache-2.0 -package core +package write import ( "fmt" "strings" + "github.com/ActiveMemory/ctx/internal/config/token" "github.com/spf13/cobra" - "github.com/ActiveMemory/ctx/internal/config" + "github.com/ActiveMemory/ctx/internal/assets" "github.com/ActiveMemory/ctx/internal/context" ) -// OutputRaw outputs context files without assembly or headers. +// LoadRaw outputs context files without assembly or headers. // -// Files are output in read order (see [config.FileReadOrder]), separated -// by blank lines. Content is printed as-is without modification. +// Files are output in read order, separated by blank lines. +// Content is printed as-is without modification. // // Parameters: // - cmd: Cobra command for output stream -// - ctx: Loaded context containing files to output +// - files: Context files sorted by read order // // Returns: // - error: Always nil (included for interface consistency) -func OutputRaw(cmd *cobra.Command, ctx *context.Context) error { - // Sort files by read order - files := SortByReadOrder(ctx.Files) - +func LoadRaw(cmd *cobra.Command, files []context.FileInfo) error { for i, f := range files { if i > 0 { cmd.Println() @@ -40,7 +38,7 @@ func OutputRaw(cmd *cobra.Command, ctx *context.Context) error { return nil } -// OutputAssembled outputs context as formatted Markdown with token budgeting. +// LoadAssembled outputs context as formatted Markdown with token budgeting. // // Assembles context files into a single Markdown document with headers, // respecting the token budget. Files are included in read order until the @@ -48,53 +46,41 @@ func OutputRaw(cmd *cobra.Command, ctx *context.Context) error { // // Parameters: // - cmd: Cobra command for output stream -// - ctx: Loaded context containing files to assemble +// - files: Context files sorted by read order // - budget: Maximum token count for the output +// - totalTokens: Total available tokens in context +// - titleFn: Function to convert filename to display title // // Returns: // - error: Always nil (included for interface consistency) -func OutputAssembled( - cmd *cobra.Command, ctx *context.Context, budget int, +func LoadAssembled( + cmd *cobra.Command, + files []context.FileInfo, + budget, totalTokens int, + titleFn func(string) string, ) error { var sb strings.Builder - nl := config.NewlineLF - sep := config.Separator + nl := token.NewlineLF + sep := token.Separator - // Header - sb.WriteString(config.LoadHeadingContext + nl + nl) - sb.WriteString( - fmt.Sprintf( - config.TplLoadBudget+nl+nl, - budget, ctx.TotalTokens, - ), - ) + sb.WriteString(assets.LoadHeadingContext + nl + nl) + _, _ = fmt.Fprintf(&sb, assets.TplLoadBudget+nl+nl, budget, totalTokens) sb.WriteString(sep + nl + nl) - // Sort files by read order - files := SortByReadOrder(ctx.Files) - tokensUsed := context.EstimateTokensString(sb.String()) for _, f := range files { - // Skip empty files if f.IsEmpty { continue } - // Check if we have the budget for this file fileTokens := f.Tokens if tokensUsed+fileTokens > budget { - // Add a truncation notice - sb.WriteString( - fmt.Sprintf(nl+sep+nl+nl+config.TplLoadTruncated+nl, f.Name), - ) + _, _ = fmt.Fprintf(&sb, nl+sep+nl+nl+assets.TplLoadTruncated+nl, f.Name) break } - // Add the file section - sb.WriteString(fmt.Sprintf( - config.TplLoadSectionHeading+nl+nl, FileNameToTitle(f.Name)), - ) + _, _ = fmt.Fprintf(&sb, assets.TplLoadSectionHeading+nl+nl, titleFn(f.Name)) sb.Write(f.Content) if !strings.HasSuffix(string(f.Content), nl) { sb.WriteString(nl) diff --git a/internal/write/memory.go b/internal/write/memory.go new file mode 100644 index 00000000..c24d14dc --- /dev/null +++ b/internal/write/memory.go @@ -0,0 +1,169 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// MemoryNoChanges prints that no changes exist since last sync. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryNoChanges(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryNoChanges) +} + +// MemoryBridgeHeader prints the "Memory Bridge Status" heading. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryBridgeHeader(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryBridgeHeader) +} + +// MemorySourceNotActive prints that auto memory is not active. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemorySourceNotActive(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemorySourceNotActive) +} + +// MemorySource prints the source path. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - path: absolute path to MEMORY.md. +func MemorySource(cmd *cobra.Command, path string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplMemorySource, path)) +} + +// MemoryMirror prints the mirror relative path. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - relativePath: mirror path relative to project root. +func MemoryMirror(cmd *cobra.Command, relativePath string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplMemoryMirror, relativePath)) +} + +// MemoryLastSync prints the last sync timestamp with age. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - formatted: formatted timestamp string. +// - ago: human-readable duration since sync. +func MemoryLastSync(cmd *cobra.Command, formatted, ago string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplMemoryLastSync, formatted, ago)) +} + +// MemoryLastSyncNever prints that no sync has occurred. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryLastSyncNever(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryLastSyncNever) +} + +// MemorySourceLines prints the MEMORY.md line count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of lines. +// - drifted: whether the source has changed since last sync. +func MemorySourceLines(cmd *cobra.Command, count int, drifted bool) { + if cmd == nil { + return + } + if drifted { + cmd.Println(fmt.Sprintf(config.TplMemorySourceLinesDrift, count)) + return + } + cmd.Println(fmt.Sprintf(config.TplMemorySourceLines, count)) +} + +// MemoryMirrorLines prints the mirror line count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of lines. +func MemoryMirrorLines(cmd *cobra.Command, count int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplMemoryMirrorLines, count)) +} + +// MemoryMirrorNotSynced prints that the mirror has not been synced yet. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryMirrorNotSynced(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryMirrorNotSynced) +} + +// MemoryDriftDetected prints that drift was detected. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryDriftDetected(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryDriftDetected) +} + +// MemoryDriftNone prints that no drift was detected. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func MemoryDriftNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplMemoryDriftNone) +} + +// MemoryArchives prints the archive snapshot count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of archived snapshots. +// - dir: archive directory name relative to .context/. +func MemoryArchives(cmd *cobra.Command, count int, dir string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplMemoryArchives, count, dir)) +} diff --git a/internal/write/notify.go b/internal/write/notify.go new file mode 100644 index 00000000..e5c3453f --- /dev/null +++ b/internal/write/notify.go @@ -0,0 +1,79 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "net/http" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// SetupPrompt prints the interactive webhook URL prompt. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func SetupPrompt(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Print(config.TplSetupPrompt) +} + +// SetupDone prints the success block after saving a webhook: +// configured URL (masked) and encrypted file path. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - maskedURL: masked webhook URL for display. +// - encPath: encrypted file path. +func SetupDone(cmd *cobra.Command, maskedURL, encPath string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplSetupDone, maskedURL, encPath)) +} + +// TestNoWebhook prints the message when no webhook is configured. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func TestNoWebhook(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplTestNoWebhook) +} + +// TestFiltered prints the notice when the test event is filtered. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func TestFiltered(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplTestFiltered) +} + +// TestResult prints the webhook test response block: status line +// and optional working confirmation. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - statusCode: HTTP response status code. +// - encPath: encrypted file path for the working message. +func TestResult(cmd *cobra.Command, statusCode int, encPath string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplTestResult, statusCode, http.StatusText(statusCode))) + if statusCode >= 200 && statusCode < 300 { + cmd.Println(fmt.Sprintf(config.TplTestWorking, encPath)) + } +} diff --git a/internal/write/pad.go b/internal/write/pad.go new file mode 100644 index 00000000..76807d1b --- /dev/null +++ b/internal/write/pad.go @@ -0,0 +1,355 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// PadEmpty prints the message when the scratchpad has no entries. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func PadEmpty(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplPadEmpty) +} + +// PadKeyCreated prints a key creation notice to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - path: key file path. +func PadKeyCreated(cmd *cobra.Command, path string) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(config.TplPadKeyCreated, path)) +} + +// PadEntryAdded prints confirmation that a pad entry was added. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - n: entry number (1-based). +func PadEntryAdded(cmd *cobra.Command, n int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadEntryAdded, n)) +} + +// PadEntryUpdated prints confirmation that a pad entry was updated. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - n: entry number (1-based). +func PadEntryUpdated(cmd *cobra.Command, n int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadEntryUpdated, n)) +} + +// PadExportPlan prints a dry-run export line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - label: blob label. +// - outPath: target file path. +func PadExportPlan(cmd *cobra.Command, label, outPath string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadExportPlan, label, outPath)) +} + +// PadExportDone prints a successfully exported blob line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - label: blob label. +func PadExportDone(cmd *cobra.Command, label string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadExportDone, label)) +} + +// ErrPadExportWrite prints a blob write failure to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - label: blob label. +// - cause: the write error. +func ErrPadExportWrite(cmd *cobra.Command, label string, cause error) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(config.TplPadExportWriteFailed, label, cause)) +} + +// PadBlobWritten prints confirmation that a blob was written to a file. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - size: number of bytes written. +// - path: output file path. +func PadBlobWritten(cmd *cobra.Command, size int, path string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadBlobWritten, size, path)) +} + +// PadEntryRemoved prints confirmation that a pad entry was removed. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - n: entry number (1-based). +func PadEntryRemoved(cmd *cobra.Command, n int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadEntryRemoved, n)) +} + +// PadResolveSide prints a conflict side block: header and numbered entries. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - side: label ("OURS" or "THEIRS"). +// - entries: display strings for each entry. +func PadResolveSide(cmd *cobra.Command, side string, entries []string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadResolveHeader, side)) + for i, entry := range entries { + cmd.Println(fmt.Sprintf(config.TplPadResolveEntry, i+1, entry)) + } +} + +// PadEntryMoved prints confirmation that a pad entry was moved. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - from: source position (1-based). +// - to: destination position (1-based). +func PadEntryMoved(cmd *cobra.Command, from, to int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadEntryMoved, from, to)) +} + +// PadImportNone prints the message when no entries were found to import. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func PadImportNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplPadImportNone) +} + +// PadImportDone prints the successful line import count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of entries imported. +func PadImportDone(cmd *cobra.Command, count int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadImportDone, count)) +} + +// PadImportBlobAdded prints a successfully imported blob line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: filename of the imported blob. +func PadImportBlobAdded(cmd *cobra.Command, name string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadImportBlobAdded, name)) +} + +// ErrPadImportBlobSkipped prints a skipped blob to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: filename. +// - cause: the error reason. +func ErrPadImportBlobSkipped(cmd *cobra.Command, name string, cause error) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(config.TplPadImportBlobSkipped, name, cause)) +} + +// ErrPadImportBlobTooLarge prints a too-large blob skip to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: filename. +// - max: maximum allowed size in bytes. +func ErrPadImportBlobTooLarge(cmd *cobra.Command, name string, max int) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(config.TplPadImportBlobTooLarge, name, max)) +} + +// PadImportBlobSummary prints the blob import summary or "no files" message. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - added: number of blobs imported. +// - skipped: number of blobs skipped. +func PadImportBlobSummary(cmd *cobra.Command, added, skipped int) { + if cmd == nil { + return + } + if added == 0 && skipped == 0 { + cmd.Println(config.TplPadImportBlobNone) + return + } + cmd.Println(fmt.Sprintf(config.TplPadImportBlobSummary, added, skipped)) +} + +// ErrPadImportCloseWarning prints a file close warning to stderr. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: filename. +// - cause: the close error. +func ErrPadImportCloseWarning(cmd *cobra.Command, name string, cause error) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(config.TplPadImportCloseWarning, name, cause)) +} + +// PadMergeDupe prints a duplicate-skipped line during merge. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - display: entry display string. +func PadMergeDupe(cmd *cobra.Command, display string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadMergeDupe, display)) +} + +// PadMergeAdded prints a newly added entry line during merge. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - display: entry display string. +// - file: source file path. +func PadMergeAdded(cmd *cobra.Command, display, file string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadMergeAdded, display, file)) +} + +// PadMergeBlobConflict prints a blob label conflict warning. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - label: conflicting blob label. +func PadMergeBlobConflict(cmd *cobra.Command, label string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadMergeBlobConflict, label)) +} + +// PadMergeBinaryWarning prints a binary data warning for a source file. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - file: source file path. +func PadMergeBinaryWarning(cmd *cobra.Command, file string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPadMergeBinaryWarning, file)) +} + +// PadMergeSummary prints the merge summary based on counts and mode. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - added: number of entries added. +// - dupes: number of duplicates skipped. +// - dryRun: whether this was a dry run. +func PadMergeSummary(cmd *cobra.Command, added, dupes int, dryRun bool) { + if cmd == nil { + return + } + if added == 0 && dupes == 0 { + cmd.Println(config.TplPadMergeNone) + return + } + if added == 0 { + cmd.Println(fmt.Sprintf(config.TplPadMergeNoneNew, dupes, padPluralize("duplicate", dupes))) + return + } + if dryRun { + cmd.Println(fmt.Sprintf(config.TplPadMergeDryRun, + added, padPluralize("entry", added), + dupes, padPluralize("duplicate", dupes))) + return + } + cmd.Println(fmt.Sprintf(config.TplPadMergeDone, + added, padPluralize("entry", added), + dupes, padPluralize("duplicate", dupes))) +} + +// padPluralize is an internal helper matching core.Pluralize for write templates. +func padPluralize(word string, count int) string { + if count == 1 { + return word + } + if len(word) > 0 && word[len(word)-1] == 'y' { + return word[:len(word)-1] + "ies" + } + return word + "s" +} + +// PadExportSummary prints the export summary or "no blobs" message. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of blobs exported. +// - dryRun: whether this was a dry run. +func PadExportSummary(cmd *cobra.Command, count int, dryRun bool) { + if cmd == nil { + return + } + if count == 0 { + cmd.Println(config.TplPadExportNone) + return + } + verb := config.TplPadExportVerbDone + if dryRun { + verb = config.TplPadExportVerbDryRun + } + cmd.Println(fmt.Sprintf(config.TplPadExportSummary, verb, count)) +} diff --git a/internal/write/permissions.go b/internal/write/permissions.go new file mode 100644 index 00000000..e3e6fe08 --- /dev/null +++ b/internal/write/permissions.go @@ -0,0 +1,101 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// RestoreNoLocal prints the message when golden is restored with no local file. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func RestoreNoLocal(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplRestoreNoLocal) +} + +// RestoreMatch prints the message when settings already match golden. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func RestoreMatch(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplRestoreMatch) +} + +// RestoreDiff prints the permission diff block: dropped/restored +// allow and deny entries, or a note that only non-permission settings differ. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - dropped: allow permissions removed. +// - restored: allow permissions added back. +// - denyDropped: deny rules removed. +// - denyRestored: deny rules added back. +func RestoreDiff( + cmd *cobra.Command, + dropped, restored, denyDropped, denyRestored []string, +) { + if cmd == nil { + return + } + printSection(cmd, config.TplRestoreDroppedHeader, config.TplRestoreRemoved, dropped) + printSection(cmd, config.TplRestoreRestoredHeader, config.TplRestoreAdded, restored) + printSection(cmd, config.TplRestoreDenyDroppedHeader, config.TplRestoreRemoved, denyDropped) + printSection(cmd, config.TplRestoreDenyRestoredHeader, config.TplRestoreAdded, denyRestored) + + if len(dropped) == 0 && len(restored) == 0 && + len(denyDropped) == 0 && len(denyRestored) == 0 { + cmd.Println(config.TplRestorePermMatch) + } +} + +// RestoreDone prints the success message after restore. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func RestoreDone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplRestoreDone) +} + +// SnapshotDone prints the golden image save/update confirmation. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - updated: true if golden already existed (update vs save). +// - path: golden file path. +func SnapshotDone(cmd *cobra.Command, updated bool, path string) { + if cmd == nil { + return + } + if updated { + cmd.Println(fmt.Sprintf(config.TplSnapshotUpdated, path)) + } else { + cmd.Println(fmt.Sprintf(config.TplSnapshotSaved, path)) + } +} + +// printSection prints a header and list items if the list is non-empty. +func printSection(cmd *cobra.Command, headerTpl, itemTpl string, items []string) { + if len(items) == 0 { + return + } + cmd.Println(fmt.Sprintf(headerTpl, len(items))) + for _, item := range items { + cmd.Println(fmt.Sprintf(itemTpl, item)) + } +} diff --git a/internal/write/print.go b/internal/write/print.go deleted file mode 100644 index 77c01c2e..00000000 --- a/internal/write/print.go +++ /dev/null @@ -1,25 +0,0 @@ -// / ctx: https://ctx.ist -// ,'`./ do you remember? -// `.,'\ -// \ Copyright 2026-present Context contributors. -// SPDX-License-Identifier: Apache-2.0 - -package write - -import ( - "fmt" - - "github.com/spf13/cobra" -) - -// sprintf formats a string and prints it to the command's stdout stream. -// -// This is the internal building block for all formatted output in the package. -// -// Parameters: -// - cmd: Cobra command whose stdout stream receives the output. -// - format: fmt.Sprintf format string. -// - args: format arguments. -func sprintf(cmd *cobra.Command, format string, args ...any) { - cmd.Println(fmt.Sprintf(format, args...)) -} diff --git a/internal/write/prompt.go b/internal/write/prompt.go new file mode 100644 index 00000000..befcabd7 --- /dev/null +++ b/internal/write/prompt.go @@ -0,0 +1,60 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// PromptCreated prints the confirmation after creating a prompt template. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: prompt template name. +func PromptCreated(cmd *cobra.Command, name string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPromptCreated, name)) +} + +// PromptNone prints the message when no prompts are found. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func PromptNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplPromptNone) +} + +// PromptRemoved prints the confirmation after removing a prompt template. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: prompt template name. +func PromptRemoved(cmd *cobra.Command, name string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPromptRemoved, name)) +} + +// PromptItem prints a single prompt name in the list. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: prompt template name. +func PromptItem(cmd *cobra.Command, name string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPromptItem, name)) +} diff --git a/internal/write/prune.go b/internal/write/prune.go new file mode 100644 index 00000000..a7169439 --- /dev/null +++ b/internal/write/prune.go @@ -0,0 +1,63 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// PruneDryRunLine prints a single dry-run prune candidate. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - name: file name being considered for pruning. +// - age: human-readable age string. +func PruneDryRunLine(cmd *cobra.Command, name, age string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPruneDryRunLine), name, age)) +} + +// PruneErrorLine prints an error encountered while removing a file. +// +// Parameters: +// - cmd: Cobra command for error output. Nil is a no-op. +// - name: file name that failed to remove. +// - err: the removal error. +func PruneErrorLine(cmd *cobra.Command, name string, err error) { + if cmd == nil { + return + } + cmd.PrintErrln(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPruneErrorLine), name, err)) +} + +// PruneSummary prints the prune results summary. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - dryRun: whether this was a dry-run invocation. +// - pruned: number of files pruned (or would be pruned). +// - skipped: number of files skipped (too recent). +// - preserved: number of global files preserved. +func PruneSummary(cmd *cobra.Command, dryRun bool, pruned, skipped, preserved int) { + if cmd == nil { + return + } + if dryRun { + cmd.Println() + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPruneDryRunSummary), + pruned, skipped, preserved)) + } else { + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyPruneSummary), + pruned, skipped, preserved)) + } +} diff --git a/internal/write/publish.go b/internal/write/publish.go new file mode 100644 index 00000000..b879a889 --- /dev/null +++ b/internal/write/publish.go @@ -0,0 +1,101 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// UnpublishNotFound prints that no published block was found. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: source file name (e.g. "MEMORY.md"). +func UnpublishNotFound(cmd *cobra.Command, filename string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplUnpublishNotFound, filename)) +} + +// UnpublishDone prints that the published block was removed. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: source file name (e.g. "MEMORY.md"). +func UnpublishDone(cmd *cobra.Command, filename string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplUnpublishDone, filename)) +} + +// PublishPlan prints the full publish plan: header, source files, +// budget, per-file counts, and total. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - budget: maximum line count for the published block. +// - tasks: number of pending tasks selected. +// - decisions: number of recent decisions selected. +// - conventions: number of key conventions selected. +// - learnings: number of recent learnings selected. +// - totalLines: total lines in the published block. +func PublishPlan( + cmd *cobra.Command, + budget, tasks, decisions, conventions, learnings, totalLines int, +) { + if cmd == nil { + return + } + cmd.Println(config.TplPublishHeader) + cmd.Println() + cmd.Println(config.TplPublishSourceFiles) + cmd.Println(fmt.Sprintf(config.TplPublishBudget, budget)) + cmd.Println() + cmd.Println(config.TplPublishBlock) + if tasks > 0 { + cmd.Println(fmt.Sprintf(config.TplPublishTasks, tasks)) + } + if decisions > 0 { + cmd.Println(fmt.Sprintf(config.TplPublishDecisions, decisions)) + } + if conventions > 0 { + cmd.Println(fmt.Sprintf(config.TplPublishConventions, conventions)) + } + if learnings > 0 { + cmd.Println(fmt.Sprintf(config.TplPublishLearnings, learnings)) + } + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplPublishTotal, totalLines, budget)) +} + +// PublishDryRun prints the dry-run notice. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func PublishDryRun(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println() + cmd.Println(config.TplPublishDryRun) +} + +// PublishDone prints the success message with marker info. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func PublishDone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println() + cmd.Println(config.TplPublishDone) +} diff --git a/internal/write/recall.go b/internal/write/recall.go index e128d9cd..de6922e0 100644 --- a/internal/write/recall.go +++ b/internal/write/recall.go @@ -10,9 +10,10 @@ import ( "fmt" "strings" + "github.com/ActiveMemory/ctx/internal/assets" + "github.com/ActiveMemory/ctx/internal/config/token" + config2 "github.com/ActiveMemory/ctx/internal/write/config" "github.com/spf13/cobra" - - "github.com/ActiveMemory/ctx/internal/config" ) // SkipFile prints that a file was skipped during export. @@ -25,7 +26,7 @@ func SkipFile(cmd *cobra.Command, filename, reason string) { if cmd == nil { return } - sprintf(cmd, " skip %s (%s)", filename, reason) + cmd.Println(fmt.Sprintf(" skip %s (%s)", filename, reason)) } // ExportedFile prints that a file was exported or updated. @@ -40,9 +41,9 @@ func ExportedFile(cmd *cobra.Command, filename, suffix string) { return } if suffix != "" { - sprintf(cmd, " ok %s (%s)", filename, suffix) + cmd.Println(fmt.Sprintf(" ok %s (%s)", filename, suffix)) } else { - sprintf(cmd, " ok %s", filename) + cmd.Println(fmt.Sprintf(" ok %s", filename)) } } @@ -140,16 +141,16 @@ func ExportFinalSummary(cmd *cobra.Command, exported, updated, renamed, skipped } cmd.Println() if exported > 0 { - sprintf(cmd, "Exported %d new session(s)", exported) + cmd.Println(fmt.Sprintf("Exported %d new session(s)", exported)) } if updated > 0 { - sprintf(cmd, "Updated %d existing session(s) (YAML frontmatter preserved)", updated) + cmd.Println(fmt.Sprintf("Updated %d existing session(s) (YAML frontmatter preserved)", updated)) } if renamed > 0 { - sprintf(cmd, "Renamed %d session(s) to title-based filenames", renamed) + cmd.Println(fmt.Sprintf("Renamed %d session(s) to title-based filenames", renamed)) } if skipped > 0 { - sprintf(cmd, "Skipped %d existing file(s).", skipped) + cmd.Println(fmt.Sprintf("Skipped %d existing file(s).", skipped)) } } @@ -209,6 +210,58 @@ func SessionListFooter(cmd *cobra.Command, hasMore bool) { } } +// SessionInfo holds pre-formatted session metadata for display. +type SessionInfo struct { + Slug string + ID string + Tool string + Project string + Branch string // empty to omit + Model string // empty to omit + Started string + Duration string + Turns int + Messages int + TokensIn string + TokensOut string + TokensAll string +} + +// SessionMetadata prints the full session metadata block: identity, +// timing, and token usage sections separated by blank lines. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - info: pre-formatted session metadata. +func SessionMetadata(cmd *cobra.Command, info SessionInfo) { + if cmd == nil { + return + } + SectionHeader(cmd, 1, info.Slug) + + SessionDetail(cmd, assets.MetadataID, info.ID) + SessionDetail(cmd, assets.MetadataTool, info.Tool) + SessionDetail(cmd, assets.MetadataProject, info.Project) + if info.Branch != "" { + SessionDetail(cmd, assets.MetadataBranch, info.Branch) + } + if info.Model != "" { + SessionDetail(cmd, assets.MetadataModel, info.Model) + } + BlankLine(cmd) + + SessionDetail(cmd, assets.MetadataStarted, info.Started) + SessionDetail(cmd, assets.MetadataDuration, info.Duration) + SessionDetailInt(cmd, assets.MetadataTurns, info.Turns) + SessionDetailInt(cmd, assets.MetadataMessages, info.Messages) + BlankLine(cmd) + + SessionDetail(cmd, assets.MetadataInputUsage, info.TokensIn) + SessionDetail(cmd, assets.MetadataOutputUsage, info.TokensOut) + SessionDetail(cmd, assets.MetadataTotal, info.TokensAll) + BlankLine(cmd) +} + // SessionDetail prints a labeled metadata line to stdout. // // Parameters: @@ -311,7 +364,7 @@ func ListItem(cmd *cobra.Command, format string, args ...any) { if cmd == nil { return } - _, _ = fmt.Fprintf(cmd.OutOrStdout(), "- "+format+config.NewlineLF, args...) + _, _ = fmt.Fprintf(cmd.OutOrStdout(), "- "+format+token.NewlineLF, args...) } // NumberedItem prints a numbered item to stdout. @@ -350,3 +403,102 @@ func Hint(cmd *cobra.Command, text string) { } cmd.Println(text) } + +// LockUnlockNone prints the message when no journal entries are found (lock/unlock context). +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func LockUnlockNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config2.TplJournalSyncNone) +} + +// LockUnlockEntry prints the confirmation for a single locked/unlocked entry. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: journal filename. +// - verb: "locked" or "unlocked". +func LockUnlockEntry(cmd *cobra.Command, filename, verb string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config2.TplLockUnlockEntry, filename, verb)) +} + +// LockUnlockSummary prints the lock/unlock summary. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - verb: "locked" or "unlocked". +// - count: number of entries changed. Zero prints no-changes message. +func LockUnlockSummary(cmd *cobra.Command, verb string, count int) { + if cmd == nil { + return + } + if count == 0 { + cmd.Println(fmt.Sprintf(config2.TplLockUnlockNoChanges, verb)) + return + } + cmd.Println(fmt.Sprintf(config2.TplLockUnlockSummary, strings.Title(verb), count)) //nolint:staticcheck // strings.Title is fine for single words +} + +// JournalSyncNone prints the message when no journal entries are found. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func JournalSyncNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config2.TplJournalSyncNone) +} + +// JournalSyncLocked prints a single locked-entry confirmation. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: the journal filename that was locked. +func JournalSyncLocked(cmd *cobra.Command, filename string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config2.TplJournalSyncLocked, filename)) +} + +// JournalSyncUnlocked prints a single unlocked-entry confirmation. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - filename: the journal filename that was unlocked. +func JournalSyncUnlocked(cmd *cobra.Command, filename string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config2.TplJournalSyncUnlocked, filename)) +} + +// JournalSyncSummary prints the sync summary: match, locked count, +// and/or unlocked count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - locked: number of newly locked entries. +// - unlocked: number of newly unlocked entries. +func JournalSyncSummary(cmd *cobra.Command, locked, unlocked int) { + if cmd == nil { + return + } + if locked == 0 && unlocked == 0 { + cmd.Println(config2.TplJournalSyncMatch) + return + } + if locked > 0 { + cmd.Println(fmt.Sprintf(config2.TplJournalSyncLockedCount, locked)) + } + if unlocked > 0 { + cmd.Println(fmt.Sprintf(config2.TplJournalSyncUnlockedCount, unlocked)) + } +} diff --git a/internal/write/remind.go b/internal/write/remind.go new file mode 100644 index 00000000..d1c2f25f --- /dev/null +++ b/internal/write/remind.go @@ -0,0 +1,87 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// ReminderAdded prints the confirmation for a newly added reminder. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - id: reminder ID. +// - message: reminder text. +// - after: optional date gate (nil if none). +func ReminderAdded(cmd *cobra.Command, id int, message string, after *string) { + if cmd == nil { + return + } + suffix := "" + if after != nil { + suffix = fmt.Sprintf(config.TplReminderAfterSuffix, *after) + } + cmd.Println(fmt.Sprintf(config.TplReminderAdded, id, message, suffix)) +} + +// ReminderItem prints a single reminder in the list. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - id: reminder ID. +// - message: reminder text. +// - after: optional date gate (nil if none). +// - today: current date in YYYY-MM-DD format. +func ReminderItem(cmd *cobra.Command, id int, message string, after *string, today string) { + if cmd == nil { + return + } + annotation := "" + if after != nil && *after > today { + annotation = fmt.Sprintf(config.TplReminderNotDue, *after) + } + cmd.Println(fmt.Sprintf(config.TplReminderItem, id, message, annotation)) +} + +// ReminderDismissed prints the confirmation for a dismissed reminder. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - id: reminder ID. +// - message: reminder text. +func ReminderDismissed(cmd *cobra.Command, id int, message string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplReminderDismissed, id, message)) +} + +// ReminderNone prints the message when there are no reminders. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func ReminderNone(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplReminderNone) +} + +// ReminderDismissedAll prints the summary after dismissing all reminders. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: number of dismissed reminders. +func ReminderDismissedAll(cmd *cobra.Command, count int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplReminderDismissedAll, count)) +} diff --git a/internal/write/session.go b/internal/write/session.go new file mode 100644 index 00000000..19d6240d --- /dev/null +++ b/internal/write/session.go @@ -0,0 +1,50 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// SessionPaused prints confirmation that hooks were paused. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - sessionID: the session identifier. +func SessionPaused(cmd *cobra.Command, sessionID string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplPaused, sessionID)) +} + +// SessionResumed prints confirmation that hooks were resumed. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - sessionID: the session identifier. +func SessionResumed(cmd *cobra.Command, sessionID string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplResumed, sessionID)) +} + +// SessionWrappedUp prints confirmation that the wrap-up marker was written. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func SessionWrappedUp(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyMarkWrappedUpConfirmed)) +} diff --git a/internal/write/status.go b/internal/write/status.go new file mode 100644 index 00000000..01aac7f9 --- /dev/null +++ b/internal/write/status.go @@ -0,0 +1,88 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// StatusFileInfo holds prepared data for a single file in status output. +type StatusFileInfo struct { + Indicator string + Name string + Status string + Tokens int + Size int64 + Preview []string +} + +// StatusActivityInfo holds prepared data for a recent activity entry. +type StatusActivityInfo struct { + Name string + Ago string +} + +// StatusHeader prints the status heading and summary block. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - dir: Context directory path. +// - fileCount: Number of context files. +// - totalTokens: Estimated total token count. +func StatusHeader(cmd *cobra.Command, dir string, fileCount, totalTokens int) { + if cmd == nil { + return + } + cmd.Println(config.TplStatusTitle) + cmd.Println(config.TplStatusSeparator) + cmd.Println() + cmd.Println(fmt.Sprintf(config.TplStatusDir, dir)) + cmd.Println(fmt.Sprintf(config.TplStatusFiles, fileCount)) + cmd.Println(fmt.Sprintf(config.TplStatusTokens, FormatNumber(totalTokens))) + cmd.Println() + cmd.Println(config.TplStatusFilesHeader) +} + +// StatusFileItem prints a single file entry in the status list. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - f: Prepared file info. +// - verbose: If true, include tokens, size, and preview. +func StatusFileItem(cmd *cobra.Command, f StatusFileInfo, verbose bool) { + if cmd == nil { + return + } + if verbose { + cmd.Println(fmt.Sprintf(config.TplStatusFileVerbose, + f.Indicator, f.Name, f.Status, + FormatNumber(f.Tokens), FormatBytes(f.Size))) + for _, line := range f.Preview { + cmd.Println(fmt.Sprintf(config.TplStatusPreviewLine, line)) + } + } else { + cmd.Println(fmt.Sprintf(config.TplStatusFileCompact, f.Indicator, f.Name, f.Status)) + } +} + +// StatusActivity prints the recent activity section. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - entries: Recent activity entries. +func StatusActivity(cmd *cobra.Command, entries []StatusActivityInfo) { + if cmd == nil { + return + } + cmd.Println() + cmd.Println(config.TplStatusActivityHeader) + for _, e := range entries { + cmd.Println(fmt.Sprintf(config.TplStatusActivityItem, e.Name, e.Ago)) + } +} diff --git a/internal/write/sync.go b/internal/write/sync.go index ec8505a2..dd9b8d73 100644 --- a/internal/write/sync.go +++ b/internal/write/sync.go @@ -7,120 +7,69 @@ package write import ( + "fmt" + "github.com/ActiveMemory/ctx/internal/write/config" "github.com/spf13/cobra" ) -// DryRun prints the dry-run header to stdout. +// SyncDryRun prints the full dry-run plan block: header, source path, +// mirror path, and drift status. // // Parameters: // - cmd: Cobra command for output. Nil is a no-op. -func DryRun(cmd *cobra.Command) { +// - sourcePath: absolute path to MEMORY.md. +// - mirrorPath: relative mirror path. +// - hasDrift: whether the source has changed since last sync. +func SyncDryRun(cmd *cobra.Command, sourcePath, mirrorPath string, hasDrift bool) { if cmd == nil { return } - cmd.Println(tplDryRun) -} - -// Source prints an indented source path line to stdout. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -// - path: the source file path to display. -func Source(cmd *cobra.Command, path string) { - if cmd == nil { - return - } - sprintf(cmd, tplSource, path) -} - -// Mirror prints an indented mirror path line to stdout. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -// - relativePath: the mirror path relative to the project root. -func Mirror(cmd *cobra.Command, relativePath string) { - if cmd == nil { - return - } - sprintf(cmd, tplMirror, relativePath) -} - -// StatusDrift prints that drift was detected. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -func StatusDrift(cmd *cobra.Command) { - if cmd == nil { - return - } - cmd.Println(tplStatusDrift) -} - -// StatusNoDrift prints that no drift was detected. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -func StatusNoDrift(cmd *cobra.Command) { - if cmd == nil { - return + cmd.Println(config.TplDryRun) + cmd.Println(fmt.Sprintf(config.TplSource, sourcePath)) + cmd.Println(fmt.Sprintf(config.TplMirror, mirrorPath)) + if hasDrift { + cmd.Println(config.TplStatusDrift) + } else { + cmd.Println(config.TplStatusNoDrift) } - cmd.Println(tplStatusNoDrift) } -// Archived prints that a previous file was archived. +// SyncResult prints the full sync result block: optional archive notice, +// synced confirmation, source path, line counts, and optional new content. // // Parameters: // - cmd: Cobra command for output. Nil is a no-op. -// - filename: the archive filename (basename, not full path). -func Archived(cmd *cobra.Command, filename string) { +// - sourceLabel: source label (e.g. "MEMORY.md"). +// - mirrorPath: relative mirror path. +// - sourcePath: absolute source path for display. +// - archivedTo: archive basename, or empty if no archive was created. +// - sourceLines: current source line count. +// - mirrorLines: previous mirror line count. +func SyncResult( + cmd *cobra.Command, + sourceLabel, mirrorPath, sourcePath, archivedTo string, + sourceLines, mirrorLines int, +) { if cmd == nil { return } - sprintf(cmd, tplArchived, filename) -} - -// Synced prints that a sync completed successfully. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -// - source: label for the source (e.g. "MEMORY.md"). -// - destination: relative path to the destination. -func Synced(cmd *cobra.Command, source, destination string) { - if cmd == nil { - return + if archivedTo != "" { + cmd.Println(fmt.Sprintf(config.TplArchived, archivedTo)) } - sprintf(cmd, tplSynced, source, destination) -} + cmd.Println(fmt.Sprintf(config.TplSynced, sourceLabel, mirrorPath)) + cmd.Println(fmt.Sprintf(config.TplSource, sourcePath)) -// Lines prints a line count, optionally including the previous count. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -// - count: current line count. -// - previous: previous line count. Zero omits the "(was N)" suffix. -func Lines(cmd *cobra.Command, count, previous int) { - if cmd == nil { - return - } - line := tplLines - if previous > 0 { - line += tplLinesPrevious - sprintf(cmd, line, count, previous) - return + line := config.TplLines + if mirrorLines > 0 { + line += config.TplLinesPrevious + cmd.Println(fmt.Sprintf(line, sourceLines, mirrorLines)) + } else { + cmd.Println(fmt.Sprintf(line, sourceLines)) } - sprintf(cmd, line, count) -} -// NewContent prints how many new lines appeared since the last sync. -// -// Parameters: -// - cmd: Cobra command for output. Nil is a no-op. -// - count: number of new lines. -func NewContent(cmd *cobra.Command, count int) { - if cmd == nil { - return + if sourceLines > mirrorLines { + cmd.Println(fmt.Sprintf(config.TplNewContent, sourceLines-mirrorLines)) } - sprintf(cmd, tplNewContent, count) } // ErrAutoMemoryNotActive prints an informational stderr message when diff --git a/internal/write/sync/ctxsync.go b/internal/write/sync/ctxsync.go new file mode 100644 index 00000000..1a358d20 --- /dev/null +++ b/internal/write/sync/ctxsync.go @@ -0,0 +1,79 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package sync + +import ( + "fmt" + + "github.com/ActiveMemory/ctx/internal/write/config" + "github.com/spf13/cobra" +) + +// CtxSyncInSync prints the all-clear message when context is in sync. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func CtxSyncInSync(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(config.TplSyncInSync) +} + +// CtxSyncHeader prints the sync analysis heading and optional dry-run notice. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - dryRun: If true, includes the dry-run notice. +func CtxSyncHeader(cmd *cobra.Command, dryRun bool) { + if cmd == nil { + return + } + cmd.Println(config.TplSyncHeader) + cmd.Println(config.TplSyncSeparator) + cmd.Println() + if dryRun { + cmd.Println(config.TplSyncDryRun) + cmd.Println() + } +} + +// CtxSyncAction prints a single sync action item with optional suggestion. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - index: 1-based action number. +// - actionType: Action type label (e.g. "DEPS", "CONFIG"). +// - description: Action description. +// - suggestion: Optional suggestion text (empty string skips). +func CtxSyncAction(cmd *cobra.Command, index int, actionType, description, suggestion string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(config.TplSyncAction, index, actionType, description)) + if suggestion != "" { + cmd.Println(fmt.Sprintf(config.TplSyncSuggestion, suggestion)) + } + cmd.Println() +} + +// CtxSyncSummary prints the sync summary with item count. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - count: Number of sync items found. +// - dryRun: If true, shows the dry-run variant. +func CtxSyncSummary(cmd *cobra.Command, count int, dryRun bool) { + if cmd == nil { + return + } + if dryRun { + cmd.Println(fmt.Sprintf(config.TplSyncDryRunSummary, count)) + } else { + cmd.Println(fmt.Sprintf(config.TplSyncSummary, count)) + } +} diff --git a/internal/write/sync/doc.go b/internal/write/sync/doc.go new file mode 100644 index 00000000..fdbc3f6e --- /dev/null +++ b/internal/write/sync/doc.go @@ -0,0 +1,8 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +// Package sync provides formatted output helpers for the sync command. +package sync diff --git a/internal/write/task.go b/internal/write/task.go new file mode 100644 index 00000000..7484942d --- /dev/null +++ b/internal/write/task.go @@ -0,0 +1,123 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// ArchiveSkipping prints a notice that a task block was skipped due to +// incomplete children. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - taskText: the parent task description. +func ArchiveSkipping(cmd *cobra.Command, taskText string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveSkipping), taskText)) +} + +// ArchiveSkipIncomplete prints a summary when no tasks could be archived +// due to incomplete children. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - skippedCount: number of skipped task blocks. +func ArchiveSkipIncomplete(cmd *cobra.Command, skippedCount int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveSkipIncomplete), skippedCount)) +} + +// ArchiveNoCompleted prints a message when there are no completed tasks +// to archive. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func ArchiveNoCompleted(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyTaskArchiveNoCompleted)) +} + +// ArchiveDryRun prints the dry-run preview for task archiving. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - archivableCount: number of tasks that would be archived. +// - pendingCount: number of pending tasks remaining. +// - preview: the archived content preview string. +// - separator: the separator string for framing the preview. +func ArchiveDryRun(cmd *cobra.Command, archivableCount, pendingCount int, preview, separator string) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyTaskArchiveDryRunHeader)) + cmd.Println() + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveDryRunSummary), archivableCount, pendingCount)) + cmd.Println() + cmd.Println(assets.TextDesc(assets.TextDescKeyTaskArchiveContentPreview)) + cmd.Println(separator) + cmd.Print(preview) + cmd.Println(separator) +} + +// ArchiveSuccess prints the result of a successful task archive operation. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - archivedCount: number of tasks archived. +// - archiveFilePath: path to the created archive file. +// - pendingCount: number of pending tasks remaining. +func ArchiveSuccess(cmd *cobra.Command, archivedCount int, archiveFilePath string, pendingCount int) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchiveSuccess), archivedCount, archiveFilePath)) + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskArchivePendingRemain), pendingCount)) +} + +// SnapshotSaved prints the result of a successful task snapshot. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - snapshotPath: path to the created snapshot file. +func SnapshotSaved(cmd *cobra.Command, snapshotPath string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyTaskSnapshotSaved), snapshotPath)) +} + +// SnapshotContent builds the snapshot file content with header and body. +// +// Parameters: +// - name: snapshot name. +// - created: RFC3339 formatted creation timestamp. +// - separator: the separator string. +// - nl: newline string. +// - body: the original TASKS.md content. +// +// Returns: +// - string: formatted snapshot content. +func SnapshotContent(name, created, separator, nl, body string) string { + return fmt.Sprintf( + assets.TextDesc(assets.TextDescKeyTaskSnapshotHeaderFormat)+ + nl+nl+ + assets.TextDesc(assets.TextDescKeyTaskSnapshotCreatedFormat)+ + nl+nl+separator+nl+nl+"%s", + name, created, body, + ) +} diff --git a/internal/write/watch.go b/internal/write/watch.go new file mode 100644 index 00000000..12e45fcf --- /dev/null +++ b/internal/write/watch.go @@ -0,0 +1,98 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// WatchWatching prints the initial "watching" status line. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func WatchWatching(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyWatchWatching)) +} + +// WatchDryRun prints the dry-run notice. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func WatchDryRun(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyWatchDryRun)) +} + +// WatchStopHint prints the Ctrl+C stop hint. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func WatchStopHint(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyWatchStopHint)) +} + +// WatchCloseLogError prints a log file close error. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - err: the close error. +func WatchCloseLogError(cmd *cobra.Command, err error) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWatchCloseLogError), err)) +} + +// WatchDryRunPreview prints a dry-run preview of an update that would be applied. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - updateType: the context update type. +// - content: the update content. +func WatchDryRunPreview(cmd *cobra.Command, updateType, content string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWatchDryRunPreview), updateType, content)) +} + +// WatchApplyFailed prints a failure message for an update that could not be applied. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - updateType: the context update type. +// - err: the apply error. +func WatchApplyFailed(cmd *cobra.Command, updateType string, err error) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWatchApplyFailed), updateType, err)) +} + +// WatchApplySuccess prints a success message for an applied update. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - updateType: the context update type. +// - content: the update content. +func WatchApplySuccess(cmd *cobra.Command, updateType, content string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWatchApplySuccess), updateType, content)) +} diff --git a/internal/write/why.go b/internal/write/why.go new file mode 100644 index 00000000..ad1249aa --- /dev/null +++ b/internal/write/why.go @@ -0,0 +1,49 @@ +// / ctx: https://ctx.ist +// ,'`./ do you remember? +// `.,'\ +// \ Copyright 2026-present Context contributors. +// SPDX-License-Identifier: Apache-2.0 + +package write + +import ( + "fmt" + "github.com/spf13/cobra" + + "github.com/ActiveMemory/ctx/internal/assets" +) + +// WhyBanner prints the ctx ASCII art banner for the why menu. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func WhyBanner(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Println(assets.TextDesc(assets.TextDescKeyWhyBanner)) +} + +// WhyMenuItem prints a numbered menu item. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +// - index: 1-based menu index. +// - label: display label for the document. +func WhyMenuItem(cmd *cobra.Command, index int, label string) { + if cmd == nil { + return + } + cmd.Println(fmt.Sprintf(assets.TextDesc(assets.TextDescKeyWhyMenuItemFormat), index, label)) +} + +// WhyMenuPrompt prints the selection prompt. +// +// Parameters: +// - cmd: Cobra command for output. Nil is a no-op. +func WhyMenuPrompt(cmd *cobra.Command) { + if cmd == nil { + return + } + cmd.Print(assets.TextDesc(assets.TextDescKeyWhyMenuPrompt)) +} diff --git a/specs/mcp-server.md b/specs/mcp-server.md index 40460637..2d1d020f 100644 --- a/specs/mcp-server.md +++ b/specs/mcp-server.md @@ -325,7 +325,7 @@ server resolves it via the fallback chain (roots > flag > CWD). | `ctx_complete` | `cli/complete` logic | Yes | Mark a task done by number or text | **Critical**: `ctx_complete` **MUST** delegate to the same code path as -`ctx complete`, not reimplement task parsing: +`ctx tasks complete`, not reimplement task parsing: PR #27 reimplements this in `tools.go` (*~60 lines of task matching logic*): This **must** use the existing `internal/task` package and the complete