diff --git a/CHANGELOG.md b/CHANGELOG.md index 1f28b75c9..c36396c90 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -8,8 +8,21 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 ## [Unreleased] +### Added + +- **Package namespaces.** Declare `namespace: acme` in `apm.yml` to install skills under direct runtime-visible directories like `skills/acme-/` -- ship multiple packages from the same org without name collisions while staying compatible with harnesses that expect direct child skill folders. The namespace flows from manifest into `apm.lock.yaml`, the install tree, and per-skill confirmation messages so the routing is visible end-to-end. (#1028) + ### Changed +- **BREAKING: `apm compile --target vscode/all` no longer generates `.github/copilot-instructions.md`.** APM now leaves that file alone, matching its "additive, never overwrite user files" contract. If your workflow consumed this generated file, copy your last generated version into the repo and manage it manually, or move the content into a dedicated APM package. (#1028) + + ```bash + # Before: apm compile --target vscode/all wrote this file for you. + # After: the file is repository-owned. To keep the previous generated copy: + git show HEAD~1:.github/copilot-instructions.md > .github/copilot-instructions.md + ``` + + See `docs/src/content/docs/getting-started/migration.md` for CI guidance. - **BREAKING: `apm pack` now produces a Claude Code plugin directory by default — zero extra flags, schema-validated `plugin.json`, convention dirs auto-discovered.** The legacy APM bundle layout is preserved under `--format apm`. Migration: CI workflows and scripts that consume the legacy bundle must add `--format apm` (the [`microsoft/apm-action`](https://github.com/microsoft/apm-action) wrapper has been updated accordingly). (#1061) - **Plugin manifest schema conformance.** The synthesized/written `plugin.json` no longer emits `agents`/`skills`/`commands`/`instructions` keys pointing at the convention directories — these are auto-discovered by Claude Code, and per the [official schema](https://json.schemastore.org/claude-code-plugin.json) those array entries must be `./*.md` paths to *additional* files. The convention dirs themselves are still copied to disk. When stripping such keys from an authored `plugin.json`, `apm pack` now emits a warning so authors can clean up their source. (#1061) diff --git a/docs/src/content/docs/getting-started/migration.md b/docs/src/content/docs/getting-started/migration.md index 78a06b93b..f87e3d5aa 100644 --- a/docs/src/content/docs/getting-started/migration.md +++ b/docs/src/content/docs/getting-started/migration.md @@ -5,7 +5,23 @@ sidebar: order: 5 --- -APM is additive. It never deletes, overwrites, or modifies your existing configuration files. Your current `.github/copilot-instructions.md`, `AGENTS.md`, `.claude/` config, `.cursor-rules` — all stay exactly where they are, untouched. +APM is additive. It never deletes, overwrites, or modifies your existing configuration files. Your current `.github/copilot-instructions.md`, `AGENTS.md`, `.claude/` config, `.cursor-rules` -- all stay exactly where they are, untouched. + +:::caution[Unreleased compile change] +`apm compile --target vscode` and `apm compile --target all` no longer write `.github/copilot-instructions.md`. Existing files stay in place, but APM will not regenerate that path. + +Before: `apm compile --target vscode` generated `AGENTS.md`, `.github/` primitives, and `.github/copilot-instructions.md`. + +After: `apm compile --target vscode` generates `AGENTS.md` and `.github/` primitives only. + +To keep the last generated file from the previous commit: + +```bash +git show HEAD~1:.github/copilot-instructions.md > .github/copilot-instructions.md +``` + +For CI, remove assertions that expect APM to regenerate `.github/copilot-instructions.md`, or commit the file and manage it as a normal repository-owned file. +::: ## Add APM in three steps diff --git a/docs/src/content/docs/guides/skills.md b/docs/src/content/docs/guides/skills.md index 065aa5b90..9df8b2061 100644 --- a/docs/src/content/docs/guides/skills.md +++ b/docs/src/content/docs/guides/skills.md @@ -322,6 +322,52 @@ name: my-project target: vscode # or claude, or all ``` +## Package Namespaces + +If you publish multiple APM packages from the same org, skills from different +packages can otherwise collide in the flat layout. Namespaces isolate them: +`acme-tools` and `acme-infra` can each own a skill directory without overwriting +the other package's files. + +Package authors opt in with `namespace` in `apm.yml`: + +```yaml +name: acme-tools +version: 1.0.0 +namespace: acme +type: skill +``` + +With this manifest, APM deploys package-owned skills under +`.github/skills/acme-/` (and the equivalent skills directory for +other targets). The skill remains a direct child of the target `skills/` +directory so GitHub Copilot, Claude, Codex, and other harnesses can discover it +without relying on recursive skill lookup. Packages without `namespace` keep +the legacy flat layout. + +Before: + +```text +.github/ +└── skills/ + └── brand-guidelines/ +``` + +After: + +```text +.github/ +└── skills/ + └── acme-brand-guidelines/ +``` + +Use a namespace when your org publishes more than one package, when multiple +teams publish skills with common names like `review` or `brand-guidelines`, or +when you want dependency-owned skills to be visually separate from local skills. +Adding or changing a namespace is a consumer-visible path change; run +`apm install` after updating the manifest and remove any old flat-path skill +directories only after confirming they are no longer referenced. + ## Best Practices ### 1. Clear Naming diff --git a/docs/src/content/docs/reference/cli-commands.md b/docs/src/content/docs/reference/cli-commands.md index 7e8ab8e14..f1cc06328 100644 --- a/docs/src/content/docs/reference/cli-commands.md +++ b/docs/src/content/docs/reference/cli-commands.md @@ -37,6 +37,7 @@ apm init [PROJECT_NAME] [OPTIONS] - `-y, --yes` - Skip interactive prompts and use auto-detected defaults - `--plugin` - Initialize as a plugin authoring project (creates `plugin.json` + `apm.yml` with `devDependencies`) - `--marketplace` - Seed `apm.yml` with a `marketplace:` authoring block. See the [Authoring a marketplace guide](../../guides/marketplace-authoring/). +- `--namespace TEXT` - Add an optional package namespace so package-owned skills install under `skills/-/` **Examples:** ```bash @@ -57,11 +58,15 @@ apm init my-plugin --plugin # Initialize a project that also publishes a marketplace apm init my-marketplace --marketplace + +# Initialize with a package namespace for owned skills +apm init my-package --namespace acme ``` **Behavior:** - **Minimal by default**: Creates only `apm.yml` with auto-detected metadata - **Interactive mode**: Prompts for project details unless `--yes` specified +- **Namespace** (`--namespace`): Writes `namespace:` to `apm.yml`; interactive mode also prompts for it and allows an empty value - **Auto-detection**: Automatically detects author from `git config user.name` and description from project context - **Brownfield friendly**: Works cleanly in existing projects without file pollution - **Plugin mode** (`--plugin`): Creates both `plugin.json` and `apm.yml` with an empty `devDependencies` section. Plugin names must be kebab-case (`^[a-z][a-z0-9-]{0,63}$`), max 64 characters @@ -131,6 +136,8 @@ See [Dependencies: Transport selection](../../guides/dependencies/#transport-sel - `apm install `: Installs **only** the specified package (adds to `apm.yml` if not present) - Each `http://` dependency is warned at install time before any fetch begins - Transitive `http://` dependencies are allowed automatically when they use the same host as a direct insecure dependency you approved with `--allow-insecure`; other transitive hosts require `--allow-insecure-host HOSTNAME` +- Packages that declare `namespace:` in `apm.yml` install package-owned skills under `skills/-/` and show the namespace identity in install output, for example `Integrated skill acme/brand-guidelines -> .github/skills/acme-brand-guidelines/` +- `apm install --verbose` keeps the same namespace-aware install tree while adding the usual file-level diagnostic detail; namespace routing is reported from the install summary layer, not as a second per-skill line **Claude Code: prompt `input:` -> slash command `arguments:`:** @@ -326,6 +333,7 @@ Skills are copied directly to target directories: - **Primary**: `.github/skills/{skill-name}/` — Entire skill folder copied - **Compatibility**: `.claude/skills/{skill-name}/` — Also copied if `.claude/` folder exists +- **Namespaced packages**: `.github/skills/{namespace}-{skill-name}/` and `.claude/skills/{namespace}-{skill-name}/` **Example Integration Output**: ``` @@ -1712,6 +1720,11 @@ target: [claude, copilot] # multiple targets -- only these are compiled/install | `gemini` | GEMINI.md, .gemini/commands/, .gemini/skills/ | Gemini CLI | | `all` | All of the above | Universal compatibility | +`apm compile --target vscode` and `apm compile --target all` no longer write +`.github/copilot-instructions.md`. Existing files at that path are left under +your control; keep them manually or move the content into an APM package if your +workflow still depends on that file. + **Examples:** ```bash # Basic compilation with auto-detected context diff --git a/docs/src/content/docs/reference/lockfile-spec.md b/docs/src/content/docs/reference/lockfile-spec.md index 699645964..3b1ba80a6 100644 --- a/docs/src/content/docs/reference/lockfile-spec.md +++ b/docs/src/content/docs/reference/lockfile-spec.md @@ -121,6 +121,7 @@ fields: | `depth` | integer | MUST | Dependency depth. `1` = direct dependency, `2`+ = transitive. | | `resolved_by` | string | MAY | `repo_url` of the parent that introduced this transitive dependency. Present only when `depth >= 2`. | | `package_type` | string | MUST | Package type: `apm_package`, `plugin`, `virtual`, or other registered types. | +| `namespace` | string | MAY | Manifest-declared namespace applied to package-owned skills. Omitted for legacy flat installs. | | `content_hash` | string | MAY | SHA-256 hash of the package file tree, in the format `"sha256:"`. Used to verify cached packages on subsequent installs. Omitted for local path dependencies. See [section 4.4](#44-content-integrity). | | `is_dev` | boolean | MAY | `true` if the dependency was resolved through [`devDependencies`](../manifest-schema/#5-devdependencies). Omitted when `false`. Dev deps are excluded from `apm pack` plugin output (and from `--format apm` bundles). | | `deployed_files` | array of strings | MUST | Every file path APM deployed for this dependency, relative to project root. | diff --git a/docs/src/content/docs/reference/manifest-schema.md b/docs/src/content/docs/reference/manifest-schema.md index ed8eb38b7..a0877ab42 100644 --- a/docs/src/content/docs/reference/manifest-schema.md +++ b/docs/src/content/docs/reference/manifest-schema.md @@ -47,6 +47,7 @@ author: license: target: type: +namespace: scripts: > includes: > dependencies: @@ -158,7 +159,36 @@ Declares how the package's content is processed during install and compile. Curr | `hybrid` | Both AGENTS.md compilation and skill installation. | | `prompts` | Commands/prompts only. No instructions or skills. | -### 3.8. `scripts` +### 3.8. `namespace` + +| | | +|---|---| +| **Type** | `string` | +| **Required** | OPTIONAL | +| **Pattern** | `^[a-z0-9](?:[a-z0-9]|-(?!-))*[a-z0-9]$|^[a-z0-9]$` | +| **Description** | Optional namespace for package-owned skills. Max 64 characters. Consecutive hyphens (`--`) are not allowed. | + +When present, installed native skills and promoted `.apm/skills/` entries are +deployed under `skills/-/` instead of the legacy flat +`skills//` layout. The namespace remains a direct child directory +prefix because target harnesses document direct skill directories, not a +portable recursive `skills/**/SKILL.md` contract. Packages without `namespace` +continue to install flat for backward compatibility. + +Namespace values MUST be a single safe path segment. Resolvers MUST reject empty +values, traversal (`.` or `..`), path separators, uppercase characters, and +filesystem-unsafe punctuation. + +See also: [Package Namespaces](../../guides/skills/#package-namespaces). + +```yaml +name: acme-tools +version: 1.0.0 +namespace: acme +type: skill +``` + +### 3.9. `scripts` | | | |---|---| @@ -168,7 +198,7 @@ Declares how the package's content is processed during install and compile. Curr | **Value** | Shell command string | | **Description** | Named commands executed via `apm run `. MUST support `--param key=value` substitution. | -### 3.9. `includes` +### 3.10. `includes` | | | |---|---| @@ -200,7 +230,7 @@ includes: auto When `policy.manifest.require_explicit_includes` is `true` (see [Governance guide](../../enterprise/governance-guide/)), only form 3 passes the policy check; `auto` and undeclared are rejected at install/audit time by the `explicit-includes` policy check (not at YAML parse time). -### 3.10. `policy` +### 3.11. `policy` | | | |---|---| diff --git a/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md b/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md index 843927210..44a882856 100644 --- a/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md +++ b/packages/apm-guide/.apm/skills/apm-usage/package-authoring.md @@ -99,6 +99,22 @@ the same way at every entry point. Invalid values fail at parse time with a message naming the apm.yml path and the offending token -- they do **not** silently fall through to auto-detect. +## Manifest fields: `namespace:` (optional) + +Declare `namespace: ` at the top level of `apm.yml` to install the +package's skills under `skills///` instead of the legacy +flat `skills//` layout. This lets one org publish multiple +packages (e.g. `acme-security`, `acme-brand`) without skill-name collisions. + +- The segment must be kebab-case: lowercase letters, digits, and hyphens, max + 64 characters, no leading/trailing hyphen, no consecutive `--`. +- Omitting `namespace:` keeps the legacy flat layout. +- The namespace flows from `apm.yml` into `apm.lock.yaml` (per-dependency + `namespace:` field) and surfaces in `apm install` tree output as + `skill / integrated -> .github/skills//`. +- `--verbose` adds a per-skill line: `Skill deployed under namespace + "": skills///`. + | Form | Behaviour | |------|-----------| | `target: copilot` | Single token; allowed values: `vscode`, `agents`, `copilot`, `claude`, `cursor`, `opencode`, `codex`, `all` | diff --git a/src/apm_cli/commands/_helpers.py b/src/apm_cli/commands/_helpers.py index 4cd8b4f9c..f24c53012 100644 --- a/src/apm_cli/commands/_helpers.py +++ b/src/apm_cli/commands/_helpers.py @@ -490,6 +490,8 @@ def _create_minimal_apm_yml(config, plugin=False, target_path=None): # gate what gets deployed. "includes": "auto", } + if config.get("namespace"): + apm_yml_data["namespace"] = config["namespace"] if plugin: apm_yml_data["devDependencies"] = {"apm": []} diff --git a/src/apm_cli/commands/init.py b/src/apm_cli/commands/init.py index 3d7b558e9..7100f4999 100644 --- a/src/apm_cli/commands/init.py +++ b/src/apm_cli/commands/init.py @@ -8,11 +8,13 @@ from ..constants import APM_YML_FILENAME from ..core.command_logger import CommandLogger +from ..models.apm_package import validate_namespace from ..utils.console import ( _create_files_table, _rich_panel, ) from ._helpers import ( + ERROR, INFO, RESET, _create_minimal_apm_yml, @@ -39,9 +41,13 @@ is_flag=True, help="Seed apm.yml with a 'marketplace:' authoring block", ) +@click.option( + "--namespace", + help="Optional namespace to prevent skill name collisions", +) @click.option("--verbose", "-v", is_flag=True, help="Show detailed output") @click.pass_context -def init(ctx, project_name, yes, plugin, marketplace_flag, verbose): +def init(ctx, project_name, yes, plugin, marketplace_flag, namespace, verbose): """Initialize a new APM project (like npm init). Creates a minimal apm.yml with auto-detected metadata. @@ -50,6 +56,9 @@ def init(ctx, project_name, yes, plugin, marketplace_flag, verbose): """ logger = CommandLogger("init", verbose=verbose) try: + if namespace: + namespace = validate_namespace(namespace) + # Handle explicit current directory if project_name == ".": project_name = None @@ -100,10 +109,12 @@ def init(ctx, project_name, yes, plugin, marketplace_flag, verbose): # Get project configuration (interactive mode or defaults) if not yes: - config = _interactive_project_setup(final_project_name, logger) + config = _interactive_project_setup(final_project_name, logger, namespace=namespace) else: # Use auto-detected defaults config = _get_default_config(final_project_name) + if namespace: + config["namespace"] = namespace # Plugin mode uses 0.1.0 as default version if plugin and yes: @@ -209,7 +220,7 @@ def init(ctx, project_name, yes, plugin, marketplace_flag, verbose): sys.exit(1) -def _interactive_project_setup(default_name, logger): +def _interactive_project_setup(default_name, logger, namespace=None): """Interactive setup for new APM projects with auto-detection.""" from ._helpers import _auto_detect_author, _auto_detect_description, _validate_project_name @@ -239,11 +250,19 @@ def _interactive_project_setup(default_name, logger): version = Prompt.ask("Version", default="1.0.0").strip() description = Prompt.ask("Description", default=auto_description).strip() author = Prompt.ask("Author", default=auto_author).strip() + if not namespace: + namespace = Prompt.ask("Namespace (optional)", default="").strip() + else: + namespace = namespace.strip() + if namespace: + namespace = validate_namespace(namespace) summary_content = f"""name: {name} version: {version} description: {description} author: {author}""" + if namespace: + summary_content += f"\nnamespace: {namespace}" console.print(Panel(summary_content, title="About to create", border_style="cyan")) if not Confirm.ask("\nIs this OK?", default=True): @@ -267,20 +286,31 @@ def _interactive_project_setup(default_name, logger): version = click.prompt("Version", default="1.0.0").strip() description = click.prompt("Description", default=auto_description).strip() author = click.prompt("Author", default=auto_author).strip() + if not namespace: + namespace = click.prompt("Namespace (optional)", default="").strip() + else: + namespace = namespace.strip() + if namespace: + namespace = validate_namespace(namespace) click.echo(f"\n{INFO}About to create:{RESET}") click.echo(f" name: {name}") click.echo(f" version: {version}") click.echo(f" description: {description}") click.echo(f" author: {author}") + if namespace: + click.echo(f" namespace: {namespace}") if not click.confirm("\nIs this OK?", default=True): logger.progress("Aborted.") sys.exit(0) - return { + config = { "name": name, "version": version, "description": description, "author": author, } + if namespace: + config["namespace"] = namespace + return config diff --git a/src/apm_cli/deps/lockfile.py b/src/apm_cli/deps/lockfile.py index e2c20333c..0bca2e068 100644 --- a/src/apm_cli/deps/lockfile.py +++ b/src/apm_cli/deps/lockfile.py @@ -11,7 +11,7 @@ import yaml -from ..models.apm_package import DependencyReference +from ..models.apm_package import DependencyReference, validate_namespace logger = logging.getLogger(__name__) @@ -34,6 +34,7 @@ class LockedDependency: depth: int = 1 resolved_by: str | None = None package_type: str | None = None + namespace: str | None = None deployed_files: list[str] = field(default_factory=list) deployed_file_hashes: dict[str, str] = field(default_factory=dict) source: str | None = None # "local" for local deps, None/absent for remote @@ -79,6 +80,8 @@ def to_dict(self) -> dict[str, Any]: result["resolved_by"] = self.resolved_by if self.package_type: result["package_type"] = self.package_type + if self.namespace: + result["namespace"] = self.namespace if self.deployed_files: result["deployed_files"] = sorted(self.deployed_files) if self.deployed_file_hashes: @@ -131,6 +134,13 @@ def from_dict(cls, data: dict[str, Any]) -> "LockedDependency": if _p_int is not None and 1 <= _p_int <= 65535: port = _p_int + namespace = data.get("namespace") + if namespace is not None: + try: + namespace = validate_namespace(namespace) + except ValueError: + namespace = None + return cls( repo_url=data["repo_url"], host=data.get("host"), @@ -144,6 +154,7 @@ def from_dict(cls, data: dict[str, Any]) -> "LockedDependency": depth=data.get("depth", 1), resolved_by=data.get("resolved_by"), package_type=data.get("package_type"), + namespace=namespace, deployed_files=deployed_files, deployed_file_hashes=dict(data.get("deployed_file_hashes") or {}), source=data.get("source"), diff --git a/src/apm_cli/drift.py b/src/apm_cli/drift.py index dc70eba9f..39296bdf8 100644 --- a/src/apm_cli/drift.py +++ b/src/apm_cli/drift.py @@ -249,20 +249,25 @@ def build_download_ref( locked_dep = existing_lockfile.get_dependency(dep_ref.get_unique_key()) if locked_dep: overrides: dict[str, Any] = {} + locked_host = getattr(locked_dep, "host", None) + locked_registry_prefix = getattr(locked_dep, "registry_prefix", None) + locked_resolved_ref = getattr(locked_dep, "resolved_ref", None) # Prefer the lockfile host so re-installs fetch from the exact same # source (proxy host preserved) — fixes air-gapped reproducibility. # When registry_prefix is set, also restore the artifactory_prefix # field on dep_ref so the downloader takes the proxy code-path and # uses PROXY_REGISTRY_TOKEN for auth instead of the GitHub PAT. - if locked_dep.registry_prefix and locked_dep.host: - overrides["host"] = locked_dep.host - overrides["artifactory_prefix"] = locked_dep.registry_prefix - elif ( - isinstance(getattr(locked_dep, "host", None), str) - and locked_dep.host != dep_ref.host + if ( + isinstance(locked_registry_prefix, str) + and locked_registry_prefix + and isinstance(locked_host, str) + and locked_host ): - overrides["host"] = locked_dep.host + overrides["host"] = locked_host + overrides["artifactory_prefix"] = locked_registry_prefix + elif isinstance(locked_host, str) and locked_host != dep_ref.host: + overrides["host"] = locked_host if getattr(locked_dep, "is_insecure", False) is True: overrides["is_insecure"] = True @@ -273,8 +278,14 @@ def build_download_ref( overrides["reference"] = locked_dep.resolved_commit # For proxy deps without a commit SHA (Artifactory zip archives), # preserve the locked ref so we download the same ref on replay. - elif locked_dep.registry_prefix and locked_dep.resolved_ref and not dep_ref.reference: - overrides["reference"] = locked_dep.resolved_ref + elif ( + isinstance(locked_registry_prefix, str) + and locked_registry_prefix + and isinstance(locked_resolved_ref, str) + and locked_resolved_ref + and not dep_ref.reference + ): + overrides["reference"] = locked_resolved_ref if overrides: return _dataclass_replace(dep_ref, **overrides) diff --git a/src/apm_cli/install/context.py b/src/apm_cli/install/context.py index 2b16fd144..32fa75ea1 100644 --- a/src/apm_cli/install/context.py +++ b/src/apm_cli/install/context.py @@ -103,6 +103,7 @@ class InstallContext: intended_dep_keys: set[str] = field(default_factory=set) package_deployed_files: dict[str, list[str]] = field(default_factory=dict) package_types: dict[str, str] = field(default_factory=dict) + package_namespaces: dict[str, str] = field(default_factory=dict) package_hashes: dict[str, str] = field(default_factory=dict) installed_count: int = 0 # integrate unpinned_count: int = 0 # integrate diff --git a/src/apm_cli/install/phases/lockfile.py b/src/apm_cli/install/phases/lockfile.py index 1ff650176..31f6d12e6 100644 --- a/src/apm_cli/install/phases/lockfile.py +++ b/src/apm_cli/install/phases/lockfile.py @@ -78,6 +78,7 @@ def build_and_save(self) -> None: # Attach deployed_files and package_type to each LockedDependency self._attach_deployed_files(lockfile) self._attach_package_types(lockfile) + self._attach_package_namespaces(lockfile) # Apply CLI --skill override to lockfile entries (skill_bundle only) self._attach_skill_subset_override(lockfile) # Attach content hashes captured at download/verify time @@ -127,6 +128,11 @@ def _attach_package_types(self, lockfile: LockFile) -> None: if dep_key in lockfile.dependencies: lockfile.dependencies[dep_key].package_type = pkg_type + def _attach_package_namespaces(self, lockfile: LockFile) -> None: + for dep_key, namespace in self.ctx.package_namespaces.items(): + if dep_key in lockfile.dependencies: + lockfile.dependencies[dep_key].namespace = namespace + def _attach_skill_subset_override(self, lockfile: LockFile) -> None: """Apply CLI --skill override to lockfile skill_bundle entries. diff --git a/src/apm_cli/install/services.py b/src/apm_cli/install/services.py index 28ad48944..eb17e0e1e 100644 --- a/src/apm_cli/install/services.py +++ b/src/apm_cli/install/services.py @@ -232,20 +232,55 @@ def _log_integration(msg): targets=targets, skill_subset=skill_subset, ) + # Resolve the package namespace once so install-summary labels can + # surface the namespace identity while deploy paths stay direct children + # of the target skills root for runtime compatibility. + _pkg_namespace = getattr(getattr(package_info, "package", None), "namespace", None) + # Group target_paths by their parent directory (the skills root) for the + # default summary. Namespaced native skills override this with exact + # deployed directories below. _skill_target_dirs: set = builtins.set() for tp in skill_result.target_paths: try: - rel = tp.relative_to(project_root) - if rel.parts: - _skill_target_dirs.add(rel.parts[0]) + parent_rel = tp.parent.relative_to(project_root).as_posix() + _skill_target_dirs.add(parent_rel) except ValueError: # Dynamic-root target (copilot-cowork) -- path is outside project tree. - _skill_target_dirs.add("copilot-cowork") + label = "copilot-cowork/skills" + if _pkg_namespace: + label = f"{label}/{_pkg_namespace}" + _skill_target_dirs.add(label) _skill_targets = sorted(_skill_target_dirs) - _skill_target_str = ", ".join(f"{d}/skills/" for d in _skill_targets) or "skills/" + _skill_target_str = ", ".join(f"{d}/" for d in _skill_targets) or "skills/" if skill_result.skill_created: result["skills"] += 1 - _log_integration(f" |-- Skill integrated -> {_skill_target_str}") + if _pkg_namespace: + # Per-skill confirmation surfaces the namespace/name identity + # while the deployed directory remains a direct child that + # current harnesses can discover. + _deployed_name = skill_result.skill_path.parent.name + _prefix = f"{_pkg_namespace}-" + _skill_name = ( + _deployed_name[len(_prefix) :] + if _deployed_name.startswith(_prefix) + else _deployed_name + ) + _exact_targets = [] + for tp in skill_result.target_paths: + if tp.name != _deployed_name: + continue + try: + _exact_targets.append(tp.relative_to(project_root).as_posix()) + except ValueError: + _exact_targets.append(tp.as_posix()) + _exact_target_str = ( + ", ".join(f"{d}/" for d in sorted(_exact_targets)) or _skill_target_str + ) + _log_integration( + f" |-- Integrated skill {_pkg_namespace}/{_skill_name} -> {_exact_target_str}" + ) + else: + _log_integration(f" |-- Integrated skill -> {_skill_target_str}") if skill_result.sub_skills_promoted > 0: result["sub_skills"] += skill_result.sub_skills_promoted _log_integration( diff --git a/src/apm_cli/install/sources.py b/src/apm_cli/install/sources.py index 35bf4fe9d..bd105879e 100644 --- a/src/apm_cli/install/sources.py +++ b/src/apm_cli/install/sources.py @@ -61,6 +61,13 @@ def _format_package_type_label(pkg_type) -> str | None: }.get(pkg_type) +def _store_namespace(ctx: InstallContext, dep_key: str, package_info: PackageInfo) -> None: + """Record a package namespace for later lockfile assembly.""" + namespace = getattr(getattr(package_info, "package", None), "namespace", None) + if namespace: + ctx.package_namespaces[dep_key] = namespace + + @dataclass class Materialization: """Outcome of ``DependencySource.acquire()``. @@ -223,6 +230,7 @@ def acquire(self) -> Materialization | None: if local_info.package_type: ctx.package_types[dep_key] = local_info.package_type.value + _store_namespace(ctx, dep_key, local_info) return Materialization( package_info=local_info, @@ -377,6 +385,7 @@ def acquire(self) -> Materialization | None: ctx.package_hashes[dep_key] = _compute_hash(install_path) if cached_package_info.package_type: ctx.package_types[dep_key] = cached_package_info.package_type.value + _store_namespace(ctx, dep_key, cached_package_info) return Materialization( package_info=cached_package_info, @@ -546,6 +555,7 @@ def acquire(self) -> Materialization | None: if hasattr(package_info, "package_type") and package_info.package_type: ctx.package_types[dep_key] = package_info.package_type.value + _store_namespace(ctx, dep_key, package_info) if hasattr(package_info, "package_type"): package_type = package_info.package_type diff --git a/src/apm_cli/integration/skill_integrator.py b/src/apm_cli/integration/skill_integrator.py index c38fb6380..04c6dfad2 100644 --- a/src/apm_cli/integration/skill_integrator.py +++ b/src/apm_cli/integration/skill_integrator.py @@ -9,9 +9,10 @@ from pathlib import Path from typing import Dict, List, Optional # noqa: F401, UP035 -import frontmatter # noqa: F401 +import frontmatter from apm_cli.integration.base_integrator import BaseIntegrator +from apm_cli.models.apm_package import PackageContentType # DEPRECATED -- use IntegrationResult directly for new code. @@ -154,6 +155,79 @@ def normalize_skill_name(name: str) -> str: return to_hyphen_case(name) +def _package_namespace(package_info) -> str | None: + """Return the package namespace, if the manifest opted in.""" + namespace = getattr(getattr(package_info, "package", None), "namespace", None) + return namespace if isinstance(namespace, str) else None + + +def _deployed_skill_name(skill_name: str, namespace: str | None) -> str: + """Return the runtime-visible skill directory name. + + Harness docs describe skills as direct children of the target skills root + (for example ``.github/skills//SKILL.md``). Keep namespaced skills + as direct child directories instead of relying on recursive discovery under + ``skills///``. + """ + if namespace: + return f"{namespace}-{skill_name}" + return skill_name + + +def _rewrite_skill_name( + target_skill_dir: Path, deployed_skill_name: str, namespace: str | None +) -> None: + """Align copied SKILL.md metadata with the runtime-visible directory name.""" + if not namespace: + return + + skill_md = target_skill_dir / "SKILL.md" + if not skill_md.exists(): + return + + post = frontmatter.load(skill_md) + if post.metadata.get("name") == deployed_skill_name: + return + post.metadata["name"] = deployed_skill_name + skill_md.write_text(frontmatter.dumps(post), encoding="utf-8") + + +def _skill_dir(target_skills_root: Path, skill_name: str, namespace: str | None) -> Path: + """Build a deployed skill directory, optionally namespaced. + + Applies a runtime path-containment check so that a symlink planted at + the computed skill directory cannot redirect ``shutil.copytree`` outside + the skills root. Parse-time regex validation rejects traversal in the + namespace value itself; this + chokepoint catches symlink-based escapes that resolve at runtime. + """ + from apm_cli.models.apm_package import validate_namespace + from apm_cli.utils.path_security import ensure_path_within + + if namespace: + namespace = validate_namespace(namespace) + path = target_skills_root / _deployed_skill_name(skill_name, namespace) + ensure_path_within(path, target_skills_root) + return path + + +def _skill_owner_key(skill_name: str, namespace: str | None) -> str: + """Key lockfile ownership by full skill identity, not just leaf name.""" + if namespace: + return f"{namespace}/{skill_name}" + return skill_name + + +def _skill_owner_key_from_deployed_path(deployed_path: str, namespace: str | None = None) -> str: + """Extract the path under ``skills/`` from a deployed_files entry.""" + normalized = deployed_path.rstrip("/").replace("\\", "/") + marker = "/skills/" + key = normalized.split(marker, 1)[1] if marker in normalized else normalized.rsplit("/", 1)[-1] + if namespace and key.startswith(f"{namespace}-") and "/" not in key: + return f"{namespace}/{key[len(namespace) + 1 :]}" + return key + + # ============================================================================= # Package Type Routing Functions (T4) # ============================================================================= @@ -167,7 +241,7 @@ def normalize_skill_name(name: str) -> str: # - Packages with only instructions -> compile to AGENTS.md, NOT skills -def get_effective_type(package_info) -> "PackageContentType": +def get_effective_type(package_info) -> PackageContentType: """Get effective package content type based on package structure. Determines type by: @@ -181,7 +255,7 @@ def get_effective_type(package_info) -> "PackageContentType": Returns: PackageContentType: The effective type """ - from apm_cli.models.apm_package import PackageContentType, PackageType + from apm_cli.models.apm_package import PackageType # Check if package has SKILL.md (via package_type field) # PackageType.CLAUDE_SKILL = has root SKILL.md only @@ -282,7 +356,9 @@ def copy_skill_to_target( When *targets* is provided, only those targets are used. Otherwise falls back to ``active_targets()``. - Source SKILL.md is copied verbatim -- no metadata injection. + Source SKILL.md is copied verbatim unless a package namespace is applied, + in which case the copied ``name`` frontmatter is aligned to the deployed + namespace-prefixed directory. Copies: - SKILL.md (required) @@ -340,7 +416,11 @@ def copy_skill_to_target( if not target.auto_create and not target_root_dir.is_dir(): continue - skill_dir = target_base / effective_root / "skills" / skill_name + skill_dir = _skill_dir( + target_base / effective_root / "skills", + skill_name, + _package_namespace(package_info), + ) skill_dir.parent.mkdir(parents=True, exist_ok=True) if skill_dir.exists(): shutil.rmtree(skill_dir) @@ -490,6 +570,7 @@ def _promote_sub_skills( target_skills_root: Path, parent_name: str, *, + namespace: str | None = None, warn: bool = True, owned_by: dict[str, str] | None = None, diagnostics=None, @@ -541,8 +622,10 @@ def _promote_sub_skills( continue is_valid, _ = validate_skill_name(raw_sub_name) sub_name = raw_sub_name if is_valid else normalize_skill_name(raw_sub_name) - target = target_skills_root / sub_name - rel_path = f"{rel_prefix}/{sub_name}" + target = _skill_dir(target_skills_root, sub_name, namespace) + deployed_name = _deployed_skill_name(sub_name, namespace) + owner_key = _skill_owner_key(sub_name, namespace) + rel_path = f"{rel_prefix}/{owner_key}" if target.exists(): # Content-identical → skip entirely (no copy, no warning) if SkillIntegrator._dirs_equal(sub_skill_path, target): @@ -554,7 +637,7 @@ def _promote_sub_skills( is_managed = ( managed_files is not None and rel_path.replace("\\", "/") in managed_files ) - prev_owner = (owned_by or {}).get(sub_name) + prev_owner = (owned_by or {}).get(owner_key) is_self_overwrite = prev_owner is not None and prev_owner == parent_name if managed_files is not None and not is_managed and not is_self_overwrite: @@ -604,6 +687,7 @@ def _promote_sub_skills( from apm_cli.security.gate import ignore_symlinks shutil.copytree(sub_skill_path, target, dirs_exist_ok=True, ignore=ignore_symlinks) + _rewrite_skill_name(target, deployed_name, namespace) promoted += 1 deployed.append(target) return promoted, deployed @@ -630,7 +714,7 @@ def _build_ownership_maps(project_root: Path) -> tuple[dict[str, str], dict[str, unique_key = dep.get_unique_key() for deployed_path in dep.deployed_files: normalized = deployed_path.rstrip("/").replace("\\", "/") - skill_name = normalized.rsplit("/", 1)[-1] + skill_name = _skill_owner_key_from_deployed_path(normalized, dep.namespace) # Both maps cover all paths for sub-skill self-overwrite tracking. owned_by[skill_name] = short_owner # Native-owner map is scoped to skill paths only to avoid false @@ -694,6 +778,7 @@ def _promote_sub_skills_standalone( targets = active_targets(project_root) parent_name = package_path.name + namespace = _package_namespace(package_info) owned_by = self._build_skill_ownership_map(project_root) count = 0 all_deployed: list[Path] = [] @@ -716,6 +801,7 @@ def _promote_sub_skills_standalone( sub_skills_dir, target_skills_root, parent_name, + namespace=namespace, warn=is_primary, owned_by=owned_by if is_primary else None, diagnostics=diagnostics if is_primary else None, @@ -749,8 +835,10 @@ def _integrate_native_skill( The skill folder name is the source folder name (e.g., ``mcp-builder``), validated and normalized per the agentskills.io spec. - Source SKILL.md is copied verbatim -- no metadata injection. Orphan - detection uses apm.lock via directory name matching instead. + Source SKILL.md is copied verbatim unless a package namespace is applied, + in which case the copied ``name`` frontmatter is aligned to the deployed + namespace-prefixed directory. Orphan detection uses apm.lock via + directory name matching instead. Copies: - SKILL.md (required) @@ -768,6 +856,7 @@ def _integrate_native_skill( SkillIntegrationResult: Results of the integration operation """ package_path = package_info.install_path + namespace = _package_namespace(package_info) # Use the source folder name as the skill name # e.g., apm_modules/ComposioHQ/awesome-claude-skills/mcp-builder -> mcp-builder @@ -828,10 +917,19 @@ def _integrate_native_skill( skills_mapping = target.primitives["skills"] # Dynamic-root targets (cowork): use resolved_deploy_root. if target.resolved_deploy_root is not None: - target_skill_dir = target.resolved_deploy_root / skill_name + target_skill_dir = _skill_dir( + target.resolved_deploy_root, + skill_name, + namespace, + ) else: effective_root = skills_mapping.deploy_root or target.root_dir - target_skill_dir = project_root / effective_root / "skills" / skill_name + target_skill_dir = _skill_dir( + project_root / effective_root / "skills", + skill_name, + namespace, + ) + owner_key = _skill_owner_key(skill_name, namespace) if is_primary: skill_created = not target_skill_dir.exists() @@ -844,19 +942,17 @@ def _integrate_native_skill( # map (current run) so that same-manifest collisions are caught even # before the lockfile has been written for this run. prev_owner = lockfile_native_owners.get( - skill_name - ) or self._native_skill_session_owners.get(skill_name) + owner_key + ) or self._native_skill_session_owners.get(owner_key) is_self_overwrite = prev_owner is not None and prev_owner == current_key if prev_owner is not None and not is_self_overwrite: try: rel_prefix = target_skill_dir.parent.relative_to( project_root ).as_posix() + rel_path = f"{rel_prefix}/{skill_name}" except ValueError: - # Dynamic-root targets (cowork): directory is - # outside the project tree. - rel_prefix = "skills" - rel_path = f"{rel_prefix}/{skill_name}" + rel_path = owner_key # Issue 1: package= should identify the package causing the # collision (current_key), not the skill name, so render_summary() # groups diagnostics by the package responsible. @@ -896,6 +992,11 @@ def _ignore_symlinks_and_apm(directory, contents): ) shutil.copytree(package_path, target_skill_dir, ignore=_ignore_symlinks_and_apm) + _rewrite_skill_name( + target_skill_dir, + _deployed_skill_name(skill_name, namespace), + namespace, + ) all_target_paths.append(target_skill_dir) if is_primary: @@ -910,6 +1011,7 @@ def _ignore_symlinks_and_apm(directory, contents): sub_skills_dir, target_skills_root, skill_name, + namespace=namespace, warn=is_primary, owned_by=owned_by if is_primary else None, diagnostics=diagnostics if is_primary else None, @@ -923,12 +1025,14 @@ def _ignore_symlinks_and_apm(directory, contents): # Record ownership in the session map so subsequent packages installed in # the same run can detect a collision even before the lockfile is written. if current_key is not None: - self._native_skill_session_owners[skill_name] = current_key + self._native_skill_session_owners[_skill_owner_key(skill_name, namespace)] = current_key # Count unique sub-skills from primary target only primary_root = project_root / ".github" / "skills" sub_skills_count = sum( - 1 for p in all_target_paths if p.parent == primary_root and p.name != skill_name + 1 + for p in all_target_paths + if p.parent == primary_root and p.name != _deployed_skill_name(skill_name, namespace) ) return SkillIntegrationResult( @@ -986,6 +1090,7 @@ def _integrate_skill_bundle( targets = active_targets(project_root) parent_name = package_info.install_path.name + namespace = _package_namespace(package_info) owned_by, lockfile_native_owners = self._build_ownership_maps(project_root) # noqa: RUF059 total_promoted = 0 @@ -1001,14 +1106,18 @@ def _integrate_skill_bundle( is_primary = idx == 0 skills_mapping = target.primitives["skills"] - effective_root = skills_mapping.deploy_root or target.root_dir - target_skills_root = project_root / effective_root / "skills" + if target.resolved_deploy_root is not None: + target_skills_root = target.resolved_deploy_root + else: + effective_root = skills_mapping.deploy_root or target.root_dir + target_skills_root = project_root / effective_root / "skills" target_skills_root.mkdir(parents=True, exist_ok=True) n, deployed = self._promote_sub_skills( skills_dir, target_skills_root, parent_name, + namespace=namespace, warn=is_primary, owned_by=owned_by if is_primary else None, diagnostics=diagnostics if is_primary else None, diff --git a/src/apm_cli/models/apm_package.py b/src/apm_cli/models/apm_package.py index 70b8216e5..191f8cde6 100644 --- a/src/apm_cli/models/apm_package.py +++ b/src/apm_cli/models/apm_package.py @@ -6,6 +6,7 @@ compatibility. """ +import re from dataclasses import dataclass from pathlib import Path from typing import Dict, List, Optional, Union # noqa: F401, UP035 @@ -50,10 +51,40 @@ "APMPackage", "PackageInfo", "clear_apm_yml_cache", + "validate_namespace", ] # Module-level parse cache: resolved path -> APMPackage (#171) _apm_yml_cache: dict[Path, "APMPackage"] = {} +NAMESPACE_PATTERN = re.compile(r"^[a-z0-9](?:[a-z0-9]|-(?!-))*[a-z0-9]$|^[a-z0-9]$") + + +def validate_namespace(namespace: str, *, field_name: str = "namespace") -> str: + """Return a normalized package namespace or raise ``ValueError``.""" + if not isinstance(namespace, str): + raise ValueError( + f"Invalid '{field_name}' field: expected string, got {type(namespace).__name__}" + ) + + normalized = namespace.strip() + if not normalized: + raise ValueError(f"'{field_name}' must not be empty") + if "/" in normalized or "\\" in normalized: + raise ValueError(f"'{field_name}' must be a single path segment") + if len(normalized) > 64: + raise ValueError(f"'{field_name}' must be 1-64 characters") + if "--" in normalized: + raise ValueError(f"'{field_name}' must not contain consecutive hyphens") + if not NAMESPACE_PATTERN.fullmatch(normalized): + raise ValueError( + f"'{field_name}' must contain only lowercase letters, digits, " + "and hyphens, and must not start or end with a hyphen" + ) + + from ..utils.path_security import validate_path_segments + + validate_path_segments(normalized, context=field_name, reject_empty=True) + return normalized def clear_apm_yml_cache() -> None: @@ -85,6 +116,7 @@ class APMPackage: None # Package content type: instructions, skill, hybrid, or prompts ) includes: str | list[str] | None = None # Include-only manifest: 'auto' or list of repo paths + namespace: str | None = None # Optional skill namespace for deployed package skills @classmethod def _parse_dependency_dict(cls, raw_deps: dict, label: str = "") -> dict: @@ -208,6 +240,10 @@ def from_apm_yml(cls, apm_yml_path: Path) -> "APMPackage": else: raise ValueError("'includes' must be 'auto' or a list of strings") + namespace = None + if "namespace" in data and data["namespace"] is not None: + namespace = validate_namespace(data["namespace"]) + # Parse target field through the same validator as --target so a CSV # string like ``target: "claude,copilot"`` resolves identically to # ``--target claude,copilot`` and unknown tokens fail at parse time @@ -230,6 +266,7 @@ def from_apm_yml(cls, apm_yml_path: Path) -> "APMPackage": target=target_value, type=pkg_type, includes=includes, + namespace=namespace, ) _apm_yml_cache[resolved] = result return result diff --git a/templates/hello-world/apm.yml b/templates/hello-world/apm.yml index becbfc5e6..3e0264d4b 100644 --- a/templates/hello-world/apm.yml +++ b/templates/hello-world/apm.yml @@ -3,6 +3,8 @@ version: {{version}} description: {{description}} author: {{author}} +# namespace: my-org # Optional: skills install under skills/my-org-/ (kebab-case, alphanumeric+hyphens) + scripts: start: "copilot --log-level all --log-dir copilot-logs --allow-all-tools -p hello-world.prompt.md" codex: "RUST_LOG=debug codex --skip-git-repo-check hello-world.prompt.md" diff --git a/tests/integration/test_skill_install.py b/tests/integration/test_skill_install.py index f4cac28cf..7c7e80fe8 100644 --- a/tests/integration/test_skill_install.py +++ b/tests/integration/test_skill_install.py @@ -118,7 +118,7 @@ def test_skill_detection_in_output(self, temp_project, apm_command): # Check for skill detection/integration message assert ( - "Skill integrated" in result.stdout + "Integrated skill" in result.stdout or "Claude Skill" in result.stdout or "SKILL.md detected" in result.stdout ) diff --git a/tests/test_apm_package_models.py b/tests/test_apm_package_models.py index c233566b9..12296703d 100644 --- a/tests/test_apm_package_models.py +++ b/tests/test_apm_package_models.py @@ -1981,3 +1981,71 @@ def test_build_download_ref_preserves_virtual_path(self): assert result.host == "git.example.com" assert result.reference == "abc123" assert result.is_virtual is True + + +class TestPackageNamespace: + def test_from_apm_yml_parses_namespace(self, tmp_path): + from apm_cli.models.apm_package import APMPackage, clear_apm_yml_cache + + clear_apm_yml_cache() + manifest = tmp_path / "apm.yml" + manifest.write_text( + "name: pkg\nversion: 1.0.0\nnamespace: example-tools\n", + encoding="utf-8", + ) + + package = APMPackage.from_apm_yml(manifest) + + assert package.namespace == "example-tools" + + def test_from_apm_yml_rejects_traversal_namespace(self, tmp_path): + from apm_cli.models.apm_package import APMPackage, clear_apm_yml_cache + + clear_apm_yml_cache() + manifest = tmp_path / "apm.yml" + manifest.write_text( + "name: pkg\nversion: 1.0.0\nnamespace: ../evil\n", + encoding="utf-8", + ) + + with pytest.raises(ValueError, match="namespace"): + APMPackage.from_apm_yml(manifest) + + def test_from_apm_yml_rejects_multi_segment_namespace(self, tmp_path): + from apm_cli.models.apm_package import APMPackage, clear_apm_yml_cache + + clear_apm_yml_cache() + manifest = tmp_path / "apm.yml" + manifest.write_text( + "name: pkg\nversion: 1.0.0\nnamespace: example/team\n", + encoding="utf-8", + ) + + with pytest.raises(ValueError, match="single path segment"): + APMPackage.from_apm_yml(manifest) + + def test_from_apm_yml_rejects_consecutive_hyphen_namespace(self, tmp_path): + from apm_cli.models.apm_package import APMPackage, clear_apm_yml_cache + + clear_apm_yml_cache() + manifest = tmp_path / "apm.yml" + manifest.write_text( + "name: pkg\nversion: 1.0.0\nnamespace: example--tools\n", + encoding="utf-8", + ) + + with pytest.raises(ValueError, match="consecutive hyphens"): + APMPackage.from_apm_yml(manifest) + + def test_from_apm_yml_rejects_long_namespace(self, tmp_path): + from apm_cli.models.apm_package import APMPackage, clear_apm_yml_cache + + clear_apm_yml_cache() + manifest = tmp_path / "apm.yml" + manifest.write_text( + f"name: pkg\nversion: 1.0.0\nnamespace: {'a' * 65}\n", + encoding="utf-8", + ) + + with pytest.raises(ValueError, match="1-64 characters"): + APMPackage.from_apm_yml(manifest) diff --git a/tests/test_lockfile.py b/tests/test_lockfile.py index 9e7bc0893..0fd7a27ab 100644 --- a/tests/test_lockfile.py +++ b/tests/test_lockfile.py @@ -12,6 +12,7 @@ get_lockfile_path, migrate_lockfile_if_needed, ) +from apm_cli.install.phases.lockfile import LockfileBuilder from apm_cli.models.apm_package import DependencyReference @@ -119,6 +120,23 @@ def test_from_dict_missing_hashes_defaults_empty(self): loaded = LockedDependency.from_dict({"repo_url": "owner/repo"}) assert loaded.deployed_file_hashes == {} + def test_namespace_round_trip(self): + dep = LockedDependency(repo_url="owner/repo", namespace="example") + data = dep.to_dict() + assert data["namespace"] == "example" + restored = LockedDependency.from_dict(data) + assert restored.namespace == "example" + + def test_namespace_omitted_when_empty(self): + dep = LockedDependency(repo_url="owner/repo") + assert "namespace" not in dep.to_dict() + + def test_from_dict_drops_invalid_namespace(self): + dep = LockedDependency.from_dict( + {"repo_url": "owner/repo", "namespace": "../example--tools"} + ) + assert dep.namespace is None + class TestLockFile: def test_add_and_get_dependency(self): @@ -379,3 +397,16 @@ def test_new_lockfile_vs_empty(self): a = self._make_lock() b = LockFile() assert not a.is_semantically_equivalent(b) + + +class TestLockfileBuilderNamespaces: + """Tests for package namespace attachment during lockfile assembly.""" + + def test_attach_package_namespaces_records_dependency_namespace(self): + lock = LockFile() + lock.add_dependency(LockedDependency(repo_url="owner/repo")) + ctx = Mock(package_namespaces={"owner/repo": "example"}) + + LockfileBuilder(ctx)._attach_package_namespaces(lock) + + assert lock.dependencies["owner/repo"].namespace == "example" diff --git a/tests/unit/install/test_mcp_warnings.py b/tests/unit/install/test_mcp_warnings.py new file mode 100644 index 000000000..ca1e15e43 --- /dev/null +++ b/tests/unit/install/test_mcp_warnings.py @@ -0,0 +1,308 @@ +"""Unit tests for ``apm_cli.install.mcp.warnings``. + +Covers F5 (SSRF) and F7 (shell metacharacter) non-blocking safety warnings +that fire during ``apm install --mcp``. +""" + +from __future__ import annotations + +from unittest.mock import MagicMock, patch + +from apm_cli.install.mcp.warnings import ( + _is_internal_or_metadata_host, + warn_shell_metachars, + warn_ssrf_url, +) + +# ================================================================ +# _is_internal_or_metadata_host +# ================================================================ + + +class TestIsInternalOrMetadataHost: + """Tests for the host-classification helper.""" + + # -- empty / falsy inputs -- + + def test_empty_string_returns_false(self): + assert _is_internal_or_metadata_host("") is False + + # -- loopback addresses -- + + def test_ipv4_loopback_returns_true(self): + assert _is_internal_or_metadata_host("127.0.0.1") is True + + def test_ipv4_loopback_other_returns_true(self): + assert _is_internal_or_metadata_host("127.255.0.1") is True + + def test_ipv6_loopback_returns_true(self): + assert _is_internal_or_metadata_host("::1") is True + + # -- cloud metadata endpoints -- + + def test_aws_imds_returns_true(self): + assert _is_internal_or_metadata_host("169.254.169.254") is True + + def test_alibaba_cloud_imds_returns_true(self): + assert _is_internal_or_metadata_host("100.100.100.200") is True + + def test_aws_ipv6_imds_returns_true(self): + assert _is_internal_or_metadata_host("fd00:ec2::254") is True + + # -- link-local -- + + def test_link_local_ipv4_returns_true(self): + assert _is_internal_or_metadata_host("169.254.1.1") is True + + # -- RFC1918 private ranges -- + + def test_rfc1918_class_a_returns_true(self): + assert _is_internal_or_metadata_host("10.0.0.1") is True + + def test_rfc1918_class_b_returns_true(self): + assert _is_internal_or_metadata_host("172.16.0.1") is True + + def test_rfc1918_class_c_returns_true(self): + assert _is_internal_or_metadata_host("192.168.1.100") is True + + # -- IPv6 brackets (literal URL host) -- + + def test_ipv6_loopback_bracketed_returns_true(self): + assert _is_internal_or_metadata_host("[::1]") is True + + def test_ipv6_private_bracketed_returns_true(self): + assert _is_internal_or_metadata_host("[fc00::1]") is True + + # -- public / external addresses (should return False) -- + + def test_public_ipv4_returns_false(self): + assert _is_internal_or_metadata_host("8.8.8.8") is False + + def test_public_ipv4_other_returns_false(self): + assert _is_internal_or_metadata_host("1.1.1.1") is False + + # -- hostname resolution -- + + def test_hostname_resolves_to_loopback_returns_true(self): + with patch("socket.gethostbyname", return_value="127.0.0.1"): + assert _is_internal_or_metadata_host("my-internal-host") is True + + def test_hostname_resolves_to_public_returns_false(self): + with patch("socket.gethostbyname", return_value="93.184.216.34"): + assert _is_internal_or_metadata_host("example.com") is False + + def test_hostname_resolution_failure_returns_false(self): + with patch("socket.gethostbyname", side_effect=OSError("no route")): + assert _is_internal_or_metadata_host("unresolvable.local") is False + + def test_hostname_unicode_error_returns_false(self): + with patch("socket.gethostbyname", side_effect=UnicodeError): + assert _is_internal_or_metadata_host("bad\x00host") is False + + +# ================================================================ +# warn_ssrf_url (F5) +# ================================================================ + + +class TestWarnSsrfUrl: + """Tests for the SSRF URL warning helper.""" + + def _make_logger(self): + return MagicMock() + + def test_none_url_does_not_warn(self): + logger = self._make_logger() + warn_ssrf_url(None, logger) + logger.warning.assert_not_called() + + def test_empty_url_does_not_warn(self): + logger = self._make_logger() + warn_ssrf_url("", logger) + logger.warning.assert_not_called() + + def test_internal_url_warns(self): + logger = self._make_logger() + warn_ssrf_url("http://127.0.0.1:8080/api", logger) + logger.warning.assert_called_once() + msg = logger.warning.call_args[0][0] + # Extract the URL embedded in the warning ("URL '' points...") + # and assert via urlparse rather than substring matching, per repo + # test convention (CodeQL py/incomplete-url-substring-sanitization). + import re + from urllib.parse import urlparse + + match = re.search(r"URL '([^']+)'", msg) + assert match is not None, f"warning message has no quoted URL: {msg!r}" + assert urlparse(match.group(1)).hostname == "127.0.0.1" + + def test_metadata_url_warns(self): + logger = self._make_logger() + warn_ssrf_url("http://169.254.169.254/latest/meta-data", logger) + logger.warning.assert_called_once() + + def test_private_range_url_warns(self): + logger = self._make_logger() + warn_ssrf_url("https://192.168.0.1/mcp", logger) + logger.warning.assert_called_once() + + def test_public_url_does_not_warn(self): + logger = self._make_logger() + warn_ssrf_url("https://mcp.example.com/v1", logger) + logger.warning.assert_not_called() + + def test_malformed_url_does_not_crash(self): + logger = self._make_logger() + # Should swallow ValueError/TypeError gracefully + warn_ssrf_url("not-a-url", logger) + # May or may not warn; must not raise + + def test_url_with_no_hostname_does_not_crash(self): + logger = self._make_logger() + warn_ssrf_url("file:///etc/passwd", logger) + # file:// has no hostname; must not raise + + def test_hostname_resolves_to_private_warns(self): + logger = self._make_logger() + with patch("socket.gethostbyname", return_value="10.0.0.1"): + warn_ssrf_url("http://internal-host/endpoint", logger) + logger.warning.assert_called_once() + + +# ================================================================ +# warn_shell_metachars (F7) +# ================================================================ + + +class TestWarnShellMetachars: + """Tests for the shell metacharacter warning helper.""" + + def _make_logger(self): + return MagicMock() + + # -- no-op paths -- + + def test_none_env_and_no_command_does_nothing(self): + logger = self._make_logger() + warn_shell_metachars(None, logger) + logger.warning.assert_not_called() + + def test_empty_env_and_no_command_does_nothing(self): + logger = self._make_logger() + warn_shell_metachars({}, logger) + logger.warning.assert_not_called() + + def test_clean_env_does_not_warn(self): + logger = self._make_logger() + warn_shell_metachars({"MY_TOKEN": "abc123", "PORT": "3000"}, logger) + logger.warning.assert_not_called() + + def test_clean_command_does_not_warn(self): + logger = self._make_logger() + warn_shell_metachars(None, logger, command="npx -y @modelcontextprotocol/server") + logger.warning.assert_not_called() + + # -- env value metacharacters -- + + def test_dollar_paren_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"SECRET": "$(cat /etc/passwd)"}, logger) + logger.warning.assert_called_once() + msg = logger.warning.call_args[0][0] + assert "SECRET" in msg + + def test_backtick_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"VAL": "`id`"}, logger) + logger.warning.assert_called_once() + + def test_semicolon_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"CMD": "foo; bar"}, logger) + logger.warning.assert_called_once() + + def test_double_ampersand_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"RUN": "true && bad"}, logger) + logger.warning.assert_called_once() + + def test_double_pipe_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"FALLBACK": "a || b"}, logger) + logger.warning.assert_called_once() + + def test_pipe_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"PIPE": "cmd | grep x"}, logger) + logger.warning.assert_called_once() + + def test_redirect_append_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"OUT": "cmd >> /tmp/log"}, logger) + logger.warning.assert_called_once() + + def test_redirect_write_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"OUT": "cmd > /tmp/out"}, logger) + logger.warning.assert_called_once() + + def test_redirect_read_in_env_warns(self): + logger = self._make_logger() + warn_shell_metachars({"IN": "cmd < input.txt"}, logger) + logger.warning.assert_called_once() + + def test_only_first_metachar_triggers_warning_per_key(self): + """Only one warning is emitted per env key (break after first match).""" + logger = self._make_logger() + warn_shell_metachars({"MULTI": "$(foo) && bar"}, logger) + # Only one call despite multiple metacharacters + assert logger.warning.call_count == 1 + + def test_multiple_keys_each_warn_independently(self): + """Each offending env key produces its own warning.""" + logger = self._make_logger() + warn_shell_metachars({"K1": "$(a)", "K2": "$(b)"}, logger) + assert logger.warning.call_count == 2 + + def test_none_value_in_env_does_not_crash(self): + """None env values are treated as empty strings without error.""" + logger = self._make_logger() + warn_shell_metachars({"TOKEN": None}, logger) + logger.warning.assert_not_called() + + def test_integer_value_in_env_does_not_crash(self): + """Non-string env values are coerced to str without error.""" + logger = self._make_logger() + warn_shell_metachars({"PORT": 3000}, logger) + logger.warning.assert_not_called() + + # -- command field metacharacters -- + + def test_command_with_pipe_warns(self): + logger = self._make_logger() + warn_shell_metachars(None, logger, command="npx|curl evil.com") + logger.warning.assert_called_once() + msg = logger.warning.call_args[0][0] + assert "command" in msg.lower() + + def test_command_with_semicolon_warns(self): + logger = self._make_logger() + warn_shell_metachars(None, logger, command="node server.js; rm -rf /") + logger.warning.assert_called_once() + + def test_command_with_subshell_warns(self): + logger = self._make_logger() + warn_shell_metachars(None, logger, command="echo $(secret)") + logger.warning.assert_called_once() + + def test_non_string_command_does_not_crash(self): + """Non-string command (e.g. list) is skipped gracefully.""" + logger = self._make_logger() + warn_shell_metachars(None, logger, command=["npx", "server"]) + logger.warning.assert_not_called() + + def test_env_and_command_both_warn(self): + """Warnings fire for both env and command when both have metacharacters.""" + logger = self._make_logger() + warn_shell_metachars({"K": "$(x)"}, logger, command="cmd; bad") + assert logger.warning.call_count == 2 diff --git a/tests/unit/install/test_services.py b/tests/unit/install/test_services.py index 7e286ab29..a53e78878 100644 --- a/tests/unit/install/test_services.py +++ b/tests/unit/install/test_services.py @@ -551,3 +551,104 @@ def test_warning_with_prompts_only_does_not_mention_commands( msg = str(warning_calls[0]) assert "prompts" in msg assert "commands" not in msg + + +class TestNamespaceTreeOutput: + """Regression tests for PR #1028 panel findings 2 and 4: install summary + must surface namespace identity when a package declares one.""" + + def _make_pkg_info(self, tmp_path: Path, namespace: str | None) -> MagicMock: + pkg_dir = tmp_path / "pkg" + pkg_dir.mkdir(parents=True, exist_ok=True) + (pkg_dir / ".apm").mkdir(exist_ok=True) + pkg = MagicMock() + pkg.install_path = pkg_dir + pkg.name = "ns-pkg" + pkg.package = MagicMock() + pkg.package.namespace = namespace + return pkg + + def _run( + self, + tmp_path: Path, + pkg_info: Any, + target_paths: list[Path], + skill_created: bool = True, + sub_skills_promoted: int = 0, + primary_skill_path: Path | None = None, + ) -> MagicMock: + from apm_cli.install.services import integrate_package_primitives + + copilot = KNOWN_TARGETS["copilot"] + targets = [copilot] + + logger = MagicMock() + + integrators = { + k: MagicMock() + for k in [ + "prompt_integrator", + "agent_integrator", + "skill_integrator", + "instruction_integrator", + "command_integrator", + "hook_integrator", + ] + } + skill_result = MagicMock() + skill_result.target_paths = target_paths + skill_result.skill_created = skill_created + skill_result.sub_skills_promoted = sub_skills_promoted + # services.py reads .skill_path.parent.name for the per-skill label + if primary_skill_path is None and target_paths: + primary_skill_path = target_paths[0] / "SKILL.md" + skill_result.skill_path = primary_skill_path + integrators["skill_integrator"].integrate_package_skill.return_value = skill_result + + with patch( + "apm_cli.integration.dispatch.get_dispatch_table", + return_value={}, + ): + integrate_package_primitives( + pkg_info, + tmp_path, + targets=targets, + diagnostics=MagicMock(), + package_name="ns-pkg", + logger=logger, + ctx=None, + **integrators, + force=False, + managed_files=None, + ) + return logger + + def test_namespaced_skill_install_label_includes_namespace(self, tmp_path: Path) -> None: + """When namespace is set, the confirmation line leads with the outcome + and shows ``/`` plus the namespaced target path.""" + pkg_info = self._make_pkg_info(tmp_path, namespace="acme") + deployed = tmp_path / ".github" / "skills" / "acme-brand-guidelines" + deployed.mkdir(parents=True, exist_ok=True) + + logger = self._run(tmp_path, pkg_info, [deployed]) + + tree_msgs = [str(c) for c in logger.tree_item.call_args_list] + joined = "\n".join(tree_msgs) + assert "Integrated skill acme/brand-guidelines" in joined, joined + assert "acme/brand-guidelines" in joined, joined + assert ".github/skills/acme-brand-guidelines/" in joined, joined + + def test_unnamespaced_skill_install_label_unchanged(self, tmp_path: Path) -> None: + """Packages without a namespace still use the plain skill label.""" + pkg_info = self._make_pkg_info(tmp_path, namespace=None) + deployed = tmp_path / ".github" / "skills" / "plain-skill" + deployed.mkdir(parents=True, exist_ok=True) + + logger = self._run(tmp_path, pkg_info, [deployed]) + + tree_msgs = [str(c) for c in logger.tree_item.call_args_list] + joined = "\n".join(tree_msgs) + assert "Integrated skill ->" in joined, joined + assert ".github/skills/" in joined, joined + # Must NOT contain a namespace-style label segment + assert "skill /" not in joined, joined diff --git a/tests/unit/integration/test_skill_integrator.py b/tests/unit/integration/test_skill_integrator.py index 09cb25507..b5948a0de 100644 --- a/tests/unit/integration/test_skill_integrator.py +++ b/tests/unit/integration/test_skill_integrator.py @@ -9,6 +9,7 @@ from apm_cli.integration.skill_integrator import ( SkillIntegrationResult, SkillIntegrator, + _skill_owner_key_from_deployed_path, copy_skill_to_target, normalize_skill_name, to_hyphen_case, @@ -306,6 +307,7 @@ def _create_package_info( dependency_ref: DependencyReference = None, package_type: PackageType = None, content_type: "PackageContentType" = None, + namespace: str | None = None, ) -> PackageInfo: """Helper to create PackageInfo objects for tests. @@ -320,6 +322,7 @@ def _create_package_info( source=source or f"github.com/test/{name}", description=description, type=content_type, + namespace=namespace, ) resolved_ref = ResolvedReference( original_ref="main", @@ -431,6 +434,162 @@ def test_integrate_package_skill_processes_virtual_subdirectory_packages(self): # Skill directory should be created assert result.skill_path.exists() + def test_integrate_package_skill_uses_manifest_namespace(self): + """Packages with namespace get a direct, namespace-prefixed skill dir.""" + package_dir = self.project_root / "my-skill" + package_dir.mkdir() + (package_dir / "SKILL.md").write_text("# My Skill") + + package_info = self._create_package_info( + name="my-skill", + install_path=package_dir, + package_type=PackageType.CLAUDE_SKILL, + namespace="example", + ) + + result = self.integrator.integrate_package_skill(package_info, self.project_root) + + target = self.project_root / ".github" / "skills" / "example-my-skill" + assert result.skill_created is True + assert result.skill_path == target / "SKILL.md" + assert (target / "SKILL.md").exists() + assert "name: example-my-skill" in (target / "SKILL.md").read_text() + assert not (self.project_root / ".github" / "skills" / "my-skill").exists() + + def test_namespaced_deployed_path_maps_back_to_owner_key(self): + """Lockfile ownership keeps namespace identity for direct child paths.""" + assert ( + _skill_owner_key_from_deployed_path( + ".github/skills/example-my-skill", + namespace="example", + ) + == "example/my-skill" + ) + + def test_nested_deployed_path_owner_key_stays_backward_compatible(self): + """Older nested lockfile paths already carry the namespace segment.""" + assert ( + _skill_owner_key_from_deployed_path( + ".github/skills/example/my-skill", + namespace="example", + ) + == "example/my-skill" + ) + + def test_integrate_package_skill_namespace_does_not_emit_duplicate_verbose_log(self): + """Namespace routing is surfaced by install/services.py, not a second + inconsistent verbose line from the integrator. + """ + package_dir = self.project_root / "brand-guidelines" + package_dir.mkdir() + (package_dir / "SKILL.md").write_text("# Brand Guidelines") + + package_info = self._create_package_info( + name="brand-guidelines", + install_path=package_dir, + package_type=PackageType.CLAUDE_SKILL, + namespace="acme", + ) + + captured: list[str] = [] + + class _StubLogger: + def verbose_detail(self, msg): + captured.append(msg) + + def warning(self, *_args, **_kwargs): + pass + + def info(self, *_args, **_kwargs): + pass + + self.integrator.integrate_package_skill( + package_info, self.project_root, logger=_StubLogger() + ) + + assert not any("namespace" in m for m in captured), captured + + def test_integrate_package_skill_namespace_no_log_when_absent(self): + """When no namespace is set, no namespace-specific verbose line fires.""" + package_dir = self.project_root / "plain-skill" + package_dir.mkdir() + (package_dir / "SKILL.md").write_text("# Plain") + + package_info = self._create_package_info( + name="plain-skill", + install_path=package_dir, + package_type=PackageType.CLAUDE_SKILL, + namespace=None, + ) + + captured: list[str] = [] + + class _StubLogger: + def verbose_detail(self, msg): + captured.append(msg) + + def warning(self, *_args, **_kwargs): + pass + + def info(self, *_args, **_kwargs): + pass + + self.integrator.integrate_package_skill( + package_info, self.project_root, logger=_StubLogger() + ) + + assert not any("namespace" in m for m in captured), captured + + def test_integrate_package_skill_namespace_preserved_in_target_paths(self): + """IntegrationResult.target_paths must include the namespace prefix so + downstream consumers can reconstruct the full / + identity without relying on recursive harness discovery. + """ + package_dir = self.project_root / "my-skill" + package_dir.mkdir() + (package_dir / "SKILL.md").write_text("# My Skill") + + package_info = self._create_package_info( + name="my-skill", + install_path=package_dir, + package_type=PackageType.CLAUDE_SKILL, + namespace="acme", + ) + + result = self.integrator.integrate_package_skill(package_info, self.project_root) + + assert result.target_paths, "expected at least one deployed path" + for tp in result.target_paths: + rel_parts = tp.relative_to(self.project_root).parts + assert "acme-my-skill" in rel_parts, f"namespace prefix missing from path: {tp}" + assert tp.parent.name == "skills", f"expected direct child of skills root: {tp}" + + def test_integrate_sub_skills_uses_manifest_namespace(self): + """Sub-skills inherit the package namespace.""" + package_dir = self.project_root / "bundle" + sub_skill = package_dir / ".apm" / "skills" / "helper" + sub_skill.mkdir(parents=True) + (sub_skill / "SKILL.md").write_text("# Helper") + + package_info = self._create_package_info( + name="bundle", + install_path=package_dir, + package_type=PackageType.APM_PACKAGE, + namespace="example", + ) + + result = self.integrator.integrate_package_skill(package_info, self.project_root) + + assert result.sub_skills_promoted == 1 + assert (self.project_root / ".github" / "skills" / "example-helper" / "SKILL.md").exists() + assert ( + "name: example-helper" + in ( + self.project_root / ".github" / "skills" / "example-helper" / "SKILL.md" + ).read_text() + ) + assert not (self.project_root / ".github" / "skills" / "helper").exists() + def test_integrate_package_skill_multiple_virtual_file_packages_no_collision(self): """Test that multiple virtual FILE packages from same repo don't create conflicting Skills. @@ -3684,10 +3843,10 @@ def test_copy_skill_to_target_fallback_without_targets(self): from dataclasses import replace as _dc_replace # noqa: E402 from unittest.mock import MagicMock # noqa: E402 -from apm_cli.integration.targets import KNOWN_TARGETS # noqa: E402 +from apm_cli.integration.targets import KNOWN_TARGETS, TargetProfile # noqa: E402 -def _make_resolved_cowork_target(cowork_root: Path) -> "TargetProfile": # noqa: F821 +def _make_resolved_cowork_target(cowork_root: Path) -> TargetProfile: """Return a frozen TargetProfile with resolved_deploy_root set for cowork. Args: @@ -3696,8 +3855,6 @@ def _make_resolved_cowork_target(cowork_root: Path) -> "TargetProfile": # noqa: Returns: A frozen TargetProfile suitable for cowork deployment tests. """ - from apm_cli.integration.targets import TargetProfile # noqa: F401 - return _dc_replace(KNOWN_TARGETS["copilot-cowork"], resolved_deploy_root=cowork_root) @@ -3876,3 +4033,51 @@ def test_skill_only_agents_skipped_on_cowork(self, tmp_path: Path) -> None: assert deployed_skill.exists() # .apm dir is excluded via shutil.ignore_patterns('.apm') assert not (cowork_root / "my-skill" / ".apm").exists() + + +# ============================================================================= +# Path-containment guard (PR #1028 round 4 — supply-chain-security finding) +# ============================================================================= + + +class TestSkillDirContainmentGuard: + """Regression tests for the runtime path-containment check inside + ``_skill_dir``. Parse-time regex validation rejects traversal in the + namespace value itself; this guard catches symlink-based escapes that + resolve only at runtime.""" + + def test_skill_dir_returns_namespaced_path_when_safe(self, tmp_path: Path) -> None: + from apm_cli.integration.skill_integrator import _skill_dir + + skills_root = tmp_path / "skills" + skills_root.mkdir() + result = _skill_dir(skills_root, "my-skill", "acme") + assert result == skills_root / "acme-my-skill" + + def test_skill_dir_returns_flat_path_when_no_namespace(self, tmp_path: Path) -> None: + from apm_cli.integration.skill_integrator import _skill_dir + + skills_root = tmp_path / "skills" + skills_root.mkdir() + result = _skill_dir(skills_root, "my-skill", None) + assert result == skills_root / "my-skill" + + def test_skill_dir_rejects_symlink_skill_escaping_root(self, tmp_path: Path) -> None: + """If a previous install planted the computed skill dir as a symlink + pointing outside the skills root, ``_skill_dir`` must raise + ``PathTraversalError`` so ``shutil.copytree`` cannot follow it and + write outside the deploy root.""" + import pytest + + from apm_cli.integration.skill_integrator import _skill_dir + from apm_cli.utils.path_security import PathTraversalError + + skills_root = tmp_path / "skills" + skills_root.mkdir() + outside = tmp_path / "outside" + outside.mkdir() + # Plant a malicious symlink at skills/evil-my-skill -> outside. + (skills_root / "evil-my-skill").symlink_to(outside, target_is_directory=True) + + with pytest.raises(PathTraversalError): + _skill_dir(skills_root, "my-skill", "evil") diff --git a/tests/unit/test_init_command.py b/tests/unit/test_init_command.py index 985ed740d..d74dd775d 100644 --- a/tests/unit/test_init_command.py +++ b/tests/unit/test_init_command.py @@ -54,6 +54,33 @@ def test_init_current_directory(self): finally: os.chdir(self.original_dir) # restore CWD before TemporaryDirectory cleanup + def test_init_namespace_yes_writes_manifest_field(self): + """--namespace makes the package namespace discoverable from apm init.""" + with tempfile.TemporaryDirectory() as tmp_dir: + os.chdir(tmp_dir) + try: + result = self.runner.invoke(cli, ["init", "--yes", "--namespace", "acme"]) + + assert result.exit_code == 0 + data = yaml.safe_load(Path("apm.yml").read_text(encoding="utf-8")) + assert data["namespace"] == "acme" + finally: + os.chdir(self.original_dir) + + def test_init_namespace_rejects_invalid_value(self): + """--namespace uses the same validation contract as apm.yml parsing.""" + with tempfile.TemporaryDirectory() as tmp_dir: + os.chdir(tmp_dir) + try: + result = self.runner.invoke(cli, ["init", "--yes", "--namespace", "acme--tools"]) + + assert result.exit_code == 1 + assert "consecutive" in result.output + assert "hyphens" in result.output + assert not Path("apm.yml").exists() + finally: + os.chdir(self.original_dir) + def test_init_explicit_current_directory(self): """Test initialization with explicit '.' argument.""" with tempfile.TemporaryDirectory() as tmp_dir: @@ -161,7 +188,7 @@ def test_init_interactive_mode(self): os.chdir(tmp_dir) try: # Simulate user input - user_input = "my-test-project\n1.5.0\nTest description\nTest Author\ny\n" + user_input = "my-test-project\n1.5.0\nTest description\nTest Author\n\ny\n" result = self.runner.invoke(cli, ["init"], input=user_input) @@ -188,7 +215,7 @@ def test_init_interactive_mode_abort(self): os.chdir(tmp_dir) try: # Simulate user input with 'no' to confirmation - user_input = "my-test-project\n1.5.0\nTest description\nTest Author\nn\n" + user_input = "my-test-project\n1.5.0\nTest description\nTest Author\n\nn\n" result = self.runner.invoke(cli, ["init"], input=user_input) @@ -229,7 +256,7 @@ def test_init_existing_project_confirm_prompt_shown_once(self): Path("apm.yml").write_text("name: existing-project\nversion: 0.1.0\n") # Say yes to overwrite, then provide interactive setup input - user_input = "y\nmy-project\n1.0.0\nA description\nAuthor\ny\n" + user_input = "y\nmy-project\n1.0.0\nA description\nAuthor\n\ny\n" result = self.runner.invoke(cli, ["init"], input=user_input) assert result.exit_code == 0 @@ -480,7 +507,7 @@ def test_init_interactive_reprompts_on_invalid_name_click(self): result = self.runner.invoke( cli, ["init"], - input="bad/name\nmy-project\n1.0.0\n\n\ny\n", + input="bad/name\nmy-project\n1.0.0\n\n\n\ny\n", catch_exceptions=False, ) assert "Invalid project name" in result.output @@ -492,7 +519,7 @@ def test_init_interactive_reprompts_on_dotdot_click(self): result = self.runner.invoke( cli, ["init"], - input="..\nmy-project\n1.0.0\n\n\ny\n", + input="..\nmy-project\n1.0.0\n\n\n\ny\n", catch_exceptions=False, ) assert "Invalid project name" in result.output diff --git a/tests/unit/test_self_entry_caller_guards.py b/tests/unit/test_self_entry_caller_guards.py index 392533890..4a913968d 100644 --- a/tests/unit/test_self_entry_caller_guards.py +++ b/tests/unit/test_self_entry_caller_guards.py @@ -121,5 +121,5 @@ def test_ownership_map_excludes_self_entry(self, tmp_path): # Self-entry's owner string would have been '' if not skipped. assert "" not in owned_by.values() assert "" not in native_owners.values() - # The remote dep's skill leaf-name is 'SKILL.md' (last path segment). - assert owned_by.get("SKILL.md") == "repo" + # The remote dep's skill key is the path under skills/ (skill identity). + assert owned_by.get("remote-skill/SKILL.md") == "repo"